December 09, 2022
On Fri, Dec 09, 2022 at 02:11:15PM +0000, Paul Backus via Digitalmars-d wrote: [...]
> Some features from the article that as far as I know have no equivalent in dub:
> 
> 1. Caching registry locally.
> 
> > Resolving environments feels instant, as opposed to the glacially slow Conda that Python offers. The global "general" registry is downloaded as a single gzipped tarball, and read directly from the zipped tarball, making registry updates way faster than updating Cargo's crates.io.

IMO speed in this area is critical.  Room for dub to improve.


> 2. Multiple registries with namespacing.
> 
> > Pkg is federated, and allows you to easily and freely mix multiple public and private package registries, even if they have no knowledge of each others and contain different packages with the same names.

This is the way to go.  Flat package namespaces just don't cut it anymore in this day and age.  Some kind of namespacing is necessary.

One may argue, D's ecosystem is so small, what need do we have of package namespacing?  Well, think of it this way: if my D program is so small, what need do I have of module namespacing?  Answer: it sets the groundwork for future expansion.  If we start off on the wrong foot, it will be difficult to add namespacing later when the ecosystem grows.


> 3. Cross compilation.
> 
> > The BinaryBuilder package allows you to cross-compile the same program to all platforms supported by Julia

This IMO would be a big selling point for dub.  We need a comprehensive cross-compiling solution that doesn't need manual hacking to work. LDC's built-in Windows target is awesome, but still needs manual setup. Cross-compilation to Mac is still incomplete, and WASM support is separate and needs work.


T

-- 
Frank disagreement binds closer than feigned agreement.
December 09, 2022
On 12/9/2022 9:28 AM, H. S. Teoh wrote:
>>> The BinaryBuilder package allows you to cross-compile the same
>>> program to all platforms supported by Julia
> 
> This IMO would be a big selling point for dub.  We need a comprehensive
> cross-compiling solution that doesn't need manual hacking to work. LDC's
> built-in Windows target is awesome, but still needs manual setup.
> Cross-compilation to Mac is still incomplete, and WASM support is
> separate and needs work.

Recently, dmd acquired the ability to cross compile for all its supported platforms. Cross linking, though, remains a problem.

December 09, 2022
On 12/9/2022 6:11 AM, Paul Backus wrote:
>> Resolving environments feels instant, as opposed to the glacially slow Conda that Python offers. The global "general" registry is downloaded as a single gzipped tarball, and read directly from the zipped tarball, making registry updates way faster than updating Cargo's crates.io.

dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.

December 10, 2022
On Thursday, 8 December 2022 at 17:47:42 UTC, Walter Bright wrote:
> Here's a good thought provoking article:
>
> https://viralinstruction.com/posts/goodjulia/
>
> A couple of things stood out for me:
>
>
> 1. https://viralinstruction.com/posts/goodjulia/#the_package_manager_is_amazing
>
> I've never thought of a package manager that way.
>
>
> 2. "Rust, for example, may have a wonderfully expressive type system, but it's also boilerplate heavy, and its borrowchecker makes writing any code that compiles at all quite a time investment. An investment, which most of the time gives no returns when you're trying to figure how to approach the problem in the first place. It's also not entirely clear how I would interactively visualise and manipulate a dataset using a static language like Rust."
>
> I've always thought that a great strength of D was its plasticity, meaning you can easily change data structures and algorithms as you're writing and rewriting code. Apparently this is much more difficult in Rust, which will inevitably result in less efficiency, even if the compiler for it generates very good code.

I speak with a fair bit of experience with both Rust and D. In my opinion, what gives the
writer of the Julia article heartburn about Rust has nothing to do with static vs. dynamic typing. Rust is difficult to learn because its insistence upon GC-less memory safety places a significant memory-management burden on the programmer. That's what all the ownership rules are about and the notorious borrow-checker is relentless in enforcing those rules. This is not a language for prototyping. You have to have a very clear idea of your design decisions and how they relate to the ownership/borrowing rules, or you will find yourself in a world of considerable frustration.

This is much less true of D (and, I'm sure, Go and Nim, with which I have only a little experience). It's also less true of Haskell, with which I have a lot of experience, which also has a demanding compiler. But those demands are mostly about proper use of Haskell's type system and don't off-load work on the programmer because there's an empty space where a GC ought to be.

Having said all this, once you learn how to deal with Rust, you learn where the land-mines are and how to avoid them. Using it then becomes a more normal experience, but the time and effort to get to that steady-state is greater than any language I've ever used in 60+ years of writing code. I will say that the compiler provides excellent error messages, as well as many Lint-ish suggestions about eliminating unnecessary and/or unused things from your code. Cargo is also very solid -- easy to use and well documented. Once you get your code to compile, it's much like Haskell -- it works, modulo your own logic errors.


December 10, 2022

On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:

>

On 12/9/2022 6:11 AM, Paul Backus wrote:

> >

Resolving environments feels instant, as opposed to the glacially slow Conda that Python offers. The global "general" registry is downloaded as a single gzipped tarball, and read directly from the zipped tarball, making registry updates way faster than updating Cargo's crates.io.

dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.

I would love that, I usually only read the forums & lurk, I made an account just to say this!

In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them.

It would be lovely if I could skip the unpacking and directly feed them into dmd.

Not only that, this could also be an awesome way of easing distributing programs via src.

I would love it if I could do something like dmd myproject.tar.xz -of=myproject

December 09, 2022
On 12/9/2022 9:53 PM, Greggor wrote:
> I would love that, I usually only read the forums & lurk, I made an account just to say this!

Hmm. Looks like I'm not the only one!
December 10, 2022
On 10/12/2022 6:53 PM, Greggor wrote:
> I would love that, I usually only read the forums & lurk, I made an account just to say this!

You don't need an account to post, entirely optional.
December 10, 2022

On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:

>

On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:

>

On 12/9/2022 6:11 AM, Paul Backus wrote:

> >

Resolving environments feels instant, as opposed to the glacially slow Conda that Python offers. The global "general" registry is downloaded as a single gzipped tarball, and read directly from the zipped tarball, making registry updates way faster than updating Cargo's crates.io.

dmd could be enhanced to read source files from a zip or tarball, so these wouldn't have to be expanded before compilation.

I would love that, I usually only read the forums & lurk, I made an account just to say this!

In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them.

It would be lovely if I could skip the unpacking and directly feed them into dmd.

Not only that, this could also be an awesome way of easing distributing programs via src.

I would love it if I could do something like dmd myproject.tar.xz -of=myproject

While I think that there's many advantages in not using a package manager, I also think that having it is really productive, being able to just add your dependency, it automatically checks the most recent version and then save that version information is absolutely important, it is easy. The compilation command would never be of that size unless you're work in a really small project.

Although dub has many problems today, I would really like to see it becoming more and more useful. Currently, for me, as a package manager, it is good enough. As a build system, it does not fit all my requirements which I have been opening issues on its repo.

But unfortunately, for changing my project which is pretty big right now from dub to any other build system is currently inviable for me because I would lose days until making it stable, which I could use it for coding more features.

I can say that for any newcomer, dub is a bless, not needing to know any build cli, specific details that shouldn't be required of every developer is a game changer. Build process is a very important step to every project out there, having one which is simple to use is the best choice.

December 10, 2022

On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:

>

On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:

>

[...]

I would love that, I usually only read the forums & lurk, I made an account just to say this!

In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them.

It would be lovely if I could skip the unpacking and directly feed them into dmd.

Not only that, this could also be an awesome way of easing distributing programs via src.

I would love it if I could do something like dmd myproject.tar.xz -of=myproject

If you already have the dependencies then it does build without an internet connection.

You can also use git submodules with dub if needed.

December 11, 2022
On Saturday, 10 December 2022 at 14:24:53 UTC, max haughton wrote:
> On Saturday, 10 December 2022 at 05:53:37 UTC, Greggor wrote:
>> On Friday, 9 December 2022 at 19:07:21 UTC, Walter Bright wrote:
>>> [...]
>>
>> I would love that, I usually only read the forums & lurk, I made an account just to say this!
>>
>> In my case, I don't use dub, or any other package manager, I am a strong believer of your project should build without a internet connection. In a lot of my project what I do is have all my dependencies in tar files & the build script unpacks them and then builds them.
>>
>> It would be lovely if I could skip the unpacking and directly feed them into dmd.
>>
>> Not only that, this could also be an awesome way of easing distributing programs via src.
>>
>> I would love it if I could do something like `dmd myproject.tar.xz -of=myproject`
>
> If you already have the dependencies then it does build without an internet connection.
>
> You can also use git submodules with dub if needed.

I'm not using git for version control, I'm using fossil so I'm not sure if this is applicable to me.

Its not just about building without an internet connection. It's more of a measuring stick I use. There are several reasons I do it, mainly its "trauma" from the JS/NPM ecosystem.

the two stories I'm sharing may not seem relevant, but they are to the point I'm making.
Maybe I'm a goof & live in a bubble, but I'd like to argue that D currently has a better dependence experience then most ""nicer"" systems based on some anecdotal evidence :^)

JS:
>My experience with the JS ecosystem especially with node.js is that anything I have written or used that relied on NPM packages if unmaintained for little while will stop working or gain odd bugs for no good reason. At one point I had the freedom to decided to do an experiment while working on a personal project in JS/node where I decided to not use NPM and minimized my dependency use & it changed my perspective on JS, the language is fine, its the tooling, culture and ecosystem around it that make it less so.
>
>Eventually I wrote a single file library that held a lot of functions id reuse, and whats funny about this, I did what some C devs do where I've created the "single header" library but in the context of JS.

Python:
>Please understand that I am not a Python programmer, so may have went at this the wrong way, but I do have a very negative experience with python as a user and I feel that its still important to mention.
>
>About two months ago I set up on my machine a distribution of Stable Diffusion (an image gen AI, it's cool, look it up) and every step of the way it was a horrid experience. First I tried following the instructions provided by the project, they used a package manager called Anaconda, it took over 10 minutes for it to "resolve" and even then it failed to figure it out.
>After throwing away an hour of my life, I decided to do it my self, so I figured out the packages I needed and installed them via pip, I even thought I was being a smart cookie and used something called virtual environments, and yes it did work well until my distribution updated the python version and everything exploded.
>I found out that the python ecosystem uses a lot of native C/C++ libs and that a lot of python libraries are wrappers for them, this creates a lot of "fun" when python updates.
>
>The virtual environment in python does not include a copy of the python install at the time it was created and instead symlink it :/, so its not actually isolated, so its just a crutch for pip because python like a lot of languages do this horrid practice of installing a dependence globally (or user wide). After more hair loss I gave up on updating the deps, nothing seemed to work and I have never written any python so I am not the person to fix it, so what I ended up doing was building python 3.10 from src and created a install just for this program and I changed the symlink in the python env to point to it. Whats even more sad, is that I found out you can't move a python environment after you make it, when building python I found out you have to hard code a path of where the interpreter lives, what twisted mind thought this was a good idea.
>
>I wish the people doing AI picked a better language, I do not understand how they reproduce anything, but I guess they are smarter then I am, so maybe I'm just too dumb for python.

Dlang:
>Looking at my currently open D project, I have a couple of dependencies and all of them don't have any sub dependencies, that's Awesome!
>When DMD updates on my system, most things still work, at worst I get a deprecation message in my build log and I investigate it, the upgrade tends to be very simple, that is Awesome!
>Unlike my C++ experience, most of the time Phobos has what I need & there is zero reason to reach for some 3rd party nonsense.
>When I do use a external dependency they tend to be of good quality, most don't pull in a ton of sub dependencies or have complicated build steps. In most cases I can just take a DUB package and just add the src to my src tree and call it a day.

My goal here is not to appear like a luddite or to tell others to be ones, I think having nice tooling is a good goal, I'd love to have a good package manager for D with a quality ecosystem.

Here is how I would go about this:
	* A dependency should always be just a tar/zip file with src code in it.
	* The dependency (tar/zip)s should always be stored in the project directory & not in some system or user folder.
	* No use of symlinks
	* To help discourage NPM insanity, Build in a ""Bloat"" measuring tool, how many total dependencies are you using? how many KLOC is it?
	* https://code.dlang.org should have a link for the manual direct download for the src zip for all versions.

	Github, Gitea instances & Fossil all have a way of providing a zip file for releases, src zips are already a near universal method of publishing code. So this theoretical package manager can be really fast and light, its basically an over gloried text search & download tool.