Idiomdrottning’s homepage

Dependencies

Many people have become down on dependencies lately.

Here are the ways I see out of the current situation.

  1. Everyone starts reimplementing (or pasting) everything for themselves.
  2. Libraries start getting a lot bigger, more feature complete, and better.
  3. We make the distro system safer, signed, pinned, better, and find trusted maintainers to host and vet stuff.

I think the first is bad and the second is good, but whether we go with the first, the second, or both, the third is also going to be necessary, and if done well enough, it might be enough on its own.

Many years ago, when I was just starting out on Linux, I saw a friend make a Perl script and he added a dependency from CPAN, a switch statement language extension.

The script as a whole was like half a page and I was like “What in the heck! Are you adding a library for just one line, a line you could’ve just as easily done with some ifs and elses and buts?”

He was like “That’s something you’re going to have to get over right away.” Code reuse is a good practice and making code reuse easy and painless is good.

I don’t know how far to stretch that.

Here is two straight up facts:

  1. There are dependencies that are gratuitous and unnecessary.
  2. There are dependencies that make your code significantly more readable and comfy.

That second fact, to me, suggests that a hardline NIH approach is not the right path forward. That first fact suggests that pruning our deps a little bit is possible and probably good, but we are gonna need a good way to handle dependencies regardless. The “zero deps” approach is something I’m not a fan of, just as little as those unweildy, over-dependent trees.

There are some practical problems, like the tension between distros vs devkits, and the perils of just pulling and running some rando unverified code, that no matter what, we’re gonna need to solve.

Here’s an outline for a way forward (I can’t make this stuff so this is just daydreaming):

  1. Integrate the language’s custom package manager with the distros so that we can apt remove and apt uninstall and resolve dependencies with apt and dpkg. (And rpm and whatever the other dorks have.)
  2. But have the languages’ custom package manager / repo be its own thing that’s as easy to contribute to as it is today. It’s own thing, but integrated with the distro systems like apt.
  3. But don’t let people automatically install stuff that a repo manager hasn’t eyeballed, at least not without ten thousand –force –yes-i-am-sure –really type flags. No more installing out of random urls and repos all over.

This means that… let’s take a fictional language foo that uses foos-modules. The language has great tooling so it’s easy for anyone to make and upload and use a foos-module.

There’s also a foos-modules package manager that works like your normal gem, go get, npm type app but also has the following features:

  1. Every module has at least two versions: latest-uploaded and latest-vetted. By default, you get the latest-vetted. This goes especially for recursive dependency resolution. You can pin to older versions (if so, we need some sorta in-app CVE alert system) and you can, with enough hoops, get newer, unvetted versions.

  2. The modules hook into the distros’ normal package managers. So you can apt-remove and dpkg -L and so on.

  3. And, when it’s time for the distro to package an app, the distro can bring in and pin vetted versions of the dependencies (again, that means some sorta built in CVE alerter is needed) and bless those versions officially.

I know Andrew’s video says that Debian forbids overly small packages and contradictingly also forbids including the dependencies right there in the package but if that’s true, one of those issues is something they’re gonna have to get over, perhaps in conjunction with a system like what I’m outlining here.

System-wide vs vendored & contained

I came into FOSS and Linux just as the debate over dynamic linking was roaring. People had been statically linking and and the hot new thing was dynamically linking. The operating system had a version of a library, say version 17, and all the apps linked to that version.

This is what I refer to as the traditional “Debian approach”. There are packages, but the binaries in those packages can bed dynamically linked to libraries in other packages.

These days, the fad is to “vendor” or “contain” things so that each app “carries with it” all the libraries necessary from the app. Often specific, stale versions, sometimes bleeding edge.

There are some pros and cons to either approach.

What I like to do is to use the traditional approach (and be on the stable distribution) for 99% of the software I use and then maybe have three or four “pet apps” where I use the dev version, maybe with a few patches of my own. Usually I don’t need to use any special versions of any libraries for these but for some of them I have to. I remember, this was decades ago now, but I had an unstable chroot just for one app (inkscape) that I wanted to compile from trunk.

If there’s a problem with the library in the traditional Debian approach, you can update that library and all the apps if they stop working, whereas if there’s a problem with the library in the vendored / contained approach you need to hunt it down in every app you’ve got. Even abandoned apps.

It’s also so much faster. I know that Podman is a miraculous feat of engineering with remarkably little overhead for what it does. But what it does is still a big thing. I can run a crazy amount of services on a puny server thanks to using the traditional Debian approach.

I know there are drawbacks, or there must be, because they used to call it “dependency hell”, but it just works really well and I’m gonna stay with it.

It’s also a question of scale. On the small scale, in small organizations, the Debian approach makes a lot of sense and you can stretch valuaable resources to go far. It’s not just about wallet resources, it’s also about electricity and e-waste.

I’m not ignorant about how in bigger organizations, thanks to bugs in our economic systems, frugality has become a dirty word and you can spend money to make money and throwing computers at a problem is cheaper than spending time on it. I’m not unaware of that. I just think it’s wack.