[ overboard / sfw / alt / cytube] [ leftypol / b / WRK / hobby / tech / edu / ga / ent / 777 / posad / i / a / R9K / dead ] [ meta ]

/tech/ - Technology

"Technology reveals the active relation of man to nature"
Name
Email
Subject
Comment
Flag
File
Embed
Password (For file deletion.)

Matrix   IRC Chat   Mumble   Telegram   Discord


File: 1678482063414.png ( 21.89 KB , 900x878 , tux-package.png )

 No.11964

What's your take on linux software distribution ?

There's a lot of buzz around flat-pack and flat-hub atm, they are currently implementing a monetization feature. And for some reason Eric Schmidt the google-guy is involved somehow.
https://www.theregister.com/2023/02/27/flathub_app_store/
flathub isn't calld flub
I'm worried that once money is involved it will attract scammers and litigation for a payout type people. Also the payment processor they want to use is stripe, that's probably not anonymous

Ubuntu has removed flat-pack from it's official releases, to push it's snap package manager instead, i wonder if they have other reasons than "we're going to make our own pack-manager with blackjack and hookers" to yeet flat-pack from their system.

I think the best package manager in the end might be NIX

Is going from distro repositories to this type of stuff going to improve software distribution on linux ?
>>

 No.11965

best package manager is portage.
It's a shame its on that convoluted mess that is gentoo.
>>

 No.11966

File: 1678499576257.jpg ( 246.34 KB , 1280x720 , pkgsrc-on-linux.jpg )

Many people seem to forget that NetBSD's pkgsrc exists and supports Linux among other systems.
It has minimal dependencies compared even to portage, which makes it integrate well with lfs, kiss, or suckless distros

>>11964
>Is going from distro repositories to this type of stuff going to improve software distribution on linux ?
The problem flatpak, snap and nix are trying to solve is reproducibility.

Source-based package managers usually ensure all dependencies with correct versions are available.
Nix is more sophisticated in its package configuration and exposing a filesystem with exactly the needed dependencies to a compiling project.

Binary package managers nowadays only distribute dynamically linked executables and libraries, that often need to have a particular version to function together. Dynamically linked libraries are rarely compatible between different version, because even trivial bug fixes can change the ABI. Compiling all dependencies yourself would only fail after an API change. Many distros invest a great deal of effort into keeping the repos of their binary package managers working. Binaries distributed for those distro not included in the repos, especially proprietary ones, will inevitably stop working, unless always they get updated in sync with the particular distro.
I don't know about snap, but flatpak has several very general dependency collections ("runtime") an application requests a specific version of. The application is then granted access to all dependencies in its filesystem and optionally user specified directories. Applications and runtimes are distributed as compressed filesystem images on flathub or private repos. Thus flatpak applications should only stop working when the underlying distro breaks flatpak, bubblewrap, dbus, x11, cgroups, … not Gnome or KDE though.

Both types of software distribution are without a doubt less error prone than previous approaches. Still i don't think the latter method will solve all portability problems. Some programs are inherently fragile outside of their target environment, so i have seen even their flatpak applications fail for undetectable reasons.
>>

 No.11979

File: 1678814504654.png ( 162.95 KB , 720x302 , pwq7ax9bvcs81.png )

>I think the best package manager in the end might be NIX
I think it is portage, because of USE flags and compile optimizations
>Ubuntu has removed flat-pack from it's official releases
ubuntu destroying itself again?(imagine my shock)
>Is going from distro repositories to this type of stuff going to improve software distribution on linux ?
No, it will definitely make things worse. I think using repositories that only have 1 packaging format in common is the best for a system, for system sanity and performance. appimages might come in handy on this since you can make packages(ebuilds, pkgbuilds, etc) with an appimage inside(this should be avoided though), maintaining system sanity for each system, i.e. not having to use other packaging formats in the same system, unlike snaps and flatpaks.
>>

 No.12034

>>11964
flatpak/snap/appimage/etc. are all cancer because they encourage vendoring dependencies which causes gigantic sloppy unmaintainable codebases. (and of course it's a trojan horse for proprietary software). it's basically going to turn linux into windows. use guix/nix
>>

 No.12035

>>12034
No matter how badly designed these are, they solve a very real problem with proprietary programs on linux. That linux is very brittle in supporting binary distributions of software should not be seen as a positive. See my previous post on how dynamic linking is inherently broken >>11248
>it's basically going to turn linux into windows.
And that's a good thing. Package repositories are the better approach in most cases, but newbies have a point when they lament the lack of portable installers. I usually compile niche software from source, yet seeing the way many projects are headed towards using meson and other unneccesarily convoluted build systems, i fear i will have to rely more on binary distributions in the future.
>>

 No.12041

>>12035
>And that's a good thing.
Eat shit.
>>

 No.12043

>>12041
Just to be clear: PEs make Windows actively worse and so do APKs for Android.
This is a genuine question. I haven't used both for some time.
You seem to be eating from the trashcan of freedesktop developers though, if you think of the current instability of linux userspace as desirable.
>>

 No.12044

>>12035
>they solve a very real problem with proprietary programs on linux
How will that dynamic play out ?
Does it mean more people use linux because it allows to them continue using that one proprietary program.
Or is it going to become a Trojan horse where a proprietary layer spreads ontop of the opensource operating system.

>it's basically going to turn linux into windows.

>And that's a good thing.
I don't know what you mean with this, but the windows way is to go download some random setup.exe from a website and run it. That's probably one of the worst ways to get programs, because there are no security or software update features.
Windows has copied some of the software repository features from Linux, you can use the winget package manager to install programs from the power-shell cli, in a not too dissimilar way from apt or dnf on Linux. Of course it's rather limited and most people now use the Chocolatey packaging manager because that has dependency management and versioning. So windows may also get a sprawling web of package managers.

>I usually compile niche software from source, yet seeing the way many projects are headed towards using meson and other unneccesarily convoluted build systems, i fear i will have to rely more on binary distributions in the future.

I'm kinda hoping that there will be a little more consolidation for these kinds of environments both for build systems and for stuff like flat-pack run dependencies as well. So that you don't have to install so many of these big chonkers.
Or if not consolidation then a way to reduce the file sizes. Installing a program that by it self is but a few megabytes, but need hundreds of megs of dependencies that's always an oof-moment.
>>

 No.12045

>>12044
>is it going to become a Trojan horse where a proprietary layer spreads ontop of the opensource operating system
The question is not wether proprietary software will arrive, but how well it will be supported in the long term. If it is made for linux, it should forever be usable/piratable/crackable on linux the same way it is currently on windows.

>That's probably one of the worst ways to get programs, because there are no security or software update features.

Package repositories are the right approach in most cases. They aren't for software that is not actively maintained and depends on unstable dependencies, such as GTK or QT (if they were stable GTK2+ and QT4 would still be supported). This is the reason these dependencies need to be distributed with the software, either with sufficiently elegant but currently unsupported static linking or the overengineered and freedesktop endorsed leaky sandboxing that is flatpak.

>I'm kinda hoping that there will be a little more consolidation for these kinds of environments both for build systems and for stuff like flat-pack run dependencies as well.

This consolidation is a direct consequence of build systems getting more complex. Before autoconf/automake developers wrote their own configure script. The proliferation of cmake coincides with the death of many smaller build systems, that often were iterative improvements on make.

Theoretically the thinning of build systems meant they became more portable. The problem is cmake, meson and other abominations are often not as robust as more minimal approaches.
DWM comes with a pre-written Makefile. When i compile it statically some libraries are not automatically included, so i need to grep for the symbols and include the missing libraries. I have even done this with individual libraries/objects of larger autoconf-generated projects. You can easily fix many portability issues with autoconf, because the end result is a tree of Makefiles.

I cannot fathom how the individual build steps of cmake fit together. It can generate Makefiles, but these Makefiles in turn call cmake again, which generates similar Makefiles again. The only option to account for missing libraries, headers or include directories is to pass them in the global CFLAGS/LDFLAGS variables.
With meson you don't even have this option. Your environment variables are simply ignored. This would only be inconvenient if the build scripts written for meson would allow you to correct these options, but they don't. When trying to make meson compile static executables, it is actively fighting against you.
>>

 No.12047

File: 1679094492867.jpeg ( 33.97 KB , 612x612 , 30b7c8203e33031cd1f3052a1….jpeg )

>>12045
>If it is made for linux, it should forever be..
stopped reading

so you want to support outdated abandoned software indefinitely into the future?

That's gotta be one of the most retarded things I have ever heard.
>>

 No.12048

>>12047
>so you want to support outdated abandoned software indefinitely into the future?
So you want every video game older than 4 years to be unplayable? People in the future won't be writing a compatibility layer for every LLVM release (or wait that's exactly what flatpak is).

Who do you think writes software? It is written by huge corpos pushing out freedesktop enforcing crapware, individual programmers solving their problems - then getting on with their lifes, or in many cases an entity in between that won't be around in a decade or so.
Most GNU projects were written by individual or small teams of developers and are barely being maintained if at all. Many of them still work, some of them don't and others compile on the BSDs but no longer on linux (glibc at work).

Backwards compatibility is not a burden in the case of linux, it is actively sabotaged by the support rentiers at Redhat. Programs depending on NCurses, X11, OpenGL, SDL, Motif, FLTK, or GNUstep can still be compiled on a recent linux install (unless they depend on some esoteric former glibc feature). Programs depending on libgnomeui cannot.
>>

 No.12049

>>12048
>So you want every video game older than 4 years to be unplayable?
No, I want every game dev to use open source engine like Godot and Renpy and use proprietary license only for their assets.
You know, that games can actually be maintained indefinitely when they are abandoned by the profit seeking devs..

also, not even windows has that kind of backwards compatibility with software - try playing win xp games on win 11

>Most GNU projects were written by individual or small teams of developers and are barely being maintained if at all.

>Many of them still work
If it's not broke, don't fix it

>Backwards compatibility is not a burden in the case of linux, it is actively sabotaged by the support rentiers at Redhat.

conspiratard theory
backwards compatibility is always a burden, because you need to maintain outdated codebase that might become a bottleneck for future development
>>

 No.12050

>>12049
>use proprietary license only for their assets
*art assets

you know, so that people can actually safely pirate it and not worry about malware

AI generated art is gonna make artists obsolete anyway, or at least the highly skilled ones
>>

 No.12051

File: 1679145268817.jpg ( 27.88 KB , 458x458 , 1670716670245357.jpg )

>>11964
I think the main problem is that there should be a clear delimitation between gnu/linux as a tool (gentoo, nix, guix, etc.) and gnu/linux as a consumer OS

I use gentoo as my daily driver because I like foss and programming (as a hobby) and portage is more or less what I would do if I decided to write a personal package manager. if I also wanted reproducibility, the end result would be very similar to guix. these package managers are just what a programmer would expect, the intuitive approach to the problem so to speak

I have no idea what people who want to use linux as a consumer OS to do office work or play windows games need, but I'm sure their requirements are not the same as mine. I don't want them to modify my tools to accommodate for their needs, and the feeling is probably mutual

so that's my take I guess. I had to program on windows at work some years ago and it was terrible. the one size fits most approach is a waste of time

>>12035
>meson
>unneccesarily convoluted build system
you probably use shit like autotools and cmake already. meson is way less convoluted than the alternatives
>>

 No.12061

>>12051
>I think the main problem is that there should be a clear delimitation between gnu/linux as a tool (gentoo, nix, guix, etc.) and gnu/linux as a consumer OS

I think if you do this and appeal to normies by making gimped distros like silverblue to more easily facilitate browsing facebook, you get hordes of screeching uneducated retards like the userbase of /r/linux going WHY DOESN'T THE DOLBY ATMOS FOR MY NETFLIX WORK, WHY DOES THE SETTINGS MENU HAVE SO MANY BUTTONS, IT'S CONFUSING, LINUX IS SHIT!!!

and in response to that you have huge developers like redhat pandering to them by writing software like GNOME that is intentionally gimped in functionality yet at the same time crowds out all other alternatives from the ecosystem because it has such institutional force behind it, and shit like baked-in DRM, TPM attestation (to enforce the DRM), immutable root (to keep the retards from breaking their distro and whining about it), wayland shit which screenrecording and keybinding doesn't work half the time (because some retard might download malware which will keylog them), ad nauseam.

linux should always remain a tool. if the tool can be made easier to use without compromising on functionality, then fine. but it should always remain a tool.
>>

 No.12062

File: 1679702623951.jpg ( 930.34 KB , 850x1200 , 1679681748776788.jpg )

>>12061
>I think if you do this
do what? keep a delimitation? do you think there should be no difference between gentoo and ubuntu?
from the rest of your message I can see that you are not as retarded as to suggest that. in principle we agree, but it is not like you can stop ibm and redhat from doing what they are doing. the best and only realistic alternative is to try to keep the tool separate from the consumer environment - this is what I call delimitation
>>

 No.12064

>>12062
the problem is it's difficult to keep "linux the tool" and "linux the consumption device" segregated so that the latter doesn't crush the former
>>

 No.12066

File: 1679793025543.png ( 474 KB , 1024x1024 , 1678486210341586.png )

>>12064
Isn't that what computers and technology fundementally are? I guess phones are probably a more extreme example because computers you can program on and create with, but, I would say that computers are, to some degree, inherently consumption devices.

Unique IPs: 6

[Return][Catalog][Top][Home][Post a Reply]
Delete Post [ ]
[ overboard / sfw / alt / cytube] [ leftypol / b / WRK / hobby / tech / edu / ga / ent / 777 / posad / i / a / R9K / dead ] [ meta ]
ReturnCatalogTopBottomHome