Michael Eriksson's Blog

A Swede in Germany

Posts Tagged ‘Linux

Linux, undue safety checks, and lack of overrides

leave a comment »

A recurring annoyance with Linux is the presence of too strong safety checks with no true or no obvious workaround, following the general idea that, e.g, “if a file goes missing, a program might malfunction; ergo, a file must never, ever go missing”. (Where a better attitude would be to write programs that behave sanely if and when a file does go missing.)

For instance, earlier today, I found that I could not delete a directory and was met with the spurious claim “Directory not empty”. The directory, however, was very empty, including that there were no “dot files”.* Even just moving the directory out of the way was impossible. After some search, I found the explanation: I had at some point had an editor (vim) running in the directory, paused it, and forgotten it. Killing the editor resolved the issue. But here we have at least two problems, namely a misleading error message and the disputable idea that the presence of the editor** should have prevented the deletion. It would be better to remove file and directory and let the editor (more generally, program at hand) handle the situation on its own.

*In Unix-like systems and file systems, files beginning with a “.” are given special treatment and are not shown per default by most tools. (Or, at least, most older tools. Never ones are often ignorant of the convention or deliberately ignore it through forcing some type of one-size-fits-all approach based on MS-Windows behavior.)

**The exact reason is unclear, as I did not investigate further. I speculate that the editor had an open file (handle) in the directory, which lived on as some type of zombie after the file’s deletion, and/or that the editor had this directory as its working directory with a similar result. With hindsight, I should have had a look at the counts in the inode, but I did not think of that until I had already killed the editor. (In particular, there is some minuscule possibility that these counts are corrupted, which could conceivably lead to a similar error message, even absent a program with open file handles and whatnot.)

Similarly, I often have trouble closing encrypted drives with cryptsetup:* Even though the unmount worked** and even though e.g. lsof finds nothing anywhere, I often receive a stubborn error message that X is still in use. (Where “X” refers to whatever entry in /dev/mapper is current.) Even e.g. waiting half-an-hour, just in case, makes no difference. I have even inspected what processes are running pre and post, and there has, so far, never been a difference.*** Advice on the Internet amounts to various tricks to “force” a close by e.g. dmsetup—but this brings no improvement. Worse, in a next step putting the drive (usually an external one used for backups) to sleep, for a safe removal, with udiskctl does not work either, because I cannot do that without first getting rid of the /dev/mapper entry. Forcing udiskctl does not work either… (In all cases: even when acting as root.) Here both cryptsetup and udiskctl should have true force powers to close, shutdown, and whatnot at the discretion of the user, should he be certain that this is what he wants. The current solution is amateurish, counterproductive, and user hostile.

*Here an explanation might be too lengthy. Those who know the tool will understand; those who do not are out of luck.

**If there was an open file, or similar, this would be the logical point to complain. In fact, I do not see how an open file could have an effect after a successful unmount.

***Which, for instance, rules out some obscure situation like with the first example, although the possibility remains that some already existing process has, e.g., opened a file handle in the wrong place. (But, if so, on its own and without my doing.)

(I have found a very ugly workaround for the udiskctl limitation, namely to go into /sys/devices, find the USB entry with the right idProduct/idVendor, as indicated by lsusb -vv, and then echo-ing a “1” into the “remove” file. Example: “echo 1 >> /sys/devices/pci0000\:00/0000\:00\:14.0/usb2/2-1/remove”. Alternately, it should be possible to write a small own program that makes the right underlying system call. Of course, while this implies that I can safely remove my USB drive, it does nothing to reduce the pollution of /dev/mapper with spurious entries.)


Written by michaeleriksson

November 20, 2022 at 8:09 pm

Further misadventures

leave a comment »

Warning: The following serves mostly as stress/tension release. With one thing and another, the Nazi series will likely see an interruption until late next week.

After my long period of problems (cf. earlier texts), things were really beginning to look up again, a day here-and-there with construction works notwithstanding. True, I had partially bought this improvement through slacking off, neglecting my writing and reading, letting the mail mount up again, and not, for now, getting to the bottom of a few outrages (notably, the inexcusable behavior of building management and the local chimney sweep, where I have put off very thorough complaints for over a year)*.

*If and when I get around to them, I might also write a few blog entries on the topic. Chances are that you will not believe me, because the situations are so utterly absurd. (And those complaints are on a very different level from the ones in this text.)

Still, things were locking up, I was beginning to get my energy back, writing was beginning to look good again, I was beginning to read heavier material, and I had the new project of researching emigration from Germany (which by now borders on being a Leftist dictatorship).

Then the screen of my newish computer just dies…

(Fortunately, just a few hours after the latest full backup. I have hopes that the issue will be repairable, as it might simply be a loose contact somewhere, but there is no guarantee, notebook repairs are often disproportionately expensive relative the original price, and there is not telling how long this might take.)

This, then, amounts to less than four months of use, while its predecessor might have worked for four years.

The next day, yesterday, I went to the local Mediamarkt to look for replacements*. Again, a bit of good luck among the bad—had this happened a little earlier, Mediamarkt might have been closed or off limits due to Covid restrictions.

*As shown both now and around New Year’s, notebooks are highly troublesome when something goes wrong, as the user has to start almost from scratch and he can be restricted in his work for days. With a desktop, I could usually just buy a new one and spend five minutes switching hard drives, while a mere monitor issue could be solved by just replacing the monitor. (Yes, notebook hard drives can also be replaced, but they are much trickier to access, might differ too much in size to fit in another notebook, and the driver situation can be trickier, which makes for more work post-replacement.)

I walked, as I always do, the few kilometers, but finding myself more tired and lacking in energy than I would have expected from such a distance. Too much time indoors due to a mixture of COVID-restrictions, low temperatures, and (during last summer) prolonged bronchitis have really damaged my fitness. (And then we have the question what this might imply for the future. If I fail to compensate through that much harder work, it might very well be a few years of my life when I am in my eighties.)

I walked around Mediamarkt, looking for suitable specimens, beginning, close at the entry to the second floor, with a set of marked-down-due-to-damage computers. Marked down? Maybe, but, apart from the Chromebooks, they were still more expensive than I cared for, and I had the slight fear that some mixture of stampeding inflation, bottlenecks for various chips and whatnots, and market segmentation* would make the affair far more expensive than intended.**

*E.g. in that only Chromebooks and various Android devices can be had cheaply, while a “grown up” computer goes for massively more. Chromebooks et al, however, are not suitable for my current purposes.

**A few years back, I wrote about an atypical lack of progress or even regression in terms of bang-for-buck when it comes to computers. At that time, the decades long trend towards ever more bang for the buck was temporarily broken. A reason for this might have been the vastly increased demand for smartphone components.

I walked over to where the regular items were found, easy to spot and well displayed—and so expensive that I could not believe my eyes. The cheapest (!) went for 900-something Euro, while the median might have been in excess of 1200.* As a comparison, my first notebook, bought in, maybe, 2000, went for less than 3000 DM or around 1500 Euro. (Yes, this was a low-end specimen and there have been more than twenty years of inflation—but there have also been more than twenty years of technological progress.)

*Reservation: I go by memory and did not take exact mental notes. The general idea holds, even should I have the details wrong.

Then, the day seemed saved: in a much less visible aisle, I found a handful of notebooks at much more moderate prices, of which I picked two (of different models) for a total of less than 900 Euro. These, while low-end, were even of better bang-for-buck than the last time around.*

*Which points to the broken trend having resumed in the interim. Imagine my relief.

Sadly, the year is 2022 and the notebooks still all came with a useless and price-increasing Windows installation. Again: 2022—not 2002. This shit should be long behind us.

I went back home on foot and, having a bad conscience about my fallen endurance, picked a road a little longer and much hillier. (Wuppertal has no end on hills, if one picks the right or, depending on perspective, wrong path.) The result: For the first time in years, even with my building in sight, I had to stop to get my breath back—and the last time around I had a loud both heavier and more awkward to carry.* Halfway up to the third floor, I had to halt again—also for the first time since that heavier-and-more-awkward load. Once in my apartment, I put my notebooks down, kicked off my shoes, dropped my jacket to the floor, too tired to hang it, and then I laid down on the floor, myself, where I spent several minutes. Now, I am not saying that this day would have been an outright pick-nick in the past, but… Two years ago, I would neither have had to stop, nor would I have found myself on the floor afterwards—and I suspect that I would have held a higher average tempo.

*The sum of bag and notebooks might have been around 5 kg, maybe less. (Weight is an area where there really has been progress.) To my very vague recollection, the prior event, involving furniture, might have been at 16 kg, but, in all fairness, over a shorter and flatter course.

After a brief excursion, at snails-pace-by-my-standards, to buy food, I spent most of the remainder of the day feeling really lousy, as I do after an overexertion. The intended high point of the day was a Tex-Mex pizza from the local store (one of my favorite dishes). I put it off until the evening—delayed gratification and all that. Twenty-five minutes in the oven, as I like the pizza crispy and firm, and it should have been good to go. But no. I tried to fish it out onto a teller with a fork, as I always do and which has never failed me in the past. This time, very unfirm dough split around the fork tines and the pizza landed in a heap on the oven lid. I tried to grab the heap with the fork and a hand, and the fork just went through it again, leaving nothing that could reasonably be eaten.*

*I do not know what the problem was. A possibility is that I had not turned one of the knobs far enough, but, if so, I should have noticed it when I turned off the oven—as I always have on the very few prior occasions when a knob has been short of the mark.

Today, I began the installation of Linux (Gentoo). Here things grew tiresome again. For starters, I had, around New Year’s, downloaded the installation manual* to an e-reader—which should be perfect right now. But no. When I opened the document, the font was on the small side, so I picked a larger one. The result: The reader locked up in “hour-glass mode” for so long that it went into power-save** mode before the document had reloaded. Once done, simply going from page 2 to page 3 caused another massive delay, after which the power-save mode was reactivated. After turning the thing on again, I was still on page 2… After several repetitions, I tried to go back in size, as things had worked to begin with. The reader worked for half an eternity, went into power-save mode—and was, surprise, still using the larger font afterwards. Several repetitions brought no improvement.

*Note that Gentoo is a distribution for somewhat more proficient users, and that there is a lot more manual work and own decisions to make than with e.g. Debian.

**Due to the minutes of waiting. The battery, to avoid misunderstandings, was fully loaded.

I gave up and began the installation on the first notebook. Just as I recalled, the installation medium did not contain the installation instructions (a bizarre choice, especially considering how little space would be needed), and the central “man” command for displaying other documentation was equally missing. Fortunately, “cryptsetup” was present and I could mount my (encrypted) backup drive, where I, among other needed things, did have a copy of the installation guide. From here on, things went much smoother than around New Year’s; in part, because I had some experience; in part, because I could forego a number of steps and just populate most of the hard drive from the backup drive. However, there were still quite a few curses, due to the incompatibilities of the defaults in the installation shell and my own ingrained-in-my-fingers preferences. Some obscure errors held me back for a while, because I had forgotten to manually add a “/tmp” directory (which I do not backup), including that tmux refused to start.*

*And why is it so hard to give decent error messages? Pretty much the first rule of writing error messages is to indicate what object caused the problem. It should never be e.g. “File not found!”, but “File XYZ not found!”. Ditto, never “Access denied!” but “Access to XYZ denied!” (unless the object is obvious from the interaction).

Still, apart from some issue with the sound,* I had a working computer and a working Internet much, much faster than last time around—and i decided to carry on with the second, too, today. (Originally, intended for tomorrow.)

*A missing driver, likely. I will look into that later. The device is found by “lspci” but is not in e.g. the “/dev” tree.

But no. While it definitely has a functioning hard drive, as it managed to boot into the pre-installed Windows when I was a little slow with entering the BIOS. However, when I tried the Gentoo installation, this hard drive simply could not be found. There is not even a “/dev” entry. If I have the energy, I will troubleshoot tomorrow, but in a worst case, I might have to replace the boot-image for the installation. Absurd. Again, the year is 2022 and interfaces should be sufficiently standardized that something like that simply cannot happen.

(I also have some misgivings about the keyboard layout. As I noticed during my brief experiments, a few keys had been moved out of position in a manner that could be extremely annoying to the touch typist. Another first rule, and another one all too often violated—if you design keyboards, keep the touch typist in mind, not just the hunt-and-peck typist.)

Written by michaeleriksson

April 23, 2022 at 9:52 pm

Odd usability decisions and rsync

leave a comment »

One of the most popular tools among e.g. software administrators is rsync, which allows efficient and flexible synchronization of files between different directories—even when located on different servers.

However, every second time that I use it, I feel like tearing my hair in frustration:

For some reason, the makers of rsync decided to implement something better governed by flags through obscure and unintuitive “directory semantics” (for want of a better word) and the behavior of rsync varies depending on whether a source and/or destination directory has a trailing directory separator*. Moreover this behavior is incompatible with almost any other tool, including the Unix command cp, for which it is a natural replacement.** Indeed, I would go as far as calling it a “best practice” to normalize directory inputs with a directory separator to exclude*** it before further processing, in order to ensure that both cases are handled identically and to avoid programming errors through assuming that a directory separator has to be added (or removed) at some later stage, e.g. when specifying the name of a new sub-directory to be created. Of course, here we have an other reason why rsync’s behavior is unfortunate: in a programmatic context, a normalized directory could lead to a very different behavior from the intended—as could a minor slip of the keyboard.

*In the Unix world, a slash resp. “/”. I have not investigated the behavior on other systems, including whether rsync is tied to the slash or the local directory separator, but I go with the more generic term for now.

**The cp command CoPies files and directories. In most cases, it is perfectly good at this, but rsync can be a superior choice for the same task in certain circumstances. Consider, e.g., copying between two servers over an imperfect network connection. If the connection fails during a use of cp, one can either start over from scratch or spend time with a manual clean up, and even then a partially transmitted file has to be re-transmitted from scratch. With rsync, the command can be repeated and the download will automatically be resumed with little overhead. (Interestingly, rsync can be used to save an interrupted cp, but then why not use rsync to begin with?)

***Why “exclude”? In part, through convention; in part, because the directory separator is not a part of the name of the directory, and it makes little sense to keep it at the end of a directory, even when given through a full path, when there is no further sub-directory that it could separate.

Specifically, I am ever again caught by the trailing directory separator of the source directory leading to a different treatment of the destination directory.* If a trailing directory separator is present, the files of the input directory are put directly into the output directory; if it is absent, they are put into a sub-directory** with the same name as the input directory. Not only is this very easy to forget, and not only is this highly counter-intuitive, but the standard file-name completion of e.g. Bash automatically adds a trailing slash when it expands a directory name, implying that the user who has used completion to generate the name has to explicitly think about removing that slash (should it not be wanted in conjuncture with rsync—in almost any other context it will be either wanted or irrelevant).

*At least in terms of manifestation. Conceptually, it might possibly be argued to that the source directory is treated differently. Cf. the rsync–cp comparison that follows.

**Created, should it not already be present. I suspect that the original motivation for these special behaviors related to the complication that such a sub-directory could or could not already be present.

For comparison: “cp -r x y”, “cp -r x/ y”, “cp -r x y/”, and “cp -r x/ y/” all do the same thing—they copy the directory x to the directory y, where there will be a new sub-directory with the appropriate name. In contrast, “cp -r x/* y” (or “cp x/* y/”; in both cases, note the asterisk, which here does not point to a footnote) copies the individual files and sub-directories present in x to y.* An “rsync -r x y” does the same** as the first four cp commands; “rsync -r x/ y” does the same** as the fifth (and sixth).

*Excepting “hidden files”, as the “*” is expanded thus by Bash and shells in the same family. Other shells might have a different behavior. Writing this footnote, I suspect that this could be another clue to the origins of rsync’s idiosyncratic behavior—a poorly thought-through attempt to reduce the dependency on the shell (or scripting language) used.

**With reservations for details, e.g. that cp might give an error and/or ask for a user decision when it tries to copy something which already exists or that, cf. above, cp-with-Bash is more restrictive in terms of hidden files.

Pure insanity.

How to do it better? Well, one option, would be to just have a flag that indicates whether the input directory, it self, should be copied or just its contents—while any trailing slashes are entirely ignored.

Excursion on (and reservation for) flags:
The behavior of these commands can vary considerably depending on what flags are given. The rsync “r” flag is roughly equivalent to the “cp” one, according to documentation, and I use it for consistency between examples. In practical use, I almost always call rsync with “avz”, of which the “a” includes the full effect of “r”. I have “cp” aliased to ‘cp -i”, which increases the “interactiveness”, in case of name collisions, over the “vanilla” cp. (Similarly, I have “mv” aliased to “mv -i”.)

Written by michaeleriksson

May 31, 2020 at 2:40 pm

Posted in Uncategorized

Tagged with , , , ,

Problems with adduser

with one comment

Doing some light administration, I stumbled upon a few idiocies in the Linux tool “adduser”* and the associated config-file /etc/adduser.conf on my Debian** system.

*Used, unsurprisingly, to add new users to the system.

**Who is to blame for what is not clear. As I have grown increasingly aware over the last few years, the Debian developers make quite a few poor decisions and changes to various tools and defaults that do not correspond to the wishes of the original tool-makers or that violate common sense and/or established “best practices”.

  1. This tool is the source of the “skeleton” files* added during user creation (well, I know that already) and contains a config variable for the corresponding location (SKEL)—but provides no obvious way of turning this idiocy of entirely. (Although a secondary config variable for blacklisting some files, SKEL_IGNORE_REGEX, can probably be (ab-)used for this purpose; the better way is likely to just keep the directory empty.)

    *Presumably so called because they provide a metaphorical skeleton for the users home directory. Note that there are other mechanisms that create unwanted contents in home directories. (One example is discussed in an earlier post.)

    Why is this an idiocy? Well, while their might be some acceptable use case for this, the typical use is to fill the respective users home directory with certain default configurations. This, however, simply does not make sense: Configuration settings should either be common, in which case they belong in global files, not in individual per-user copies; or they should be individual, and then the individual user should make the corresponding settings manually. In neither case does it make sense to copy files automatically.

    Indeed, in the former case, the users and administrators are put out, because (a) the skeleton files must be kept synchronized with the global configuration files (to boot in an unexpected and counter-intuitive manner), (b) the users get a snap-shot of the configuration—the configuration as it was at the time of user creation, without any changes that were made later.

    In the latter case, a user who wants his own config file can simply copy the global configuration file manually should he at all want it.*

    *Experienced users tend to not want anyone elses preferences in their config files, and have often made too extensive changes or are set on a certain set of values they have used for years, implying that they will likely not want the global files to begin with.

    Now, one comparatively rare case where skeleton files could make sense, is when setting up several sets of users that have different characteristics (e.g. administrators, Java developers, C developers)—but that will not work with this mechanism, because the skeleton files are common for all users. In order to get this to work, one would have to provide an entire new config file or play around with command-line settings—and then it is easier to just create the users without skeleton files and then copy the right files to the right users in a secondary step.

    As an aside, I do not recommend the sometime strategy of having a user-specific config file call a global config file to include settings from there (as might have been a workaround in the case of skeleton files): This tends to lead to confusion and unexpected effects, especially when the same user-specific config file is used on several systems (e.g. a work and a home computer), when global config changes, or when something is done in an inconsistent order. Instead, I recommend the individual user to either use only the global file or only his own.

  2. When the home directory is created, the typical access-control defaults (“umask”) are ignored* in favor of the config variable DIR_MODE—and this variable has the idiotic and inexcusable value 0755**. In other words, home directories are created in such a manner that everyone can read*** the directory per default. It is true that this will not give the rights to read the contents of files****, but being able to see file names can be bad enough in it self: Consider e.g. if the wrong person sees names like “resignation_letter”, “proposal_plan”, “porn”, “how to make a bomb”, …

    *Such duplication of responsibility makes it harder to keep security tight, especially since the admins simply cannot know about all such loopholes and complications—if in doubt because they change over time or can vary from Unix-like system to Unix-like system.

    **The best default value is 0700, i.e. “only the owner can read”; in some cases, 0750, i.e. “owner and group members can read”, might be an acceptable alternative.

    ***To be more specific, list the directory contents and navigate the directory. (Or something very close to this: The semantic of these values with regard to directories are a bit confusing.)

    ****Files (and sub-directories) have their own access rights that do respect the value of the umask (at their respective creation).

  3. The format of a user name underlies restrictions by the configuration variable NAME_REGEX. Unfortunately, this variable only appears to add restrictions. Quoting the documentation:

    adduser and addgroup enforce conformity to IEEE Std 1003.1-2001, which allows only the following characters to appear in group and user names: letters, digits, underscores, periods, at signs (@) and dashes. The name may no start with a dash. The “$” sign is allowed at the end of usernames (to conform to samba).

    An additional check can be adjusted via the configuration parameter NAME_REGEX to enforce a local policy.

    This is unacceptable: Unix-like systems typically accept almost any character in a username, and what name schemes or restrictions are applied should be a local decision—not that of some toolmaker.

    For reasons of interoperability through a “least common divisor”, it makes great sense to apply some set of restrictions per default; however, these restrictions must be overrideable and should have been integrated in NAME_REGEX (or a second, parallel variable).

    As an aside, I am quite surprised that “@” is allowed per default, seeing that this character is often used to connect a user with e.g. a domain or server name (as with email addresses). When the user name, it self, can contain an “@” it becomes impossible to tell for certain whether “X@Y.Z” is a user name (“X@Y.Z”) or whether it is a user name (“X”) combined with a server or domain (“Y.Z”). In the spirit of the aforementioned “least common divisor”, I would not only have expected this to be forbidden—but to be one of the first and most obvious things to be forbidden. (I would speculate that there is some legacy issue that requires that “@” remains allowed.)

Written by michaeleriksson

July 3, 2018 at 4:44 am

XDG, lack of respect for users, and bad design

with 3 comments

Normally and historically, using Linux has been much more pleasant than using MS-Windows, at least for those who stay away from KDE, Gnome, et co. Unfortunately, there has long been a negative trend towards worse usability, greater disregard for the user’s right to control his own computer, etc. (See e.g. [1] or [2] for similar discussions.)

Today, after doing some experiments with various X* and WM setups, I found something utterly inexcusable, a truly Microsoftian disregard for the user’s interests:

*By which I refer to the window system named “X”—unlike many other instances where I have used it as a placeholder.

Somehow, a tool xdg-user-dir had been activated for the first time during my experiments (whether this tools was installed per default or has snuck its way in during my experiments, I do not know)—and promptly created a slew of directories in one of my user accounts. There was no inquiry whether I wanted them created, they were just silently created. To boot, they were exactly the type of directories that I do not want and that I have always deliberately deleted when they were present on installation: Random directories like “Desktop”, “Pictures”, “Music” bring me no value* whatsoever and do not in any way, shape, or form match my computer use—starting with the fact that I use different user accounts for different tasks and order my files and whatnot according to entirely different criteria**. They clutter my file system—nothing more, nothing less.

*Note that, unlike with MS Windows, these are not necessary for the OS not to throw a fit. (Although it is conceivable that one of the worse desktop environments would—but then I do not use these!)

**The exact criteria vary, but the most common is “by topic”, rather than “by type”. For instance, a video and an eBook on the same topic would land in the same directory; two eBooks on different topics would land in different directories.

Having a look at the documentation, the functioning of this tool appears to be fundamentally flawed, even apart from its presumptuousness: In my reading, the directories would simply have been created again, and again, and again, had I just deleted them. This too is inexcusable—a manually deleted directory should stay deleted unless there are very strong reasons speaking to the contrary*/**. Now, I have the computer knowledge and sufficient drive that I actually could research and solve this issue, through finding out what idiotic program had added the directories and make sure that a repetition did not take place (and do so before my other user accounts were similarly molested)—but this is still a chunk of my time that I lost for absolutely no good reason, just because some idiot somewhere decided that he knew better than I did what directories I should want. For many others, this would not have been an option—they would have deleted the directories, seen them recreated a while later, deleted them again, …, until they either gave up with a polluted user account or screamed in frustration.

*Such reasons would typically involve a directory or file that is needed for practical operations while not being intrusive to the user (almost always implying a “dot” or “hidden” file). Even here, however, querying the user for permission will often be the right thing to do. To boot, it is rarely the case that an application actually needs to create anything in the user account, when not explicitly told to do so (e.g. through a “save” action by the user). Off the top of my head, I can only think of history files. For instance, creating a config file is only needed when the user actually changes something from the default config (and if he wants his config persistent, he should expect a file creation); before that it might be a convenience for the application, but nothing more. Temporary files should be created in the temp-directory, which is in a central place (e.g. /tmp) per default (and should the user have changed it, there is an obvious implicit consent to creation). Caching is a nice-to-have and would also often take place in a central location. Indeed, caching is an area where I would call for great caution and user consent both because of the potential privacy issues involved and because caching can take up quite a bit of storage space that the user might not be aware of. Finally, should the user wish to save something, it is up to him where to save it—he must not be restricted to an application specific directory. All-in-all, most cases where an application “needs” a directory or file will be pseudo-needs and signs of poor design.

**Looking specifically at the directories from above, I note that they are not hidden—on the contrary, they are extremely visible. Further, that they almost certainly are intended for one or both of two purposes: Firstly, to prescribe the user where he should put his this-and-that, something which is entirely unacceptable and amateurish—this is, remains, and must be the user’s own decision. Secondly, to ensure that various applications can rely on these directories being present, which is also entirely unacceptable and amateurish: Applications should not make such assumptions and should be capable of doing their job irrespective of the presence of such directories. On the outside, they can inquire whether a missing directory should be created—and if the offer is turned down, the applications still need to work. If they, unwisely, rely on the existence of, say, a picture directory at all, it should also be something configurable. They must not assume that it is called “Pictures”, because the user might prefer to use “Images” and already have other tools using that directory; similarly, they must not assume that the directory, irrespective of name, is in a given position in the file system, because the user might prefer e.g. “~/Media/Pictures” over “~/Pictures”; he might even have put all his pictures on a USB-drive or a server, potentially resulting in entirely different paths.

Looking up XDG on the web gives a negative impression. For instance, the Wikipedia page has a list of hosted packages/projects, some of which are among my least favorite, because they are e.g. redundant, do more harm than good, or replace superior solutions, including D-Bus, PulseAudio, systemd, and Poppler*. To boot, they often violate the design principles that used to make Unix and its derivatives great. Some others leave me ambivalent, notably Wayland: Even assuming that X needs replacement or improvement**, Wayland seems to be the wrong way to go through reducing both flexibility and separation of concerns.***

*At least with regard to how it has screwed up xpdf.

**It quite probably does, about three decades (!) after the last major specification release. However, in my own use, I cannot think of anything major that bothers me.

***With the reservation that I have not read anything on Wayland in years and am not aware of the latest state: Because I am not a Ubuntu user, I have not been forced to a practical exposure.

Looking at the “Stated aims” further down the page, most are good or neutral (with some reservations for interpretation); however, “Promote X desktops and X desktop standards to application authors, both commercial and volunteer;” is horribly wrong: Promoting such standards to desktop authors is OK, but not to application authors. Doing the latter leads to unnecessary dependencies, creates negative side-effects (like the unwanted directories above), and risks us landing in a situation where the system might need a desktop to function—not just a window manager or just a terminal.

For instance, I have now tried to uninstall everything with “xdg” in its name. This has left me with “xdg-utils” irremovable: If I do uninstall it, a number of other packages will go with it, including various TeX and LaTeX tools that have nothing to do with a desktop. In fact, there is a fair chance that they are all strictly command-line tools…

I have also searched for instances of “freedesktop” (a newer name, cf. Wikipedia). Trying to uninstall “hicolor-icon-theme” (admittedly not something likely to be a source of problems) leads to a request to also uninstall several dozen packages, many (all?) of which should reasonably still work without an external icon theme. By all means, if icons can be re-used between applications, try to do so; however, there must be sufficient basic “iconity” present for a good program to work anyway—or the programs must work sufficiently well without icons to begin with. Indeed, several of these are either command-line tools (that should not rely on icons in the first place) or make no or only minimal use of icons (e.g. pqiv and uzbl).

Worse, chances are that a considerable portion of these tools only have an indirect dependency: They do not necessarily need “hicolor-icon-theme” (or “xdg-utils”). Instead they rely on something else, e.g. a library that has this dependency. Here we can clearly see the danger of having too many dependencies (and writing too large libraries)—tools that do not need a certain functionality, library, whatnot, still need to have it installed in order to function. This leads to system bloat, greater security risks, and quite possibly diminished performance. Unfortunately, for every year that goes by, this problem appears to grow worse and worse.

Forcing XDG functionality and whatnot into applications that do not actually need them is a bad thing.

(Similarly, a great deal of my skepticism against D-Bus arises from the fact that it is “needed” by every second application, but is rarely actually used for something sensible. In the vast majority of cases, the application would have been just as good without D-Bus, but it still has a dependency and it still presumes to start an unwanted D-Bus at will—which it typically does not clean up afterwards…)

Written by michaeleriksson

May 8, 2018 at 1:31 pm

Follow-up: Linux vs. GNU/Linux

with 6 comments

In light of a lengthy reply by a user codeinfig to an earlier post on the issue of “Linux” vs. “GNU/Linux”, I revisit this topic.

This in two parts: An extension of the original discussion (partially driven by the reply, but mostly held abstract) and a more specific rebuttal of said reply (formulated in terms of a direct answer).

General discussion:

  1. At the time of my original post, I actually was not aware of the amount of controversy that surrounded this issue, mostly seeing Stallman’s position as an example of flawed thinking, and my intention was (like in much of my previous writings) to point to such flaws. (Possibly, because commercial Unixes dominated my experiences of the Unix-like OSes until the early new millennium.)

    With hindsight, it was highly naive of me to not expect the topic to be “hotter”: This is the Internet, and more or less any question that could cause controversy will be discussed/argued/flame-warred at length—even be it something so trivial seeming as a name. To boot, this issue appears to be almost as old as Linux, giving it plenty of time to have been discussed.

  2. I stress again that I do not claim that “Linux” is an appropriate name when we do not speak of the kernel (cf. previous statements). However, “GNU/Linux” does not solve the involved problem. On the contrary, it compounds it further, because arguments against using “Linux” are the stronger against “GNU/Linux”. (However, this might very well have been different in the 1990s.)
  3. “GNU”, on its own, has at least three different meanings: The OS envisioned by Stallman, the GNU project, and the GNU set of tools/programs. Of these, I would consider the first the least relevant today, because this OS has simply not materialized in its full intended form, even after several decades, and I honestly cannot recall when I last heard that meaning used prior to this discussion. Even as early as 1994, when I started college and made my first contacts with Unix (specifically SunOS), the tools were arguably more important than the OS: The default setup after login consisted of one instance of Bash (running in an Xterm) and one instance of GNU Emacs; the main computer work for the first semester consisted of writing and executing LISP programs using Emacs—even under a commercial Unix version, with its own set of pre-existing editors, shells, whatnots, GNU tools had been preferred. An alternate editor and semi-imitation of Emacs, MG, was typically referred to as “Micro-GNU”*, showing the relative importance of Emacs within the GNU sphere at the time.

    *The actual name was “Micro-GNU-Emacs”, with the intended focus on “Emacs”, with “GNU” only serving to avoid confusion with other (less popular) variations of Emacs. (A distinction that hardly anyone bothers with today, “Emacs” being used quasi-synonymously with “GNU Emacs”, just like “Windows” usually contains an unspoken “Microsoft”.) However, so dominant was Emacs in the perception of GNU that most people shortened the wrong component of the name…

    But, by all means, let us go with the OS-meaning*: We say “GNU” and mean an OS. Even now, however, the use by codeinfig does not make sense. He appears to use “GNU” (resp. “gnu”) as an all-encompassing term for the OS in a very wide sense** or even the whole system, effectively saying that not only is e.g. a Debian system “GNU/Linux”—it is actually “GNU”… This goes towards the absurd, because even when we speak of “GNU” as an OS, the possible interpretations are 1) the whole original vision by Stallman, i.e. “GNU/HURD” and 2) just the “GNU” part of “GNU/HURD” (resp. “GNU/kernel-of-your-choice”). If we take the first, Debian is only even a GNU candidate when the HURD kernel is used (which will not be the case when we have a Linux-version of Debian) and speaking of just “GNU” in the manner of codeinfig is clearly wrong; even speaking of “GNU/Linux” would be clearly wrong a priori. If we take the second, “GNU/Linux” would still be conceivable (before looking at other aspects of the issue), but the equation GNU = GNU/Linux would be obviously incorrect.

    *For simplicity of discussion I will try to stick to this meaning in the rest of the text, where the difference between the three matter and where the right choice is not clear from context. Note that earlier references do not necessarily do so.

    **An annoying problem in this situation is that it is very hard to define the border between OS and application in such a manner that everyone is happy and all circumstances are covered. A more fine-grained terminology would be beneficial, just like dividing the year into winter and summer would be simplistic. (However, this is secondary to the current discussion.)

  4. If we do use the OS meaning, then, yes, I would consider GNU mostly irrelevant today. It is of historical importance and it might very well grow important again, but today it is dwarfed by Linux, various BSD variants (arguably including MacOS), and possibly even the likes of OpenSolaris and its derivatives. And, no, this OS is not what e.g. I have running right now.

    On the other hand, the GNU tools/programs and the GNU project are highly relevant and immensely valuable to the world of non-commercial/-proprietary computing.

  5. GNU/Linux systems are certainly conceivable: Take GNU (in the sense of an OS without the kernel) and add Linux as a kernel. Such systems might even be present today. A typical Linux-kerneled* distribution, however, is simply not an example of this.

    *See what I did there!

  6. Some seem to think that “because system A uses GNU components it is GNU” or “[…] it should use GNU in its name”. This line of reasoning does not hold up: It is simply not practical to mention every aspect of a system (be it in IT, Formula One racing, or house building), and GNU does not today play so large a part that it warrants special treatment over all other aspects, including e.g the X server and associated tools or the desktop. Again, this might have been different in the 1990s, but today is today. Cf. my first post.

    Notably, any even semi-typical Linux-kerneled system of today runs a great variety of software from a number of sources, and limitations in naming like “Linux”, “GNU/Linux”, whatnot, simply make little sense. Let a user name his five or ten most used “regular” applications and his desktop or window manager (depending on what is central to him), and we know something about his system, his needs, and his user experience. For most users, the rest is just an invisible implementation detail. Hell, many only use even the command line as a last resort… (To their own major loss.)

    That GNU possibly was the first major attempt at a free or open-source OS is not relevant either. Consider by analogy Project Gutenberg: Its founder claims* to be the first to think of the concept of eBooks: Should any party dealing with eBooks be forced to include “Gutenberg” in its name, resulting e.g. in the “Gutenberg/Tinder” reader? Or should ordinary book publishers be forced to refer to the original Gutenberg, for using printing presses? No—both notions are absurd. They might deserve to be honored for early accomplishments and, certainly, someone might chose to voluntarily name something in honor (as Project Gutenberg did with the original Gutenberg)—but no obligation can conceivably be present.

    *I very much doubt that this is true. Yes, his idea goes back to, IIRC, the early 1970s or late 1960s, but even back then it cannot have been something entirely unthought of, be it as a vision for the (real) future or as sci-fi. Vannevar Bush published ideas several decades earlier that at least go somewhat in the same direction.

  7. Some arguments appear to go back to a variation of moral superiority, as with Stallman’s arguments (also linked to in my original post) or codeinfig’s below. Notably: Linux is not free (in the sense of free software etc.) enough/does not prioritize freeness enough; ergo, GNU is morally superior and should be given precedence. This too is a complete non sequitur that would lead to absurd consequences, especially because the different parties have different priorities for deliberate reasons.

    Someone who does share GNU’s priorities might, for all I care, chose to voluntarily include GNU as a part of the name of this-or-that. However, no obligations can exist and those who do not share said priorities have absolutely no reason to follow suit. More: It would be hare-brained if they did…

As for the more specific reply*, I start by noting that there are clear signs that you** have not understood (or misrepresent) what I am actually saying, and that it is hard to find a consistent line of reasoning in your text (your language does not help either). If you want another iteration of the discussion, I ask you to pay more attention to these areas.

*I have left some minor parts of the original reply out. There can be some changes in typography and formatting, for technical reasons. I have tried to keep the text otherwise identical, but it is conceivable that I have slipped somewhere during editing or spell-checking—always a danger when quoting extensively. The full original text is present under the link above, barring any later changes by codeinfig.

**I address codeinfig directly from here on.

“whether the emphasis is on GNU alone or GNU and HURD in combination matters little for the overall argument.”

it covers more of the argument than you realise, and it is the flaw in your argument.

you are making gnu out to be a tiny subset of what it is, and making it less significant (details aside, you are greatly diminishing what it is) so that it pales next to “linux.”

this is unfair for several reasons— first, you do not understand what gnu is. you think gnu is just some software that isnt useful to the (everyday) user. its a misrepresentation that would lend at least some weight to your argument, if it werent a misrepresentation.

The “GNU” vs “GNU/HURD” distinction only makes sense if we abuse “GNU” in the manner I have dealt with above. The rest is largely a distortion of what I say. In particular, I have never claimed that GNU would pale to Linux (in the kernel sense)—I claim that it pales in comparison to the overall systems. (Which really should be indisputable.) If you re-read my original post, you will find that I clearly point to uses of GNU tools that are not obvious to the end user; however, the simple truth remains that for someone who does not live on the command line or in Emacs, the overall importance of GNU is not so large that it deserves special treatment over some other parties.

You do not seem to understand how many different components of various types and from different sources go into building e.g. a Debian system (be it as a whole or as the OS), with many of them present or not present depending on the exact setup. We simply do not have anything even remotely close to an almost-just-GNU system with Linux dropped in in lieu of HURD, which seems to be your premise.

gnu was “the whole thing” before linux was a kernel. the web browser is not “linux” either, it is a browser. but we call it “linux.” xwindows predates “linux” by nearly a decade, but we call it “linux.”

when we call these things “gnu” you fail to understand that *that is what it was called already* and “linux” is no more a web browser than gnu was at the time, but somehow its ok for linux to presume itself to be all those things, but its “riding coattails” if gnu helps itself by being included.

Here we have several misrepresentations, e.g. the claim that the browser would be called “Linux”—this simply is not the case. Neither were those things already called “gnu” in the past. Notably, in the time before Linux-kerneled systems broke through, the clear majority of users were running commercial Unixes, e.g. SunOS, with GNU either absent or represented through a few highly specific tools, e.g. Emacs. While it is true that GNU was conceived as “the whole thing” (by the standards at its conception), this does not imply that it actually is “the whole thing” when it is included in a greater context and at a much later date. By analogy, if someone launches his own car company A, and another car company B, thirty years later, uses parts delivered from A, other parts from other companies, and parts that it has produced on its own, should B’s products then be referred to as “A” or as “B”? Obviously: “B”. In addition, due to the absence of HURD there is no point of time prior to Linux where GNU actually, even temporarily, was “the whole thing”, making the claims of precedence the weaker.

Note the item on historical influence above.

Note that my original formulation concerning coat-tails, a) referred explicitly to “better-known-among-the-broad-masses”, which is indisputably true and makes no implication concerning e.g. practical importance, b) was used to demarcate the outer end of the spectrum of interpretations of the situation—I never say that Stallman’s intent is to ride the coat-tails of Linux, only that this is the worst case interpretation.

the whole idea that linux is entitled to do this but gnu is not is special pleading *all over the place.*

its special pleading that strawmans the heck out of what gnu is in the first place— with a generous “side” of ad hom for why stallman thinks we should call it that. oh, its his quasireligious views…

No such special pleading takes place. I clearly say that “Linux” is a misnomer—but that “GNU/Linux” is a worse misnomer through compounding the error.

Watch your own strawmanning!

no, his arguments are not quasireligious. they are philosophical and even practical. thats an ad hom attack, and the only thing religion has to do with it is in parody (and other related ad hom from critics.)

If you actually read what he says, you will find that he is quite often religious/ideological and lacking in pragmaticism: He has an idea, this idea is the divine truth, and thou shalt have no other truth. Watch his writing on free this-and-that in particular.

i suspect that at some point (to be fair, you havent yet) you will accuse me of being some kind of stallman devotee. i was an open source advocate first, but i switched to free software after years of comparing the arguments between them. open source is a corporate cult, partly denounced by one of its own founders.

i switched to free software because it lacks the same penchant for rewriting history, for splitting off and then accusing those who didnt follow of “not being a team player,” and basically is more intellectually honest than open source and “linux.”

but its like an open source rite of passage to nitpick about “gnu/linux,” and it tends to follow a formula. you left out the part about how “free” is a confusing word with multiple meanings— sort of like “apple computer.”

I have not yet, and I will not here either—and it would not matter if I did: Your arguments remain the same irrespective of whether you are a devotee or not.

As for your motivations to prefer free over open: They have no relevance to the naming issue.

“To the best of my knowledge, no-one, Stallman included, has suggested that we refer to GNU (!) as GNU/Linux.”

in most instances he does. your definition of gnu is in fact, partial and subset, so he has never suggested we refer to that subset by anything. i dont believe he has ever referenced the subset you call “gnu” at all.

See the general discussion above and why this does not make sense. (But I do not rule out that Stallman too can have said something that does not make sense.)

” `The question is rather whether a Linux (sensu lato) system should be referred to as GNU/Linux.’ “

no, thats a loaded question, and a fine ingredient for a circular argument. “its already called linux.” well, it was already called gnu. but again, *somehow* linux is entitled to do that and gnu isnt, even though gnu was already calling it that.

for stallman and many others who have not been swayed by over a decade of these “dont bother calling it gnu” articles, the question is whether gnu should be referred to as only “linux.”

It was never called GNU and even if it were, you cannot demand that others, building new products, where GNU is a subset, propagate the name ad eternam. Cf. above.

the answer to that question is cultural, and already explained— *if you care about software freedom* then gnu is a signifier. to a programmer this makes sense— its self-documenting.

from a marketing perspective, this is ridiculous. to a “linux” fan (to torvalds himself) this is riduculous. to me, its a *lot* more honest. however, what stallman has done is establish a brand that shows something living up to a promise.

“gnu” is quality control (a brand) for user rights. and linux really isnt. it really really isnt, but why it isnt is a separate debate. im not trying to write you an oreilly book here.

so again— if you want to signify user freedom, call it gnu/linux. (if it were up to me, the /linux would be dropped, stallman was trying to be fair.) if you want to signify whatever the heck “linux” stands for, call it whatever you want. i call fig os “fig os,” but fig os is a gnu/linux distro.

This is the flawed moral argument discussed above. In addition, why should e.g. Torvalds include “GNU” to stress free software when free software is not his priority? If he does not include it, how can I (you, Stallman, …) presume to alter the name based on having another priority?

As an aside, if the reasoning went in the other direction, i.e. “You are not free enough to use our name, so stop using it”, this would make a lot more sense. (Assuming that someone sufficiently non-free did use “GNU”.)

“Not at all: A world-view in which GNU is so important that it would warrant top-billing in the context of a Linux system is outdated—not GNU it self.”

thank you for clarifying, but you have not explained why gnu is not important enough to warrant top-billing, except to say that applications are more important to users who dont know why gnu is important.

Again: There are many components from different sources that make up a system. GNU is just one of them. (If you feel that GNU truly outweighs all others to such a degree that it warrants top-billing, it is up to you to prove this.) Further: What is important to the user is what matters in the end. If, by analogy, Bash or Emacs behaves the same, but is now implemented in Java or C#, it remains Bash resp. Emacs. The developers might be in understandable tears, but the world goes on. Implementations are fungible to the users—the result of the implementation is not. Hell, when I use Vim, Bash, and Tmux* under Cygwin, I have almost the same experience as when working under Debian, even when actually on a Windows machine… Even speaking of “Windows user” and “Linux [or whatnot] user” makes a lot less sense today than it did in the past, and it is often more sensible to speak of e.g. “Bash user” and “Vim user”.

*Note that of the three, only one is a GNU program.

your article is mostly assertions, and you do start to explain some of them though i still think it rests mostly on ad hom and assertion. its a very common set of assertions too— made year after year after year, i even made them myself once long ago.

Ad hominem and assertions pretty much match what I see in your writing…

“I am saying that building e.g. a Debian system without GNU is conceivable.”

thats hardly fair. gnu has been vital to all this for a quarter of a century (bsd can make a similar argument, considering that the only reason gnu was necessary was they were tied up in gaining the rights to their own work.)

I doubt that GNU has been that important for a whole quarter of a century, but even if it has been, that is irrelevant: It is not that important now. This is not a matter of fairness. Light-bulbs were great; today, they have been replaced by LEDs and other newer technologies. (Be it for technical reasons or through legislation.) I do not look up at my ceiling lamp and say a quick prayer to Edison (or one of the other inventors involved in the development of light-bulbs).

you make gnu less essential by creating a strawman version of what it is, so that you can say “but this isnt a good enough reason to warrant top-billing.”

we cant agree on the validity of your argument if you insist on misrepresenting what you weigh the importance of.

in fact, your argument suggests to me that the name is more important than the thing itself. i mean— gnu wouldnt be essential if this mostly-hypothetical thing like gnu were created instead! but thats no reason to call my entire operating system:

gnu! linux!

stallman has said lots of times that the name really isnt important. seems to really go against this whole thing, eh? people miss his asterisk where he says its important that everything gnu exists to accomplish is not forgotten for a side movement that reframes years of work to deliver freedom to the user as “just a practical way to develop software.”

Most of the above is more of you misrepresenting (or misunderstanding) what I am actually saying. As for the name vs. the thing: The topic of my post is the name and flawed reasoning around the name; ergo, I deal with the name and the flawed reasoning.

open source creates the need for this, stallman says “well if youre going to misrepresent everything we do, at least give us three letters of credit for this enormous amount of software youre relying on.” and people say more or less: “wow, the nerve of THIS GUY!”

its so funny because all he wants is for people to not forget that the entire point of all this was free software.

His entire point was free software. Torwalds’ (and many others’) is not. And again: If GNU was given credit in the name, then there are other parties with a similar right.

your argument is whether it should be called gnu/linux or not. but it never addresses the years-old argument of why it should be called gnu— it makes up its own reason, and then steps on it.

it really is a giant strawman. and i appreciate that you are almost certainly sincere and wouldnt create a strawman just to be a jerk. but its still a strawman.

Your claims make no sense, unless you truly are under the misapprehension that a typical systems consists of the kernel, various GNU components, and few trivial other bits. This is very, very far from the truth.

“GNU GPL, however, is irrelevant for the functioning, it would be easy to replicate something similar,”

haha— it isnt irrelevant at all, ask torvalds if it is. it isnt easy to replicate something similar either, and its less easy to get people to use such a thing. pulling off copyleft (when a billion dollar corporation was heavily dedicated to defeating it) was a serious coup. youre making it out to be a bunch of words in a file.

go make a gnu gpl— go ahead. show me. have anybody you can find to help in on it, too. have someone show me how easy it is.

“there are other available licenses. Notably, such other licenses, e.g. something in the Apache or BSD families, are often preferred by people outside GNU, because these parties have other priorities than free software.”

and thats the thing. they dont do what the gpl does, they dont achieve what the gpl does, but you consider them replacements. its apples and oranges.

Firstly, you assume (again) that copy-left, free this-and-that, whatnot is the priority of everyone. It might be your and Stallman’s priority, but it simply is not a global priority—and it is not needed to build e.g. an open-source computer system. Linux, Debian, …, does not need the GPL to exist.

Now, if someone does want a copy-left license? Firstly, there are other copy-left licenses around, if possibly with a somewhat different coverage. Secondly, combining another existing license with aspects of the GPL and/or with a few days research by a lawyer should yield something quite passable. (True, there might be few issue to sort out over time, e.g. due to ambiguities or complications with different jurisdictions, but not something that would require several decades to build.)

“GCC is mostly important for building the system, not for using it.”

special pleading, all over the place.

Not in the least: How would you justify include the compiler used to build the system as a component of the system? (Except for those proportionally rare cases when it is actually used to compile other programs when later using the system.) See also below.

“In fact, even now, many build setups contain explicit checks for the presence of GCC and automatically fallback to CC, should GCC be absent.”

and this is a bit of trivia, because all this stuff we have now would not exist (and would not be maintained) if everyone had to use cc. its like you know the significance of gcc but choose to ignore it when its convenient to your argument.

open source would not exist without gcc. linux might, but not the linux we have today. some little usenet gem that wasnt developed by half as many people— because they needed gcc to do it.

That is a very far-going claim. Can you back it up? I doubt it. In the case of open source in general, it is definitely incorrect, as can be seen by the many projects that use e.g. Java instead of C… Also keep in mind that in the absence of GCC, someone is likely to have started to improve CC or to build a more suitable compiler as need arose. Consider e.g. how GIT came into being; note that GCC as it is today is a very different beast from what it was when Torvalds started his work; note that GCC contains much that is not needed for Linux in the first place (e.g. unused languages) or is not essential for the existence of Linux (e.g. compilation for architectures outside the of the main-stream PC processors).

why even mention cc when you could have talked about clang instead? because i already addressed that when i talked about bsd, and because the part about cc is hypothetical (and at best, unlikely.)

One of several claims that make poor sense even on the sentence level. Besides: When did you address CC? Why would what I say about CC be unlikely?

As for Clang, I was not aware of it until now (but did consider mentioning LLVM in addition to CC, or the possibility of having started the original development with even a non-Unix compiler). However, its existence proves my point: Even if CC would not be a realistic replacement for GCC today, another tool definitely is. And: Other tools tools capable of filling the role of GCC have been possible at any point.

“glibc is possibly the most deeply ingrained dependency (and a better example than my original GRUB); however, this is still just one library.”

and linux is just one kernel. so what? its a monolithic kernel, and glibc is a monolithic library. theyre both enormous. you cant make them smaller by counting units, thats absurd.

glibc is far smaller and the kernel, by its very character, is the core of an OS. (glibc is not even the largest individual library—quite far from it, actually.) What you mean by “counting units” is not clear to me. The relevance of whether they are monolithic or not is lost on me.

“Here too we have the situation that glibc is not used because it is the only alternative, just the best.”

so once again, we shouldnt call it “gnu/linux” because gnu is just a bunch of vital components that arent vital because you could easily replace them with a bunch of drastically inferior alternatives that no one actually wants.

hmpf. yes, im taking some liberties with my version of your argument, but only to try to get its author to appreciate how much of a stretch it is.

“As with GCC, its absense would simply have led to something else being used.”

so *hypothetically*, gnu doesnt deserve top billing. because it could be less important than it is, if it werent.

You miss my point: That if we look at the situation as it is and say “part X is important today; ergo, if part X had never existed, the whole would not exist”, we ignore both the possibility of a replacement that would still have made the whole viable and the considerably likelihood that something else would have evolved over time to fill the same role or that the role would have been covered in a different manner. If we look at the situation today and see that just removing e.g. glibc would cause a given system to fail catastrophically, we cannot conclude that the system would not have existed had glibc not been present in (hypothetically) 1990—and therefore we cannot conclude that the existence of the system is contingent on glibc and, by implication, GNU. As a consequence, when you say “i said without gnu. no gnu gpl, no gcc, no glibc. you go right ahead, since gnu is irrelevant now. remove it, and find out what you get”, the answer is “without GPL, GCC, and glibc, we would see something that is recognizably approximately what we have today”. We might have ended up with an king penguin instead of an emperor penguin, but we are still talking penguins. Now, a scenario that removes GNU entirely from the early Linux development, that could have been a very major problem—but that does not imply that Linux and/or Linux-kerneled systems cannot exist without GNU today or that e.g. glibc is so central that they would never have come into existence without it.

“Even now, keeping the interface intact and replacing the implementation with a non-GNU variation would be technically feasible.”

but then, why should we rewrite glibc just to deny gnu the billing it allegedly doesnt deserve now?

That is not what I suggest: The point is that if glibc was no longer an option, hypothetically because a GPL violation necessitates its removal, a work-around is available. Yes, this might be tantamount to a team of surgeons operating around the clock to put in an artificial heart that buys the patient time until a real heart transplant is possible; no, it does not equal a dead patient.

“arguments that speak against referring to a system by the name of its kernel also speak against using the names of individual libraries, build-tools, and whatnots.”

except that you are oversimplifying the “linux is just a kernel” argument, failing to understand what people actually want with the name “gnu/linux,” not aware of why they want it, and making the name out to be more important than what the name refers to.

Not at all: The only way I can see to make your statement make sense is to posit that “GNU/Linux” would actually be enough to cover the entire OS (at a minimum) or the OS + a considerable portion of the rest of the system. This, however, is not even close to being the case. It is conceivable that a working “GNU/Linux” (only) system is buildable today, but it would not be the equivalent of e.g Debian, Fedora, Suse, …

As to name vs. thing, cf. above.

“I grant that Linux would conceivably not exist today without the presence of GNU in the past”

nor the present.

Prove your assertion. Remove GNU today, where do you see the insurmountable obstacle? (As opposed to the far more likely transitional period of blood, sweat, and tears.)

“if we speak of the system as a whole (the sensu lato), I refer to my post for a discussion why GNU is no longer important enough to define the system.”

now it is a discussion. it was an assertion, which leaned a bit on misunderstanding and special pleading and ad hom.

I strongly disagree.

“I do. Cf. above and your apparent confusion of what GNU is.”

i am not “confused” about what gnu is. gnu was from the beginning, a fancy-pants latin phrase (“the whole thing.”)

since the 1990s, a bunch of people have suggested that it is just a bunch of applications that everyday people dont really use.

your argument is built around the suggestion being a fact.

See the general discussion for various meanings and your incorrect interpretation.

given that the conclusion of your argument is that we should agree with them, i would call your entire argument circular.

Your claim makes no sense, shows that you have not understood me, and raises doubts as to whether you understand what a circular argument is.

we dont have to rewrite history. however, i would say you argue (quite unintentionally, beacuse i think you really do misunderstand the nature and premise of your own argument) that history doesnt deserve to not be rewritten.

it is not necessary to rewrite history to refer to gnu and “linux” instead of gnu/linux.

Again, you make no sense.

it is necessary to rewrite glibc, the gpl, and reestablish so much of gnu that you generously refer to as “linux,” in order to make most of the PREMISES of your argument into facts.

If you believe that, you definitely have not understood what I am actually saying.

if the premises are false, the argument isnt sound. in your reply, you spend a lot of time defending the logic of your argument based on a more hypothetical premise.

the premise of your argument was just false. the logic is heavily just assertion.

You have not shown that my premise is false; yet seem to rely on faulty premises or faulty understandings, yourself.

there isnt any need for ad hom, its simply wrong. but thats not important.

Where have I used ad hominem? Do you understand what this actually implies?

what matters is that in ten years, people will still be trying to get “gnu” removed from “gnu/linux.” and we can have this debate all the way there. i do hope we get breaks for the restroom though.

I do not see that as something that really matters and the opinion that “GNU” does not belong in the name is likely to grow stronger for reasons that include a further lessening of GNUs practical relevance, a smaller proportion of people who at all know of GNU, and a growing importance of both the distribution aspect and the desktop aspect.

“I do consider free software highly beneficial, but free software is not a core priority of Linux”

and that is exactly why stallman says it shouldnt be called just linux. because free software is not a core priority of it.

thats his entire argument. if you care about free software, call it gnu.

it has nothing to do with percentages of code, it has nothing to do with riding coattails.

it has everything to do with why gnu was created in the first place. not to write glibc, not to give you a web browser.

gnu was created to give the user freedom. and if you care about that, calling it “linux” ignores the original purpose, paints something relevant in modern times into enough obscurity that people think its just about user applications— and lets “linux” come along and assert boldly that freedom doesnt matter.

its not about ego, religion, or percentage of code. its about whether you care about freedom or not.

See the general discussion for why this is a faulty argument when it comes to the name.

funny thing, its always implied that stallman is just nitpicking, but year after year (after year after year) open source nitpicks that “gnu” isnt important enough to be in the name.

I, personally, have implied no such thing. That “GNU” does not belong is not nit-picking.

free software never convinces everyone to add “gnu” and open source never convinces everyone to drop it, but both sides continue to nitpick this for decades.

The division into free and open software should not play a role when discussing the name issue. If it does, something is fundamentally wrong with the approach.

your argument is most likely honest, if lacking context and history. the argument itself has its own history, though open source doesnt learn from the failure of the argument youre making, it just keeps reasserting it.

the history of your argument is that it is constantly made— 20+ years running now.

i made it myself, over a decade ago— i abandoned it because it was silly.

From what I have seen so far, the lack of historical and contextual understanding seems to be more on your side, with the one reservation that I was actually not aware of the extensive history of the argument. The reasoning you apply today, e.g. that what I refer to as moral superiority above should affect the name, is the silly part.

Written by michaeleriksson

April 25, 2018 at 6:25 am

“Linux” vs “GNU/Linux”

with 5 comments

A sometime claim is that “Linux” is an inappropriate term (when not referring specifically to the kernel) and that “GNU/Linux” would be better—especially by Richard Stallman, who is the founder and main force behind GNU…

However, this view is at best outdated*—at worst, it is an attempt to ride on the coat-tails of a better-known-among-the-broad-masses project. Most likely, however, it is a sign that Stallman is too fixated on his own vision of “GNU/HURD”**, and is unable to see that there are other perspectives on the world: Since his focus is on GNU, those who use Linux instead of HURD obviously appear to use “GNU/Linux” instead of “GNU/HURD”. This, however, has very little relevance for the typical Linux user:

*GNU used to be a much bigger deal than it is today, for reasons both of changing user demographics/behaviors/wants and of an increased set of alternative implementations and tools. Certainly, Linux (in any sense) would have had a much tougher time getting of the ground without GNU.

**HURD was conceived as the kernel-complement to GNU roughly three decades ago—and has yet to become a serious alternative to e.g. Linux.

The general criticism that Linux is just the kernel and that the user experience is dominated by user programs (and other non-kernel software, e.g. a desktop) is quite correct. (This can be seen wonderfully by comparing an ordinary Linux computer and an Android smart-phone: They have very little in common in terms of user experience, but both use a Linux kernel. Conversely, Debian has made releases that use a non-Linux kernel.) However, in today’s world, most Linux users simply do not use many GNU programs, they have correspondingly little effect on the user experience, and a functioning Linux system entirely without them* is conceivable.

*The main problem being “hidden” dependencies. For instance, most Linux computers use GRUB for booting and GRUB is a GNU tool. However, none of these hidden dependencies are beyond replacement.

For instance, a typical Linux user might use Firefox or Chrome (both non-Gnu), LibreOffice (non-Gnu), a few media applications (typically non-Gnu), … Even most parts of the OS in an extended sense will typically not be GNU-programs, e.g. the X-server, the window manager, the log-in manager, the network manager, a desktop environment, … The best way to approximate the user experience would likely be to speak of e.g. “distribution/desktop” , e.g. “Debian/KDE”*, especially seeing that most desktop environments insist on providing their own, entirely redundant tools, for tasks that more generic tools already do a lot better, including text editors, music players, image viewers, …

*KDE is a user hostile disaster that I strongly recommend against, but it is likely still the most well-know desktop environment. Generally, not everyone uses any desktop, but most do.

Even those, like yours truly, who actually do use a lot of GNU programs are not necessarily bound to GNU: Most important GNU tools are re-implementations of older tools and there are alternate implementations available even in the open- and free-source worlds. Are the GNU variations of e.g. “ls”, “mv”, “awk”, better than the others? Possibly. Would it kill someone to switch? No. Even a switch from Bash to Ksh or Zsh would not even be close to the end of the world. Admittedly, there might be some tools that are so significantly better in the GNU-version that users would be very troubled to switch (gcc?) or are not drop-in replacements (e.g. gnumeric). These, however, typically are either developer tools or have a small user basis for other reasons. Most modern users will not actively use a compiler—or will not need the extras of gcc for their trivial experiments. Most users will opt for a component of an office suite (e.g. LibreOffice) over gnumeric. Etc.

For that matter, even on the command line, my two most extensively used programs (vim, mplayer) are not from GNU either…

Yes, using “Linux” is misleading (but generally understood correctly from context); no, using “GNU/Linux” is not an improvement. On the contrary, “GNU/Linux” is more misleading, shows a great deal of ignorance, and should be avoided in almost all cases*.

*An obvious exception would be a situation where GNU is the core topic and a contrast between GNU-with-the-one-kernel and GNU-with-the-other-kernel is needed.

GNU still plays a very valuable role through providing free-software alternatives for many purposes. This role, however, is not of a type that it justifies “GNU/Linux”.

As an aside, Stallman’s own arguments focus unduly on the free-software aspect: Most of his text seems to argue that GNU is valuable through being more keen of free software than Linux—something which is entirely irrelevant to the question of naming. (In general, Stallman appears to see free software as a quasi-religious concern, trumping everything else in any context.)

Written by michaeleriksson

April 14, 2018 at 4:33 am

The declining security of Linux (and sudo considered harmful)

with 4 comments

Naive approaches to computer security have long been a thorn in my side, starting with the long lasting Windows assumption of a single user and user account on a system. (Originally explicit in that no second user account or user control was available; in the last ten-or-so-years in the form that the standard case is one user and one user only—who if at all possible should only ever work with one account.)

Unfortunately, Linux has also taken a turn for the worse over the years, often taken extremely naive approaches, prioritizing the convenience of the inexperienced user over security*, and opening holes that even a highly proficient user is often unaware of—and with more and more holes as time goes by.

*With the dual effect that those who want security have to put in a load of work (and likely still fail) and that many users are not aware of how poor their security is. Notably, the naive users might be pleased about the convenience—but they too are victims of the poor security. I would even argue that because they are naive, there is a greater obligation to protect them through implementing strong default security.

A prime example is the default file permissions (umask), which on most modern systems are set so that anyone can read the files of everyone else… This is so obviously wrong and idiotic that whoever is responsible should be taken out and shot. The obvious correct default behavior, and what matches the reasonable intent on almost all systems, are permissions where either only the owner is allowed to read a file or only the owner and the members of the files “group”*. One of the first things I do with a new installation is to restrict the default file permissions to owner only—if something else is needed for a specific file, I override the default.

*The standard file permissions on Unix-like systems divide the world into the owner, the group, and everyone else. By assigning users to a group, they can be given different access to certain files than “everyone else”, without being the owner.

This misconfiguration is particularly dangerous because it is unexpected, it is often only discovered when it is (potentially) to late, and it requires an over-average amount of knowledge to correct*.

*It is not enough to simply change the default setting: Each and every file that has already been created with that setting must have its individual setting corrected.

Another particularly annoying and dangerous problem is demonstrated by utterly conceptually flawed tools like sudo, pkexec, and polkit: Much like the execution controls in Windows, they assume that a user has a varying amount of rights to do things depending on how he does them. (E.g. through calling a command with or without sudo, or through giving or not giving a password to polkit.) While these tools are intended to increase security, they instead open up ridiculous security holes, and increase the likelihood both of users being given rights that the admins never intended them to have and of hostiles being able to achieve “privilege escalation”*.

*Roughly, an attacker starting with a certain set of rights that do not pose a danger and tricking the system into giving him more rights until he does pose a danger. This is a central part of cracking a computer system.

Consider sudo: The intention of sudo is that when a user executes the command X as “sudo X” (instead of just “X”), it is as if root (the main admin user) executed the same command. Now, what commands are allowed to “sudo” for a certain user is configurable, but this configuration can be a bitch. Take something as harmless as an editor: If the user can “sudo” the editor, he can now change system files, manipulate the password storage, read documents that should be secret, … The system is effectively an open book that a skilled cracker can exploit and infiltrate as he sees fit. OK, so we do not allow editors (and a number of more obvious things like command shells, commands to delete files, and the like). Now what about all the other applications that are not editors but still have the ability to execute editors or have the ability to even just save a file? What about those that can execute commands (e.g. through a “shell escape”—a very common mechanism on Unix-like systems)? They too must be ruled out. Etc. But here is the real devilry: How do we find out what commands have what abilities? This is a virtually impossible task, with many nasty surprises—e.g. that the standard pager (“less”; seemingly only intended to view files) has the ability to launch an editor… The only chance is to reduce the “sudoable” commands to an absolute minimum, carefully verify that minimum, and (more likely than not) conclude that the users now do not receive the convenience that sudo was intended to give them.

The task of configuring sudo is made the harder because most Linux distributions appear to work on the assumption that any system is a single user system (as with Windows above)—and cram down whatever gives the user convenience in the corresponding configuration. Looking at the configuration file /etc/sudoers on my current system*, I find e.g.

*No worries: While the configuration file is still there, the actual sudo program has been removed.

# Allow members of group sudo to execute any command


The comment line says it all.

Now, a good admin would not assigns the group “sudo” to just anyone and would use far more granular settings to give individual users what they need. However, not all admins* are good and this approach practically invites the admin to be lazy and assign rights carelessly. To boot, this makes it ease for the Linux distribution to screw up, because the consequences of a change become hard to predict, e.g. when default group assignments or default configuration entries are altered. In one horrendous case I heard of some months ago, the default configuration actually gave everyone, irrespective of group, the right to “sudo” anything, resulting in a system with no actual security anymore…

*Note that the admin is often quite, quite poor as an admin: Admins are not just found in big enterprises—the family member who takes care of the family’s computers is also an admin.

Others do truly stupid things, like https://help.ubuntu.com/community/Sudoers which gives an example of how to add an editor (!) to the configuration—and this in a section titled “Common Tasks”…

myuser ALL = (root) NOPASSWD:NOEXEC: /usr/bin/vim

This example lets the user “myuser” run as root the “vim” binary without a password, and without letting vim shell out (the :shell command).

Well, preventing “shell out” (more properly “shell escape”, one of the issues I mention above) is good, but obviously the idiot who wrote this has failed to understand that an editor is lethally dangerous too (cf. above). For instance, “sudo vim /etc/shadow” gives a malicious user the possibility to change the root password, after which he can trivially gain a root shell—without needing a “shell out”.

In contrast, the earlier approach was very sound: Either a user account had the right to do something or it did not—end of story. Usually, “did not” applied, when not dealing with the users own files. When more rights were needed to do a task the physical user had to log in with a new user account with more rights in the relevant area (and typically less in other areas!)—if he was trusted with such an account*. Yes, sudo can be more convenient, but that convenience is bought with a horrendous drop in security.

*If he was not trusted, then he correctly had no opportunity to do whatever he wanted to do.

The one saving grace of sudo is that it makes live a little safer for those who would otherwise take even greater risks in the name of convenience, through giving themselves dangerous rights all the time. This, however, is not a valid reason to make life that much less secure for the users who actually try to be secure and know how to handle themselves. This is like noting that condoms reduce pleasure and replacing condoms with some other mechanism which gives more pleasure—but does so at the price of not actually preventing pregnancy and disease transmission…

As a rule of thumb: If someone recommends that you use sudo, discount anything he says on security issues. This tool is simply one of the worst security ideas in the history of Linux.

I have seen some truly absurd cases, e.g. one nitwit who adamantly insisted that logging in as root on a terminal was very dangerous, but still threw sudos around willy-nilly. (While logging in as root is never entirely without danger, a terminal is the least dangerous place to do so, seeing that this reduces the risk of a snooper catching the password, removes the temptation of starting various GUI programs, and drastically reduces the risk of forgetting that one is using the root account and mistakenly doing something stupid.)

Excursion for the pros:

Those who know a little more about Unix security might see a major advantage of sudo in the reduced need for suid-ing programs. This might or might not have been an advantage at some point of time, but I have worked for years without using sudo and I have never needed to change anything in this regard. I conclude that what should work works, be it through appropriate group settings, daemons, or suid programs that are there irrespective of the presence of sudo. In addition, I am not convinced that suid programs, the potential dangers notwithstanding, are a greater evil than sudo, at least not after considering the relative likelihood of an admin doing some stupid—it is not just a question of what approach is the safer technically, but also of what approach gives us the better protection from human errors.

Written by michaeleriksson

December 6, 2016 at 11:07 pm

Blogroll update

with one comment

Last week, I was directed to a page wishing to prove that more than 1 % of all desktop users use Linuxe. Considering its approach of actually trying to get sufficiently many Linux users to announce themselves, this is a herculean task, which can benefit from a little help—like the inclusion in a few blogrolls.

I recommend any Linux user to drop by to increase the statistic.

(Note: An email address must be given. While the site looks legitimate overall, I recommend the precaution of using a disposable address.)

By the FIFO principle, The Thoughtful Animale is removed. That blog was first discussed here.

Written by michaeleriksson

October 12, 2010 at 12:15 pm

The trial of the year—Victory! (Follow up)

leave a comment »

As I wrote in March, a jury ruled in favour of Novell in the fight against SCO, whose widely-considered-faulty claims had caused great costs and uncertainty for a number of other parties (including, obviously, Novell).

There was still some remaining uncertainty in theory (considering the overall situation and previous judgements, a practical problem was unlikely), because there were further “findings of facts” and various motions to be decided by the judge. As Groklaw now reportse:

Judge Ted Stewart has ruled for Novell and against SCO. Novell’s claim for declaratory judgment is granted; SCO’s claims for specific performance and breach of the implied covenant of good fair and fair dealings are denied. Also SCO’s motion for judgment as a matter of law or for a new trial: denied. SCO is entitled to waive, at its sole discretion, claims against IBM, Sequent and other SVRX licensees.


Maybe I should say cases closed. The door has slammed shut on the SCO litigation machine.

Written by michaeleriksson

June 11, 2010 at 6:09 pm

Posted in Uncategorized

Tagged with , , , ,