Archive for December 2018
A few thoughts on grocery shopping and holiday crowds
Earlier today, I went grocery shopping. Between the restrictive German opening regulations (yesterday/Sunday closed; tomorrow/holiday closed) and the event character that New Year’s has for many, it was very, very crowded. This was made the worse by people having failed to make (or adhere to) some basic observations/guide-lines that any adult shopper should be long aware of:
- Never leave a shopping cart standing by its own: Unattended shopping carts are a great cause of “traffic” and access problems, and it is much better to move with the shopping cart.
Notably, a typical aisle is roughly half-blocked (some entirely) by a shopping cart, which implies (a) that other shoppers can only pass in one direction at a time, (b) that people have to forego the contents behind the cart entirely, (c) people either have to forego the contents on the other side of the aisle too or they will cause the entire aisle to be blocked.* A moving cart, in contrast, will soon be gone (making the issue temporary in any given place) and it poses no obstacle to “two-way” traffic.
*Some relief can be found through moving the cart forwards or backwards; however, in crowded situations, there is not always much room to do so. Personally, I am also loathe to move other peoples cart by more than a nudge or two, unless they are severely misparked—it has always struck me as rude.
In a twist, unattended shopping carts can sometimes cause a chain-reaction, because an unattended cart has made it impossible for another shopper to reach a certain spot without leaving his cart behind.
For those who insist on leaving their carts around, at least make sure that the cart is parked as close to one side of the aisle and as parallel to it as possible, and that it does not protrude into e.g. a “cross-aisles”.
(Foregoing a shopping cart entirely is an option, but it is not always practically possible, and the additional gain compared to not parking the cart seems small.)
- Similarly, try to keep moving as much as possible and try to not be unduly slow: Slow movers slow others down and increase the risk of blockages. Standing around to look in detail at various products, especially in groups and/or with carts, causes similar problems as above. By all means, shop consciously and make comparisons, but adjust your behavior to the degree of crowdedness and the importance of the decision—which of several low-end sekt bottles to buy is not worth five minutes of discussion between three people.*
*While I did not time them, I did see a group of three people having exactly this discussion for what must have been a somewhat longish time: I co-incidentally passed them by on two occasions with several minutes in between. (And, yes, we are talking the likes of “Rotkäppchen”—not expensive Champagne brands.)
- Do not bring family/group members that are not needed (at least on crowded days): Not only is every extra person an additional burden on space, but people who move in groups tend both to form obstacles in a different manner than the same number of people moving individually, and to move a lot slower on average (cf. above; note that the slowest individual sets the tempo for the entire group).
As a special case: Do not bring small children (in general) and small children in a prams (in particular), unless there is no other option.*
*On non-crowded days, this would apply for another reason—the undue amount of noise that they so often make. If you do bring small children, make sure to silence crying and whatnot as swiftly as possibly. To just let children cry out loud without reacting, as some parents do in at least Germany, is inconsiderate towards the rest of the world and poor parenting towards the children.
- While the above items deal with how to reduce the impact of a crowd,* there is also the matter of reducing the crowd, it self. For instance, those who shop faster will make the crowd smaller once they are gone; those who can manage to go on an earlier day** will distribute the crowd more evenly; etc. Such planning has the additional advantage of directly benefiting the planner, himself. (While the earlier items help others; with the hope that others will also adhere to them, and thereby give a benefit in return.)
*Whether a shopper or a shopping cart is currently moving or standing still does not (or only marginally) affect the amount of space taken. However, those that move still cause a smaller disturbance (as surprising as it might seem).
**Admittedly not always easy: Forego shopping today, and there is a three-day stretch without shopping, with the first prior opportunity being a Saturday (which tends to be fairly crowded even on ordinary days, and was likely much worse than usual due to New Year’s). Forego the Saturday too, and it is a four-day stretch. Over this year’s Christmas, with both the 25th and the 26th being German holidays, the corresponding stretch was four resp. five days (foregoing the 24th, resp. the 24th and the 22nd/Saturday). While these time periods are by no means unsurmountable, they are suboptimal for those with no freezer and a small fridge, on the long side for many “fresh” items, and increase the risk of something unexpectedly running out—let alone the risk that some event-specific item has been forgotten.
Here we can also see a vicious circle: The increase of the crowd increases the amount of time the individual shoppers need, which increases the crowd, which …
- Try to show some degree of awareness of other shoppers and how you might currently impede them: Even on non-crowded days, one or two people who ignore the rest of the world can be a severe hindrance. Cf. an older text on women and awareness of surroundings.
As a similar advice to store-managers/-staff: Try to limit activities like stocking shelves to the absolute minimum in crowded situations. If a shelve is about to run out of whatever it contains,* do re-stock—but save the “routine” stocking until the crowd has abated (or do it before the crowd comes). Also try to do this ad-hoc re-stocking with as few and as small carts as possible (cf. the shopping-cart discussion above).
*Not only will greater crowds increase the risk in general, but “special-day” crowds will be disproportionately likely to go for the same items. Some standard items, e.g. milk, could conceivably need multiple re-stockings even on a regular day. Correspondingly, not re-stocking at all is a bad idea.
The misadventures of a prospective traveler
Three issues that have collaborated to drive me nuts today:
- I had promised my father and step-father to try to come to Sweden over Christmas or New Year’s.
For this purpose, I had already made several attempts to find suitable tickets with the Tor Browser (relayed through Tor or directly over my non-torified Internet connection, even with JavaScript enabled). This had proved very annoying and unproductive. For instance: The Lufthansa site simply does not load at all, it just hangs with a perpetually waiting tab. The SAS site loads, appears to work, allows me to fill in all criteria—and then does nothing when I try to submit the search. Ditto EuroWings. A meta-search that I attempted to use was hopelessly slow, insisted on (with every single search) re-including flights with a change of planes,* and also insisted on re-ex(!)cluding potentially cheaper late-night** flights. At least one site interrupted my search by having a JavaScript pop-up demand that I take a survey to improve its usability…*** Almost invariably, these sites had various annoying blend-ins/-outs, animations, overly large images, poorly structured pages, …
*Which roughly doubles the travel time between Düsseldorf and Stockholm (that have the most suitable airports) and is highly unwanted by me.
**A smaller negative than changing flights…
***First tip: Do not molest customers with such pop-ups!
Having given up and postponed the search twice already, and now having the 28th of December, I decided to install a brand-new vanilla Firefox, in the hope that at least the SAS/EuroWings problems would be explained by some version incompatibility with either the Tor Browser (as a Firefox derivative) or my version being too low.*
*Many web-sites, in the year 2018!, still fail to make browser-agnostic implementations, insist on very recent versions of browsers, and similar—usually with no indication to the user that they do. (And visiting an eCommerce web-site without JavaScript on is more-or-less bound to fail.)
First attempt, SAS: Everything seemed to work, but the web-site was now even more visually annoying than before. The choosing of dates, for some reason, worked in a different manner than before and was highly counter-intuitive. Seeing that there also (not entirely unsurprisingly) were no good flights prior to the New Year and that prices were unnecessarily high, I decided to look elsewhere first.
Second attempt, Lufthansa: Still did not load…
Third attempt, EuroWings: To my great and positive surprise, everything appeared to work perfectly, showing me timely and much-cheaper-than-SAS flights. Things kept working as I began my purchase, entering name, address, and whatnot—and I even found the alternative to pay by invoice!* Alas: As I tried to confirm the last step, I was met with an uninformative error message and the request to start again from the very beginning. There was, in particular, no mention of factors like the last few seats on that flight having been suddenly snatched by someone else, or invoice payment not being possible on so short notice**. Before starting over from the beginning, I gave just the last page a second try. I re-entered some (inexplicably deleted) information and re-submitted. Same error message—but now followed by an intrusive pop-up suggesting that I start a chat with someone… I clicked the dismiss button—but, instead of disappearing, the pop-up did a weird and time-consuming animation and kept blocking a significant part of the page even after the animation was finished. At this point, I had had enough, closed my browser, and decided to find other means—which will likely amount to going to a physical travel agency and visiting at some point after the New Year…
*Thereby removing one of my last doubts, namely the risk that I would be forced to pay with a combination of credit card and 3D-secure—which I (a) have never attempted with my new bank, (b) fear would involve the idiotic use of SMS (I do not currently have a cell provider), (c) had found to simply not work at all with my previous bank (the to-be-avoided Norisbank).
**For which I would have had some sympathies–but, in that case, invoice payment should never have been offered in the first place.
These events are the more annoying, seeing that there actually was a time when it was reasonably easy to handle tasks like these over the Internet—it really is not that hard to implement a decent search–choose–pay UI. However, year for year, the usability of various Internet shops and whatnots grows worse and worse, and appears to make more and more specific demands on the browsers. Much of this goes back to the obsession with Ajax. Credit-card payments are also not what they used to be, being much more laborious and likely to fail than in the days before 3D-secure and similar technologies. Worse, from the customers point of view, they likely lead to a net loss of security, whereas the stores and involved payment entities see the gains.* Then, if not relevant above, we have the inexcusably poor efforts of various delivery services, notably DHL, which often make it less of a fuzz to go to a store and pick up a purchase in person…
*For the customer, the risk that someone will manage to fake a payment is reduced, but if someone does, he has very few options to prove that he was the victim of fraud. Without 3D-secure, the burden of proof was on the other party, and the customer had very little risk at all (short of additional work). The merchants and credit-card acquirers, on the other hand, can have large costs and losses when a fraudulent purchase is followed by a charge-back—and 3D-secure helps them, not the customer, by reducing this risk.
- Installing and setting up Firefox proved to be a PITA. Apart from the issues of the next item, I note that any version of Firefox has tended to come with very poor default settings, including default UI behavior; and that the “new” Firefox is highly reduced compared to the “old”.* After installation and prior to my attempts at finding tickets, I spent at least five minutes going through and correcting settings—that were then, obviously, only even valid for that one user account**…
*The changes would be enough for a long own text. For now, I will just note that (a) that the GUI-configurable settings have been reduced to a fraction of their previous scope, (b) the general attitude described in e.g. [1] is continued.
**I have a number of user accounts for different purposes, in order to reduce the risk of and damage from security breaches and whatnots. This includes separate accounts for eCommerce (the current), my professional activities, ordinary surfing, porn surfing, and WordPress.
To boot, a new dependency was installed: libstartup-notification0. I did some brief searching as to what this is, and it appears to be just a way for an application to change the shape of the cursor during startup… (Beware that my information might be complete.) Firstly, why would I want the cursor to change?!? Secondly, even if this was seen as beneficial, it certainly is not reason enough to add yet another dependency—there already are too many useless dependencies, many of them recursive (also see portions of a text linked below).
- The idiotic Debian “alternatives” systems and the “desktop nonsense”.
Disclaimer: Some familiarity with Debian or similar systems might be needed in order to understand the below.
When a Debian user installs an application, e.g. Firefox, /usr/bin/firefox (or whatever applies) does not contain the Firefox binary—nor even a link to the Firefox binary. Instead, it links to an entry in /etc/applications, which in turn links to the actual binary (unless a certain setup has even more indirections involved). To boot, this system is administered by a poorly thought-through tool (update-alternatives) and/or configuration; to boot, it is vulnerable to applications arbitrarily overriding the status quo, as well as adding pseudo-applications (e.g. x-www-browser) that at least I simply do not want polluting my system.
In fact, these pseudo-applications are likely the reason why this system was added in the first place—because e.g. x-www-browser can be “provided” by a thousand-and-one different real applications, it would be highly complicated to work with straight links, let alone binaries (especially when one of the “providers” is removed). For real applications, there is a much better way to solve such problems—namely, to just link e.g. /usr/bin/firefox directly to the usually sole instance of Firefox present and give the user an explicit choice of the “default” Firefox every time a new Firefox version was installed or an old removed.
Why do I not want these pseudo-applications? Firstly, they bring me and most reasonable users, at best, a very minor benefit (for which they bring the cost of the indirections and the greater effort needed when looking for something). Secondly, the “providers” are usually sufficiently different that unexpected effects can occur.* Thirdly, they are often used by other applications in a manner that is highly unwanted: For instance, one of the alleged main benefits of x-www-browser is that any other application, e.g. an email reader, should have an easy way to open an HTML document, without having to bother to check what browsers are installed—but I absolutely, positively, and categorically do not want my email reader to even try this. In a saner world, this would be something configurable in the email reader (and only there), and those who want this endangerment can configure it, while those who do not want it simply do not configure it. By having x-www-browser, the user no longer has such control. Worse: Since the real application behind x-www-browser can change without his doing (be it due to presumptuous applications or an administrator with different preferences), the effects can be very, very different from the expected—e.g. that a known browser with JavaScript, images, and Internet access disabled (appropriate for reading e.g. HTML emails) is replaced with an unknown browser with everything enabled. (Which, in combination with email, could lead to e.g. a security intrusion, leaking of data to a hostile party, activation of unethical tracking mechanisms of who-read-an-email-when, and similar.)
*For instance, there are many highly specific tool families, e.g. awk, whose members will superficially appear to be and behave identically (much unlike e.g. Firefox and Chrome/Chromium, as x-www-browser candidates), but will have subtle differences that can lead to a failed execution or a different-than-expected result in certain circumstances. Such problems, especially when undetected, can have very serious consequences. It is then much better for the user to, depending on circumstances, pick the specific awk-version he needs by explicit call (=> the alternatives system is not needed), make sure (for a one-user system) that he only ever has one instance installed and use the generic “awk”-name (=> the alternatives system is not needed), or restrict himself to only the common base of identical features. In the last case, the alternatives system would have some justification—however, it would place a very high burden on the user in terms of not making mistakes, might still fail due to undocumented differences or bugs, and is vulnerable to other differences, e.g. regarding performance. Obviously, this would also reduce the available capabilities of the tool in question—in many cases, quite severely.
Similar remarks concern the “desktop nonsense” (which would deserve a long text of its own; a partial treatment is present). In this particular case, there are at least two* further mechanisms (/usr/share/applications, /usr/lib/mime/packages/) that cause similar problems, including allowing e.g. email readers to launch things that they should not launch. I have used the tool chattr to forbid additions to these two directories; however, due to the incompetence of the apt implementers and/or package builders, this is only a partial help: Despite these entries being unimportant for the actual functioning of the system/the installed application, the chattr-setting leads to a hard error from the apt-tools. I know have to “de-chattr” the directories, re-attempt the install, manually delete the added files, and “re-chattr” the directories … Effectively, I do not prevent the directories from being polluted—instead I trade an increased work-load for the benefit of knowing when I have to manually clean them up after pollution.
*Proof-reading, I suspect that /usr/lib/mime/packages is not strictly desktop related, and might better have been treated as a third area. In the big picture, this does not matter. (And I do not have the energy to sort out “what is what” at the moment.)
Plato’s cave and the dangers of a limited world-view
Plato’s famous cave provides an excellent illustration* of the dangers of e.g. censorship (cf. any number of previous texts), opinion corridors, distortion of literary works, echo chambers, and other limits on thought and perception.
*Not necessarily one intended by him, but it still fits well.
Consider someone (for now, voluntarily) sitting in a cave, facing one sole direction, and attempting to understand the world based on the shadows that move on the cave wall: How is this supposed to work? What is there to lose by turning around and trying to catch a glimpse of what happens in another direction? A glimpse of the actual objects, instead of their shadows? A glimpse of the fire? A glimpse of the other walls? Why not climb out of the cave for an entirely new set of perspectives and information? How can the wall-facers even know the extent of their knowledge and ignorance, let alone actually learn something beyond the limited and often distorted information of the shadows?
Yet, exactly this wall-facing is what many people do—most, in the case of the strongly religious or ideological (notably Leftists/PC-ists/Feminists). Indeed, they are similar to the prisoners of Plato’s cave even in that they often react negatively towards those who have seen something else or suggest other truths (also see an excursion below). The condemnation of the unseen without bothering to look at it first, is a recurring theme.
Now, I am not saying that those who do expose themselves to other walls, the world outside the cave, whatnot, will necessarily change their opinions—nor that they necessarily should (this will obviously depend on how well the old opinion fares compared to the alternatives; also cf. e.g. [1]). However, their opinions afterwards will be better—be it because they have replaced a flawed opinion with something closer to the truth, have modified the old opinion to be more nuanced, or have a greater legitimacy in believing the old opinion (should it have panned out, after all).
In contrast, the opinions of those who refuse to expose themselves to other ideas, perspectives, whatnot are next to worthless, being steeped in ignorance and too weakly tested—and those who deliberately constrain themselves prove their anti-intellectualism and misology, no matter what they might claim or believe themselves to be.
Then there are those (again, notably in the Leftist/PC/Feminist area) who deliberately try to chain others in front of that one wall. They try to censor or libel dissenters; force curricula to have an ideological character with a fix world-view; re-write children’s literature; promote ignorance of and/or distort history and scientific findings; … My feelings for them are better not put into words.
I have no objections to people who fairly promote their own opinions (world-view, ideas, whatnot)—this is the equivalent of describing the advantages or contents of a particular other eye direction or the surface world. I also raise no objections to those who attack other opinions with reasoning, factual arguments, sound science, and other intellectually honest methods.
However, very many chose very different methods, aiming at, directly or indirectly, preventing their victims from exploring other sources of information, other perspectives, etc. This is not limited to actively preventing access to these information sources and whatnots; it also includes unfair attacks (e.g. most cases of ad hominem), “science” hell-bent on proving a certain point,* indoctrination of those too young or feeble minded to think critically, and “channel flooding” through ensuring that all** channels that might reasonably be encountered by most people are filled with content supporting or pseudo-supporting the same idea. (These channels would then be akin to the wall, and the “flooders” to the movers of shadow-casting objects, who can further control and distort the perception of the world.)
*Legitimate scientists will often approach e.g. an experiment or a survey with the wish to see a certain conclusion; however, they try to build it fairly and are willing to adapt their opinions, should the results not be the hoped for. Illegitimate ones will often build the experiment/survey/whatnot so that the “right” result is bound to manifest, irrespective of the truth, and/or interpret the results in such a manner that it supports their thesis (even when someone more objective would come to another conclusion), and/or report their findings in a highly misleading manner.
**For instance, school-children will be so disproportionately exposed to the channel “school” in many areas (including history and social sciences) that agenda pushing among teachers or text-book publishers can do severe damage; college is not as extreme, with adults having more options, but poses a similar danger. For instance, until the 1990s or even later, the single most important source of news for most people in e.g. Sweden was the few state-owned TV channels, and anyone controlling these also controlled a disproportionate part of the information flow. Websites are mostly a contrast, seeing that anyone can chose to simply visit a different set of websites; however, if e.g. Google and Facebook selectively promote or penalize certain contents based on opinion, this could be seen as a further example.
Excursion on Swedish anti-Feminists and similar groups:
A particularly strong parallel can be found in the journey and treatment of most Swedish anti-Feminists: These, like I, usually grew up with a considerable amount of Feminist indoctrination (through school, news papers, political propaganda etc.), they eventually found that the Feminist world-view, even Feminist self-portrayal, did not pan out (by a mixture of critical thinking and exposure to alternate opinions/non-ideological science), and were then derided as ignorants in need of enlightenment by the Feminists when they tried to point out the many Feminist fallacies to others… This matches the scenario by Plato of a former wall-facer who is moved to the surface and considered deficient upon his return—for his failure to see the world in the same way as the remaining wall-facers.*
*A minor flaw in the analogy is Plato’s description of the returning explorers as being temporarily limited in sight until their eyes have adjusted to the darkness again. I am far from certain that I agree with this portion of his (metaphorical) discussion in any context, and it is likely intended to apply only in the context of education and politicians (or possibly the ideal vs. the mundane, or similar) to begin with.
Excursion on Plato:
At least the linked-to piece (covering Book VII of his “Republic”) is yet another example of writing that is poor, long-winded, and has a too low information density. The ever repeating, mindless, agreement by Glaucon is a particular nuisance—not merely introducing noise, but also being outright annoying after a while.* A much better approach to the dialogue format (if such a format is used at all) would be to have Glaucon go into opposition and force clarifications and/or stronger arguments. As is, such opportunities are limited to variations of him not understanding or understanding incorrectly—and even those drown in the constant amens.
*In a twist, this type of agreement is contrary to the spirit of the above. It is also a long way from the reputation of the Socratic method… (The rest of the “Republic” is on my short-term todo list, but I have no other direct experiences with Plato’s writings.)
A few thoughts around Christmas and myself
Some semi-random thoughts that have gone through my head the last few days:
- Christmas is one of the rare cases where I can feel a certain degree of loneliness: Normally, I am perfectly happy on my own;* however, the mixture of the family-centric holiday and a fair bit of nostalgia (cf. the next item) puts matters of family on my mind. Even so, it is only partially an actual (fleeting) wish for a family—the bigger part is a feeling of being too different and/or of having (in some sense) failed at an aspect of life.
*For as long as I can remember, I have preferred books and TV (and later e.g. the Internet) to people.
This is radically different from my thinking on regular days, where I tend to view the prospect of children and the associated responsibilities and problems with abhorrence, while being ambivalent* or even negative towards the idea of a wife. The one prolonged exception to this was a period of a few months after my mother’s death, when I seriously contemplated looking for a wife, likely as a reaction to the shrinking of my “old” family. (I postponed this until my sabbatical, but the urge was long over by the time that I finally was able to begin this, long delayed, sabbatical.)
*I like the idea in principle and would be very happy, should “Miss Right” stumble into my arms; however, my experiences with women, current divorce rates, whatnot, make me seriously doubt my chances of finding someone with a sufficient long-term compatibility that we will both be happy for the duration—not just a few weeks, months, or years. Most women turn out to be obviously incompatible quite soon. (Going into the why would double the length of this text, but I stress that compatibility is not an absolute value judgment—it is a statement about how well two or more entities suit each other.)
As for “being too different” (etc.): This is something that I normally consider irrational—I live by my own standards, not those of others. However, when I am exposed to how large the differences are, as with e.g. Christmas, it can be hard to not feel “off”.
More generally, Christmas appears to bring out a similar contrast in life or a feeling of “being on the outside looking in” among people with no or little family. I can only imagine how it is for those who actually are lonely to begin with…
- Nostalgia is by its nature bitter-sweet, being a longing for something lost and (usually) unrecoverable. Mostly, for me, the positive parts outweigh the negative, either be it through pleasance of recollection or through the opportunity to learn something about myself. Christmas appears to be different, because my main Christmas memories (cf. a text from last Christmas) are so far back that I was a radically different person (e.g. at age four, while I am now closing on forty-four), and I am not just faced with my-life-as-it-used-to-be but with myself-as-I-used-to-be. While I would not wish to go back and lose what I am today, I do have a strong feeling of loss, as if I had had a little brother who died or as if I somehow could look back into a past life* with the knowledge that this past incarnation was dead.
*I do not actually believe in past lives, but the idea is quite useful in this context.
This feeling also makes me re-evaluate my take on Time Lords (a potentially good further illustration): I watched a lot of “Doctor Who” a few years ago and was particularly fascinated with the idea of multiple (recollected) lives—imagine the understanding and wisdom that could be gained through having lived a dozen-or-so lives, all with a different personality, preferences, skills, experiences, … By now, I fear that the risk of pain would outweigh the positives—imagine having all that nostalgia and “self-death”.*
*To which at least the extrovert must add the deaths of countless friends, companions, lovers, …, that simply had a shorter life-span—an aspect sometimes mentioned on the show.
- I tend to view holidays as “nothing special”*. Indeed, I have ignored almost all holidays since I became an adult—no decorations, no special food, no special activities, no whatnot. My everyday life is good enough as it is, so what would be the point of going through the effort? Christmas and/or New Year’s** is a considerable exception. It is true that I go through less effort than many others;*** however, what I lack in effort must be weighed against the thinking that I usually end up doing. (Also see a much older text.)
*And did just yesterday claim in an email that Christmas did not feel very special this year, with all the other free time that I had through my sabbatical—it appears that I was wrong.
**After my parents divorce, I usually celebrated Christmas with my mother and New Year’s with my father, which caused both holidays to take on a Christmas character.
***I put up very little in way of decorations (or, like this year, never get around to them at all), have special food only in as far it can be bought ready-made, do not go to church, do not go caroling (not that a Swede would), etc.
- When Christmas, other holidays, vacation periods, sometimes even weekends come, most people appear to stop writing and reading blogs, participating in online forums, and similar. This is highly surprising to me: They have the time and energy to do such things on work-days, but when they finally have a bit of spare time and really* should take the opportunity to increase their activities, well, then they decrease them or cease them altogether… Some might, obviously, be stuck somewhere without an Internet connection, but this is bound to be a minority. Some might be more swamped than usually, but how much extra stress does it take to outweigh not having to work?** Ditto those who want to prioritize family—push the freed work and commute hours onto the family and there will still be plenty of time to go around.
*I assume that most of these enjoy such activities. Those, presumably a small minority, who for some reason force themselves are obviously given a pass. (Then again, if they have to force themselves, would it not be a better strategy to keep the post-work evenings free and reserve such tasks for weekends and vacations?)
**A sub-category are those who do not have extra days free and just have the extra stress. However, this is again likely to be a minority, and does not explain the phenomenon during weekends, vacation periods and more low-effort holidays than Christmas.
Each to his own, but, even after close to twenty-five years on the Internet, this still puzzles me.
The effects of our base-line on perception / Follow-up: A few thoughts on traditions and Christmas
Traditions [1] were the topic for a Christmas text last year. In the almost exactly one year since then, I have again and again noted various overlaps with the sub-topic of our perception of normality. More specifically, it seems that there is a point of “normality”, where something becomes so familiar that we do not notice or reflect upon it, or where we experience it highly differently from less familiar phenomena and/or from how others experience the same phenomenon.
A few examples:
- As children, I and my sister often stayed for prolonged times at our maternal grand-mother’s. She declined many wishes for pasta and rice with the argument that “we already had that once this week”—but had no qualms about using boiled* potatoes as the “staple” five to seven times a week. In all likelihood, she genuinely** did not perceive the paradox in this argumentation, being so used to potatoes that they were a standard part of any meal***—just like the glass of milk.
*Mashed or fried potatoes happened on occasion; I am not certain whether she ever served French fries.
**To which should be noted that she was not very bright—others might have been more insightful even in the face of ingrained eating habits. Unfortunately, back then, I took it to be just another case of dishonest adult “argumentation”.
***She was born in 1924 and grew up with a very different diet from even what I (1975) did, let alone what some born today will. Indeed, left to her own devices, deviations from boiled potatoes were more likely to have been e.g. kåldolmar (cabbage rolls) or rotmos (a rutabaga mash with some admixture of potatoes(!) and carrots) than rice or pasta.
Consider similarly my own caffeine habits*: I drink large amounts of black coffee—no sugar, no milk, no cream, … This despite originally not liking the taste. When it comes to tea, I have tried repeatedly to use it as a substitute, but within a week or two of a cup a day, the experiment always ends, because I do not like the taste.** I have used e.g. Nespresso and Dulce Gusto machines, but eventually grew tired of the taste and returned to drip-brews. Similarly, when I ordered coffee in restaurants, I used to take the opportunity to have an espresso or a cappuccino—today, I almost invariably order a “regular” coffee. What is the difference, especially since I did not originally enjoy coffee? Simply this: I have drunk so much of it that it has become a taste norm. Tea does not have that benefit and other variations of coffee are implicitly measured as deviations from that norm. The latter might even taste better in the short term, but then I simply “grow tired” of the taste.
*Also see parts of [1] and of a text on prices.
**In fairness to tea: I have so far always used tea bags—some claim that they are a poor substitute for tea leaves.
This item has some overlap with (but is not identical too) the concept of “an acquired taste”.
- Why does boy-meets-girl feel less hackneyed than childhood-friends-fall-in-love? (Cf. an excursion in [2].) Well, the former is so common that it does not register in the same way as the latter—despite the paradox. Or take teenage-girl-and-much-much-older-vampire-fall-in-love: Only a very small minority of all works of fiction has this theme, and it would likely amount to a minority even of the vampire genre. Still, it feels so hackneyed that my reaction typically is “not this shit AGAIN—I will watch something else”. A higher degree of rarity can even increase the perceived hackneyedness, because the concept registers more strongly.* Beyond a certain rarity limit, the recognition factor might be so large that the automatic reaction is not “hackneyed” but “plagiarized”…
*However, another partial explanation can be that a theme has still not been explored enough, leaving works using a certain concept too similar. For instance, the overall vampire genre is much more diverse today than in the hey-days of Christopher Lee, because so many new variations of the theme have been tried over time—“vampire movie” does no longer automatically imply scary castles, big capes, the surreptitious biting of sleeping maidens, or similar.
- Virtually every generation complains about the music of the following generations. To some degree this can be due to actual falling quality (e.g. through increased commercialization or a shift of focus from music-on-the-radio to exotic-dancing-on-TV) or a greater filtering of old music (where only the great hits have survived); however, a major part is the base-line that we are used to (likely coupled with nostalgia). Notably, the hit music of a certain period appears to fall mostly into just several fairly specific genres, with a great internal similarity in “sound”. Those who grow up* with a certain sound will tend to see it as a norm, be more likely to be estranged by newer genres and be more able to differentiate within and appreciate the old genres. (Hence complaints like “it all sounds the same”.)
*In my impression, most people listen to more music and more intensely in their youth than at higher ages, and they might be more easily malleable to boot (be it for biological reasons or because the prior exposure has been lower). However, I suspect that amount of exposure is more important than age.
A similar effect is almost certainly present between contemporaneous genres that differ considerably.
- As a small child, I somehow got into a discussion with my parents as to why the clock on the kitchen wall was not audibly ticking. They claimed that it was, but I could not hear anything. On their insistence, I spent a short period listening intently—and there it was! I was simply so used to the sound that it had not registered with me, until I deliberately tried to hear it…
In an interesting contrast, I often found the antique wall-clocks at both my father’s and my maternal grand-mother’s so annoying that I used to stop them—in turn, slightly annoying my respective hosts. This might at least partially have been due to my base-line being “tickless”; however, they were also much louder than the (modern) kitchen-clock, and might also have had a more irregular or prolonged sound. (The antiques used an entirely mechanical, crude-by-modern-standards clockwork with pendulums and whatnots; the kitchen-clock had a modern clockwork, ran on a battery, and likely used a balance wheel.)
As an aside, this points to the risk that isolating one-self from disturbances can lead to an increased sensitivity to the disturbances that do occur, while increased exposure can bring greater tolerance—a dilemma that I have long struggled with as someone sensitive to noise. An extreme example is present in the movie “The Accountant”, in which the autistic protagonist deliberately exposes himself to very loud noises, strobing lights, and physical pain during shorter intervals, apparently trying to increase his tolerance. (I caution that said movie did not strike me as overly realistic.)
- When I lived in Sweden, German seemed a fairly ugly language with too strong (in some sense) pronunciations of many sounds (including “r” and “s”). After twenty years in Germany, it sounds just fine, while I am often struck by Swedish as bland and lacking in character. Back then, I heard how German differed from Swedish; today, I hear how Swedish differs from German.
English is somewhere in between and has not struck me in the same way. However, it is notable that TV and movies have left me with a U.S. base-line, in that I mostly (mis-)register U.S. English as “without an accent”,* while e.g. any version of British English comes across as British**. This is the odder, since I actually consider (some versions of) British English more pleasant to the ear and have a tendency to drift in the “English English” direction, or even towards amateurish pseudo-RP, on those rare occasions that I actually speak English.
*But many versions of U.S. English stand out as non-standard, including the heavy Southern ones.
**Often with a more specific sub-classification, e.g. “standard”, Cockney, Irish, Scottish; in some cases, as something that I recognize as a specific accent but am unable to place geographically. (The same can happen with U.S. dialects, but is much rarer—possibly, because British English is more diverse.)
Outside of examples like the above, there are at least two areas that might be at least partially relevant and/or over-lapping: Firstly, opinion corridors and similar phenomena. Secondly, various physical phenomena, e.g. drug resistance, specificity of training, or how the human body reacts to cold: Apparently, Eskimos “in the wild” have the ability to work without gloves in freezing temperatures for prolonged times without ill-effects, pain, whatnot—but a few years in “civilization” make them lose this ability. Allegedly, Tierra del Fuego natives have (had) the ability to sleep almost naked in free air at low (but not freezing) temperatures, while the typical Westerner can feel cold at a little below room temperature without a duvet. I have myself witnessed two or three Westerners who walk around in t-shirt and shorts all year round (in Sweden and/or Germany—not Florida), at least one of which made the papers for this habit—he claimed that the body adapts* if one can push through the early discomfort.
*The exact nature of those adaptions are beyond my current knowledge, but at least some of them likely relate to how fast the body switches from a low-isolation to a high-isolation state and how strong the isolation becomes. That this is trainable to some degree can be easily verified through only taking cold showers for a few weeks and noting how strongly the discomfort is reduced in that time frame. Increase of “brown fat” likely also plays in.
Distortion of literary works / Enid Blyton
While I have always been strongly opposed to censorship, political correctness, intellectual dishonesty, (mis-)editing of the words of others, and similar, there is a particular area that troubles me with an eye on my own contemplations of becoming an author of fiction—presenting distortions of the works of dead authors as if they were the actual works.
For instance, I recently stumbled over the Wikipedia page on “The Famous Five”, and was distraught to read:
In modern reprints, George still wants to be a boy, but the statement that her short hair makes her look like a boy has been removed as it is now considered offensive to assume that girls need long hair to be considered feminine. Anne’s statement that boys cannot wear pretty dresses or like girl’s dolls has been taken out. Julian and Dick now help the girls with cleaning the house and washing dishes.
This increases the series of children’s’ books* that have been distorted in an irrational and destructive manner, contemptuous of both the author and the readers. (To boot, the claim “now considered” is a further inexcusable lack of encyclopedic standards on behalf of Wikipedia. A correct claim would be that e.g., depending on what applies, the “censor’s** considered it offensive” or “some population groups considered it offensive”.) Not only are such distortions despicable in general, but here the reasons appear to be particularly weak. I note, concerning the hair, that in such young people it can be the only physical differentiation and that judgments like “looks like a boy” and “looks like a girl” have to be measured against the time in which they occurred. To boot, George almost*** certainly otherwise dressed and whatnot as a boy, making the hair just one piece of a puzzle. Removing references to the hair thoroughly distorts the original intentions. Similarly, removing Anne’s statement distorts her character and misrepresents the times. This is especially bad, as it removes the contrast between the boyish/unconventional George and the girly/traditional Anne, weakening the two characters and the “group dynamic”. That the boys help with house-work again misrepresents the times and risks a character distortion—how do we know that they would have helped, had they lived in today’s world? Worse: I strongly suspect that these changes, especially the last, is not so much a matter of wanting to avoid offense as of deliberately influencing modern readers to hold a certain set of values—an utterly inexcusable reason for an already inexcusable act.
*Other examples include “Huckleberry Finn”, “Doctor Dolittle”, and the Swedish “Ture Sventon” and “Pippi Långstrump [Long-Stocking]”—among the at least dozen cases I have heard of. (The true scope of the problem is likely orders of magnitude greater and afflicting many more languages.)
**I call a spade a spade—these people are no better, arguably worse, than regular censors. (To “call a spade a spade” is another example of how unjustified censorship is common: Here, “spade” refers to a digging implement in a saying that goes back to ancient Greece. Still, there are people who consider it offensive because the same sequence of letters, much more rarely, has been used to refer to Black people…)
***It has been a very long time since I read one of the books, and there is some minor room for a combination of character being misremembered and contents not matching what would be reasonable based on first principles.
I note that the motivations give in other contexts tend to be very poor. For instance, Swedish censorship and distortion have been directed at the word “neger” as being offensive—however, unlike the English “nigger”, “neger” was never offensive. This changed at some point in the 1980s or 1990s when the PC movement presumed to declare it offensive. This with no reasonable motivation and likely based on a mindless analogy with the English “nigger”—if the one is offensive, then so must be the other…
A particular perfidious version, inexcusable beyond the inexcusable, is the claim that certain changes were made because “we” are sure that this is what the (long dead) author would have wanted—a presumption so moronic and/or dishonest that I feel like punching the speaker in the face.
Such changes, worthy of the Ministry of Truth, are a crime against the author, who sees his work distorted, and a crime against the reader, who is refused the opportunity to read the original work and whose view of the world of old is potentially distorted. Indeed, for a member of the politically correct who actually had a brain, would it not make more sense to let the children see that the world was different in the past and draw their own conclusions? Would it not be better that a girl noted that Anne did house-work and that Julian did not—and questioned the “why”? To look at Anne and George and ask who she would rather be? For the girl-who-wants-to-be-boy* (or vice versa) to look at George and how she had the courage to go against convention even back then? Etc.
*However, I am uncertain to what degree George’s wishes were comparable to some modern cases and to what degree she just wished for a more boyish life-style, considered girly-girls silly, whatnot. Not only are my contacts too far back, but I doubt that Blyton would have been explicit on the topic (if it even occured to her).
As for myself, I have not yet made up my mind on whether to become an author of fiction, and chances are that I would never have a sufficient and enduring popularity that such concerns would actually be relevant. However, I state now and for the record that I absolutely and categorically forbid such distortions of any of my works, current, past, present, and irrespective of type. If I am alive, I will exercise legal options; if I am dead, I will come back to haunt the culprits. The latter especially if someone presumes to try that utterly inexcusable excuse “we know that this is what he would have wanted”—you now know that it is not!
Excursion on other distortions:
Unfortunately, the general problem of distortion is not limited to e.g. censorship and children’s literature. Notably, newer German editions of older texts often come with the claim that the orthography has been “behutsam angepasst” (“cautiously adapted”) or similar, in order to match modern German—and this even for works that were written as late as the 19th century… This might be less harmful than the above, but still brings risks and disadvantages—and most changes are pointless in that the average reader could take the old spelling in a stride.* (A better solution would be to add a few corresponding notes. For truly extreme examples, a parallel original and “translated” text is an option.) For instance, one reason to read older books is to get a feel for the historical language, which is no longer possible. For instance, any such change risks an unintended distortion.** For instance, it is possible that the author deliberately chose a more traditional spelling over a more new-fangled one, in which case the alteration is in direct contradiction to his will.
*A notable example is the common use of “th” in many cases where today “t” is used, e.g. “Thal” vs “Tal” (“valley”). Consider e.g. the extinct Neanderthals vs. the valley Neandertal—at the time of their discovery, the valley used the “th” spelling, which is preserved in the anthropological name, while it uses the “t” spelling today. (And, yes, Neanderthal is correctly pronounced with a “t” sound—not with a lisp.)
**E.g. because two words that used to be spelled (slightly) differently are now spelled the same or vice versa, because some rhyme or play on words does no longer work, or because some spelling choices might have been very personal. (The latter especially in times when the orthography was less standardized than today.) An interesting example is the disappearance of older words, word cases, whatnot. Consider e.g. a modernized version of Shakespeare that replaces “thou” with “you”, etc.: This would lose a lot of nuance as to who is in what relationship to/with someone else and how the relationship might change over time.
The problem is not necessarily limited to dead authors either (but is particularly perfidious there, because they cannot defend themselves). Translations are a horrifying source of problems, at least in Germany, where I have encountered many efforts so awful that they should have led to a summary firing. The German translations of Terry Pratchett’s books have often been disastrous (cf. portions of a text on Pratchett’s death)—and do not get me started on German movie translations… While this is often the result of mere incompetence, e.g. ignorance of what a certain word/phrase/reference/… means,* it can also be deliberate. Notably, there is a school of translators who attempt to hide the fact that a work actually is a translation at any and all cost… (Including rather losing a play on words than giving an explanation of it, or rather re-writing cultural references to some highly approximate local equivalent.) This is an anti-intellectualism and dishonesty that is truly deplorable.
Excursion on Blyton:
Blyton might have mass-produced works with little literary value and might, by reputation, actually have approved in exchange for a bit of extra money. None of that matters: The editors have no such actual approval; the distorting effect for the readers remain (cf. above); the works have, irrespective of literary value, a great following and have been loved by millions (implying that any change is likelier to do damage than to do good); and, above all, if this is accepted for one author, what protects other authors? Indeed, even “To Kill a Mocking-Bird”, widely considered a work of considerable literary accomplishment, has been targeted by the PC crowd. It is important that not one inch be given to these people.
Excursion on tomboys and Feminists:
A peculiarity when it comes to e.g. Feminists and tomboys vs. girly-girls is that “tomboy” is often described as some type of insult or framed in a context of boys/men looking down on the tomboys who “should” be proper girly-girls instead. This repeats a pattern of ignorance and over-generalization about what men are actually like and what they actually think about women—I very much preferred George to Anne at that age, I have preferred girls/women with boyish/mannish interests later in life, and the same applies to a very sizable portion, likely a majority, of the male population. Yes, when it comes to sex and romance, there are many cases where a certain femininity in behavior and style can be attractive; no, when it comes to playing, socializing, whatnot, the tomboy and her adult successor tend to do better. For that matter, too much femininity and/or stereotypically female behaviors are a turn-off in romance too. (Too much make-up, too many shoes, too much emotionality, etc.—the likes of Carrie Bradshaw are not a good ideal.) Of course, even a boyish girl/woman can be quite physically attractive, aesthetically pleasing, and even feminine—this is not an either–or area. (Consider e.g. Evangeline Lilly in “Lost” or Keira Knightley in “Bend it like Beckham”*.) When a man says “tomboy”, it is more likely to be a compliment than an insult.
*Incidentally, a good example of German mistranslations: It was renamed to the alleged English title “Kick [sic!] it like Beckham”… Also a good, if fictional, example of how men tend to view tomboys—compare the positions of the two fathers and the two mothers towards the respective daughters and their “boyish” interests.
A few thoughts on the display of emotions (and similar topics)
There is a lot of prejudice around the display (and presence) of emotions, boring behavior, and similar against e.g. the introverted. Below I discuss some issues relating to these topics from my own point of view and with a partial eye at the Tall Dancer Phenomenon from my earliest writings.
When it comes to “emoting” (in a very wide sense and for lack of a better word), there are many things that I do not do, simply because they are too affected and/or manipulative in my eyes. For instance, I talk relatively monotonously and have toyed with the idea, notably after watching Laurence Olivier in “Rebecca”, of going for a radical change in order to have a better effect on the people around me. However, doing so would be a major affectation and it would be solely for manipulative reasons—even if I stopped well short of Olivier.* Indeed, similar affectations in others have often struck me negatively and reduced their success in interactions with me. For instance, some women looking for a favor go at it with such exaggerated smiles, voices, and gestures that it makes me unwilling to perform that favor… They would be much better off just making a neutral request, accompanied with a factual explanation and, for a bigger request, an offer of a something-for-something. Similarly, a woman who starts with five minutes of flirting** as ground-work before a request is unlikely to do better with me than if she had jumped straight to the point. Smiling politicians, executives, sales people, …, leave me cold, because the smile will almost always be calculated and not reflect something worth-while—and I am naturally*** loathe to smile at others as a (deliberate) sign of friendliness, because it seems an affectation even when I am friendly towards them.
*Which I almost certainly would have to: Olivier had many years of training behind him, is highly unlikely to have spoken in that manner without the extensive training, and I suspect that even he spoke differently in private. (If in doubt, because he speaks differently in some other films.)
**If I even recognized it as flirting: Today, I almost certainly would, but in my younger years I could be quite oblivious to flirting, romantic hints and probes, and similar.
***My thoughts (but, so far, not behavior) have changed compared to my natural state over time: Originally, I saw a smile as something that should just happen, making any “deliberate” smile an affectation. Today, I also see a smile as a legitimate means of deliberate communication, just as a hand-shake or a “hello”, while reserving my disdain for more manipulative uses.
However, even spontaneous and non-manipulative emoting can be off-putting to me, when too exaggerated or too undisciplined*. I am not saying that we should all walk around with poker faces; however, some degree of self-control can benefit those around us and deliberate exaggerations (as seen with some women and many children) are really unbecoming. Such negatives in others have also influenced my relative reluctance.
*Notwithstanding the partial hypocrisy: I have on some occasions emoted so strongly while working with e.g. WebSphere that near-by colleagues complained. (WebSphere is one of the most infuriating and frustrating software products I have ever encountered. While the colleagues complained, they also sympathized—having to work with it themselves on a daily basis.)
Looking at the other direction, reading emotions, the situation is similarly often a matter of differing preferences and norms. For instance, if a low and a high emoter do not catch each others feelings, it might well be that the high emoter over-looks the more subtle* reactions of the low emoter, while the low emoter does not catch a true emotion from the high emoter due to all the “noise”** surrounding it. Similarly, interpretation is hard without knowing the baseline of the counter-part, which is necessary to judge how much noise is usually present and how strong reactions tend to be—without it, we can have problems like a low emoter interpreting something in the counter-part as an emotion that is not, or over-estimating the strength of an emotion that is there.
*The difference is not necessarily one of emoting vs not emoting. When actual emotions are present (not just faked), the difference is more likely to be one of strong emoting vs. weak emoting.
**If someone “cries wolf” when no wolf is there, the alarm for an actual wolf might be misinterpreted. From another angle, I have often found the “logical” members of the “Star Trek” universe to be more expressive than the regular ones. Data, e.g., has a baseline that involves very few and small facial expressions—which makes the expressions that do occur the easier to recognize and the more significant. His colleagues move their faces (raise their voices, whatnot) at the drop of a hat—and how are we to know when it was a hat that dropped, and when a bomb?
Then again, if a low emoter does not react (or appear to react) to the emotions of a high emoter, it is not a given that he has missed them. It could also be that he deliberately ignores them (e.g. out of diplomacy, because he sees them as unwarranted or none of his business, awaiting an explicit verbal clarification*, etc.) or that he has reacted while the counter-part failed to notice…
*E.g. whether the counter-part is looking for some particular reaction or to establish whether he is just venting, looking for sympathy, or wants advice.
Of course, some people (including yours truly) have or have had more genuine problems with reading emotions.* By now, I am probably better than most, courtesy of the implicit training from watching so many movies and TV series,** but I trailed my age bracket until at least my late twenties and was disastrously behind during my school years. The effect of training is not to be underestimated and the introverted (even outside sub-groups with complications like autism) are at a disadvantage, because they, unsurprisingly, tend to spend less time socializing.
*More generally, the great degree of arbitrariness in human behavior and the way that most people ignore reason in favor of emotion often left me stumped in interactions. Women (in general) and women in connection with romance (in particular) were especially problematic. Cf. an earlier footnote.
**Starting with a growing awareness through cartoons, where the emotions are usually portrayed very strongly and have an obvious connection to events. As awareness coupled with a more grown-up mind, these emotions became easier to detect, increasingly subtle signs could be read and increasingly nuanced emotions differentiated. (Of course, other sources of awareness than cartoons featured later in life, including own experiences and verbal descriptions.) That actors tend to exaggerate expressions is an advantage during the early learning stages, but can be a disadvantage later on. Combined with the fact that they are acting, this implies that real life observations are necessary too.
Excursion on smiles and changing times:
An interesting development is that people on photographs (and paintings) were a lot less likely to smile in the past than today. In the case of politicians, statesmen, whatnot, the difference is particularly large—many past depictions of politicians show someone (almost certainly deliberately) grim or fierce looking, while bright, artificial smiles are par for the course today. Presumably, a modern politician wants to send a message of friendliness, while those of yore went for e.g. strength, domination, and the ability to face an assault by a foreign power.
Excursion on emotions vs. emoting:
A common misunderstanding is that people with low emoting are also low on emotion. In actuality, emotions appear to be more-or-less evenly divided between high and low emoters, with the groups simply displaying these emotions differently. I suspect that I, myself, am a fair bit above the male average when it comes to emotional intensity, e.g. in that I tend to feel more intense happiness, grow angrier at injustices, be likelier* to cry during a movie, whatnot. I would even speculate that many emotional people deliberately suppress their emotiveness, somewhat like the Vulcans of “Star Trek”, who do not have an in-born emotional control—on the contrary, they are naturally highly emotional and have developed techniques to keep their emotions under control.** Similarly, it is allegedly common for functioning alcoholics to have a facade that leaves other people believing that they have an under-average interest in alcohol.
*Note that “likelier” does not imply “likely”: Most movies with scenes that are intended to have this effect, and often have it on women, are simply too poorly made to actually hit home (e.g. because the scene is too cheesy/exaggerated or because sufficient emotional investment in the characters has not been created).
**However, the Vulcans attempt to control the actual emotions, not just their expression.
Excursion on “mirror neurons”:
A partial explanation for some of the above could lie in “mirror neurons” (or an equivalent mechanism) that trigger some degree of emotional or behavioral “mirroring”, e.g. in that many people reflexively smile back when smiled at. I only very rarely have such an impulse, and tend to just note that “someone is smiling at me”—often with such absentmindedness or (with strangers) such lack of interest that the question of whether I should smile back only occurs to me after the fact… I can even recall a few cases of extremely shy and insecure appearing* girls/women slowly start to smile at me, stop halfway through, and then sadly fade back to not smiling, when I did not reciprocate. Using smiles as an indicator of e.g. friendliness, potential romantic interest, whatnot, is very prone to error when we do not know how the other party tends to behave.**
*Based on e.g. posture, downwards turned eyes, and similar. To some degree the reasoning could be circular, however, since the type of failed smile is a part of the reason for the classification—confident women tend to jump straight to a full smile without “interactively” checking for feedback.
**Also for reasons like the many fake smiles made by manipulators and the artificially friendly.
Excursion on affectation and manipulation:
There are some things that I actually do that could be considered affectations or manipulations. For instance, I do not let my facial hair grow wild and I do so for reasons of aesthetics*. The differentiation towards the above is not entirely objective and I do not rule out that I will change my mind about e.g. altering my speech patterns—the above is not a discussion of why something would be morally** wrong to do but of why I (and likely many others) have not done it. However, there are at least three differences between the facial-hair and speaking-like-Olivier examples: I groom my facial hair at least partially for my direct own benefit (I have to look at that face in the mirror), not just for the indirect benefit through others. The type of grooming I make is sufficiently common that there is nothing remarkable about it (in fact, people who do not groom at all are more likely to be seen as following an affectation, even be it unjustly). There has been no major conscious decision to change anything, but a gradual change of habits until I found something that required little effort while still pleasing me.
*As opposed to practical reasons, which would have neutralized the accusation. (Chances are that if I did not groom, I would eventually be caught by practical reasons, e.g. when my beard landed in my soup; however, such concerns have not explicitly featured in my decision making so far.)
**However, some of the above examples arguably are, e.g. trying to flirt oneself to office favors.
Note that it is conceivable that similar factors played in with Olivier, himself, and it is not a given that his speech seemed remarkably affected to him or his peers (at least while performing)—much unlike a software developer who makes a conscious decision to emulate him in the office. He might simple have started with regular stage English and continually improved himself with an eye on practical stage effect. (Note that while software developers often have cause to speak, is not the core part of the job and what is said should* be far more important than the delivery.)
*Unfortunately, this is not always the case; however, the impact is still far smaller than for an actor, even when the ideal is not reached.
The status of practical learners
In my earlier days in the Blogosphere, one of my comments* was answered with “So, you are a practical learner!”**. Knowing “practical learner” mostly as euphemism for those with some limited practical talent and a complete lack of intellectual accomplishment, I almost choked with the perceived insult and condescension.
*This was too long ago for me to remember the context and details.
**To paraphrase my main take-away. Here too, I do not remember the details, but the actual answer was likely a bit longer, and probably not intended to be insulting.
Since then, I have revised my opinion on practical learning considerably. For one thing, I have over the years increased my proportion of practical learning, e.g. in that I have so often found claims by others to be faulty that I often prefer to do my own informal experimentation/trial-and-error/whatnot (not just my own thinking). For another, practical learning (in the literal sense, which I will use throughout below) plays in well with my opinions on learning in general:
There are, somewhat over-simplified, two types of learners: Those who just gather knowledge provided by others and those who gain an understanding from the knowledge of others and/or create new knowledge of their own. A practical learner can to some degree be either; however, the weakest aside, the latter will likely dominate. Examples include anyone who observed an event and drew conclusions about how this event could be reproduced or avoided, what the positive and negative effects were, how the event could be utilized, … Consider a stone-age man who accidentally hits a piece of flint so that it can be used as a cutting implement, realizes that it is a potential cutting implement, tries to create new cutting implements by hitting other pieces of flint, and refines his technique based on further experiences—a practical learner who has done something most of his peers did not do and which helps the group to be more successful. Or consider a software developer who tries a certain approach to solve a problem, sees an unforeseen complication when the code is run, and modifies his approach thoughtfully* the next time around. I certainly suspect that many of the great inventors and researchers have drawn considerably on an aptitude for practical learning.
*Not to be confused with the “worst practice” of making random changes in the code until it appears to be running as intended.
Contrast this with someone who just mindlessly absorbs the contents of books, who can apply the algorithm of long division (see excursion), who knows in-what-year for a thousand events, who has absorbed but not understood the deep thoughts of others, … (In turn, not to be confused with the mindful reader. Cf. e.g. [1].)
Of course, the border between the practical learner and, e.g., the theorist can be hard to find—is our stone-age man still acting as a practical learner if he takes a thirty-minute break to just think his options through, during which he does not even touch a piece of flint? This, however, is only natural with an eye on how deeper learning works: Deeper learning, with an understanding of the matter involved, always comes from within, from own thought. External influences, be they practical observations, books read, statements heard by others, …, are food for thought—they are not thought it self. The source of this food matters less than what we do with the food. It is true that some sources provide more, more nourishing, or more easily digested food than others, but ultimately it is up to us to do the digesting.
In all fairness, it is likely true that the set of practical learners will contain a comparatively large sub-set of those not-very-bright (including the stereotypical “shop students”), compared to e.g. those who actually learn from e.g. books. However, there is no true reason to believe that the sub-set of the very bright would be smaller, even if those might engage in practical learning in other areas (e.g. experimental physics instead of auto mechanics)—and worth-while thinkers will almost certainly have several sources of food for their thoughts. Moreover, there are plenty of readers, likely an outright majority, who are not all that bright either—they read but do not truly learn. Similarly, many or most college* graduates have not truly learned—they have internalized some (possibly, a very considerable) amount of facts, methods, whatnots, but have failed to gain an understanding, cannot draw own conclusions, are bad at applying what they have internalized, etc.
*School and, increasingly, higher education have a strong tendency to favor the wrong type of learner. Too often, the mindless absorption is rewarded during tests, while understanding brings little or no additional benefit. In some cases, critical thought can be positively harmful to success, e.g. in fields like gender-studies.
Excursion on long-division:
I have never mastered it: In school it was presented as a set of mechanical steps, with no attempt to explain the “why”, which I imitated a few times to create the impression that I knew them. After that, I just winged the divisions that came up on tests (usually as comparatively easy steps within a longer calculation). In adult life, the divisions that I encounter are either so trivial that I can easily do them in my head (say, 231/11=21), so complicated that I would use a calculator* anyway, or from a context where I only need an approximate** value to begin with. In the unlikely event that I really need an algorithm, I understand division, the decimal system, etc. well enough that I could create it—which is far more valuable than memorizing a set of steps.
*I do not need long-division to solve e.g. 2319523/2344 using pen and paper, but a calculator removes an entirely unnecessary risk of an accidental error and is usually faster—be it compared to long-division or to an improvised calculation. This especially as the very few such calculations that are needed tend to carry a legal relevance, e.g. the extraction, for my tax declaration, of the VAT from an amount paid that includes VAT.
**That 2319523/2344 is a little short of 1000 will be enough in many contexts.
Excursion on men vs. women:
While the problem with a lacking understanding (etc.) is quite bad among men, it appears to be considerably worse among women (and is very often combined with the knee-jerk classification of everyone as intelligent who graduated from college). This could turn out to be a major future problem, if the trend of giving women artificial preference in e.g. hiring/promoting and politics is continued.
For an example, consider the relative likelihood of a homeopathic physician* being a man vs. being woman.
*As opposed to an uneducated user who might be forgiven for not seeing through the obvious quackery that homeopathy is. (But women appear to dominate there too.)
Remark on double posts:
Subscribers might have seen two incomplete postings of the above contents. This was caused by my failing to close the “tags” declaration for WordPress within the HTML code.
Sound-bite communications and too much brevity
While I have recently criticized older writers for being overly wordy, some modern forms of writing and whatnot is a greater evil: Those that fail to inform or, worse, actually mis-inform, through reducing communication to tidbits without context.
Consider e.g. a typical modern documentary: Someone with some connection to the topic is allowed to say a single sentence, taken entirely out of the original context*. There is a cut to the next person with some connection to the topic, who is also allowed to say a single sentence. There is a cut [etc., etc.] This goes on for five or ten minutes—and what happens after that I frankly do not know, because I have already turned the documentary off… To make matters worse, the connection is not necessarily very strong and the single sentences are often uninformative irrespective of context. At extremes, a documentary about a film-maker can start off with single sentences by people who at some point made a single film with him—or merely share the same field. (But happen to be quite famous…) The sentences themselves can then amount to “He is the greatest!”, “I loved working with him!”, and similar.
*From the optics, this context is usually a longer interview, which would, likely, have been more interesting to begin with. True, such interviews are rarely entertaining, but there is something to be learned and understood—and the point of a documentary is to give that opportunity. For entertainment, pick a good sit-com, action movie, or whatever the current mood calls for.
Note that it is very hard to say something sensible with a single sentence, even when that sentence is targeted to the purpose (not just torn out of context). Look e.g. at the immediately preceding sentence: It does a far better job than the aforementioned and is comparatively long, but it is still just a piece of the overall text. Consider how it would read without the context of the overall discussion or note how it would be just a claim without any support for its truth.*
*This is, admittedly, a situation that can be hard to avoid even in a full text, and not always something that I pay great attention to. A major complications is that what might seem self-evident to one person is not so to the next, e.g. because they have different levels of expertise or because their priorities differ. Another is that the overall text might give enough support when enough time is spent thinking, but that the reader will not necessarily be aware of this (or willing to put in the time, or smart enough), while the writer might be too stuck in his own context to realize that there are things that might better be spelled out. Even so, a larger block of text is almost always better than just a single statement.
Or take the reverse approach and contrast “He is the greatest!” with “He revolutionized the use of camera angles and I have never known a film-maker with such a drive for perfection. I was particularly impressed with his movie X, e.g. the scene where […]”: The former is just a sound-good/feel-good claim; the latter allows the viewer/reader/… to gain some insight, make his own verifications, find other sources of information, whatnot—e.g. through watching movie X with particular attention to the mentioned scene and/or camera angles.
The lack of context can even make claims misleading. For instance, “I loved working with him!” might have been exclaimed after an early successful collaboration, and does not necessarily reflect feelings after a later collaboration that lead to a major falling out—and knowing why the feelings were positive can have a massive impact on interpretation. (Did he make the experience pleasant or entertaining? Did he share insights into film-making that improved the speaker’s own abilities? Did he behave professionally when others might have exploded in anger? …) Similarly, “greatest” need not refer to the film-maker as a film maker—it might have been as a friend, a person, a philanthropist, boxer, or even be referring to physical size. (Other examples can be more subtle, while being vulnerable to similar objections.)
Somewhat overlapping, these single sentences are almost always lacking in nuance and have a tendency towards the hyperbolic—I have yet to hear “X is on my top-ten list, slightly behind Bergman. I waver between him and Fellini for eighth place.”, which presumably is not cool enough for a modern documentary.
I am left with the impression that the documentary makers just try to pump out a certain number of minutes of screen time (never mind its value per minute) and resort to a way that allows even those void of skill to produce those minutes. Possibly, a secondary, highly populist, concern can play in: To allow mindless viewers to get some degree of entertainment, e.g. through rapid changes (never mind that the result is unusable for those with a brain or a genuine interest—those for which documentaries should be made).
The problems are by no means limited to documentaries, however. For instance, there are many journalistic “articles” on the web, especially sport-centric, that consist of as many Twitter-quotes as own text… No analysis, no details, no information—just superficial impressions or sound-bite claims by others. My concerns are similar, as is my lack of enjoyment. I am certainly not informed by such crap.
Twitter, it self, is an obvious further example: If someone wants to inform the world that “I am going to the loo! Yaaay!”, Twitter might be an appropriate medium. For readers and writers looking for something with more substance, it is not a good choice.
Politics (especially Left/PC populist) and advertising are, unsurprisingly, other common sources of examples. Consider e.g. the utterly despicable pro-abortion argument (using the word loosely) “It is my body!”. Not only does this anti-intellectually reduce a very complicated* ethical question to a mere slogan, but this slogan is also extremely misleading—the main reason why this question is so tricky is exactly that it is not “my” (i.e. the current woman’s) body that is the main issue! The main issue is the body of the fetus, and involves sub-issues like when this body should be considered a human and when a disposable something else—which in turn involves medical, philosophical, and (for non-atheists) religious considerations. (Other issues not addressed by this slogan include whether and to what degree the interests of the father** and the grand-parents might need consideration and what the medical professionals consider compatible with their own conscience and religion. At the same time, it fails to use the single strongest pro-abortion argument, i.e. the medical risks resulting from illegal abortions.)
*So complicated that I have no clear opinion on the matter: I do not argue “pro-life” here—I argue against useless, illogical, intellectually dishonest, whatnot argumentation and cheap sloganeering.
**Including the negative direction, since he is usually forced to pay for the child when no abortion takes place.
Excursion on depth in general, especially regarding school:
As I have noticed again and again after leaving school*, what passes for education is often sufficiently superficial to be near useless—sometimes, even dangerous. The same can be said about much news reporting (even when the extreme cases above are discounted). Generally, society is filled with shallow information and a shallower understanding—while most people fail to understand that they know and understand very little. School amounts to twelve-or-so years of superficial orientation on most topics, where it would have been better to dig deeper into more select areas. History is possibly the topic in which this is the most obvious. Look at a typical school text on history, consider how many pages are spent on what topics, and then compare this amount of text with e.g. the Wikipedia article on the same topic—and compare the amount of depth, thought, analysis, whatnot. Reading the Wikipedia article once or twice and forgetting ninety-five percent of the data (but not insight!) will usually be more valuable than even memorizing the school text.
*Somewhat similar arguments apply to higher education too, but to a lesser degree and with more honesty: Achieving a diploma is somewhat comparable to getting a driver’s license—proof that someone is fit to enter traffic, but still trailing severely compared to what is expected ten or twenty years down the line. School, in comparison, often amounts to the knowledge that a car has four wheels and uses gasoline—superficial, border-line useless, and ignoring other numbers of wheels and other energy sources.
Excursion on “likes”:
It could be argued that “likes” takes this type of mis-communication to its purest extreme (or it could be argued to be another topic entirely). Consider e.g. the lack of a motivation why something was “liked”; the uncertainty of whether the text/video/whatnot was viewed as high-quality or whether it was the message that was approved; the typical inability to “dislike” something; the pressure some might feel to “like” as a form of payment (or the hope of getting “likes” in return, or the fear of disappointing a friend, …); etc.
Excursion on neglecting core groups:
Documentaries and entertainment (cf. above) is just a special case of a very disturbing tendency of neglecting the core group (traditional target/raison d’être/whatnot audience), for which something should be made, in favor of the great masses—failing to realize the betrayal implied, the damage done, and that it might be more profitable to hold a large portion of a niche market than a thin sliver of a mass market. A good other example is museums that (at least in Germany) focus so much on populist entertainment that I, as a core-group member, rarely bother to visit one. A particular problem is the drive to include children, even when they gain nothing museum-specific from the visit, and even when their presence is a disturbance to other visitors. For instance, when I lived in Munich, I visited an automotive or vehicle museum—and was ultimately forced to avoid large areas of the museum, lest I flip out. Why? A large part of the museum was occupied by some type of for-children-cinema and a slide—both in immediate vicinity of parts of the exhibition, both entirely lacking sound barriers. To boot, there was an endless stream of children running around and shouting between them. I note both that neither appeared to have any educational value (even discounting the limited value achievable in young children*) and that any hoped for gain in long-term interest in museums (if occurring at all) is outweighed by the adult visitors losing their interest. The true explanation is simply a wish to maximize the number of paying visitors—and education be damned. Other examples include sports’ events going for the ignorant masses, with imbecile commentators, idiotic camera angles, whatnot, and ignoring** those with an interest in and knowledge of the respective sport.
*It is a dangerous myth that children learn better than adults. Even when it comes to raw facts, it is highly disputable. When it comes to gaining an understanding, extrapolating, applying, it is horrendously wrong. See also e.g. [1].
**Note the difference between opening doors for the masses in addition to the core group vs. doing so through locking out the core group.
Follow-up: Revisiting verbosity based on Wesnoth
A minor unfairness in my recent text on Wesnoth was the use of an outdated version (1.12), instead of the latest (1.14).
I have since installed and briefly tried 1.14 through use of Debian’s “backports”. My experiments were cut short by the fact that it was unplayable on my computer for performance reasons. Specifically, the game used up an entire processor core even when doing nothing* and was slow as molasses—click somewhere and the reaction took place ten seconds later…**
*Including when being on the start screen and when having all animations deactivated.
**Note that this is a different type of delay than discussed in the original text: There we had artificial delays deliberately introduced by misguided developers; here we have delays as a side-effect of a too resource-consuming application.
If (as it likely was*) this performance delay was caused by changes to Wesnoth it self, it demonstrates a disastrous attitude so common in today’s software—the assumption that any type of performance waste in order to gain minimal benefits is acceptable. Notably, nothing that this game does should cause such performance delays. That the display part can be handled much more efficiently is proved by older versions of the game, the amount of calculation needed during user interaction is negligible, and the observed delays were by no means restricted to the AI’s/computer’s play**. And, no, neither 1.14 nor the older versions use graphics of an impressive kind—strictly 2D, a mostly static map (especially with animations off), no fanciful textures, …***
*It is unlikely-but-conceivable that some version incompatibilities or other problems in/with/between the simultaneously backported libraries and/or the original setup led to problems that are not the fault of Wesnoth. (The backports are typically nowhere near as thoroughly tested and come with far less guarantees than the “regular” Debian packages.)
**If it had been, it might be explained by deep analysis, simulation, or whatnot preceding the AI’s decisions. However, considering how trivial and fast the AI had been in older incarnations, this would have been a surprising development.
***Which is good: There is no true benefit to be found from such features in a game of this type. Equally, chess is played just as well with a simple wooden board and wooden pieces as if diamonds and rubies were used—likely, better.
I did manage some minor comparisons with 1.12 before giving up, however, and must amend my original criticism slightly: The blend-in of text and movement-to-and-from-a-war-council in 1.12 were not as slow as I perceived them when writing the original text. (However, both are still entirely unnecessary delays. Also note that at least the latter is sped up by a factor of four in my settings, compared to the default.) Here we likely see an effect of different standards of comparison: During regular play, I am used to things happening very quickly; coming straight from experiences with 1.14, I saw the contrast to the truly slow.
(As for differences, they appeared to be mostly optical. However, I did not manage to do any non-trivial game play nor test some of the features at all or more than very superficially, and there might have been significant changes that I am not aware of.)
Excursion on performance:
It is true that processing power is quite abundant today, with even a low-end cell-phone comparing well to the first computers that I knew. However, it is still not a limitless resource; not all computers have top-end processors (graphics cards, whatnot), notably because even an outdated low-end notebook can handle almost any task with ease;* and there are still factors like environmental impact and battery life to consider when a heavy workload increases energy consumption. In some case, even heat might be a factor to consider—what if a game is not playable on a hot summers day?
*For instance, my current notebook has a quadcore processor topping out at 2 GHz and only “on-board” graphics—and that is more than enough to watch even HD movies, play older versions of Wesnoth, listen to music, browse the Web, etc. Indeed, the average load on my processor is often below 1 %…
It is also noteworthy that some of this computing is a complete waste, e.g. because (as with 1.14) there is no benefit to it. Often, it is based on faulty assumptions about what the user wants.* Often, it is a pseudo-optimization; sometimes, it is even contra-productive.** It is particularly infuriating when an idle application (as with 1.14 above) runs the processor up—by rights, it should have a negligible processor load. Even as a professional software developer, I have problems understanding how they manage the opposite: Is it amateurish “busy waiting”? Is it a polling multiple times per second for an event that takes place every few hours? Is it a hidden malware that tries to spy on me? Is it an equally hidden bitcoin-mining operation? …
*For instance, many applications leave background processes hanging around after they are (ostensibly) terminated. This is usually with the intent that the application should start faster the second, third, fourth, … time around—but what if this does not happen for several weeks or only after a re-boot? Worse, there are some applications (especially on Windows machines) that insist on starting such background processes before the first application start. What if the actual application is then never started?
**A good potential example of contra-productive “optimization” is the “save_index” file of Wesnoth: Interacting with the saves is usually faster after deleting it… (I have not studied its exact purpose, but based on the name, it is likely intended to speed up said interaction. I note that I have never experienced any negative side-effects of the deletion.) And, no, this is not a unique example: I first deleted this file because I knew that emptying a cache (or a similar measure) often had led to a speed-up in other applications and thought that I should at least give it a try. Indeed, while caches can have major benefits in the right situation, they are often more a hindrance than a help. For instance, it only rarely makes sense for an application to add its own file caching on top of the file caching done by the operating system and other mechanisms. (And the introduction of SSDs have created many situations where the value of any file caching is strongly reduced compared to the past.)