Posts Tagged ‘humans’
A few thoughts for a HAPPY New Year
As the one year ends and the next begins, spending a few words on happiness might be a good idea.
Let me start with a detour over self-driving cars: When I first heard of realistic* plans for these, I was very pleased. Here we might finally have a good alternative to commuting by train, allowing me to read a book or watch a movie in peace and with plenty of space, while getting from A to B. At least in Germany, however, government thinking amounted to “you may have a self-driving car, but you must be constantly alert and ready to immediately take over driving, should a problem occur”. This sounds like a downgrade relative just driving. Reading and movies are obviously out, as is beginning work in the car, and what is left is basically to stare out the window and listen to the radio—while actually driving would involve activity, continual decision making, and a correspondingly more active brain stimulation.** With this approach the driver-as-passenger of a self-driving car, on the one hand, is turned into a parcel relative the car and, on the other, does not have the option to seek out activities that he finds more entertaining and/or developing—the worst of two worlds.
*As opposed to speculation and sci-fi; however, I note that the these early plans seem to have underestimated or deliberately downplayed the difficulties, which might make “realistic seeming” a better formulation…
**I also have doubts as to how well this would work, as chances are that most drivers-as-passengers would soon lose their concentration and not be able to intervene sufficiently fast in case of problems. A regular driver is kept at least somewhat alert by the constant activity and decision making, even should it just involve a slight adjustment of position on the road, and is in a very different position. (A good driver would also constantly evaluate the surroundings for potential dangers and whatnots, which would increase the difference further; however, I am uncertain to what degree most drivers actually do this—especially after the first ten minutes and especially while driving on well-known roads, as with a daily commute.)
This approach to self-driving cars shares much with modern life, where old activities, instances of decision making, whatnot have grown rarer or outright disappeared and found no true replacement—old needs are fulfilled in a convenient and allows-us-to-be-lazy manner, but there is too little new to fill the void left behind, unless we go looking for something new of our own.* In many ways, the modern life is a journey in a self-driving car—nominally, a “sit down and relax”; in reality, something much less appealing. In some, modern life is worse, as the driver-as-passenger at least sets the destination, while the metaphorical destination of life is increasingly set by others, most notably the government.
*Those who are unaware of the void might go through life with an unhappiness that they do not understand; those who go looking might be vulnerable prey to those who promise a purpose by joining a Cause or Crusade; and even those who search in a more informed manner might see a lesser fulfillment, because the “new” is less basal (cf. below) than the “old”.
This the more so since the arrival of COVID and the ensuing excessive and harmful government meddling, which has left the citizens virtually powerless (or revealed their degree of powerlessness?); and this the more so, if many (especially Leftist) politicians get their way, everyone is taken care off by the government, own decision making and self-determination disappear, etc. Remember “Watership Down” and that human-run warren?
Below, I will look at some of the factors involved, and I invite the reader to think on how these have changed over time and if/when/where (in Western countries) earlier generations might have had an advantage or disadvantage compared to us in terms of happiness:
- Happiness often results from the satisfaction of needs, and the effect appears to be the larger the more basal* the need and the greater** the previous deprivation. For instance, someone who has never been truly*** hungry might be objectively better off than someone who has to skip meals, but the satisfaction from even a very plain meal for someone truly hungry can be greater than that of a meal in an expensive restaurant to someone who is not.
*In a classification like Maslow’s hierarchy of needs. With reservations for special cases: for instance, someone who falls asleep after being awake for three straight days might not, as he is asleep, be satisfied in the manner of a hungry eater—and might be too confused to think clearly, or at all, before falling asleep.
**Within reasonable limits. For instance, someone undergoing life-threatening starvation might not have the energy and whatnot to be happy when food is presented.
***What should be considered “truly hungry” can be a further matter of debate, as there is always a further step and even going a full day without eating can seem like nothing to the starving; however, the point is that most of what is called hunger today amounts to “I want to eat” rather than “I need to eat”.
The same principle applies to a wide range of other areas, from the life-sustaining to such trivialities as drinking coffee, e.g. enjoying a cup as a mere bonus while reading a book vs. drinking one after having spent an entire caffeine-less day on one’s feet, outdoors, and in cold weather. (I will stick mostly with eating, be it eating out of necessity or enjoyment, for below examples, as eating is very easy to relate to.)
- Enjoyment of something often has a deliberate component, in that the enjoyment increases when we pause, cherish the moment, whatnot. The more we have, and the easier life is, the greater the risk that something enjoyable will not be enjoyed to the full degree. Compare e.g. eating a single piece of candy, knowing that this one piece is it for today, with wolfing down two dozen pieces. Similarly, compare the reactions of the girls in “The Little House on the Prairie” books to various gifts (candy or other) that would seem like nothing today. Similarly, note how “firsts” tend to be enjoyed much more deeply and often than “fiftieths”,* because the relative scarcity causes a greater attention. Etc.
*E.g. a first book, CD, whatnot. This assuming an individual value/enjoyability/whatnot that is sufficiently comparable, as a poor “first” might certainly lose to a top-rate “fiftieth”.
(However, this item can clash with the previous one, as this type of deliberate enjoyment is less basal and/or urgent. It would, for instance, be unreasonable to expect someone who has not eaten in 24 hours to take a single piece of candy and savor it. The natural and appropriate reaction is to make up for the deficiency in nourishment as soon as possible; and the current item is more relevant to those whose more basal/urgent needs are already satisfied. More generally, I do not guarantee that any given item, let alone combination of items, will make sense in any given situation.)
- A degree of happiness or, at least, contentment seems to arise from continually and successfully solving or dealing with small problems, which is likely a reason for the popularity of computer games and for how easy it is to lose track of time while playing them. (Also note the discussion of self-driving cars above.) These problems, however, need not be of a “thinking” nature and can equally involve something more physical.* For instance, being physically tired and putting in a few steps more can bring a satisfaction on its own—and a distraction from the problems of life. (It is hard to dwell on this-and-that when that next step fills one’s mind, which makes this a good approach to “take one’s mind off something”.)
*In addition, there seems to be physiological benefits from physical activity on happiness, both in the short and the long term. My knowledge of this area is fleeting, however, and I am more interested in the “psychological” components, except to the degree that these are explained by more physiological effects, e.g. in that something might be more mentally satisfying than something else because of some physiological effect.
The same applies to writing, and I suspect that one of the reasons for my unhappiness at around this time last year was that I had cut down on my writing too far, and my very extensive writing during the last few months has made me much happier. (Notwithstanding a great amount of frustration and the like. Cf. a later item and note many past texts. If in doubt, writing makes it harder for that frustration to manifest.)
- The previous item is partially an example of intellectual* stimulation, which is of great importance in avoiding boredom—and need not consist of analyzing the works of great philosophers or solving math problems. On the contrary, for many, a random chat, even without any intellectual* aspirations, can fill the same function—as can the modern teenager’s fiddling with a smartphone. Looking at myself, I have rarely drawn much stimulation from small-talk and e.g., in my childhood, preferred to spend car rides with the family reading, often eliciting complaints that I spent too much time with my nose in a book.** The point is that some activities are stimulating to the one and boring to the other—and that it is hard to be happy when bored. As counterpoints, too much stimulation in one area can lead to a neglect of other areas, while a too great difference in what brings stimulation can make someone an unnecessary outsider.
*The word “intellectual” has more than one meaning, and here two overlapping-but-not-identical meanings are intended.
**It is interesting, however, that my reading was a minority behavior that seemed odd to others, including most fellow school-children, while the similar behavior relating to smartphones has become the norm today, in at least some age groups. (But what is done with a smartphone can vary considerably, while reading can vary in content and intensity but not in being reading.)
- Above, I speak of “successfully solving or dealing with small problems”. When the success is not there, things can go downhill fast. (The same applies to problems that are not small, but it is important to note that even small problems can have this effect.) Consider doing something that, in some sense, “should” work, seeing it fail, making a modification that really “should” make it work, seeing it fail again, making a modification that really, really, really should make it work—and seeing it fail a third time.* No matter how trivial the issue, but provided that the time from first to third is sufficiently short, I tend to grow very annoyed and frustrated. To make matters worse, the level of annoyance and frustration can make a fourth or fifth attempt too sloppy, leading to further and unnecessary/self-caused errors and bringing me to the point of explosion.**
*To give a specific example is tricky, as most of my own experiences stem from computers and it is unlikely that any given reader will share a sufficiently common ground among the many uses of computers to make such an example sensible without considerable explanations, and as real-life examples might vary equally much from person to person. However, chances are that he will recognize many such situations from his own life.
**A natural human reaction seems to be to, literally or metaphorically, use more force when things do not work. In the stone age, this might have been a good approach, but not necessarily in the modern world—and certainly not when dealing with computers.
- More generally, a lack of own control can be extremely frustrating—especially, when it comes to things that we rightfully should control. Ditto when our attempts at control are thwarted by others. (Note my text series on choice, including on the illusion of choice, unfair government and choice, and overruled choice.)
Consider e.g. most government involvement in our lives, the horrors of customer “service”, and, in my case, the absurd amounts of renovations that have taken place in my apartment building during times when I have tried to study and write my books. A particular complication is that the government has often removed all realistic means to take own actions in various areas and has provided no adequate substitutions in return (cf. e.g. [1]).
Excursion on diminishing returns:
Diminishing returns is a recurring theme above, to the point that I almost included a separate item on the issue. The core idea, that the “marginal gain” in happiness through having more of something grows smaller the more one has, is sound, but it does clash a little with one of my main points, namely that the removal of a greater need makes for more happiness. Going strictly by diminishing returns, the one who has more is still happier than the one who has less, which is not necessarily the case once we consider who has the greater removal of a current need. A threshold level might also be argued, in that returns do not diminish in a consistent manner, but see an abrupt drop once a certain level has been reached, e.g. that more food brings comparatively much value as long as someone is hungry, but comparatively little once satiation is reached.
Excursion on worry and other constant harms:
Worry is another factor that can cause unhappiness. Here we see a shift from the more personal and more urgent of the past to the more distant and less urgent of today. For instance, worrying over politicians wrecking the energy supply or whether one will be fired/be promoted/pass that exam/whatnot is different from worrying over whether there will be something to eat tomorrow. (Notwithstanding that the mixture of artificial and unnecessary COVID-countermeasures, artificial and unnecessary inflation, artificial and unnecessary energy-supply issues, etc., has made similar concerns hit a wider range of the population than just a few years ago.)
A potentially important difference between e.g. a removed need and worry is that the former gives a strong-but-short push towards happiness, while the latter gives a prolonged push towards unhappiness. Other such factors exist, e.g. many instances of fear, pain, and, of course, hunger. (And note that I do not recommend going on a starvation diet for several days to get the benefit of a single excellent meal, or, more generally, claim that more hardship is better. I do suspect, however, that we have long passed the optimum levels and are now doing increasingly worse by having it too easy.)
As an aside, worry is a paradoxical emotion as things either will work out, in which case the worry was unnecessary, or will not, in which case the worry just made things worse. Worry can have a legitimate place as a motivator, e.g. to ensure that stores to survive the winter are built up during the warmer days of the year; however, this presupposes that we have sufficient control over our lives and the problems that we worry about.
Excursion on other factors:
There are a great many other factors that can contribute to/detract from happiness, and many that might be relevant in a “now vs. yore” comparison, including those that relate to human evolution vs. current life. (E.g. in that humans might be happier with a greater exposure to nature than most of us get today, with larger families, with less “screen time”, etc.) A particularly interesting possibility is that “ignorance is bliss”, that a better understanding of the world/whatnot makes it harder to be happy. Yet another issue is that having more comes with secondary costs/efforts/stress/whatnot, e.g. risk of theft and need for maintenance. The above focuses on cases that have figured in my recent thought.
Excursion on my own behavior:
This is an area where it is much easier to give advice than to follow it. I have repeatedly set a target for myself to act based on this-or-that observation (not necessarily from the above), e.g. to eat less but with more deliberate enjoyment, but the effort has rarely lasted for more than a day and/or been restricted to a day here and a day there. (I might call these New Year’s resolutions, except that they have never taken place at the right time.)
An interesting example is the trick of deliberately feeling happy: I have developed a knack for simply letting certain positive feelings sweep over me without a trigger event.* The downside is that this feels a bit like a chore, which makes me put it off, and doing so for more than a short time can be boring, even border on the tiring. In a manner of speaking, it is broccoli for the soul—good for me, but not something that happens as often as it could or should.
*I can no more explain how to do this than I can explain how to make my body move; however, I suspect that most readers have to some degree made similar experiences. A partial tip is to notice how parts of the body behave during various types of happiness, notably relating to smiles and orgasms, and try to force this behavior, and to engage in some experimentation of what movements can have what effect. However, much of it does not have an obvious physical aspect.
Evil and treating humans like NPCs
As I have written in the past, much of evil goes back to “a lack of concern for the rights and interest of others, or even the inability to understand that others do have rights and interests” ([1]; with similar contents in other texts), which I have seen as similar to, but distinct from, a claim by Terry Pratchett about evil arising from treating humans like things.
There is an interesting middle ground, which arguably covers both claims in a unified manner—evil arising from treating humans like NPCs (Non-Player Characters) in a computer game. These, within the game, can appear quite human-like; however, when push comes to shove, they have no rights, no intelligence,* no hopes and dreams, and their entire existence might amount to a few bytes of memory and a few lines of code leading to some movement on a screen. I have myself, in strategy games, sent my troops off to kill these by the thousands. (To which must be added a great many other killings in e.g. MUDs.)
*Currently, at least. The future might bring a change.
However, I realize that real-life humans are a very different matter. I might loathe, detest, and despise a great many of my fellow humans (and for good reason), but I have a fundamental realization that they are humans—not NPCs. If I do something harmful to another human, I do it to another human. This something might still be justified, but it requires an actual ethical judgment, a weighing of options, considerations of what my and the counterparts’ rights are, etc.—not just a solipsistic shrug of the shoulders, as if humans were NPCs. On the contrary, there are a great, great many harmful things that I do not do to other humans, because they cannot be justified.
The problem is that most others seem to lack this exact realization and walk around as were they players of a computer game and everyone* else an NPC. In this, NPCs might be the perfect metaphor for the problem of disregard of others and the many resulting evils.
*Unless, metaphorically, connected to them by headset, as might be the case with close friends and family members: they might realize that someone close is not an NPC, but they do not extend the same realization to the rest of humanity.
Excursion on NPCs proper vs. metaphorical/meme NPCs:
The above is not to be confused with the common use of “NPC” as a derogatory label for e.g. someone mindlessly repeating slogans that he has been told by someone else, useful idiots, and similar groups of low thinkers. While this label is in many ways fitting (the part of “no intelligence” seems particularly apt), it is of no relevance to the above.
However, in a twist, it was repeated recent mentions of such metaphorical NPCs that brought my mind to the topic, by making me reflect on the difference in use. My much longer exposure to literal NPCs, oddly, never brought this idea to my mind.
In another twist, there is a large overlap between such metaphorical NPCs and those of my fellow humans that I “loathe, detest, and despise”. Even so, metaphorical NPCs are not literal NPCs.
Profound vs. trite
An interesting* observation is that what is considered profound (insightful, whatnot) and what trite (trivial, whatnot) depends on factors like how often we have been exposed to this “what” and how far back the first exposure is. The main break is, of course, between a first exposure and no exposure; however, both time passed and the number of exposures can play in long after this main break. (On other dimensions, factors like correct–incorrect and depth of understanding can play in, as with e.g. the superficialness of the trouser–skirt observation below. These factors are off-topic, however. Ditto own level of brains, cf. excursion.)
*With some reservations that should be immediately clear from this text.
A particular annoyance to me is when someone else presents something as if it were original and thought-worthy but leaves me with a feeling of “not this shit again”—and chances are that I have, without realizing it, been on the other side of that equation a number of times. For instance, during my school years, there were at least three occasions when a (different) teacher pulled up some quote about the dire state of the modern youth, and followed the quote by almost triumphantly declaring that it had actually been the words of some commentator from ancient Greece. Well, the first time around I did find it a revelation, and that some issues tend to be eternal is an important insight, but there was little or no benefit to the repetitions.* For instance, earlier today, I had someone try the old “How do you pronounce ‘ghoti’? You will never guess!!!” riddle/demonstration.** I will never guess? Considering that I first (likely in the context of Shaw’s complaints) encountered this contorted spelling of “fish” no later than in my early twenties, more likely in my teens, and have encountered it on at least another dozen occasions since, I beg to differ.***
*The different teachers did hardly set out with an intent to be repetitive, but the end effect is the same. As to “why”, I speculate that there was some type of source on pedagogy, maybe a college course, that had recommended this as a good way to make the students think, and that each of the teachers independently decided to try it, with, in the later cases, no awareness that someone else had already pulled the trick.
**This encounter was the trigger to finally move ahead with this text—and a good thing too: in another few years, it might have seemed too trite to bother with…
***While my interest in language and related matters is likely considerably above average, and I might be at a correspondingly greater risk for repeated exposures, I am not a native English speaker and it puzzles me that an adult native speaker would have failed even to reach one exposure.
Looking over time, I have had ideas that I considered “worthy” at one time but do not consider so today, because I have grown so used to them that the familiarity has bred contempt. (While the respective idea only rarely has changed in value: either it was an “unworthy” idea to begin with or it is still “worthy”.) Similar effects can be present when I encounter an old own idea in the writings of someone else, which might imply that the idea was less original than I thought.* In some cases, a seemingly original idea can turn out to be common knowledge within a certain field, even when it is obscure outside that field. (The main difference is then whether someone has or has not already encountered that field. At the extreme, I had long had ideas in the direction of evolutionary psychology before I became aware that the field of evolutionary psychology existed—likely, because this field is very un-PC, while my exposure to information was more strongly PC-ified in the days before the Internet grew dominant.) In some cases, what amounts to the same idea(s) can be developed independently in different fields.**
*However, I have repeatedly noted that it is very hard to come up with an idea that no-one else has ever had. A difference remains between ideas that are “had” by relatively larger and smaller portions of the population.
**As a specific example, I have encountered (at least apparently) independently developed ideas on queues and queueing in both math/computer science and business studies, with the latter trailing the former considerably. (This at the “textbook level”. I cannot speak for the research level, but as those bright enough to do strong mathematical thinking rarely do business research, be it at all or specifically in an academic setting, I suspect an even larger gap.)
There have even been quite a few cases where I once had the intention to write something around a certain idea, but later realized that it would be too trite in light of the many others who already had written something on the same idea, lost interest due to the “familiarity breeds contempt” angle, or began to consider the idea “too obvious”. A good example is the display of neoteny in women, which seemed like an important insight to me, maybe, twenty years ago, but where I, today, am almost ashamed to admit that I once considered such a “Duh!” noteworthy. In other news, fire is hot.
The question of encounters with a certain field, or a specific problem,* can be of great importance in that even someone highly educated and highly intelligent might not yet have had any non-trivial contact with a great number of fields, simply because there are so many. Similarly, two highly educated and highly intelligent persons can go through a similar set of fields or problems in a different order, which can make what appears a profound insight to the one seem trite to the other—and vice versa.
*There might be a problem that someone could solve in five minutes or five days, but which he has simply never encountered and, therefore, can say little about. He might then trail someone who needed five hours resp. five weeks, but who also has the advantage of prior contact and who already has those five hours/weeks behind him.
An important side-issue is that those who are poor thinkers, poorly read, whatnot can be exposed to an important-seeming idea and be swept away with it, be it because they overestimate its importance or because they lack the perspective built from having seen different takes on the underlying issue. This likely explains much of the success of various religions, ideological movements, and similar. I have particular suspicions that the success of Feminism and “gender studies” to a large part go back to the naivety of the victims, that many naive young women are exposed to certain thoughts at a late stage and from an external source that others have had earlier and on their own. Now having a world-turning moment, the young women think that they have discovered something profound and become adherents—while they have, in fact, discovered something trite and should laugh it off.
For instance, I was no older than five,* maybe younger, when I confronted my parents about skirts and trousers, and why women/girls could choose which one to wear, while men/boys had to stick to trousers. This is an appropriate age to notice and address such obvious things—but what if someone goes for another ten years without doing so and is then fed some line about “Have you ever thought about skirts and trousers? See how the evil Patriarchy oppresses us poor womyn!”. Or consider the idea of a fix two sexes and/or “genders”: By the time that I was twelve, maybe younger, I had already had repeated exposures to other ideas, including some (non-fiction) book mentioning fish that switched sexes during their life cycles and a sci-fi book where some alien species had three sexes—and this was in the 1980s, far away from the current gender-mania. Then there were various works featuring body swaps, “Dr. Jekyll and Mr. Hyde” scenarios, girls pretending to be boys and vice versa,** which left me well prepared for such ideas (although the former two more often involved same-sex scenarios). I remain unimpressed by the idea of someone being “non-binary”, but how might things look for someone with little brains, little prior readings, and few own thoughts, who is suddenly exposed to such ideas?
*My parents separated when I was five and we moved to another apartment in another town soon after, providing an upper limit.
**This, I wish to recall, included a portion of one of the Tom Sawyer/Huckleberry Finn books, where the protagonist poses as a girl, and is found out by catching something in his lap by putting his legs together, like a boy, where (it is claimed) a girl would have spread her legs and used her skirt to make the same catch. These books were written in the late 19th century, so hardly something new. Norse mythology, which I enjoyed greatly in my youth, includes several instances of similar situations, including Thor pretending to be Freya in order to get his hammer back and Loki actually managing to get himself pregnant.
As an aside, I find the common claims by various Leftist groups that their opponents are unenlightened and have missed these Big Revelations deeply offensive. More or much more often, the opponents are better informed, have gone through all the Big Revelations (but understood that they are not actually Big and/or noticed the distortions introduced by the Left), and, often, have once held similar ideas, which they later abandoned in light of additional evidence and critical thinking. This is particularly notable with Swedish Feminists, who like to claim that their opponents are opponents only because they are unenlightened and should listen to Feminist wisdom and superior insight with a becoming humility, while these opponents had, usually, been indoctrinated into a Feminist worldview* since they were children and had later broken free from it—as the Feminist worldview simply does not match the facts at hand.
*Including e.g. that domestic violence would be a one-sided men-on-women affair, that women only fail at this-or-that due to discrimination/Patriarchy/“structures”/whatnot, and that differences between the sexes would be limited to reproductive functions.
Excursion on memory:
Above, I gloss over the issue of memory, e.g. in that someone might be exposed to a certain idea once and forget it again, or twice and view both exposures as revelations, etc. This, in part, to keep things simple; in part, as truly worthwhile insights are less likely to be truly forgotten. (Yes, they might need a prompt to be remembered and the details might be lost in the fogs of time, but chances are that the big picture will be remembered given a prompt—and many will be remembered even without a prompt.) Exceptions certainly do exist, however, as with the nectarine phenomenon.
Excursion on own thinking and similar effects:
Another issue is how good we are at own thinking, at coming up with own ideas, and similar. Someone who has only few own ideas might think more highly of those that he does have than someone with more ideas; someone who has spent a day reaching a certain conclusion might value it higher than someone who arrived at the same conclusion in five minutes; etc. A reason for my scepticism towards e.g. business studies and social sciences is that I, when reading such materials, often find myself raising issues like “this cannot be true—consider X”, “but what about special case Y?”, “important observation Z immediately follows—why is there no mention of this?” only to see the book address X, Y, and Z a few pages later on—and in a manner that makes it clear that the author is now teaching the reader something that he could not expect to figure out on his own.*
*More generally, but decidedly off topic, I am often greatly annoyed by the extreme condescension, arrogance, and attitude of “my readers are idiots” shown by some authors. Those annoying “in this chapter you will learn” are just the tip of the iceberg. Consider formulations like “This seems very complicated, but never fear, it is not that hard to understand!”, which would be disputable in a book for first-graders but are still used for adults. Ditto questions like “What pattern do you notice?” and “How do you think that this made X feel?”.
Excursion on other areas:
The above deals mostly with ideas of an “understand the world/humanity/whatnot” type. The phenomenon is much more general, however. Among the many examples we have e.g. works of fiction with a brilliant or hackneyed plot twist (depending on whether the viewer/reader/whatnot encountered the twist for the first or the tenth time and/or whether he managed to foresee it). Another interesting example is jokes, as with “When is a door not a door? When it is ajar!”. This joke is actually brilliantly funny, but it suffers from the many, many, many repetitions. Worse, chances are that most native English speakers first hear it a too low age to truly appreciate the pun, while already being tired of the joke by the time that they are old enough. (I am uncertain when I heard it first, myself, but I suspect that I was in my mid teens, which might have made me twice as old, or more, as the typical “native” for that first exposure.)
Losing someone is not something special / Follow-up: Lake Woebegone (sic!), where all the teens are more miserable than the others
In a somewhat similar phenomenon to [1], fiction often contains someone suffering a great personal loss, e.g. a child or a spouse, followed by claims like “You do not know what it’s like!” directed at someone else. While such a loss is indisputably much worse than e.g. being a tall teenage girl (cf. [1]), these claims are often unfair, and the general surrounding grief is definitely far from unique. True, most have not lost a child or, if sufficiently young, a spouse, but very many have. Looking at my own extended family, I can give three examples of lost children—off the top of my head and limited to events that I actually know about: The highly premature deaths of resp. a (would-have-been) great-uncle, a cousin, and a step-nephew, all of which had at least one parent at the time. (And if Sweden had participated in various wars, it might have been a lot worse.)
Spouses are trickier, as there is an age factor (these statements are often directed at someone comparatively young, e.g. a police officer or a friend of a similar age). However, in due time, somewhat close to half the population falls into this category (with the other close-to-half being dead), and highly premature deaths are not that rare here either.* This, of course, counting just deaths, not e.g. divorces and other non-lethal losses that can still be quite painful.**
*Looking, again, at my own extended family, one of my grandmothers became a widow at 57 or 58, an aunt would have become so in her forties, except for a preceding divorce, and I know that one of my great-grandfathers was married repeatedly, with a considerable risk that he was a young widower at some point, as divorces were much rarer in his days. Above 60, it is a different ballgame entirely; ditto, if we include divorces.
**Looking at the issue holistically, I suspect that many divorces are worse than deaths, e.g. when a believed-to-be loving wife suddenly takes off with the kids and the contents of the joint bank accounts, files fake charges of this-and-that, and is then rewarded/awarded with full custody, the car, the house, child support, and a hefty alimony by a court.
So, how sure can we be that the counterpart does not know what it is like? Typically, we cannot, and chances are that many of the speakers are too caught up in the moment to think clearly and fairly.
Of course, even among those who truly do not know, many or most can have at least some idea—just like someone who has broken an arm has some idea of what it is like to break a leg and vice versa. I, for instance, have never lost (or, at all, had) a child, so I will not presume to claim that I know what that is truly like—but I do know how it is to lose someone. (The list includes, but is by no means limited to, my mother and all my grandparents, and I am still only in my forties.) Looking at those adults who have lost someone truly significant, say, a child, a spouse, a parent, an unusually close grandparent/aunt/uncle/friend, chances are that they form the majority. (Throw a wider net, to include grandparents, aunts, uncles, friends, and whatnot in general, and it is almost everyone, but a too wide net would lessen the “some idea” too far—like comparing breaking a finger to breaking a leg.)
Then there are so many other personal misfortunes of other types that, again, are so much worse than just being a teenager—say, seeing one’s home burn down, landing in prison without having done anything wrong, losing a leg, or having a bout of cancer. (Also see excursion.)
At the end of the day, losing someone dear is much worse than being a tall teenage girl, but it is not necessarily that much more “special”.
Excursion on other cases and the threat of own death:
There might be other areas worthy of similar critique, but I will not dig into this, except to briefly discuss the pressure of own death, as presented in e.g.*The Midnight Club, where kids are not too tall to be happy, but too ill to live into adulthood, and spend their midnights telling each other stories.** Even this, I do with some hesitation, as I have never*** been in the position to know that I could consider myself lucky to live another year, and as I certainly have already long left my teenage years, implying that I have had an adult life, even should I drop dead five minutes after publishing this text; however, a somewhat similar point can be made: we will all die, sooner or later. From this fact, we can take two different interpretations, namely: (a) We all have a limited time to live and differ mainly in how much time. (b) Excepting those of us who suffer sudden and premature deaths, we will all eventually be in that same position of knowing that the hourglass is emptying and emptying fast.
*Characters presented with knowledge of near-unavoidable death are common in fiction, even outside the action genre, ranging from teens with terminal illnesses, to Walter White, to those very, very old. Deaths of the younger are more interesting here—in part, because they are more tragic; in part, because they tie in well with [1].
**Interesting idea, but poorly executed, notably with too many pointless mystic and mysterious elements a la “Lost”. A better series would have cut the bullshit and focused more on the stories and the character development. I stopped watching after episode 7 (going by Wikipedia’s episode descriptions) out of 10, when the most interesting character was killed off. The series appears to have been cancelled after those 10 episodes, which, going by the descriptions of the remaining three episodes, might have been a mercy.
***On at least two occasions, I have had a genuine fear for my life on a much shorter time-scale, but these were, obviously, false alarms. (Not counting dreams, where the number is far higher than two.)
Lake Woebegone (sic!), where all the teens are more miserable than the others
As much as I sympathize with the many who had a poor experience of life in high school, or school in general, I find annoying the self-obsession that sometimes, especially among girls, follows it in real life and almost invariably does so on TV.
Certainly, there are some who have it much worse than others, but most of those who complain about “poor me”* actually seem to be fairly average in their experiences. The teenager years can be hard, being in school does not help, and being surrounded by other teens is often a bad thing—but it is approximately the same for everyone. Unrequited love? Bad break-up? Fight with the best friend? A mean-girl clique? Feeling misunderstood? Studying hard and still getting a poor grade? Insecurity about physical attractiveness? Being nervous around the other sex? Being homosexual and insecure both about that and about being around the same sex? Having a close relative die? Parents divorcing? Being forced to move to a new school and starting over without the old friends? Not being made home-coming queen?** Chances are that most high-school students run into several of these and/or several others that simply did not occur to me off the top of my head.
*As opposed to “poor us”: There is a difference between complaining about general problems using personal experiences as examples (I often do, myself; note e.g. the locked-in-a-train situation in my previous text) and painting fairly normal problems as something much worse than what others encounter. Ditto between complaining about one’s own specific situation when it is unusually tough and when it is not. Ditto between just letting of some steam with a personal complaint and demanding the sympathies of the world.
**With variations depending on personal priorities, e.g. “Not being made valedictorian?” and “Not making the football team?”. (I stick with a more female perspective in the main example, as the girls, again, seem to be worse complainers on average.)
Nevertheless, we have many who complain about their specific personal situation relative everyone else based on just these several items. Consider “Tall Girl”*: Unsurprisingly, this movie deals with a tall girl. She goes to high school and sees her complaints center around being tall and how everyone, supposedly, considers her a freak. Various adventures ensue, and towards the end of the movie she gets up on a pedestal** and holds a speech about how horrible it is to be a tall girl in high school to the gathered students—most of which likely had problems of a similar or greater magnitude, but who were not given a pedestal by the filmmakers.*** Interestingly, she was not even that tall, being somewhere in the 1.80s, where she could still expect to, e.g., find plenty of taller men. Certainly, she did not seem to consider or take advantage of potential upsides of being tall, say, a chance to be the star of the local basket team and maybe getting a college scholarship—while there is no or very little upside to most other teenage complaints, e.g., unrequited love. (Well, I suppose that unrequited love might help someone looking for a career in poetry, but …)
*A third rate movie that I caught when stuck in a hotel room with nothing else to do. With reservations for the exact name and various other details.
**Whether literally or just metaphorically, I do not remember.
***In the case of fiction, there is often a question of whether the fault results from a realistically portrayed character or from filmmakers pushing some angle. In the latter case, there is the additional question of whether the angle is more-or-less arbitrary or rooted in personal experiences.
Of course, this type of condescending and self-centered speech is not unique to “Tall Girl”. On the contrary, I have seen a number of variations over the years. The same attitude, without a speech, is very common. Consider “Buffy the Vampire Slayer” for a much more intelligent take and many interesting contrasts between those with superficial problems, like Cordelia and her clique, and those like Buffy (who spends her days in school and her evenings risking her life to fight “the vampires, the demons, and the forces of darkness” or something to that effect). A particularly interesting scene follows shortly after the death of Buffy’s mother: kid sister Dawn is at school, desolate and crying in the bathroom, and it is revealed that this was over some boy or something mean that some inconsequential mean-girl said—the news of her mother has yet to arrive.
The trigger for writing this text, however, is “13 Reasons”, where I am currently midway through episode 4. The premise is that a high-school girl, Hannah, has killed herself and left behind a set of tapes with various recapitulations of how some other students gave her these “13 Reasons” to end her life. So far? Nothing truly impressive. Combined with her accusatory tone and streaks of pontification on tape, and her behavior during flashbacks on screen, she often seems to be over-sensitive, irrational, self-centered, and/or attention seeking. (To the series credit, something similar is also stated by one of the other characters.) Examples of her complaints include boys making a list of who-is-or-is-not attractive (apparently male sexism,* even though she did well), which led to one of her “friends” freaking out and blaming her for a breakup.** Well, that sucks, but it is not the end of the world and nowhere near suicide territory. It is certainly not, for instance, comparable to having a parent die or oneself losing a limb in a car crash—and I doubt that suicide is a typical reaction to these either.*** Few, in all fairness, see a “13” in lieu of “several” above, and within a comparatively short time, but, so far, I do not see suicide as an even remotely reasonable reaction—or the death rate of teenagers would be far higher than it is.
*In real life, over twelve years in school, I encountered exactly one such list—made by the girls.
**I am a little vague on the details, but it might be that Hannah’s “friend’s” boyfriend was the one who was complimentary, which led to unwarranted suspicions of an affair with Hannah. The loss of this “friend” is the worst damage seen so far, but, from the overall material, it seems to not have been a true friendship to begin with (hence my scare quotes). This does not lessen the pain in the moment, but it does reduce the practical damage.
***I acknowledge that those who commit suicide in real life do not necessarily have reasons that others would understand, but when an entire TV series is made on the topic such reasons should be present. I note, in particular, that there has not been any signs of pre-existing complications, say, a clinical depression, a severe substance-abuse problem, abusive parents, or a very prolonged state of unhappiness. Moreover, it is clear from the existence of the tapes that suicide was neither a spur of the moment decision, nor a “number 13 was the last straw—I just cannot take it anymore”.
Of course, all this even going by Hannah’s versions of events, the truthfulness of which has been disputed by at least one other character.
I had great hopes, after a promising first episode, but right now I am uncertain whether I will even watch episode 4 to an end—in part, because the promises of the first episode do not seem to be fulfilled; in part, because there have been repeated unrealistic “evil male” portrayals;* in part, because I fear that the series will end with some type of cop out, e.g. a rape scenario, as there are strong signs that a group of boys is trying to keep something very bad quiet.
*Including, in this episode, a photographer who secretly takes photos of other students. The sheer amount of such portrayals in modern fiction is both tedious and annoying. To boot, they can feed into the very distorted view of men that many in modern society (already) have.
Excursion on mean girls and suppressed information:
When I hear about somewhat similar events in real life, say, that some girl did commit suicide or that some girl saw her reputation blown to pieces by incriminating photos, the formulations used typically amount to “poor girl” vs. “mean fellow students”. However, looking at the type of meanness presented, e.g. that the girl with incriminating photos is condemned as a slut or was excluded from her previous group of friends, it usually seems more like something that specifically the other girls would do. (In the second example, additionally, because girls usually have more girls than boys as friends.) Factor in what I have myself seen and heard in real life, most (at least, pre-woke) fictional portrayals,* and the known issue of the ethnicity of criminals being censored by media, and I strongly suspect another case of suppressed information, that the perpetrators are predominantly female. I also note a similar pattern of “society”, “media”, whatnot being blamed for “pressuring” women into this-and-that, where it is often clear that the pressure either stems from other women or from the individual woman, herself, e.g. because she believes that others have certain expectations. (Eating disorders is a recurring example.)
*Note e.g. parts of an older discussion of Carrie and the book, itself.
Why the world is going to Hell
As the world appears to go to Hell faster and faster, and as problems that I complained about ten years ago appear to go from a Swedish phenomenon to a world-wide disaster, it might be time to reflect on the causes. How can it, e.g., be that group A brings factual arguments, reasoning, statistics, whatnot, that group B brings ad hominem and other unethical rhetorical tricks, sloganeering, pseudo-arguments and -reasoning that fall apart when prodded with a stick, etc., and that group B wins? How can the virtual astrologers defeat the astronomers? The virtual homeopaths defeat the “allopaths’?
A dominance in media might contribute, certainly. (But how did that dominance arise?) Ditto less stringent schooling. Ditto less exposure to history. Ditto less exposure to past thinkers. Ditto this and ditto that.
The core problem is something else, however, namely that most humans are very bad at thinking (or choose not to think, in the first place).
In particular, someone of “average” intellectual/cognitive abilities, IQ, g, whatnot, is deeply stupid.
I am sorry, but it really, really has to be said:
The average human is deeply stupid.
More than that, even humans a fair bit above average are usually far from ideal. For instance, my main tour at university was at a program* widely considered one the most challenging in Sweden, loaded with math and physics—the type of program where the (in U.S. terms) average AP math A-scorer has problems keeping up. Even here, I saw plenty of students unable to follow not-too-complicated arguments or who preferred to ask for help instead of thinking for themselves—students who were not just less smart than I was, but who were depressingly far behind.
*Civilingenjör in “teknisk fysik” at the Royal Institute of Technology (KTH) in Stockholm.
In parallel, I took roughly two semesters of business classes at one of the hardest-to-get-into programs* in Sweden—where (again, in U.S. terms) a near perfect GPA and/or SAT score is needed to get in. My impression of most students was not one of awe (certainly, not compared to the above), many were more leg-workers than head-workers, and the tests usually checked more for rote-learning and ability to replicate from memory than for understanding and ability to apply knowledge independently. Still, this might have been second brainiest “peer group” that I ever had.
*Civilekonom at the Stockholm School of Economics. I interrupted my studies when I moved to Germany as an exchange student within my main program. (I deliberately leave out my studies in Germany from the discussion, as issues like my own language deficits made it hard to judge the level of the students.)
Go back to “year nine” of school, where brain-development was at an almost adult level and I last interacted with students who had not been strongly pre-filtered* for intelligence—and most were deeply stupid. So stupid, in fact, that I consider it a joke that they, a few years later, gained the right to vote by dint of turning eighteen.
*Years ten-through-twelve, the Swedish “gymnasiet” is voluntary, with many of the dumbest dropping out, and with a self-filtering into different programs, some academic, some vocational. I went to the the usually-considered-hardest academic program (natural science). Compared to year nine, almost everyone in my new class would have rated in the upper half or better.
Before I switched to writing, I worked in different software positions over twenty years. Most of my colleagues have likely had Master-level STEM-degrees. If not, most have definitely had at least Bachelor-level STEM-degrees. Very few have been smart enough to make good software developers; about half so dumb* that they should have been kept away from the profession entirely. Looking at other departments, (e.g. HR, product management, project management), the standard has been far lower, even though most of these have had some type of university level qualification, often undergraduate degrees in some business or administration topic.
*But, to avoid misunderstandings, a clear majority of these were still above the population average.
Looking at other people that I have interacted with over the years, including roughly half of my pre-college teachers*, most-or-almost-all civil servants, most-or-almost-all customer-service workers, most-or-almost-all social contacts (outside work), there is a clear dominance of “deeply stupid” and “has no business voting” (among those that I have seen enough of to form an impression). Then we have my impressions of most journalists, many elected politicians, whatnot—-just depressing.
*I left for university in 1994. In my impression, the quality has dropped even further since then.
The simple truth is that most activities that humans engage in, even most post-school activities that many have ever encountered, require very little “higher” intelligence—but that tasks like software development, politics, and voting do.* Holding a conversation, e.g., requires comparatively little, because humans come with a tremendous amount of built-in “circuitry” for conversation and what is not built-in can be trained simply through talking a few hours a day. Children and people with an I.Q. of 80 can do it—as long as the topics include the weather, who has a crush on whom, and what team won the game last Saturday. Performing simple routine tasks after a bit of instruction is not that intellectually straining. Etc.
*To do well, that is: Just getting a position as a software developer is far, far easier than becoming a good software developer, some complete idiots have managed to be elected, and the right to vote is usually handed out in a blanket manner to those who turn eighteen.
But: let the intellectual demands increase and most fall of the map fairly rapidly. Disturbingly many have problems with so elementary concepts as fractions, even when explained. Fewer yet could be told the concept and come up, on their own, with simply arithmetic laws for fractions. Most of the population appears unable to learn non-trivial matters from books. Think critically, see through a flawed argument, make abstractions, understand cause and consequence, create new knowledge, understand a math proof, …? Now we are down to a small minority.
One way or another, almost all modern problems boil down to human stupidity and irrationality. For instance, is it really reasonable that someone is allowed a say in politics who wrecks a child’s math score for illustrating “3 x 8” by adding 3 eight times instead of adding 8 three times?* Someone who does not understand that causality and correlation are different things? Someone who believes that if X implies Y then not-X necessarily implies not-Y? Someone who fails to understand that a difference in incentives can alter human behavior? Someone who hears “First they came …” and fails to see how it could apply to any other group than the Nazis (or, on the outside, other members of pseudo-category “Right”)? Someone who cannot understand the point of the previous questions without examples?
*A real example that I encountered on the Internet a few months ago (with reservations for the exact details). Even posing the question is disputable, as it does little to test the child. Picking the one over the other is idiotic, on this level, because both points of view are arithmetically equivalent, and a significant difference will only be relevant when we start to think about math in terms of operators—which is not productive for small children and somewhat arbitrary in general.
Also see e.g. [1] for previous discussions.
Excursion on IQ, etc.:
A more extensive and slightly quantified attempt to classify IQ and capabilities is found in a text by James Thompson. Comparing his speculation with my personal experiences and observations, I believe that he errs on the side of optimism in some cases. I suspect that at least some readers will be tempted to use flawed arguments like “Michael speaks of IQ, IQ is this-and-that; ergo, everything above is nonsense”. To this I add that the value of IQ as measure is well established, contrary to PC propaganda, and that none of the above requires IQ to be valid. (Neither does it require e.g. that I.Q. is heritable.)
Excursion on my second Master:
I earned a second Master in Germany, a few years after my original studies. As this was a distance program, my interactions with other students were to small for me to form an impression, but I have written very unfavorably about the quality of the university in the past ([2]).
Excursion on “Civil”-degrees:
Swedish degrees that start with “Civil” are usually broadly equivalent to a U.S. Bachelor immediately followed by a U.S. Master, thesis included. I compare the progress of my own studies with a U.S. J.D in an older text ([3]), which might be a useful illustration.
The complication of the untested evil-doer
One of my most important claims is that “evil is as evil does”, that we should measure others by their actions, not their opinions. (This with several variations, e.g. “fascist is as fascist does” and, in a wider context, the need to judge by actions and not claims, including e.g. with many businesses, politicians, and co-workers whose actions are contrary to their claims.)
A potential weakness of this attitude is that two equally good/evil/stupid/whatnot individuals can differ in their actions through different opportunities and abilities, different levels of material need, different exposure to provocations, and similar. What if the one has proved himself X and the other is currently untested? What if the one has hung over a volcano and the other has not?
A consequence of this weakness is the need to only make comparisons between people who have gone through sufficiently similar situations. For instance, comparing the character of Stalin with that of the office cleaning lady will be futile in almost all cases, because their lives have been too different: with the tables turned, the cleaning lady might have been a genocidal dictator, while Stalin might have been a conscientious cleaner.* We might still consider it unlikely that the cleaning lady would prove herself such a monster through statistical considerations (especially, when she has a good record within her prior opportunities and whatnots), but we cannot rule it out.
*We further have to differ between two variations: Firstly, my main focus in this text, a prior life that has simply revealed too little of the character of someone, e.g. in that the someone who would or would not commit mass-murder given the chance has never had the chance. Secondly, more tangentially, a prior life that has changed the character, e.g. in that a series of traumatic experiences created an unreasonable hatred.
Similarly, we cannot say that someone who has not done X would be incapable of X because he is currently innocent of X, consider someone harmless because of a lack of prior evil actions, etc.
However, and this is where the Left so often goes wrong, we equally cannot and must* not condemn someone because of something that he might or might not have done in an alternate reality, or might or might not do in the future—especially, at the word of his opponents. This with an eye on at least the following:
*Regrettably, this seems to often be a deliberate Leftist strategy, not just a lack of understanding of the problems involved, including variations like guilt-by-association, use of straw-men, and severe distortions of opinion.
- The presumption of innocence and its underlying principles. If in doubt, everyone is vulnerable to such accusations, including members of the Left. Indeed, guilt-by-association could easily be used against the Left too. If every nationalist (who is almost invariable and ipso facto condemned as nazi, fascist, racist supremacist) is just waiting for an excuse to invade Poland and kill Jews, then imagine what can be said about large parts of the Left, in light of Stalin, Mao, Pol Pot, et co.
A reasonable world requires that we are condemned only for what we have done—not what our opponents claim that we would do when we finally have the opportunity. Some extrapolation might be justified for a changing situation, but only to the degree that it is made likely by prior own actions and statements, e.g. in that someone with an own record of political violence is deemed unsuitable for political offices by voters, who might then legitimately fear an abuse of the office to continue such practices on a larger scale. (Ditto e.g. someone with a record of embezzlement or great incompetence.)
- Given sufficiently extreme circumstances, most people can be brought to out-of-character, out-of-norm, or even outright horrendous acts, implying that the mere potential vulnerability in some set of circumstances is not enough—the circumstances have to be sufficiently applicable and likely. Moreover, because the extremity of circumstances, if any, that bring a given individual over the edge are not knowable before they have actually occurred, it might be better to avoid speculation altogether and (again) apply the presumption of innocence.
For instance, someone strongly opposed to stealing might not be able to resist the temptation when it is a matter of life-or-death, especially when a spouse or child is concerned.* Even murder might be a possibility for many or most: imagine someone pointing a gun at your children and threatening to shoot them, unless you press a button to cause a sweet old lady to be electrocuted—what would you do? (And: if you do press that button, would that make you more or less suitable for a political office than if you did not? Than if you never had been forced to make the choice?)
*In some circumstances, this might even be legally allowed; however, there is no blanket exemption.
- Most people would, if at all, only become genocidal dictators or whatnot under truly extreme circumstances. For instance, the vast majority of even political extremists (let alone the overall population) has never killed a single human. How, then, can we assume that a specific individual would do this-or-that without a backing evidence well beyond “he is a political extremist”—let alone the much vaguer (and much more common) “I don’t like his political positions”?*
*An added complication is that many on the Left seem to consider someone “not sufficiently Left” as “extremist”, regardless of other facts, and/or consider any “Right-wing” position as “extremist”.
Even someone who does have a strong predilection will not necessarily act upon that predilection. For instance, there might be members of the extreme Left who would like to start a revolution in principle, but who deliberately abstain from any such attempt for various reasons, e.g. respect for the law, fear of setting a precedent for other movements, an unwillingness to cause the inevitable bloodshed.
Assuming that someone with a certain predilection will act on this predilection is similar to assuming that someone who likes to eat will be obese: some are, most are not.
(Much of the above amounts to seeing a test correctly: If someone has been tested and passed, we can make a positive statement; if someone has been tested and failed, we can make a negative statement; but it is a fallacy to assume that being untested is the same thing as having failed (or passed) the test.)
Untested extrapolation and human nature
In the Firefly universe, Shan Yu is claimed to have said:
Live with a man 40 years. Share his house, his meals. Speak on every subject. Then tie him up, and hold him over the volcano’s edge. And on that day, you will finally meet the man.
The ramblings of fictional dictators are rarely a great source of wisdom, but this one points to one of the most important life lessons we can ever learn:
The true nature of someone or something is often only revealed in the right circumstances—and what the careless observer believes is the truth, is often incomplete, occasionally entirely wrong. Before we have seen this someone or something in a situation sufficiently similar, we can often only speculate about the true nature (or aspect of overall nature).
This has a very wide applicability, including scientific phenomena,* doors,** businesses,*** …—and, most notably, humans.
*Countless examples exist on many different levels. A high-level example is the contrast between classical physics and quantum mechanics or the theory of relativity.
**Is that solid looking house-door really an obstacle to a burglar?
***The final impulse to get this particular text done was a reader email concerning Clevvermail, which serves as a great example of how businesses only show their true level of (in-)competence, customer (un-)friendliness, whatnot, when something goes wrong or an unusual situation occurs. See excursion below.
Consider a small selection of the many conceivable examples involving humans:
- Shan Yu’s example: While I do not agree that we would meet “the man”,* chances are that we would indeed learn something new about the victim—possibly even something that radically changes our view. Take someone that you truly believe that you know** and imagine him (or her) in that situation—can you now truly predict how he will react? Will there be tears? Threats? Promises? Negotiation? Cold fear or hysterical panic? …
*Rather, it would show us yet another aspect of the man.
**A spouse, sibling, close friend, … I will mostly use “friend” below, but the examples easily generalize.
Turn the table: Can you predict how you* would react in this situation?
*A similar table-turning is implied in other examples; however, mostly not spelled out, in order to avoid redundancies. (Obviously, the chances that we will have a good idea about ourselves is far greater than for others—often because we already have been tested/tempted/whatnot in a similar manner.)
I would not dare to make the prediction even for myself.
- Vice versa, can you predict the circumstances* in which you or your friend would hold someone else over a volcano? Actually drop him?
*Chances are that they exist, no matter how despicable the act might seem. “Do it—or we kill your family!” would likely do the trick in most cases. Now find the borders. (Or haggle.)
- Is that friend a true friend, someone who can be counted on for help, or just someone you enjoy spending time with? Would he risk his life to save you from a volcano?
- Given the choice, would a certain friend prefer to help you or to obey the law? How would he generally prioritize conflicting obligations, loyalties, whatnot?
- Is that confident sounding colleague actually more competent than the rest or just more confident? Is the better-dressed colleague actually more competent or just better-dressed? Is the higher-up actually more competent or just higher up?
In my experience, these situations are roughly a 50–50, and judgment should be based on the actual performance.
- Would your children ever lie* to you, cheat on a test, do drugs, …?
*Arguably a bad example—if they are old enough to communicate, they almost certainly do…
A particularly interesting issue is the difference between what we want to do (would do in theory, consider the right thing to do, would do if we had the ability, …) and what we actually do. Will someone who rejects theft be able to stick to his principles when faced with a risk-free opportunity to steal ten million dollars? When stealing a loaf of bread makes the difference between eating for the first time in two days and not eating? Who can tell, when someone has not yet been tested… The difference is often even a physiological issue, e.g. in that another repetition of an exercise becomes so painful that even a strong will falters—or that a truly iron will is eventually foiled by a physical inability to complete a repetition. Then again, lack of trying can also leave us underestimating our abilities. For instance, I have several times gone for a walk, felt so lacking in energy after just a few minutes that I considered going back (“must be a cold waiting to erupt”)—and ended up walking for two hours, feeling more energetic at the end than at the beginning. Intellectual activities are the same: Sometimes sleepiness, a headache, or similar prevents me from keeping myself focused for even five minutes on something that I really want to do—and sometimes I can go into a near trance-like state where I spend hours, with only minimal interruptions, doing something that I would normally look for excuses to avoid.
Excursion on Clevvermail:
If we change a single thing in my experiences with Clevvermail, chances are that I would never have written the linked-to text—assume that my credit card had not been arbitrarily rejected (due to some technical, non-credit, issue):
This takes care of the first item (see the linked-to text) outright. It also takes care of most of the fourth item, because it directly or indirectly removes most of my interactions with customer service. It turns the last item from a major issue into a mere inconvenience, with no unwarranted suspensions, and no reason for me to terminate my account effective immediately due to a gross breach of contract. I might or might not have terminated it even so, but if I had, it would have been in a more regular manner, with no opportunity for Clevvermail to later harass me with an unjustified claim. (Be it because a deliberate fraud now lacked even a pseudo-justification or, assuming mere incompetence, because the restrictions concerning online account-termination no longer applied, and the account would have been terminated online.) Indeed, I might even have come off with an impression no worse than “has a poor UI” and “abuses my email address for spam”—both of which are bad, but not necessarily signs of anything unusual. (Some degree of email abuse is fairly common, even among businesses normally considered reputable. This does not make it acceptable, but at least less remarkable and with lesser implications.)
In reality, however, my card was rejected—and Clevvermail proceeded to reveal much more about it self to me than to the average customer. (Or so I hope—for their sake…)
Excursion on (fake?) friendliness and “service experience”:
An interesting special case is formed by the very many who are unable to see the difference between friendly, often fake friendly, service and good service. Smiles, greetings, a few jokes shared, whatnot, are all positive—but they are merely a bonus. The main point must be that the customer gets what he paid for, without any shortage, hidden costs, own unplanned efforts, …
Unfortunately, many incompetent or, worse, dishonest people bank on uncritical customers confusing a smile with good service or hope that a mere smile will be enough to not have to make up for an error and the associated costs/efforts imposed on the customer. Notably, the less competent (or more dishonest) someone is, the greater the past opportunity to practice fake friendliness…
To boot, there can be many border-line cases where someone is friendly just as a bonus, without manipulative motivations, and the uncritical customer is to blame for focusing on the wrong criterion. For instance, old ladies seem to judge how “good” a physician is more by friendliness than by demonstrated competence. This does not automatically make the physician incompetent; however, his reputation is still misleading.
As an aside, the fact that I am not blinded by friendliness has repeatedly led me to view people, including several past colleagues, in a radically different manner than the majority did—be it because I have judged them by their incompetence, not friendliness, or because I have had a greater ability to see through their surface than most others.
On my inactivity and human stupidity
Even after returning to the Internet almost a year-and-a-half-ago I have published (or written, for that matter) very little. There are several reasons for this, including that I have decided to and benefited from cutting down on my “extra curriculars” in favour of more post-work relaxation and that I grown more and more critical as to what I consider a text worthy of publishing and a thought worthy of writing up in the first place—to the point that I must force myself to artificially lower my criteria, lest I remain silent.
The greatest reason, however, is something very different: Sheer frustration with the stupidity of most humans, with the way those more in need of feedback are correspondingly less responsive to it, and with how many of the greatest ignorants are sure of their own (imagined) knowledge and understanding. (Including the important special cases of incorrectly believing that knowledge or experience automatically implies understanding, failure to realize that understanding is almost always the more important of the three, and entirely overlooking that none of them is worth much without actual thought.) My activities in the Blogosphere have been particularly unrewarding and frustrating, and it has been a long time since I had a non-trivial activity there.
It is no coincidence that there are many sayings or quotes expressing the principle that the fool is cock-sure and the wise man doubts—nor that the Dunning–Kruger principle has gained fame among those who do think. (Executive summary of Dunning–Kruger: Ability at A goes hand in hand with the meta-ability to judge ability at A.) Indeed, one of the few things that give me some amount of personal pride is simply that I belong to the small minority of people actually willing to actively challenge their own opinions and modify them as time goes by.
The examples of this are very common and the effects extremely demotivating to me. It is proverbially better to light a candle than to curse the darkness (and I have long tried to live by this claim), but there simply comes a point where it is hard to keep it up—especially, since there are many ignorants not only impervious to candle light—but who actively put out candles lit by others. Those who are familiar with my writings will know that I have written a lot about censorship—and the sad truth is that there are many blogs (notably feminist ones) who simply censor comments that have a dissenting view. This includes even polite comments using factual arguments, links to statistics, pointers to logical errors, … Indeed, often the comments that are the more likely to convince a third-party are the ones preferentially censored… Specifically in the realm of political correctness (in general and to some degree) and feminism (in particular and to high degree), there appears to be no willingness to actually look for the truth. Instead, pre-formed claims are pushed with great insistence, even when no more justified than e.g. the claims of a creationist: Both kinds live in their own special world where some things just have to be true because else they would find themselves in another world or have to face possibilities that they cannot cope with. Scientific proof, logical arguments, whatnot, are all secondary: The truth that these point to is abhorred and therefore they must, ipso facto, be faulty. It is inconceivable that God did not create the world; it is inconceivable that differences in outcome could have any other explanation than differences in opportunity. Anyone claiming otherwise is uninformed and should let himself be enlightened—or an evil liar deliberately trying to ruin the game, a heretic, a sexist, … Meanwhile, those wishing to “enlighten” the dissenters typically give ample proof of their own ignorance, undeveloped ability to understand arguments, and lacking prowess with critical thinking. A particular annoyance is the constantly recurring claim that those who criticize feminism (more specifically gender-feminism and feminist populism) are ignorants who must be exposed to the truth—when most critics (at least in Sweden) actually grew up under feminist indoctrination, long took feminist claims to be true, and only over time developed a more nuanced world view, by means of critical thinking, exposure to more scientific information, personal experience contrary to the feminist world-view, and so on: If the feminist claims about e.g. rape statistics, domestic violence, earning capacity, discrimination against women, …, were true, then almost everyone would be feminists—but I have over time learned that these claims for the most part are invalid. (For varying reasons for different cases, but often including hiding vital details that radically change the interpretation of data, misreporting of data, use of unsound methodology and non-standard definitions, statistics extrapolated to different areas or times without verification of relevance, and even statistics simply made up.)
These problems, however, are by no means limited to the Blogosphere, nor to the politically correct or any other ideology or religion. No, stupidity, irrationality, incompetence, and so on, permeate the world and all its aspects, the main question often being whether a certain phenomenon is explained directly or just indirectly by such factors: Is the advertising industry filled with idiots or does it merely try to convince idiots? (I suspect that it is a bit of both: People of highly disputable competence and judgment trying to preferentially convince the most stupid, irrational, and uninformed consumers.)
Even in software development, stereotypically associated with the gifted and the border-line autistic, there are few who have the competence level they should have and many who have a good standing through social relationships and despite their lack of skill. About five in ten of the colleagues that I have worked with have been so poor that I would simply not have considered them an option, had I been setting up a new team. No more than one in ten is someone I would give a blanket “yes”. Another one in ten may be a border-line case, picked or rejected depending on the available alternatives. The remaining three might do if nothing else is available and a sufficient mentoring and reviewing could be guaranteed. Even those worthy of a “yes” are typically lacking of the competence they should have, for the simple reason that they have the competence level of a worthy developer—but typically work as lead developers. Notably, most of them have a very limited own understanding, instead basing their decisions on rules, recommendations, or things that they have read somewhere without giving sufficient thought to e.g. why the recommendation is made and when it does not apply because the underlying cause for the recommendation is irrelevant. For instance, The lead-developer of a team that I was assisting a while ago was highly surprised by the suggestion of replacing an ugly set of conditionals with a look-up in map—apparently, he was unaware of this obvious and well-established technique that even a junior should (but rarely does) know. Going outside the “yes” developers and the border-line cases, things deteriorate very rapidly. The average developer has no feeling whatsoever for what makes good and poor code, does not use the benefits of polymorphy over if-statements, uses copy-and-paste when he should write a new method or class to abstract the same functionality, writes test cases that are next to useless through checking the implementation instead of the interface, …
It is the same with other professions—software developers still do better than most other groups. Looking at most business graduates I have dealt with, I marvel that they actually did graduate… Most are lacking in knowledge, almost all are devoid of understanding, and areas such as critical thinking are uncharted territories. Large egos and great efforts to create an appearance of competence are more common.
A particularly frustrating problem: The few of us who actually do strive for understanding often see problems, opportunities, solutions, …, that others do not. However, because the ignorants are in the majority, the minority is considered lacking… (E.g. through being seen as obsessing with unimportant details when these particular details actually are important, or as being wrong in a dispute for lacking some insight of the majority—but where the reason for disagreement is that the minority has this insight and several more that the majority is lacking…) A project I worked on last year had me crawling up the walls for frustration for this reason (in several areas, but mainly with regard to Scrum):
I had spent some considerable time deepening my knowledge and understanding of Scrum and was actually enthusiastic (rarely happens with me…) about testing this and that, in particular seeing what gains might be possible through systematic inspect and adapt. My efforts where almost entirely blocked by a team that had no understanding of Scrum but merely followed a certain formulaic approach, leaving inspect and adapt (the very core of Scrum) entirely by the wayside. This regrettably extended to both the Scrum Masters that the project saw: The first had masterly conned large parts of the company into believing she was a true expert, making anything she said an ipse dixit during any discussion. In reality, she was a disaster in her role, not merely through failing to understand inspect and adapt, but also through failing Scrum in several critical regards, notably including trying to prescribe what the developers should do and how they should do it (and not limited to Scrum at that). The second had no previous Scrum background, but went through a crash course consisting of tail-coating number one for two weeks combined with some informal tutoring of the blind-leading-the-blind kind. Discussions with her were even less productive, with an even more limited intellect and the one implicit argument of “number one said and number one is the expert”. No: Sorry, the only one in the project who had any claim whatsoever of being a Scrum expert was yours truly—I was the only one who had bothered to go beyond superficial knowledge and actually gain an understanding of the principles and ideas, as well as the only one who seemed to actually evaluate how well or poorly something worked.
Many examples of how stupidity rules the world can be found in the UIs of modern software programs, with explanations coming to a high degree from the made-for-idiots camp, but also, if to a lesser degree from the made-by-idiots camp (e.g. through not understanding the benefits of separation of concerns, not having knowledge of alternate paradigms, or undue prejudice against e.g. command lines). Take web browsers: For a considerable part of the post-2000 period, I was a dedicated Opera user—Opera delivering superior functionality and speed. However, for each subsequent version, Opera grew less and less user-friendly, to the point that I threw up my hands in anger and reluctantly switched to what seemed the least of the many evils: Firefox. Unfortunately, Firefox has continued with the same user-despising trend as Opera. Negative developments include, but are by no means limited to, removing the options to turn images and JavaScript on/off from the GUI, necessitating a visit to about:config, or reducing the usability of the image filtering severely by removing the generic black-/white-list system in favour of a rights system where rights can only be set for the domain of the current page (but not for e.g. a domain that provides images displayed on that page). Worse, as I recently discovered during the update of an older system, when these were left in the “off” position in a version that had the toggle in the GUI, an upgrade to a version with the toggle in about:config would automatically, without asking the user, and in direct violation of reasonable expectation, turn them on again—absolutely inexcusable! Generally, Firefox has a severe usability problem through forcing central functionality into unofficial plug-ins that have to be installed separately. Yes, plug-ins are great. No, it is not acceptable to move functionality central to the product to plug-ins or to force the user to install a plug-in for something that should be done through a setting. (However, installing a plug-in to provide a more advanced version of the central functionality is acceptable. A JavaScript on/off switch is a must in a browser, and a per site toggle very highly recommended, but the full functionality of the NoScript plug-in is legitimately put in a plug-in.)
While Firefox removes central functionality, it also includes more and more non-central functionality that rightfully should be (but is not) in a plug-in, e.g. the “sync” functionality. Or what about the many, many URLs that can be found under about:config for a variety of unspecified tasks, some of which is highly likely to include unethical “phone-homes” or definitely expose data to Google (a by now entirely untrustworthy third party)?
One of my main beefs with Firefox since day one has not improved one iota over possibly some five years: I like to run different instances of browsers for different tasks (at home using different user accounts, at work at least using different profiles). Under Firefox this means a lot of unnecessary work. For instance, installing a certain plug-in for all users is not possible (resp. there is an alleged way, but it is poorly documented, it is non-obvious, it requires far more work than a single-user installation, and it, judging by my one attempt a few years back, simply does not work). Profiles, in turn, are very poorly thought-through, having no official means to copy them, requiring command-line intervention to run more than one profile at any given time, and, when push comes to shove, merely solving a problem that would not have existed in the first place—had Firefox made proper use of config files. If it had, one could just tell it to use the settings from file A for this instance and File B for that instance, with no additional programming or a cumbersome profile concept. Whether using profiles or additional user accounts, a major issue is to have to go through a good many settings for each instance: Settings is the most natural thing to export and import between parallel instances—but this is not allowed. What Firefox provides is a means to export bookmarks and similar—but that is near useless for any practical use. (Yes, this could be handy when e.g. moving from computer A to computer B. However, then I would most certainly want the settings too. For parallel use, in contrast, the settings are far more important: I may need to alter one or two individual settings between instances, but the website visited will be almost entirely disjunct.)
One of the most atrocious examples of stupidity is the German “Energiewende”: A massive and costly intervention has been made to move energy consumption and production to “renewable energies”, and many criticize it already for the costs or the many implementation errors that have unnecessarily increased the cost or distributed it unfairly. Personally, I could live with the costs—and have to admit that the increase in renewable production capacity has been far more successful than I thought it would be. Unfortunately, there is one major, disastrous, and incredibly counter-productive catch: The production form which has been replaced is almost exclusively nuclear power—while the use of “fossil fuels” (especially coal) has actually increased (!). In other words, the net-effect of this massive and costly intervention is increased pollution… (Notably, very few people are aware that fossil fuels do far more damage to the environment and cause far more human deaths on a yearly basis than nuclear power has in its entire history, including the accumulated effects of Chernobyl and Fukushima.)
I could go on and on from a virtually endless list of examples, causing the writing of this article continue for far too long and ensuring that almost all potential readers will have the feared “to long; did not read” reaction. (Not that I have any illusion about the proportion still reading, even as is.) Instead, I prefer to make a cut here, but I will make some honourable mentions that I had originally intended to include with one or several paragraphs each:
-
Deutsche Bahn (“German Railways”) demonstrates so much incompetence on a daily basis that I could write several articles on that topic alone.
-
Museums used to be a way for those with an interest to actually learn something. Today they are rapidly degenerating into cheap entertainment–and they pride themselves with their “family friendliness”, which means that those who try to learn have to cope with children running around and screaming without anyone intervening. In many ways, what the typical museum of today does, is antithetical to the purpose of a museum…
-
The abysmal state of groups like journalists and teachers, who should be among the intellectual elite and are so often so embarrasingly poorly informed and poor at thinking.
-
Belief in various superstitions and pseudo-sciences, e.g. astrology, homeopathy.
-
The lacking queue management in stores where a further checkout-counter is only opened when the queue is already several times as long as it should be—not when it becomes clear that the queue is starting to get out of hand.