Michael Eriksson's Blog

A Swede in Germany

Posts Tagged ‘Psychology

Lake Woebegone (sic!), where all the teens are more miserable than the others

leave a comment »

As much as I sympathize with the many who had a poor experience of life in high school, or school in general, I find annoying the self-obsession that sometimes, especially among girls, follows it in real life and almost invariably does so on TV.

Certainly, there are some who have it much worse than others, but most of those who complain about “poor me”* actually seem to be fairly average in their experiences. The teenager years can be hard, being in school does not help, and being surrounded by other teens is often a bad thing—but it is approximately the same for everyone. Unrequited love? Bad break-up? Fight with the best friend? A mean-girl clique? Feeling misunderstood? Studying hard and still getting a poor grade? Insecurity about physical attractiveness? Being nervous around the other sex? Being homosexual and insecure both about that and about being around the same sex? Having a close relative die? Parents divorcing? Being forced to move to a new school and starting over without the old friends? Not being made home-coming queen?** Chances are that most high-school students run into several of these and/or several others that simply did not occur to me off the top of my head.

*As opposed to “poor us”: There is a difference between complaining about general problems using personal experiences as examples (I often do, myself; note e.g. the locked-in-a-train situation in my previous text) and painting fairly normal problems as something much worse than what others encounter. Ditto between complaining about one’s own specific situation when it is unusually tough and when it is not. Ditto between just letting of some steam with a personal complaint and demanding the sympathies of the world.

**With variations depending on personal priorities, e.g. “Not being made valedictorian?” and “Not making the football team?”. (I stick with a more female perspective in the main example, as the girls, again, seem to be worse complainers on average.)

Nevertheless, we have many who complain about their specific personal situation relative everyone else based on just these several items. Consider “Tall Girl”*: Unsurprisingly, this movie deals with a tall girl. She goes to high school and sees her complaints center around being tall and how everyone, supposedly, considers her a freak. Various adventures ensue, and towards the end of the movie she gets up on a pedestal** and holds a speech about how horrible it is to be a tall girl in high school to the gathered students—most of which likely had problems of a similar or greater magnitude, but who were not given a pedestal by the filmmakers.*** Interestingly, she was not even that tall, being somewhere in the 1.80s, where she could still expect to, e.g., find plenty of taller men. Certainly, she did not seem to consider or take advantage of potential upsides of being tall, say, a chance to be the star of the local basket team and maybe getting a college scholarship—while there is no or very little upside to most other teenage complaints, e.g., unrequited love. (Well, I suppose that unrequited love might help someone looking for a career in poetry, but …)

*A third rate movie that I caught when stuck in a hotel room with nothing else to do. With reservations for the exact name and various other details.

**Whether literally or just metaphorically, I do not remember.

***In the case of fiction, there is often a question of whether the fault results from a realistically portrayed character or from filmmakers pushing some angle. In the latter case, there is the additional question of whether the angle is more-or-less arbitrary or rooted in personal experiences.

Of course, this type of condescending and self-centered speech is not unique to “Tall Girl”. On the contrary, I have seen a number of variations over the years. The same attitude, without a speech, is very common. Consider “Buffy the Vampire Slayer” for a much more intelligent take and many interesting contrasts between those with superficial problems, like Cordelia and her clique, and those like Buffy (who spends her days in school and her evenings risking her life to fight “the vampires, the demons, and the forces of darkness” or something to that effect). A particularly interesting scene follows shortly after the death of Buffy’s mother: kid sister Dawn is at school, desolate and crying in the bathroom, and it is revealed that this was over some boy or something mean that some inconsequential mean-girl said—the news of her mother has yet to arrive.

The trigger for writing this text, however, is “13 Reasons”, where I am currently midway through episode 4. The premise is that a high-school girl, Hannah, has killed herself and left behind a set of tapes with various recapitulations of how some other students gave her these “13 Reasons” to end her life. So far? Nothing truly impressive. Combined with her accusatory tone and streaks of pontification on tape, and her behavior during flashbacks on screen, she often seems to be over-sensitive, irrational, self-centered, and/or attention seeking. (To the series credit, something similar is also stated by one of the other characters.) Examples of her complaints include boys making a list of who-is-or-is-not attractive (apparently male sexism,* even though she did well), which led to one of her “friends” freaking out and blaming her for a breakup.** Well, that sucks, but it is not the end of the world and nowhere near suicide territory. It is certainly not, for instance, comparable to having a parent die or oneself losing a limb in a car crash—and I doubt that suicide is a typical reaction to these either.*** Few, in all fairness, see a “13” in lieu of “several” above, and within a comparatively short time, but, so far, I do not see suicide as an even remotely reasonable reaction—or the death rate of teenagers would be far higher than it is.

*In real life, over twelve years in school, I encountered exactly one such list—made by the girls.

**I am a little vague on the details, but it might be that Hannah’s “friend’s” boyfriend was the one who was complimentary, which led to unwarranted suspicions of an affair with Hannah. The loss of this “friend” is the worst damage seen so far, but, from the overall material, it seems to not have been a true friendship to begin with (hence my scare quotes). This does not lessen the pain in the moment, but it does reduce the practical damage.

***I acknowledge that those who commit suicide in real life do not necessarily have reasons that others would understand, but when an entire TV series is made on the topic such reasons should be present. I note, in particular, that there has not been any signs of pre-existing complications, say, a clinical depression, a severe substance-abuse problem, abusive parents, or a very prolonged state of unhappiness. Moreover, it is clear from the existence of the tapes that suicide was neither a spur of the moment decision, nor a “number 13 was the last straw—I just cannot take it anymore”.

Of course, all this even going by Hannah’s versions of events, the truthfulness of which has been disputed by at least one other character.

I had great hopes, after a promising first episode, but right now I am uncertain whether I will even watch episode 4 to an end—in part, because the promises of the first episode do not seem to be fulfilled; in part, because there have been repeated unrealistic “evil male” portrayals;* in part, because I fear that the series will end with some type of cop out, e.g. a rape scenario, as there are strong signs that a group of boys is trying to keep something very bad quiet.

*Including, in this episode, a photographer who secretly takes photos of other students. The sheer amount of such portrayals in modern fiction is both tedious and annoying. To boot, they can feed into the very distorted view of men that many in modern society (already) have.

Excursion on mean girls and suppressed information:
When I hear about somewhat similar events in real life, say, that some girl did commit suicide or that some girl saw her reputation blown to pieces by incriminating photos, the formulations used typically amount to “poor girl” vs. “mean fellow students”. However, looking at the type of meanness presented, e.g. that the girl with incriminating photos is condemned as a slut or was excluded from her previous group of friends, it usually seems more like something that specifically the other girls would do. (In the second example, additionally, because girls usually have more girls than boys as friends.) Factor in what I have myself seen and heard in real life, most (at least, pre-woke) fictional portrayals,* and the known issue of the ethnicity of criminals being censored by media, and I strongly suspect another case of suppressed information, that the perpetrators are predominantly female. I also note a similar pattern of “society”, “media”, whatnot being blamed for “pressuring” women into this-and-that, where it is often clear that the pressure either stems from other women or from the individual woman, herself, e.g. because she believes that others have certain expectations. (Eating disorders is a recurring example.)

*Note e.g. parts of an older discussion of Carrie and the book, itself.

Advertisement

Written by michaeleriksson

November 23, 2022 at 10:58 am

The fallacy of the superior observer

leave a comment »

Preamble: This item has been on my backlog for a long time, in part because giving it a fair treatment would require a considerable amount of effort, including for gathering proper examples. Below, I give an abbreviated and a little simplistic treatment, in order to get rid of the backlog item. The general idea should still be clear.

Time and again, I see someone* trying to observe, analyze, or even psycho-analyze the intentions, behavior, whatnot, of someone else in a very smug and superior manner, somewhat like a stereotypical** anthropologist in a jungle village, displaying an attitude of “I observe you; ergo, I am superior to you”, “I observe you; ergo, I understand the world better than you do”, “I observe you; ergo, I understand you and/or your situation better than you do”, or similar. These “ergos”, however, are fallacies. Firstly, the respective conclusion, as such, does not hold.*** Not only is it a non sequitur, but a contradiction can easily be created by having the parties simultaneously (or at different times) observe each other.**** Secondly, the facts at hand are often, but by no means always, the reverse, with the subjects being smarter and/or better informed than the observers.

*I do not claim to be innocent of the same type of analysis, but, attempting to look back, I hope to have avoided the fallacy part, which lies in flawed reactions and conclusions.

**I leave unstated how common this behavior is among real anthropologists in jungle villages.

***Note that bad logic remains bad logic, even should it land at the truth, just like a still-standing watch still stands still even when it shows the right time (as it, twice a day, proverbially does). Also note the major difference between “I observe you; ergo, I am superior to you” and “I observed you do a handful of stupid things and, from your behavior, I am inclined to see myself as the superior”.

****A situation that can easily arise naturally, even for a more formal setting. Consider e.g. Student A trying to earn extra money through participation as a subject for Study X, where Student B has the task to observe his behavior as part of his course requirements—while Student B tries to earn extra money through Study Y, where Student A is the observer. In more informal settings such cases abound, as with those who play “Psych 101” on forums and usually reveal more about themselves than about their victims.

An illustrative case is an anecdote that I read somewhere (approximately paraphrased from memory and with reservations for details):

A psychologist is giving a personality test to a group of engineers.

Psychologist: Any questions?

Engineer: Should we use the same personality on both sides of the page?

Psychologist: You are supposed to answer truthfully!!!!

Engineer: How stupid do you think that we are?

Here, firstly, the psychologist seems* to commit the fallacy, by seeing the engineers as test subjects, who are to be obedient, cooperative, and analyzable, with no regard for their own interests,** much like school children filling in a similar survey—inferior, because they are observed; the psychologist superior, because he observes. Secondly, the engineer seem to believe himself well ahead of the psychologist in terms of understanding the situation and/or of general intellectual capabilities, which is premature.*** If the engineer also applies an “observer mentality”, we actually have two parties simultaneously committing the fallacy against each other. (I suspect that this reciprocal fallacy is somewhat common in certain circles.)

*I can only speculate, and a common contributor to the fallacy and/or a common result of the fallacy is to take speculation to be the truth. (However, for convenience, I will skip the “seems” and whatnots below.)

**Various tests and surveys can reveal more about an individual than he wishes to be known, and if the wrong entity sees the information, and/or if the information is not sufficiently anonymized, this can have negative consequences, e.g. that a promotion goes to someone else, because HR sees a certain character trait as negative.

***From what I have seen of engineers and psychologist, he might well be correct, but the belief is premature, even should it be correct—unless it also draws on earlier interactions and whatnots.

Other typical examples include that guy on a forum who likes to “Psych 101” his co-debaters, their intentions, and their whatnots, often while being entirely wrong;* the Leftist, especially Feminist, ideologues who write papers on groups that they dislike, while assuming that these groups see the world the wrong way, are naive, whatnot;** and, indeed, many anthropologist, psychologists, etc., who try to analyze societal behavior.

*In those cases where I have been on the receiving end and can judge the truth.

**In reality, it is usually the other way around.

An impulse to finally get this item done was the recent mention ([1]) of a paper that “explore[s] how September 11 and subsequent events have been experienced, constructed, and narrated by African American women, primarily from working-class and low-income backgrounds.”, which not only seems to be an absurd topic,* but also stands a good chance of committing the fallacy. (Whether it does/the authors do, I do not know for sure. However, I have seen similar formulations used by those with the wrong attitude before.)

*While I do not think highly of research into “constructed” and “narrated” in general, here a more interesting and potentially legitimate choice would have been a compare and contrast between different groups, e.g. whether men and women, Whites and Blacks, high- and low-income subjects, whatnot, have different views of the events, experienced the events differently, etc.

Finally, as an exercise for the reader: If a psychoanalyst engages in self-observation/-analysis, should we expect the result to be a superiority complex, an inferiority complex, or both?

Excursion on borderline cases:
It can often be hard to draw borders, e.g. between the fallacy and attempts at manipulation of third parties, horribly misguided speculation, and similar. For example, I once saw a (likely German, but set in China) news clip, which featured a well-dressed woman on a bicycle—and a voice-over on how this woman would consider herself something better than the rest of the persons present. However, if the makers of the news clip had even exchanged a single word with this woman, it was not shown on screen—and neither was any other act of hers than riding a bicycle. (And, no, she was not someone famous, nor anyone of any major concern for the clip.) This was certainly poor and unethical journalism, to the point that a firing seems warranted, but was it also an example of the fallacy or was the cause something else?

(If it was an example of the fallacy, it was also hypocrisy beyond belief.)

Excursion on being in charge:
In some cases, e.g. with psychologists performing a study, there can be overlap with another fallacy, namely that the one who is in charge is automatically superior in general and/or that he* is in charge because he is superior rather than, say, by coincidence or because he happens to be on his home turf, e.g. in an office where he wields some power in the name of his employer, while the cards would be reversed in his counterpart’s office. Remember, in particular, what they say about small people who are given a little bit of power.

*But note that the problem, in my impression, is much more common among women, including the type who leads a three-person department and considers herself a big shot, while she is actually small fry by any reasonable standard. This to the point that I wrote the first draft of this excursion using “she” and “her” over the generic “he”.

Excursion on observers revealing themselves and going full circle:
As I note above, the “Psych 101”-ers usually reveal more about themselves than about others. This is a potential issue with observers in general, even among those who do not commit the fallacy (although, it might be more common among them), and including me. What we relay to others is almost invariably colored by our perceptions, our priorities, our attempts to guide the perceptions we leave with others,* whatnot, and this allows others to draw conclusions. Of course, their conclusions will in turn be colored by their perceptions (etc.), which can make their conclusions misguided and tell us more about them. And so on. Indeed, one of the issues with “Psych 101”-ers is how often they try to, so to speak, describe the color filters of their victims but end up describing their own color filters instead.

*We all do, to some degree, and that too is something that might inadvertently reveal things about us. For instance, I have a perfectionist drive, but am usually forced to write texts well short of perfect, e.g. for reasons of time or priority. This, and the (maybe, irrationally perceived) risk that the reader will believe that I fail to spot the flaws, often irks me to the point that I add a comment on the issue, as e.g. with the “preamble” at the beginning of this text—even when I know that very few others would have felt the need to do so. (Many of my footnotes are also caused by this perfectionist drive, but in a more immediate manner. Here it is not a matter of what the reader might think, beyond “Too many bloody footnotes!”, but of a wish for completeness and thoroughness.)

Written by michaeleriksson

November 4, 2022 at 1:22 pm

Life as a (bad) cosmic joke, disturbances, and my rotten-to-the-core building

with 7 comments

As I noted in an older text:

There are days when I can barely suppress the suspicion that life is a weird cosmic joke, “Truman Show”, or scientific experiment that replaces mice in a labyrinth with humans in a Matrix.

Even this formulation is kind: to speak of a Kafkaesque nightmare sometimes seems justified. (And, indeed, I suspect that Kafka’s popularity is rooted in how recognizable many of his stories are, more reflections of reality in carnival mirrors than something freely painted.)

One recurring issue is that when I set a day aside to deal with e.g. outstanding correspondence or something IRS-related (as in that text), something invariably happens that wreaks havoc on my plans. So yesterday:

After last year being plagued with construction work (with a severe reduction of life quality and ability to work on my book over roughly six months) and other unnecessary noise, the COVID-19 (-20?) lock-down brought a few months of renewed problems, with hours of disturbances per day. Disturbances, obviously, that did not just affect me during those few hours, but also over the day through a ruined mood, uncertainty if and when they would resume, and a severe sleep shortage. Among the side-effects, I have a large backlog of often temper-ruining correspondence and other tasks, because I did not want to risk the additional stress when I was already beset by these disturbances.

As the disturbances had returned to a normal* level, my quality of life was up again, I was in a good mood, and I had cleared my schedule for yesterday, to go through half-a-dozen tasks—one ruined day and then back to a good mood.

*Higher than what I would consider normal based on other apartments, but within the tolerable with occasional use of ear plugs or ANC head-phones.

The day was ruined alright …

I was awakened at 08:57 through various very loud noises and poundings, which where joined by even louder drilling an hour later. I saw myself forced to flee the apartment. I returned around 12:42, and there was brief peace, but the construction work soon began again, and lasted, with only short interruptions until late in the afternoon. (I tried to get by through with simultaneous use of ear-plugs, ANC head-phones with loud music, and a sound bar also playing loud music.) Obviously, even attempting the intended tasks was out.

Today, the same shit started up again—twenty minutes earlier …

As always, there was no notification about the scope of the works, and I have no idea whether I will have peace and quiet tomorrow or whether I will see another six months ruined. I have no idea whether it would make sense to try to go on a vacation or find a secondary apartment—or for how long. Obviously, I am worse off than last year, due to the limits posed by the COVID-19 counter-measures as e.g. travel is less free and spending two hours in a restaurant (to get some time away from the noise) is trickier.

Generally, the amount of problems that have arisen in this house, with these neighbors, with this building management, whatnot, is astounding and entirely disproportionate. Consider for instance:

  1. The entrance to my storage unit in the cellar has been blocked by some type of rubble, pouring out of two openings in the wall, for more than two months … (Note that this does not just make it impossible to access said storage unit, but that it might also be an indication of some type of problem with the building, which would be urgently looked into by any non-negligent building management.) At first, I gave it time, expecting that either building management would act on its own or that one of the neighbors would own up to having caused the issue. As there are at least two* representatives of building management who should have noticed it within a week, this should have been reported fairly soon, even without my intervention. Nope … Eventually, I wrote a letter telling building management to clear this mess up, ensure the safety of the building, and find and bill whatever idiot who had caused this problem. The result was a one-sentence reply, which amounted to one of the neighbors having been notified. Building management appears to have taken no other action, and whoever the neighbor, he has not either …

    *A weekly cleaning lady for the stairs-and-whatnot and someone who carries the garbage cans from the cellar to the street the one day, for collection, and back again the next. (The former is a professional, the latter might or might not be a local contracted for the purpose as some type of “micro-job”.)

  2. I have terminated my contract with the gas supplier, seeing that I use very little gas and have to pay an entirely disproportionate amount through fix monthly fees and that I can avoid the annoying chimney sweep (cf. at least [1], [2]) .* I received a notification from the gas supplier that my contract was terminated—and, a little later, a second notification amounting to “someone has terminated the gas supply to apartment XYZ, likely an old tenant moving out. Because you are the owner, we have automatically opened a new contract** for you.”, an interpretation of events and an action that is utterly absurd. (I have written back.) To boot, the chimney sweep also refuses to accept either that I have terminated my gas supply or that a terminated gas supply would be a valid reason to not check the heater, which from my point of view is just scrap-metal still hanging on the wall. Performing this check without any gas might be an interesting challenge. I wonder whether physically removing it would be enough …

    *Other reasons include that the old gas heater is old, poorly placed in the bathroom, and overly large: even if I continue with gas in the long-haul, it would be after replacing it with something better and more modern. I might or might not do so come winter, but until then I have even less need for gas.

    **This might be an inappropriate term in context, but I do not want to get bogged down in details.

    In effect, I get rid of the gas to avoid unnecessary costs and unnecessary interactions with morons and the result is (a) I have no gas, (b) I am still supposed to pay and to interact with morons.

  3. One or several families with children put their prams and whatnot wherever they like. For instance, the door to the cellar is often blocked by prams. Last week, it was three-seater (!) that would have made it impossible to access the cellar without simultaneously blocking the stairs completely, so that no-one could get in. Even maneuvering it sufficiently to get to the cellar door, even at the price of blocking the stairs, might have required me to go out the front door first (and/or to push the pram out the front door). Certainly, there would have been no chance of getting out of the cellar again, had someone wanting to enter the building put it back while I was down there—and the chances with even a smaller pram might not be brilliant either.* For instance, the access to my storage unit (even rubble aside) is often blocked by a pram. For instance, one of the neighbors often has two trikes (or similar) standing outside her apartment door, partially blocking the stairs and posing a definite risk that someone will stumble and take a fall.

    *There is a connecting door to a neighboring house, so it might be possible to get out that way, assuming that no-one has locked said door …, but that would imply a very major detour.

Excursion on other cases:
Like above, most cases go back to some mixture of human stupidity, irrationality, and lack of respect for others, sometimes in combination with odd coincidences. Politics, e.g., is filled with examples of the opposite being done to what is sound and reasonable, as with some of the COVID-19 reactions*, the attempts to kill nuclear power at all costs, even when this implies that much more damaging fossil fuels must be used, the unholy Conservative/Social-Democrat alliances in German politics, or some of the issues from yesterday’s text**. This, note well, not counting the many cases where I strongly disagree with developments but there is a more legitimate room for different preferences, as with reductions in civic rights “for the safety of the people” and ideologically driven increases in redistributions.

*Probably, all if we look at the specific aspect of lack of a scientific approach and a policy making that amounts to pin-the-tail-on-the-donkey, but that does not automatically imply that all are actually ineffective, harmful, or otherwise negative. See a great number of older texts for more specific discussions.

**My claim there that “the constant exposure to human stupidity and irrationality” would be the main obstacle to my own happiness needs extension with “human lack of respect for others”, as above. Also see e.g. [3] and [4].

Written by michaeleriksson

July 1, 2020 at 11:02 am

A few thoughts on stimuli and emotions

with one comment

Watching a horror piece*, a few similar phenomena with, I suspect, a common explanation are brought to my mind.

*The “Black Mirror’ episode “Playtest”.

Specifically, I* often experience a build-up of emotions, emotional reactions, and whatnot, where a (for want of a better word) stimulus initially has little impact, but eventually becomes something major. This would be well explained by assuming that certain (again, for want of a better word) channels of the mind, likely with a physiological** background, are continually drained by some mechanism, much like the drain in a basin. When a small stimulus is present, the equivalent of a slightly opened faucet, then the basin remains more-or-less empty, because the drain swallows the water almost immediately. When the faucet is opened wider, the basin will have some water in it, but the amount will be more-or-less fixed***, because the drain still swallows it all, at roughly the same rate as it enters, but there might be some time before any given water particle is swallowed. When the faucet is opened wider yet, the drain will eventually not swallow the water at a sufficient rate and the amount of water will increase until the basin overflows (unless a secondary drain comes to the rescue, as is the case with most modern basins, but not necessarily the human mind).

*I suspect most others too, but I can only actually speak for myself.

**The exact nature is beyond my speculation as my knowledge of e.g. brain physiology has never moved beyond the informed layman’s and my latest readings are a good many years behind me. I do note, however, that some set of this-and-that receptors, re-uptake, etc. would make a decent fit.

***I.e. we still have an equilibrium, or something close to it.

Consider e.g.:

  1. Watching a (scary, non-laughable, non-splatter) horror movie and how the tension and anxiety, even fear, felt increases until the viewer wishes to take a break or otherwise relieve the tension (also cf. below)—and how a short break can make watching a few more minutes that much easier, until the tension has built to the critical level again. This, while a longer interruption, e.g. to write a discussion relating to horror movies,* can lead to a much greater respite before the tension becomes critical again. Of course, in all three cases (continued watching, short break, long break) the actual movie remains the same.

    *However, the aforementioned “Playtest” is not that bad. The situation of the protagonist, in universe, is repeatedly truly horrifying even by a horror-movie standards, but the viewer has it easier than with many other works, as there is a fair amount of comic relief and relief through threats that turn out to be harmless, and at least some of the episode is less horror and more “meta-horror” and the usual “Black Mirror” investigations into consequences of technology. Besides, this is a second watching. (Now, “Alien” or “The Blair Witch Project” on a first watching …)

    Similarly, the increasing fear that I have experienced during prolonged times close to potential falls. I am afraid of heights, but usually in a controllable manner. However, I can e.g. recall how I once was in a museum looking down on a few very large statues from several floors up for, possibly, ten minutes. By the end, the originally weak fear had risen so close to a panic that I had to move away, unable to take the anxiety any longer.

  2. The increasing mirth when watching a good sit-com, where the first joke or humorous situation might bring a smile or a giggle, while similarly funny portions bring a stronger and stronger reaction, until the point of major laughter is reached. Here too, on rare occasions, I can wish for an outright break, e.g. by putting an episode on pause for a little while. This in part because any further jokes might prolong the laughter and the positive feeling, but not* make it stronger, implying that it is better to take a break, to let the metaphorical water level sink a bit, and only then continue for a new build-up; in part, because it becomes hard to simultaneously laugh and pay attention to what happens on the screen.

    *Or, if it could, possibly to a degree where the situation, literally, became unhealthy.

  3. The increasing annoyance caused by a continuing disturbance, which goes from a triviality* to a horror as it continues, on and on and on and on. In my case, with sufficiently long disturbances, they can even cause rage. Compatible with the metaphor, I have also found that anger surfaces much more rapidly on days when I have already been angry, often through such a disturbance, e.g. in that I have a first understandable anger because someone has ruined my sleep by raising hell at 6:30 in the morning** and that I have a second anger over something much more trivial later in the morning, e.g. because I prepped my coffee maker before showering, but forgot to turn it on. (Something, which would normally just give me a brief moment of disappointment.)

    *After it first enters the realm of awareness. The pre-awareness time is probably usually fairly short, but it might involve some other mechanism than the “faucet and drain”.

    **Unfortunately, not a fictitious example. This has happened quite often the last few weeks, complemented by several past-midnight incidents and quite a lot of odd stomping, hammering, and whatnot during the days. That the anger, understandably, grows worse with every occurrence is probably yet another mechanism. (Yes, I try to apply stoic principles, but it is not that easy in the moment.)

It seems likely to me that the channels are not entirely separate and/or that the conscious mind can be distracted from one channel to another. Consider e.g. comic relief* in the case of a horror movie—a sudden joke will not only reduce the anxiety from the scary parts, it will also typically have a much stronger humorous effect than the same joke would have in a less stressful situation. (The phenomenon of “nervous laughter” is likely related.) Or consider the rare works of fiction that manage to hit a spot where the viewer/reader/whatnot is simultaneously laughing and crying**: I hardly ever experience it, but it is an immense feeling on those rare occasions.

*Disclaimer: It is possible that I slightly misuse the term here, but my meaning should be clear.

**Due to a very sad (or very happy) situation—not because the laughter has grown strong enough to cause tears in its own right.

More speculatively, I could see a connection with mood swings. For instance, I have the subjective impression that I am more prone to a strong negative reaction when I am on a (positive) emotional high, e.g. after having watched a particularly funny sit-com. Say that I hit my elbow on something: when I am in a neutral mood, my reaction tends to be an equally neutral “that was painful”, but when I am in a strongly negative or positive mood, the reaction is likely to be in the fuck-this-piece-of-a-shit-of-an-object direction.

Looking at autism*, just assuming that autists/aspies/whatnot have a smaller drain (or a smaller basin) would go a long way to explain many differences in behaviors and preferences relative NTs, in that they are, in reality, not that different, but happen to be triggered by a different level of stimulus, be it through a difference in magnitude or duration. Even the likely most stereotypical** behavior, flapping, appears to be more a matter of exceeding some level of excitement or anxiety than anything specifically autistic. For instance, some type of flapping is regularly used in animes*** to indicate exactly extreme excitement—but not autism. On the contrary, it seems much more common in everyday outgoing high-school girls than in even introverted high-school boys. (The boys, those hentai-kuns, appear to be more prone to nose-bleeds, however.)

*I am a suspected aspie.

**Possibly unfairly: I have no recollection of flapping myself and have only very, very rarely seen another suspected non-NT flap (and I worked for two decades in the software industry).

***It might, conceivably, be a trait shared by autists and the Japanese, but that seems less likely.

Written by michaeleriksson

May 19, 2020 at 8:39 pm

Posted in Uncategorized

Tagged with , , , ,

The effects of our base-line on perception / Follow-up: A few thoughts on traditions and Christmas

leave a comment »

Traditions [1] were the topic for a Christmas text last year. In the almost exactly one year since then, I have again and again noted various overlaps with the sub-topic of our perception of normality. More specifically, it seems that there is a point of “normality”, where something becomes so familiar that we do not notice or reflect upon it, or where we experience it highly differently from less familiar phenomena and/or from how others experience the same phenomenon.

A few examples:

  1. As children, I and my sister often stayed for prolonged times at our maternal grand-mother’s. She declined many wishes for pasta and rice with the argument that “we already had that once this week”—but had no qualms about using boiled* potatoes as the “staple” five to seven times a week. In all likelihood, she genuinely** did not perceive the paradox in this argumentation, being so used to potatoes that they were a standard part of any meal***—just like the glass of milk.

    *Mashed or fried potatoes happened on occasion; I am not certain whether she ever served French fries.

    **To which should be noted that she was not very bright—others might have been more insightful even in the face of ingrained eating habits. Unfortunately, back then, I took it to be just another case of dishonest adult “argumentation”.

    ***She was born in 1924 and grew up with a very different diet from even what I (1975) did, let alone what some born today will. Indeed, left to her own devices, deviations from boiled potatoes were more likely to have been e.g. kåldolmar (cabbage rolls) or rotmos (a rutabaga mash with some admixture of potatoes(!) and carrots) than rice or pasta.

    Consider similarly my own caffeine habits*: I drink large amounts of black coffee—no sugar, no milk, no cream, … This despite originally not liking the taste. When it comes to tea, I have tried repeatedly to use it as a substitute, but within a week or two of a cup a day, the experiment always ends, because I do not like the taste.** I have used e.g. Nespresso and Dulce Gusto machines, but eventually grew tired of the taste and returned to drip-brews. Similarly, when I ordered coffee in restaurants, I used to take the opportunity to have an espresso or a cappuccino—today, I almost invariably order a “regular” coffee. What is the difference, especially since I did not originally enjoy coffee? Simply this: I have drunk so much of it that it has become a taste norm. Tea does not have that benefit and other variations of coffee are implicitly measured as deviations from that norm. The latter might even taste better in the short term, but then I simply “grow tired” of the taste.

    *Also see parts of [1] and of a text on prices.

    **In fairness to tea: I have so far always used tea bags—some claim that they are a poor substitute for tea leaves.

    This item has some overlap with (but is not identical too) the concept of “an acquired taste”.

  2. Why does boy-meets-girl feel less hackneyed than childhood-friends-fall-in-love? (Cf. an excursion in [2].) Well, the former is so common that it does not register in the same way as the latter—despite the paradox. Or take teenage-girl-and-much-much-older-vampire-fall-in-love: Only a very small minority of all works of fiction has this theme, and it would likely amount to a minority even of the vampire genre. Still, it feels so hackneyed that my reaction typically is “not this shit AGAIN—I will watch something else”. A higher degree of rarity can even increase the perceived hackneyedness, because the concept registers more strongly.* Beyond a certain rarity limit, the recognition factor might be so large that the automatic reaction is not “hackneyed” but “plagiarized”…

    *However, another partial explanation can be that a theme has still not been explored enough, leaving works using a certain concept too similar. For instance, the overall vampire genre is much more diverse today than in the hey-days of Christopher Lee, because so many new variations of the theme have been tried over time—“vampire movie” does no longer automatically imply scary castles, big capes, the surreptitious biting of sleeping maidens, or similar.

  3. Virtually every generation complains about the music of the following generations. To some degree this can be due to actual falling quality (e.g. through increased commercialization or a shift of focus from music-on-the-radio to exotic-dancing-on-TV) or a greater filtering of old music (where only the great hits have survived); however, a major part is the base-line that we are used to (likely coupled with nostalgia). Notably, the hit music of a certain period appears to fall mostly into just several fairly specific genres, with a great internal similarity in “sound”. Those who grow up* with a certain sound will tend to see it as a norm, be more likely to be estranged by newer genres and be more able to differentiate within and appreciate the old genres. (Hence complaints like “it all sounds the same”.)

    *In my impression, most people listen to more music and more intensely in their youth than at higher ages, and they might be more easily malleable to boot (be it for biological reasons or because the prior exposure has been lower). However, I suspect that amount of exposure is more important than age.

    A similar effect is almost certainly present between contemporaneous genres that differ considerably.

  4. As a small child, I somehow got into a discussion with my parents as to why the clock on the kitchen wall was not audibly ticking. They claimed that it was, but I could not hear anything. On their insistence, I spent a short period listening intently—and there it was! I was simply so used to the sound that it had not registered with me, until I deliberately tried to hear it…

    In an interesting contrast, I often found the antique wall-clocks at both my father’s and my maternal grand-mother’s so annoying that I used to stop them—in turn, slightly annoying my respective hosts. This might at least partially have been due to my base-line being “tickless”; however, they were also much louder than the (modern) kitchen-clock, and might also have had a more irregular or prolonged sound. (The antiques used an entirely mechanical, crude-by-modern-standards clockwork with pendulums and whatnots; the kitchen-clock had a modern clockwork, ran on a battery, and likely used a balance wheel.)

    As an aside, this points to the risk that isolating one-self from disturbances can lead to an increased sensitivity to the disturbances that do occur, while increased exposure can bring greater tolerance—a dilemma that I have long struggled with as someone sensitive to noise. An extreme example is present in the movie “The Accountant”, in which the autistic protagonist deliberately exposes himself to very loud noises, strobing lights, and physical pain during shorter intervals, apparently trying to increase his tolerance. (I caution that said movie did not strike me as overly realistic.)

  5. When I lived in Sweden, German seemed a fairly ugly language with too strong (in some sense) pronunciations of many sounds (including “r” and “s”). After twenty years in Germany, it sounds just fine, while I am often struck by Swedish as bland and lacking in character. Back then, I heard how German differed from Swedish; today, I hear how Swedish differs from German.

    English is somewhere in between and has not struck me in the same way. However, it is notable that TV and movies have left me with a U.S. base-line, in that I mostly (mis-)register U.S. English as “without an accent”,* while e.g. any version of British English comes across as British**. This is the odder, since I actually consider (some versions of) British English more pleasant to the ear and have a tendency to drift in the “English English” direction, or even towards amateurish pseudo-RP, on those rare occasions that I actually speak English.

    *But many versions of U.S. English stand out as non-standard, including the heavy Southern ones.

    **Often with a more specific sub-classification, e.g. “standard”, Cockney, Irish, Scottish; in some cases, as something that I recognize as a specific accent but am unable to place geographically. (The same can happen with U.S. dialects, but is much rarer—possibly, because British English is more diverse.)

Outside of examples like the above, there are at least two areas that might be at least partially relevant and/or over-lapping: Firstly, opinion corridors and similar phenomena. Secondly, various physical phenomena, e.g. drug resistance, specificity of training, or how the human body reacts to cold: Apparently, Eskimos “in the wild” have the ability to work without gloves in freezing temperatures for prolonged times without ill-effects, pain, whatnot—but a few years in “civilization” make them lose this ability. Allegedly, Tierra del Fuego natives have (had) the ability to sleep almost naked in free air at low (but not freezing) temperatures, while the typical Westerner can feel cold at a little below room temperature without a duvet. I have myself witnessed two or three Westerners who walk around in t-shirt and shorts all year round (in Sweden and/or Germany—not Florida), at least one of which made the papers for this habit—he claimed that the body adapts* if one can push through the early discomfort.

*The exact nature of those adaptions are beyond my current knowledge, but at least some of them likely relate to how fast the body switches from a low-isolation to a high-isolation state and how strong the isolation becomes. That this is trainable to some degree can be easily verified through only taking cold showers for a few weeks and noting how strongly the discomfort is reduced in that time frame. Increase of “brown fat” likely also plays in.

Written by michaeleriksson

December 21, 2018 at 9:27 pm

Conflicting own beliefs and what to do about them

with one comment

In the set of beliefs* held by anyone, there will be occasional real or imagined conflicts (consider also e.g. the concepts of “cognitive dissonance” and “doublethink”). People differ mainly in (a) the degree that they are aware of and (b) how they handle these conflicts. Unfortunately, most people are unaware of most or all conflicts that arise, make no attempts at detecting them, and are prone to just explain away the conflicts that are known—even descending to outright doublethink.** A particular issue with awareness is that a too faulty or incomplete understanding can make such conflicts go undetected.***

*I use “belief” as a catch-all that, depending on context, could include any or almost any belief, idea, opinion, whatnot that implies or would imply something about something else. This includes e.g. “cucumbers are green”, “cucumbers are blue”, “God does [not] exist”, and “I [do not] like chocolate”.

**This includes such absurdities as simultaneously professing to believe in Evolution and Gender-Feminism. Indeed, a great deal of my annoyance with politics/ideology (in general) and Feminism/Leftism/PC-ism (in particular) results from the adherents ever recurring faults in similar directions.

***Consider again Evolution vs. Gender-Feminism: It is, for instance, highly unlikely that evolutionary processes would generate physical differences while keeping mental abilities identical—but exactly that is seen as a given by most Gender-Feminists (and a significant portion of the PC crowd, in general). Similarly, it is highly unlikely that the different roles of men and women in most societies over thousands of generations would have left no trace in form of e.g. natural inclinations. A Creationist–Feminist match-up would be less prone to contradictions.

In many cases, these conflicts are sufficiently trivial that they may be neglected.* For instance, that someone has two favorite dishes, music bands, movie stars, …, rarely has major impact on important decisions.** When it comes to topics that can have a greater impact, especially on others, care should be taken, however. Consider e.g. questions like how to vote in an election, what recommendations to make to others, what agendas to push, …—here it is important to have a sufficiently sound view of the topic; and if beliefs conflict, the view is unlikely to be sufficiently sound.

*A resolution can still bring benefit, e.g. through better self-knowledge, and I would not advice against the attempt.

**However, the resolution is often fairly simple, e.g. that none of two is the favorite and that the word “favorite” is best avoided; or that an opinion has changed over time, while still being professed out of habit.

Giving blanket rules for detection is tricky, but actually reading up* on a topic, gaining an own understanding (as opposed to parroting someone else’s), and deliberately trying to see the “bigger picture” and making comparisons between different fields and ideas, can all be helpful. Above all, perhaps, it is helpful to actually think through consequences and predictions that can be made based on various beliefs, and looking at how they stack up against both each other and against observations of reality. In my personal experience, writing about a topic can be an immense help (and this is one of the reasons why I write): Writing tends to lead to a deeper thought, a greater chance of recollection in other contexts, and a thought-process that continues intermittently long after a text has been completed.

*Note especially that information given in news papers, in school, or by politicians tends to be too superficial or even outright faulty. Wikipedia was once a good source, but has deteriorated over the years (at least where many topics are concerned). The “talk” pages can often contain a sufficient multitude of view-points, however.

If a conflict has been detected, it should be investigated with a critical eye in order to find a resolution. Here there are at least* five somewhat overlapping alternatives to consider: (a) One or both beliefs are wrong and should be rejected or modified. (b) Both beliefs have at least some justification and they should be reconciled, possibly with modifications; e.g. because they cover different special cases. (c) The conflict is only apparent, e.g. through a failure to discriminate. (d) One or both beliefs are not truly held and the non-belief should be brought to consciousness; e.g. because profession is made more out of habit than conviction. (e) The support of both** beliefs is approximate or tentative (awaiting further evidence), and (at a minimum) this condition should be kept in mind, with revisions according to the preceding items often being necessary.*** Note that the above need not result in rejection of one belief—it can equally be a matter of modification or refinement (and it can also happen to both beliefs). This is one reason why investigation is so beneficial—it helps to improve one’s own mind, world-view, whatnot.

*A deeper effort might reveal quite a few more alternatives. I write mostly off the top of my head at the moment.

**Here it has to be both: If one belief is taken as true and only one as approximate, then it would follow that the approximate one is outright faulty (at least as far as the points of conflict are concerned), which moves us to the “One” case of (a).

***For instance, if two physical theories are not perfectly compatible, the realization that physical theories are only approximations-for-the-now (eventually to be replaced by something better) gives room for an “approximate belief” in either or both theories. As long as work proceeds with an eye at the used assumptions, with the knowledge that the results might not be definite, and while being very careful in areas of known conflict or with poor experimental verification, this is not a major issue. Indeed, such “approximate belief” is par for the course in the sciences. In contrast, if someone was convinced that both were indisputably true, this would be highly problematic.

Again, giving blanket rules is tricky, especially with the very wide variety of fields/beliefs potentially involved and with the variety of the above cures. However, actually thinking and, should it be needed, gathering more information can be very productive. Having a good ability to discriminate is helpful in general; and with (b) and (c) it can be particularly beneficial to look at differences, e.g. if there is some aspect of a case where one belief is assumed to apply that is not present in a case where the other belief is assumed to apply. With (d), it is usually mostly a matter of introspection. (In addition, the advice for detecting conflicts applies to some parts here and vice versa. Often, the two will even be implicit, hard-to-separate, parts of a single process.)

For a specific, somewhat complex example, consider questions around what makes a good or poor book, movie, whatnot—especially, the property of being hackneyed: On the one hand, my discussions of various works have often contained a complaint that this-or-that is hackneyed. On the other, it is quite common for works that I enjoy and think highly of (at least on the entertainment level*) to contain elements of the hackneyed—or even be formulaic. Moreover, I rarely have the feel that this enjoyment is despite of something being hackneyed—this weakness, in it self, does not appear to disturb me that strongly.

*Different works serve different purposes and should be measured with an eye on the purpose. When I watch a sit-com, depth of character is far less important than how often and how hard I laugh; the romance in an action movie is a mere bonus (or even a negative, if there is too much); vice versa, an action scene in a rom-com is mere bonus; plot rarely makes sense in non-fiction; etc. For more “serious” works, more serious criteria and higher literary standards apply.

Is my explicit complaint compatible with my implicit acceptance? To some degree, yes; to some degree, no.

On the “no” side: I suspect, after introspection, that I do or do not find a certain work enjoyable, thought-worthy, whatnot, based on criteria that are not explicitly known to me.* If I find enjoyment (etc.), I am less likely to look for faults; if I do not, I am more likely to look for faults—but there is no guarantee that my original impression was actually caused by the faults now found. Some will almost certainly have been involved; others need not have been; and there might have been other faults involved that I never grew explicitly aware of.

*There are many aspects of different works that can individually have a large impact, and the end-impression is some form of aggregation over these aspects. For instance, consider the impact of music on movies like “Star Wars” and “Vertigo” or on TV series like “Twin Peaks”—change the music, and the work is lessened. Notably, the viewer is rarely strongly aware of the impact of the music (even be it hard to miss in the aforementioned cases).

On the “yes” side there are at least three things to consider: Firstly, a work can be hackneyed and have sufficient other strengths to outweigh this. Poor works are rarely poor due to one failure—they are poor because they fail on numerous criteria, e.g. (for a movie) being hackneyed and having a poor cast, wooden dialogue, unimpressive music, … Being hackneyed is, alone, not a knock-out criterion—being original is an opportunity to gain points that a hackneyed work simply has not taken. Secondly, different criteria can apply to different works,* and being hackneyed is not necessarily an obstacle for the one work, even though it is for another. Thirdly, if something is known to work well, it can be worth using even if it is hackneyed—“boy meets girl” has been done over and over and over again, but it still works. (See also an excursion below.)

*Partly, as in a previous footnote; partly, with an eye on the expected level of accomplishment. For instance, my very positive discussion of Black Beauty must be seen as referring to a children’s book—had I found the exact same contents in a work with the reputation and target group of e.g. James Joyce’s “Ulysses” (which I have yet to read), I would have been less enthusiastic.

All in all, I do not see a problem with this conflict in principle; however, I do suspect that I would benefit from (and be fairer in detail* by) looking closer at what actually created my impression and less closely on criteria like “original vs. hackneyed”. The latter might well amount to fault finding or rationalization. To boot, I should pay more attention to whether specifically something being hackneyed has a negative effect on me (beyond the mere failure to have a positive effect through originality).

*I doubt that my overall assessment would change very much; however, my understanding and explanation of why I disliked something would be closer to the truth. Of course, it might turn out that being hackneyed was a part of the explanation in a given case; however, then I can give that criticism with a better conscience…

Excursion on expectations:
In a somewhat similar situation, I have sometimes complained about a work having set a certain expectation and then changed course. That is an example of another issue, namely the need to discriminate*. There are setups and course changes that are good, in that they reduce the predictability, increase the excitement, whatnot. This includes well-made “plot twists”. There are, however, other types of expectations and course changes that are highly unfortunate—including those that make the reader (viewer, whatnot) set his mind on a certain genre or a certain general development. A course change here is likely to detract from the experience, because different genres are enjoyed in different manners, and because there is often an element of disappointment** involved. Depending on the change, there can also be a delay and reorientation needed that lessens concentration and enjoyment further. Another negative type of changes is (almost always) those that try to rejuvenate a TV series or franchise by sacrificing what once made the series worth watching, by “jumping the shark”, and similar.

*Yes, discrimination is also a sub-topic above; however, here we have a too blatant case to be truly overlapping: There is no need for me to re-investigate my own beliefs—only to clarify them towards others. (Except in as far as I might have suffered from a similar fault-finding attitude as discussed above, but that attitude is just an independent-of-the-topic aspect of an example.)

**Note that this holds true, even when the expected and the delivered are more-or-less equivalent in net value. (However, when there is a significant improvement, the situation might be different: I recall watching “Grease” for the first time, with only a very vague idea of the contents; seeing the first scene; and fearing that I was caught in the most sugary, teenage-girls-only, over-the-top romance known to man—the rest was a relief.)

Excursion on “boy meets girl”:
An additional, but off-topic, complication when considering the hackneyed, is that there comes a point of repeated use when the hackneyed does not necessarily register as hackneyed and/or is so central to a genre that it is hard to avoid. Consider the typical “boy meets girl” theme. This, in it self, is so common and so basic to movie romance that it rarely registers as hackneyed. In contrast, the rarer “childhood friends fall in love” does*. With “boy meets girl”, the question is less whether the theme lacks originality and more whether the implementation is done with sufficient quality** and whether the details are also lacking in originality (is there, e.g., yet another desperate chase to and through an airport at the end?).

*At least to me, which also shows that there can be a large element of subjectiveness involved.

**Oscar Wilde defended against accusations of plagiarism by pointing to the difference between adding and removing a petal when growing tulips: To repeat in a better manner what someone else has already done, is not necessarily a fault.

Excursion on good fiction:
More generally, I am coming to the conclusion that fiction (and art, music, whatnot) either works or does not work—and if the end result works, an author (movie maker, whatnot) can get away with more-or-less anything along the road. This includes the hackneyed, poor prose, absurd scenes, artistic liberties with science, a disregard for convention and expectation, the tasteless, … (But the question of “because or despite?” can be valuable, especially with an eye at a different reactions among different readers.) The proof of the pudding is in the eating—not in the recipe.

Written by michaeleriksson

November 17, 2018 at 2:53 am

A few thoughts on traditions and Christmas (and some personal memories)

with 5 comments

With Christmas upon us, I find myself thinking about traditions* again. This especially with regard to the Christmas traditions of my childhood, in light of this being the first Christmas after the death of my mother.

*Mostly in the limited sense of things that e.g. are done once a year on the same day or in a certain special manner, and as opposed to the other senses, say the ones in “literary tradition” and “traditional role” .

It has, admittedly, been quite a long while since I “came home for Christmas”, as she would have put it, and, frankly, the circumstances of my family had made Christmases at my mother’s hard for me to enjoy long before that. However, while the practical effect is not very large for me, there is still a psychological difference through the knowledge that some possibilities are permanently gone, that some aspects of those Christmases would be extremely hard to recreate—even aside from the obvious absence of my mother, herself. Take Christmas dinner: Even following the same recipes, different people can end up with different results, and chances are that even a deliberate attempt to recreate “her” version would be at best a poor* approximation—just like it was an approximation of what her mother used to make (and my father’s draws strongly on his childhood Christmas dinners). There is simply yet another connection with those Christmases of old that has been cut. In fact, when I think back on the most memorable, most magical, most wonderful Christmases, there are two versions that pop into my head:

*Note that a poor approximation does not automatically imply a poor effort. The point is rather that there are certain tastes and smells that can be important to us for reasons like familiarity and associations with certain memories, and that there can come a point when they are no longer available. I need look no further than my father to find a better cook than my mother, be it at Christmas or on a weekday; however, his cooking is different, just like his signature is—and even if he deliberately tried to copy her signature, the differences would merely grow smaller.

The first, predating my parents divorce, with loving and (tautologically) still married parents, a tree with a certain set of decorations, in the apartment we used to live in, and a sister too young to be a nuisance or even to properly figure in my recollections. I remember particularly how I, possibly around four or five years of age, used to spend hours* sitting next to the tree, staring at and playing with the decorations, and listening to a certain record with Christmas songs**. There was one or several foldable “balls” that I used to fold and unfold until the parents complained, and that fascinated me to no end. I have no idea whether the record and decorations exist anymore, we moved from the apartment almost forty years ago, the parents are long divorced—and I am, obviously, a very different person from what I was back then. With my mother dead, Father is the only remaining connection—and my associations with him and Christmas have grown dominated by those Christmases I spent with him as a teenager. (Which in many ways were great, but could not possibly reach the magic and wonder Christmas holds to a small child.)

*Well, it might have been considerably less—I really had no sense of time back then.

**In a twist, my favorite was a Swedish semi-translation of “White Christmas” by the title “Jag drömmer om en jul hemma”—“I’m dreaming of a Christmas back home”.

The second, likely* post-divorce and living in Kopparberg, where my maternal grand-parents resided, featured a setting in the grand-parents house and the addition of said grand-parents and my uncle and his family to the dramatis personae. Well, the house is torn down, most or all of the furniture and whatnots are gone, the grand-parents are both dead, and on the uncle’s side they started to celebrate separately relatively soon (and I was obviously never as close with them as with my parents or grand-parents). Again, I am a very different person, and with Mother dead, there is virtually no connection left.

*With the long time gone by and my young age, I cannot rule out that some pre-divorce Christmas also fell into this category.

However, memory lane is just the preparatory road, not the destination, today. The core of this post are two, somewhat overlapping, aspects of most traditions that I find interesting:

  1. What we consider traditional is to a very large part based on our own childhood experiences, both in terms of what is considered a tradition at all and what is considered the right tradition. Comparing e.g. my Christmases with my father and mother post-divorce, they had different preferences in both food and decorations* that often (cf. above) went back to their own childhoods. Similarly, U.S. fiction sometimes shows a heated argument over “star on top” vs. “angel on top” (and similar conflicts)—let us guess which of the parties were used to what as children…

    *Although some of the difference in decorations might be based less in preference and more in inheritance of specific objects.

    As for the very young me, I often latched on to something that happened just once or twice as a tradition, being disappointed when the “tradition” did not continue, say when the paternal grand-mother came visiting and did not bring the expected little marzipan piglet.

    Indeed, many traditions simply “run in the family”, and are not the universal and universally central part of, e.g., a Christmas celebration that a child might think. I recall visiting another family at a young age, thanking for dinner like my parents had taught me, and being highly confused when their daughter laughed at me. With hindsight, I cannot blame her: The phrase, “tack för maten och kamraten” (roughly “thanks for the food and the friend”), makes no sense, and is likely something my parents just found to be a funny rhyme—it is certainly not something I can recall having heard anywhere else.

    Even those traditions that go beyond the family can still be comparatively limited, e.g. to a geographical area. Christmas it self has no global standard (even apart from the differentiation into the “Christ is born” and “time for presents and Christmas trees/decorations/food” celebrations). There are, for instance, weird, barbaric countries where they celebrate on the 25th and eat Christmas turkey instead of doing the civilized thing and celebrating on the 24th with Christmas ham. The “Modern Family” episode dealing with the first joint U.S.–Columbian Christmas gives several interesting examples, and demonstrates well how one set of traditions can be weird-bordering-on-freakish to followers of another set of traditions.

  2. Traditions, even those that are nation wide, can be comparably short-lived. Christmas, again, is a great source of examples, with even e.g. the Christmas trees and Santa Clause being comparatively modern introductions, especially in countries that they have spread to secondarily. One of the most important Swedish traditions, for instance, is Disney’s From All of Us to All of You*—first airing in 1960 and becoming a virtually instant tradition, often topping the list of most watched programs of the year.

    *While this might seem extremely surprising, it can pay to bear in mind that Swedish children were starved for animation for most of the remaining year, making the yearly showing the more special. Also note the slow development of Swedish TV, with the original broadcast taking place in a one-channel system, and a two-channel system being in place until well into the 1980s—implying that the proportion of children (and adults) watching was inevitably large. That a TV broadcast of a movie or similar becomes a tradition is, obviously, not without precedent, even if rarely to that degree, with e.g. “It’s a Wonderful Life” and “Miracle on 34th Street” being prominent U.S. examples; and e.g. “Dinner for One” being a New Year’s example in several European countries.

    The entire concept of the U.S.-style Halloween is another interesting example, even when looking just at the U.S. and children (related historical traditions notwithstanding), but the more so when we look at adult dress-ups or the expansion to other countries, including going from zero to something semi-big in Germany within, possibly, the last ten to fifteen years. Fortunately, we are not yet at the point where we have to worry about children knocking on doors and demanding candy, but this might just be a question of time.

    Many traditions, in a somewhat wider sense, are even bound to the relatively short eras of e.g. a certain technology or other external circumstance. Consider, again, TV*: It only became a non-niche phenomenon in the 1950s (possibly even 1960s in Sweden); it was the worlds most dominant medium and one of the most important technologies by the 1980s, at the latest; and by 2017 its demise within possibly as little as a decade seems likely, with the Internet already having surpassed it for large parts of the population. By implication, most traditions that somehow involve a TV can safely be assumed to measure their lives in no more than decades. (Often far less, since many will fall into the “runs in the family” category.) If I ever have children and grand-children (living in Sweden), will they watch “From All of Us to All of You”, punctually at 3 P.M. on December 24th? The children might; but the grand-children almost certainly will not—there is unlikely to even be a broadcast in the current sense. (And even if one exists, the competition from other entertainment might be too large.) Looking in the other direction, my parents might have, but my grand-parents (as children) certainly did not—even TV, it self, was no more than a foreign experiment (and the program did not exist).

    *It is a little depressing, how many traditions in my family have revolved around food and TV—and I doubt that we were exceptional.

    Similarly, how is a traditional cup of coffee made? Well, for most of my life, in both Germany and Sweden, my answer would have been to put a filter in the machine, coffee in the filter, water in the tank, and then press the power button—for a drip brew. However, the pre-dominance of this mode of preparation (even in its areas of popularity) has been short, possibly starting in the 1970s and already being overtaken by various other (often proprietary) technologies like the Nespresso or the Dolce Gusto. The dominant rule might have been less than 30, certainly less than 40 years. Before that, other technologies were more popular, and even outright boiling of coffee in a stove pot might have been the standard within living memory*. Possibly, the next generation will see “my” traditional cup of coffee as an exotic oddity; while the preceding generations might have seen it as a new-fangled is-convenient-but-not-REAL-coffee.

    *My maternal grand-mother (and several other family members) was heavily involved with the Salvation Army. For the larger quantities of coffee needed for their gatherings, she boiled coffee as late as, possibly, the 1990s. While I do not really remember the taste in detail, there was certainly nothing wrong with it—and it certainly beats the Senseo I experimented with some ten years ago.

All of this runs contrary to normal connotations of a tradition—something very lengthy and, preferably, widely practiced. Such traditions certainly exist; going to church on Sunday being a prime example, stretching over hundreds of years and, until the last few decades, most of the population of dozens of countries. However, when we normally speak of traditions, it really does tend to be something more short-lived and more localized. I have e.g. heard adults speak of the “tradition” of dining at a certain restaurant when visiting a certain city—after just several visits… (It could, obviously, be argued that this is just sloppy use of language; however, even if I agreed, it would not change the underlying points.)

Excursion on other areas and nationalism:
Of course, these phenomena are not limited to traditions, but can also include e.g. national or other group characteristics. A common fear among Swedish nationalists (with similarities in other countries) concern the disappearance of the Swedish “identity” (or similar)—but what is this identity? More to the point, is the identity that I might perceive in 2017 the same that one of my parents or grand-parents might have perceived in 1967? Great-grand-parents in 1917? There have been a lot of changes not just in traditions, since then, but also in society, education, values, wealth, work environments, spare time activities (not to mention amount of spare time…), etc., and, to me, it borders on the inconceivable that the image of “identity” has remained the same when we jump 50 or 100 years*. Or look, by analogy, at the descriptions of the U.S. “generations”: While these are, obviously, generalizations and over-simplifications, it is clear that even the passing of a few decades can lead to at least a severely modified “identity”.

*Looking at reasonably modern times. In older times, with slower changes, this was might have been different. (I use “might”, because a lot can happen in such a time frame; and, at least in historical times, there was always something going on over such time intervals, be it war, plague, religious schisms, …, that potentially could have lead to similar variations.)

I strongly suspect that what some nationalists fear is actually losing the familiar and/or what matches a childhood impression: When I think back on Sweden, I often have an idealized image dominated by youthful memories, and this is usually followed with a wish to preserve something like that for eternity, the feeling that this is how the world should be, and this is what everyone should be allowed to experience. While I am rational enough to understand both that this idealized image never matched reality, even back then, and that it there are many other idealized images that would be equally worthy or unworthy, I can still very well understand those who draw the wrong conclusions and would make the preservation a too high priority.

Written by michaeleriksson

December 24, 2017 at 7:37 pm

A few thoughts around childhood recollections

with 3 comments

Through a somewhat random chain of association, I find myself thinking about one of my childhood’s favorite objects: Skåpsängen*.

*I am not aware of an English translation. Literally, “säng” is “bed”, “-en” is “the”, and “skåp” can, depending on context, translate as e.g. “cupboard” or “closet”. Below, I will speak of “box” for the “skåp” part, because this matches the internal structure best, even if it was larger and more finely worked than what I picture when I hear “wooden box”. I keep the word with a capital “S” because it always came over as a proper name to me—not a mere noun or a mere description. (This was often the case with me. Cf. “mormorsfranska” below.)

This was a foldable bed-in-a-box, that I used to sleep in when visiting my maternal grand-parents as a young child. As a result of the construction, I lied down with my head well within the box, which was something of a world of its own. Not only did the walls and roof shelter* me, but I often found myself just staring at the walls for minutes at a time, following the grain of the wood, especially the brown patterns formed by wood knots, or admiring one or two little pencil drawings (possibly drawn by my mother in her youth)—almost as good as TV. My positive associations are strengthened by how grand-parents spoil their grand-children and the “exotic” overall environment, with its new smells, different and older furniture**, different food***, toys that once belonged to my mother and her brother …—and, obviously, the grand-parents themselves.

*In my subjective impression. There was, of course, no actual danger or discomfort to shelter against.

**Including some actual antiques that had been handed down from an even older generation than my grand-parents’.

***Including what I thought was named “mormorsfranska”, but was actually just a descriptive “mormors franska”—“[my specific] grand-mother’s [style of] bread rolls”, often given to me while tucked into the bed.

While a trip down memory lane is all fine and dandy*, it is not something that I often write about. However, there are a few thought-worthy things and my mind kept wandering back to other childhood memories and potential lessons, a few of which I will discuss below.

*Or not: By now, I am actually feeling quite sad, seeing that the grand-parents (and mother) are all dead, the house was torn down decades ago, Skåpsängen probably does not exist anymore, most of the other things likely have gone the same way, the innocence of childhood has long passed, …, One of the risks with looking back at happy times gone by, instead of forward to happy times to come or at the happy times of the now, is that the element of loss can ruin the experience—and the happier the memory, the greater the loss.

The most notable is how my child’s mind could be so fascinated with the walls of the box, where I today might have had a look around and then immersed myself in a book or my computer. This is largely because a child is easier to amuse and stimulate than an adult, who (often) needs something more challenging, and whose curiosity has moved on to other areas. Not only are such contrasts between the child and the adult important in order to understand children and (e.g. in my case) developing a greater tolerance for them, but when similar variations are present in the adult population they can become a tool to understand humanity as a whole better. Consider e.g. how a difference in intelligence levels can cause one person to view a certain activity as too easy to bother with, while another might be challenged and stimulated, and the activity that challenges and stimulates the former might simply be too hard for the latter; or how some might be more interested in stimulation through thinking and some more* through perception, and/or the two having different preferences for channels of perception.

*At least here the “more” is of importance: There seems to be quite a few people who really do not like to think, but few or none who are entirely cold towards sensory perceptions. More often, it is a question of prioritizing them, or some forms of them, lower than other things.

However, another partial explanation is likely the modern tendencies to crave more active forms of stimulation and not appreciating the little things in life: There can be a benefit found in, for a few minutes a day, just relaxing, cutting out stronger sources of stimulation (e.g. blogging or TV), and just focusing on and enjoying something small in the moment. (While I have resolved to deliberately and regularly do so on a few occasions, the resolution has usually been forgotten within a week. It still happens, obviously, but more accidentally and likely not as often as it should.)

Yet another contributing factor, especially for an adult, is today’s intense competition for our attention: There is so much entertainment, so much to learn, so much to see and do, that a dozen life-times would be too little. Back then, for a child, shortly before lights out*? The competition might have been re-reading a comic or just letting my thoughts wander while staring out into the room…

*Possibly more metaphorically than literally, since I was afraid of the dark and usually insisted that the lights be left on—which could, obviously, have prolonged the time available to look at the box…

An event that took place in Skåpsängen during my very early childhood is another good illustration of the difference between more childish and more adult reactions, resp., among adults, more emotional and more rational ones: The most favorite object of my childhood was a toy penguin. At some point after dark, one of its button eyes came off. I raised hell, annoyed my grand-mother (who, understandably, did not see this as a big deal) severely, and ended up being ungrateful when she sew another button on, without locating the original. (My memory of the exact details is a little vague, but I strongly suspect that if I had seen the “injury” as less urgent and waited until the following morning, the original button would have been used.) Apart from the repeated implications on understanding children and, possibly, humans in general, there are at least two lessons: Firstly, that someone who is very upset and/or makes a lot of noise does not necessarily have a legitimate complaint, or a complaint more worthy than that of more reasonable protesters. Secondly, that we should not expect gratitude from these people if we try to satisfy them…

Importantly, however, I did not complain loudly and stubbornly because of any calculation*—I did it because I was very genuinely upset: I was unable to comprehend that this truly was no big deal. Even if we allow that a child can have a very strong emotional connection to a toy penguin**, this was not a damage that was noteworthy, debilitating, or hard to fix—a few minutes with needle, thread, and (preferably the original…) button, and everything would be fine. For I all know, exactly that could have happened to the other eye at some point when I was asleep and unaware of the events, having no way to tell after the fact. This type of inability to make correct assessments is regrettably very common among adults too, if not in such extremely obvious cases.

*In contrast, I suspect that e.g. a large part of the PC crowd is driven by calculation when it comes to their style of protest. I use similar tactics, on occasion, when dealing with e.g. spamming companies-where-I-placed-a-single-order-and-never-consented-to-any-advertising: Reasoning very obviously does not convince them that they are doing something grossly unethical, so let us see whether they pay attention when a customer leaves in (apparent) anger. (To early to tell, but I am not optimistic.)

**Which we certainly should: Even now, I find myself having a surprisingly strong reaction when thinking back, stronger than e.g. when thinking of the real-life people that I later went to school with… Similarly, one of the most enduringly popular songs in Sweden, since before my own birth, is “Teddybjörnen Fredriksson”, dealing with the nostalgic feelings of a grown man towards his childhood teddy bear (named Fredriksson). I suspect that it is better known and more beloved among Swedes that the top hits of ABBA and Roxette.

Children do provide many, with hindsight, ridiculous examples. The proudest moment of my life came when I, about four years old, gave my grand-father a tip on how to repair a broken (probably) 16mm film—and he, an actual adult!, followed my tip. Did I save the day, like I thought? No: As I realized later in life, he would have done the exact same thing anyway. (As implied e.g. by the fact that he already had the right equipment for the repair.) Similarly, the first, and possibly only, time I played croquet, at about the same age, I was very proud at having beaten my grown-up uncle. (He claimed that I did, and who was I too disagree, not even understanding the rules…) Can you say “Dunning–Kruger”?

The pride aspect is yet another case where children could differ from mature adults: I am not necessarily free from pride, but this particular type of pride (as opposed to e.g. contentment) over a specific event or a specific accomplishment is comparatively rare, and it seems pointless and vain to me for anything but the greatest accomplishments (major scientific break-throughs, Olympic medals, …) Then again, I need not be representative for adults. For instance, while I keep my college diplomas somewhere in a stack of paper, many others, including my mother, have theirs framed and hung on the wall.

Written by michaeleriksson

November 22, 2017 at 10:03 pm

Disney’s princesses and the wishes of women

with 9 comments

Time and time and again, I stumble upon blogs, newspaper articles, and similar, with a thesis along the lines of “Disney and its unrealistic princesses teach little girls what they should like.” (with many variations on who is the culprit, what age the women are, and other details).

Every time I read something like that, I have a near identical comment in mind:

You assume that the girls/women are altered by Disney/whomever. Stop to consider the far more likely explanation that money-makers simply happen to know what women like—and have done the complete opposite: Altered the message to fit the women.

In order to save some time in the future, I have written this post instead, for easy linking. (If you have found this page over such a link, please bear in mind that a one-size-fits-all is rarely a perfect fit: Apply the principles, not the details, to the post in question.)

Disclaimer: I do not claim that this is necessarily a one-way street, but fully acknowledge, e.g., that Disney can affect the girls. My point is rather that the opposite, by Occam’s Razor, should be the default assumption, that the burden of proof is on those blaming Disney, and that, even to the degree that a two-way street is present, the effect of the girls on Disney is likely to be the considerably stronger.

Written by michaeleriksson

July 16, 2010 at 1:03 am

Humor sites

with 2 comments

I try to combine education with pleasure; and have found that humor, in various forms, is not only useful as entertainment, to induce laughs, or to make people happy, but can also give insights into human thinking and behaviour, hold up a mirror of self-criticism to the audience, remind how many deeply stupid people are out there, or otherwise serve a practical purpose.

Correspondingly, I have published a list of humor sites with educational benefits on my website. Your visit is welcome—as are your suggestions for new additions.

Written by michaeleriksson

March 10, 2010 at 5:37 am

Posted in Uncategorized

Tagged with , , , ,