Michael Eriksson's Blog

A Swede in Germany

Posts Tagged ‘Psychology

The effects of our base-line on perception / Follow-up: A few thoughts on traditions and Christmas

leave a comment »

Traditions [1] were the topic for a Christmas text last year. In the almost exactly one year since then, I have again and again noted various overlaps with the sub-topic of our perception of normality. More specifically, it seems that there is a point of “normality”, where something becomes so familiar that we do not notice or reflect upon it, or where we experience it highly differently from less familiar phenomena and/or from how others experience the same phenomenon.

A few examples:

  1. As children, I and my sister often stayed for prolonged times at our maternal grand-mother’s. She declined many wishes for pasta and rice with the argument that “we already had that once this week”—but had no qualms about using boiled* potatoes as the “staple” five to seven times a week. In all likelihood, she genuinely** did not perceive the paradox in this argumentation, being so used to potatoes that they were a standard part of any meal***—just like the glass of milk.

    *Mashed or fried potatoes happened on occasion; I am not certain whether she ever served French fries.

    **To which should be noted that she was not very bright—others might have been more insightful even in the face of ingrained eating habits. Unfortunately, back then, I took it to be just another case of dishonest adult “argumentation”.

    ***She was born in 1924 and grew up with a very different diet from even what I (1975) did, let alone what some born today will. Indeed, left to her own devices, deviations from boiled potatoes were more likely to have been e.g. kåldolmar (cabbage rolls) or rotmos (a rutabaga mash with some admixture of potatoes(!) and carrots) than rice or pasta.

    Consider similarly my own caffeine habits*: I drink large amounts of black coffee—no sugar, no milk, no cream, … This despite originally not liking the taste. When it comes to tea, I have tried repeatedly to use it as a substitute, but within a week or two of a cup a day, the experiment always ends, because I do not like the taste.** I have used e.g. Nespresso and Dulce Gusto machines, but eventually grew tired of the taste and returned to drip-brews. Similarly, when I ordered coffee in restaurants, I used to take the opportunity to have an espresso or a cappuccino—today, I almost invariably order a “regular” coffee. What is the difference, especially since I did not originally enjoy coffee? Simply this: I have drunk so much of it that it has become a taste norm. Tea does not have that benefit and other variations of coffee are implicitly measured as deviations from that norm. The latter might even taste better in the short term, but then I simply “grow tired” of the taste.

    *Also see parts of [1] and of a text on prices.

    **In fairness to tea: I have so far always used tea bags—some claim that they are a poor substitute for tea leaves.

    This item has some overlap with (but is not identical too) the concept of “an acquired taste”.

  2. Why does boy-meets-girl feel less hackneyed than childhood-friends-fall-in-love? (Cf. an excursion in [2].) Well, the former is so common that it does not register in the same way as the latter—despite the paradox. Or take teenage-girl-and-much-much-older-vampire-fall-in-love: Only a very small minority of all works of fiction has this theme, and it would likely amount to a minority even of the vampire genre. Still, it feels so hackneyed that my reaction typically is “not this shit AGAIN—I will watch something else”. A higher degree of rarity can even increase the perceived hackneyedness, because the concept registers more strongly.* Beyond a certain rarity limit, the recognition factor might be so large that the automatic reaction is not “hackneyed” but “plagiarized”…

    *However, another partial explanation can be that a theme has still not been explored enough, leaving works using a certain concept too similar. For instance, the overall vampire genre is much more diverse today than in the hey-days of Christopher Lee, because so many new variations of the theme have been tried over time—“vampire movie” does no longer automatically imply scary castles, big capes, the surreptitious biting of sleeping maidens, or similar.

  3. Virtually every generation complains about the music of the following generations. To some degree this can be due to actual falling quality (e.g. through increased commercialization or a shift of focus from music-on-the-radio to exotic-dancing-on-TV) or a greater filtering of old music (where only the great hits have survived); however, a major part is the base-line that we are used to (likely coupled with nostalgia). Notably, the hit music of a certain period appears to fall mostly into just several fairly specific genres, with a great internal similarity in “sound”. Those who grow up* with a certain sound will tend to see it as a norm, be more likely to be estranged by newer genres and be more able to differentiate within and appreciate the old genres. (Hence complaints like “it all sounds the same”.)

    *In my impression, most people listen to more music and more intensely in their youth than at higher ages, and they might be more easily malleable to boot (be it for biological reasons or because the prior exposure has been lower). However, I suspect that amount of exposure is more important than age.

    A similar effect is almost certainly present between contemporaneous genres that differ considerably.

  4. As a small child, I somehow got into a discussion with my parents as to why the clock on the kitchen wall was not audibly ticking. They claimed that it was, but I could not hear anything. On their insistence, I spent a short period listening intently—and there it was! I was simply so used to the sound that it had not registered with me, until I deliberately tried to hear it…

    In an interesting contrast, I often found the antique wall-clocks at both my father’s and my maternal grand-mother’s so annoying that I used to stop them—in turn, slightly annoying my respective hosts. This might at least partially have been due to my base-line being “tickless”; however, they were also much louder than the (modern) kitchen-clock, and might also have had a more irregular or prolonged sound. (The antiques used an entirely mechanical, crude-by-modern-standards clockwork with pendulums and whatnots; the kitchen-clock had a modern clockwork, ran on a battery, and likely used a balance wheel.)

    As an aside, this points to the risk that isolating one-self from disturbances can lead to an increased sensitivity to the disturbances that do occur, while increased exposure can bring greater tolerance—a dilemma that I have long struggled with as someone sensitive to noise. An extreme example is present in the movie “The Accountant”, in which the autistic protagonist deliberately exposes himself to very loud noises, strobing lights, and physical pain during shorter intervals, apparently trying to increase his tolerance. (I caution that said movie did not strike me as overly realistic.)

  5. When I lived in Sweden, German seemed a fairly ugly language with too strong (in some sense) pronunciations of many sounds (including “r” and “s”). After twenty years in Germany, it sounds just fine, while I am often struck by Swedish as bland and lacking in character. Back then, I heard how German differed from Swedish; today, I hear how Swedish differs from German.

    English is somewhere in between and has not struck me in the same way. However, it is notable that TV and movies have left me with a U.S. base-line, in that I mostly (mis-)register U.S. English as “without an accent”,* while e.g. any version of British English comes across as British**. This is the odder, since I actually consider (some versions of) British English more pleasant to the ear and have a tendency to drift in the “English English” direction, or even towards amateurish pseudo-RP, on those rare occasions that I actually speak English.

    *But many versions of U.S. English stand out as non-standard, including the heavy Southern ones.

    **Often with a more specific sub-classification, e.g. “standard”, Cockney, Irish, Scottish; in some cases, as something that I recognize as a specific accent but am unable to place geographically. (The same can happen with U.S. dialects, but is much rarer—possibly, because British English is more diverse.)

Outside of examples like the above, there are at least two areas that might be at least partially relevant and/or over-lapping: Firstly, opinion corridors and similar phenomena. Secondly, various physical phenomena, e.g. drug resistance, specificity of training, or how the human body reacts to cold: Apparently, Eskimos “in the wild” have the ability to work without gloves in freezing temperatures for prolonged times without ill-effects, pain, whatnot—but a few years in “civilization” make them lose this ability. Allegedly, Tierra del Fuego natives have (had) the ability to sleep almost naked in free air at low (but not freezing) temperatures, while the typical Westerner can feel cold at a little below room temperature without a duvet. I have myself witnessed two or three Westerners who walk around in t-shirt and shorts all year round (in Sweden and/or Germany—not Florida), at least one of which made the papers for this habit—he claimed that the body adapts* if one can push through the early discomfort.

*The exact nature of those adaptions are beyond my current knowledge, but at least some of them likely relate to how fast the body switches from a low-isolation to a high-isolation state and how strong the isolation becomes. That this is trainable to some degree can be easily verified through only taking cold showers for a few weeks and noting how strongly the discomfort is reduced in that time frame. Increase of “brown fat” likely also plays in.

Advertisements

Written by michaeleriksson

December 21, 2018 at 9:27 pm

Conflicting own beliefs and what to do about them

with one comment

In the set of beliefs* held by anyone, there will be occasional real or imagined conflicts (consider also e.g. the concepts of “cognitive dissonance” and “doublethink”). People differ mainly in (a) the degree that they are aware of and (b) how they handle these conflicts. Unfortunately, most people are unaware of most or all conflicts that arise, make no attempts at detecting them, and are prone to just explain away the conflicts that are known—even descending to outright doublethink.** A particular issue with awareness is that a too faulty or incomplete understanding can make such conflicts go undetected.***

*I use “belief” as a catch-all that, depending on context, could include any or almost any belief, idea, opinion, whatnot that implies or would imply something about something else. This includes e.g. “cucumbers are green”, “cucumbers are blue”, “God does [not] exist”, and “I [do not] like chocolate”.

**This includes such absurdities as simultaneously professing to believe in Evolution and Gender-Feminism. Indeed, a great deal of my annoyance with politics/ideology (in general) and Feminism/Leftism/PC-ism (in particular) results from the adherents ever recurring faults in similar directions.

***Consider again Evolution vs. Gender-Feminism: It is, for instance, highly unlikely that evolutionary processes would generate physical differences while keeping mental abilities identical—but exactly that is seen as a given by most Gender-Feminists (and a significant portion of the PC crowd, in general). Similarly, it is highly unlikely that the different roles of men and women in most societies over thousands of generations would have left no trace in form of e.g. natural inclinations. A Creationist–Feminist match-up would be less prone to contradictions.

In many cases, these conflicts are sufficiently trivial that they may be neglected.* For instance, that someone has two favorite dishes, music bands, movie stars, …, rarely has major impact on important decisions.** When it comes to topics that can have a greater impact, especially on others, care should be taken, however. Consider e.g. questions like how to vote in an election, what recommendations to make to others, what agendas to push, …—here it is important to have a sufficiently sound view of the topic; and if beliefs conflict, the view is unlikely to be sufficiently sound.

*A resolution can still bring benefit, e.g. through better self-knowledge, and I would not advice against the attempt.

**However, the resolution is often fairly simple, e.g. that none of two is the favorite and that the word “favorite” is best avoided; or that an opinion has changed over time, while still being professed out of habit.

Giving blanket rules for detection is tricky, but actually reading up* on a topic, gaining an own understanding (as opposed to parroting someone else’s), and deliberately trying to see the “bigger picture” and making comparisons between different fields and ideas, can all be helpful. Above all, perhaps, it is helpful to actually think through consequences and predictions that can be made based on various beliefs, and looking at how they stack up against both each other and against observations of reality. In my personal experience, writing about a topic can be an immense help (and this is one of the reasons why I write): Writing tends to lead to a deeper thought, a greater chance of recollection in other contexts, and a thought-process that continues intermittently long after a text has been completed.

*Note especially that information given in news papers, in school, or by politicians tends to be too superficial or even outright faulty. Wikipedia was once a good source, but has deteriorated over the years (at least where many topics are concerned). The “talk” pages can often contain a sufficient multitude of view-points, however.

If a conflict has been detected, it should be investigated with a critical eye in order to find a resolution. Here there are at least* five somewhat overlapping alternatives to consider: (a) One or both beliefs are wrong and should be rejected or modified. (b) Both beliefs have at least some justification and they should be reconciled, possibly with modifications; e.g. because they cover different special cases. (c) The conflict is only apparent, e.g. through a failure to discriminate. (d) One or both beliefs are not truly held and the non-belief should be brought to consciousness; e.g. because profession is made more out of habit than conviction. (e) The support of both** beliefs is approximate or tentative (awaiting further evidence), and (at a minimum) this condition should be kept in mind, with revisions according to the preceding items often being necessary.*** Note that the above need not result in rejection of one belief—it can equally be a matter of modification or refinement (and it can also happen to both beliefs). This is one reason why investigation is so beneficial—it helps to improve one’s own mind, world-view, whatnot.

*A deeper effort might reveal quite a few more alternatives. I write mostly off the top of my head at the moment.

**Here it has to be both: If one belief is taken as true and only one as approximate, then it would follow that the approximate one is outright faulty (at least as far as the points of conflict are concerned), which moves us to the “One” case of (a).

***For instance, if two physical theories are not perfectly compatible, the realization that physical theories are only approximations-for-the-now (eventually to be replaced by something better) gives room for an “approximate belief” in either or both theories. As long as work proceeds with an eye at the used assumptions, with the knowledge that the results might not be definite, and while being very careful in areas of known conflict or with poor experimental verification, this is not a major issue. Indeed, such “approximate belief” is par for the course in the sciences. In contrast, if someone was convinced that both were indisputably true, this would be highly problematic.

Again, giving blanket rules is tricky, especially with the very wide variety of fields/beliefs potentially involved and with the variety of the above cures. However, actually thinking and, should it be needed, gathering more information can be very productive. Having a good ability to discriminate is helpful in general; and with (b) and (c) it can be particularly beneficial to look at differences, e.g. if there is some aspect of a case where one belief is assumed to apply that is not present in a case where the other belief is assumed to apply. With (d), it is usually mostly a matter of introspection. (In addition, the advice for detecting conflicts applies to some parts here and vice versa. Often, the two will even be implicit, hard-to-separate, parts of a single process.)

For a specific, somewhat complex example, consider questions around what makes a good or poor book, movie, whatnot—especially, the property of being hackneyed: On the one hand, my discussions of various works have often contained a complaint that this-or-that is hackneyed. On the other, it is quite common for works that I enjoy and think highly of (at least on the entertainment level*) to contain elements of the hackneyed—or even be formulaic. Moreover, I rarely have the feel that this enjoyment is despite of something being hackneyed—this weakness, in it self, does not appear to disturb me that strongly.

*Different works serve different purposes and should be measured with an eye on the purpose. When I watch a sit-com, depth of character is far less important than how often and how hard I laugh; the romance in an action movie is a mere bonus (or even a negative, if there is too much); vice versa, an action scene in a rom-com is mere bonus; plot rarely makes sense in non-fiction; etc. For more “serious” works, more serious criteria and higher literary standards apply.

Is my explicit complaint compatible with my implicit acceptance? To some degree, yes; to some degree, no.

On the “no” side: I suspect, after introspection, that I do or do not find a certain work enjoyable, thought-worthy, whatnot, based on criteria that are not explicitly known to me.* If I find enjoyment (etc.), I am less likely to look for faults; if I do not, I am more likely to look for faults—but there is no guarantee that my original impression was actually caused by the faults now found. Some will almost certainly have been involved; others need not have been; and there might have been other faults involved that I never grew explicitly aware of.

*There are many aspects of different works that can individually have a large impact, and the end-impression is some form of aggregation over these aspects. For instance, consider the impact of music on movies like “Star Wars” and “Vertigo” or on TV series like “Twin Peaks”—change the music, and the work is lessened. Notably, the viewer is rarely strongly aware of the impact of the music (even be it hard to miss in the aforementioned cases).

On the “yes” side there are at least three things to consider: Firstly, a work can be hackneyed and have sufficient other strengths to outweigh this. Poor works are rarely poor due to one failure—they are poor because they fail on numerous criteria, e.g. (for a movie) being hackneyed and having a poor cast, wooden dialogue, unimpressive music, … Being hackneyed is, alone, not a knock-out criterion—being original is an opportunity to gain points that a hackneyed work simply has not taken. Secondly, different criteria can apply to different works,* and being hackneyed is not necessarily an obstacle for the one work, even though it is for another. Thirdly, if something is known to work well, it can be worth using even if it is hackneyed—“boy meets girl” has been done over and over and over again, but it still works. (See also an excursion below.)

*Partly, as in a previous footnote; partly, with an eye on the expected level of accomplishment. For instance, my very positive discussion of Black Beauty must be seen as referring to a children’s book—had I found the exact same contents in a work with the reputation and target group of e.g. James Joyce’s “Ulysses” (which I have yet to read), I would have been less enthusiastic.

All in all, I do not see a problem with this conflict in principle; however, I do suspect that I would benefit from (and be fairer in detail* by) looking closer at what actually created my impression and less closely on criteria like “original vs. hackneyed”. The latter might well amount to fault finding or rationalization. To boot, I should pay more attention to whether specifically something being hackneyed has a negative effect on me (beyond the mere failure to have a positive effect through originality).

*I doubt that my overall assessment would change very much; however, my understanding and explanation of why I disliked something would be closer to the truth. Of course, it might turn out that being hackneyed was a part of the explanation in a given case; however, then I can give that criticism with a better conscience…

Excursion on expectations:
In a somewhat similar situation, I have sometimes complained about a work having set a certain expectation and then changed course. That is an example of another issue, namely the need to discriminate*. There are setups and course changes that are good, in that they reduce the predictability, increase the excitement, whatnot. This includes well-made “plot twists”. There are, however, other types of expectations and course changes that are highly unfortunate—including those that make the reader (viewer, whatnot) set his mind on a certain genre or a certain general development. A course change here is likely to detract from the experience, because different genres are enjoyed in different manners, and because there is often an element of disappointment** involved. Depending on the change, there can also be a delay and reorientation needed that lessens concentration and enjoyment further. Another negative type of changes is (almost always) those that try to rejuvenate a TV series or franchise by sacrificing what once made the series worth watching, by “jumping the shark”, and similar.

*Yes, discrimination is also a sub-topic above; however, here we have a too blatant case to be truly overlapping: There is no need for me to re-investigate my own beliefs—only to clarify them towards others. (Except in as far as I might have suffered from a similar fault-finding attitude as discussed above, but that attitude is just an independent-of-the-topic aspect of an example.)

**Note that this holds true, even when the expected and the delivered are more-or-less equivalent in net value. (However, when there is a significant improvement, the situation might be different: I recall watching “Grease” for the first time, with only a very vague idea of the contents; seeing the first scene; and fearing that I was caught in the most sugary, teenage-girls-only, over-the-top romance known to man—the rest was a relief.)

Excursion on “boy meets girl”:
An additional, but off-topic, complication when considering the hackneyed, is that there comes a point of repeated use when the hackneyed does not necessarily register as hackneyed and/or is so central to a genre that it is hard to avoid. Consider the typical “boy meets girl” theme. This, in it self, is so common and so basic to movie romance that it rarely registers as hackneyed. In contrast, the rarer “childhood friends fall in love” does*. With “boy meets girl”, the question is less whether the theme lacks originality and more whether the implementation is done with sufficient quality** and whether the details are also lacking in originality (is there, e.g., yet another desperate chase to and through an airport at the end?).

*At least to me, which also shows that there can be a large element of subjectiveness involved.

**Oscar Wilde defended against accusations of plagiarism by pointing to the difference between adding and removing a petal when growing tulips: To repeat in a better manner what someone else has already done, is not necessarily a fault.

Excursion on good fiction:
More generally, I am coming to the conclusion that fiction (and art, music, whatnot) either works or does not work—and if the end result works, an author (movie maker, whatnot) can get away with more-or-less anything along the road. This includes the hackneyed, poor prose, absurd scenes, artistic liberties with science, a disregard for convention and expectation, the tasteless, … (But the question of “because or despite?” can be valuable, especially with an eye at a different reactions among different readers.) The proof of the pudding is in the eating—not in the recipe.

Written by michaeleriksson

November 17, 2018 at 2:53 am

A few thoughts on traditions and Christmas (and some personal memories)

with 3 comments

With Christmas upon us, I find myself thinking about traditions* again. This especially with regard to the Christmas traditions of my childhood, in light of this being the first Christmas after the death of my mother.

*Mostly in the limited sense of things that e.g. are done once a year on the same day or in a certain special manner, and as opposed to the other senses, say the ones in “literary tradition” and “traditional role” .

It has, admittedly, been quite a long while since I “came home for Christmas”, as she would have put it, and, frankly, the circumstances of my family had made Christmases at my mother’s hard for me to enjoy long before that. However, while the practical effect is not very large for me, there is still a psychological difference through the knowledge that some possibilities are permanently gone, that some aspects of those Christmases would be extremely hard to recreate—even aside from the obvious absence of my mother, herself. Take Christmas dinner: Even following the same recipes, different people can end up with different results, and chances are that even a deliberate attempt to recreate “her” version would be at best a poor* approximation—just like it was an approximation of what her mother used to make (and my father’s draws strongly on his childhood Christmas dinners). There is simply yet another connection with those Christmases of old that has been cut. In fact, when I think back on the most memorable, most magical, most wonderful Christmases, there are two versions that pop into my head:

*Note that a poor approximation does not automatically imply a poor effort. The point is rather that there are certain tastes and smells that can be important to us for reasons like familiarity and associations with certain memories, and that there can come a point when they are no longer available. I need look no further than my father to find a better cook than my mother, be it at Christmas or on a weekday; however, his cooking is different, just like his signature is—and even if he deliberately tried to copy her signature, the differences would merely grow smaller.

The first, predating my parents divorce, with loving and (tautologically) still married parents, a tree with a certain set of decorations, in the apartment we used to live in, and a sister too young to be a nuisance or even to properly figure in my recollections. I remember particularly how I, possibly around four or five years of age, used to spend hours* sitting next to the tree, staring at and playing with the decorations, and listening to a certain record with Christmas songs**. There was one or several foldable “balls” that I used to fold and unfold until the parents complained, and that fascinated me to no end. I have no idea whether the record and decorations exist anymore, we moved from the apartment almost forty years ago, the parents are long divorced—and I am, obviously, a very different person from what I was back then. With my mother dead, Father is the only remaining connection—and my associations with him and Christmas have grown dominated by those Christmases I spent with him as a teenager. (Which in many ways were great, but could not possibly reach the magic and wonder Christmas holds to a small child.)

*Well, it might have been considerably less—I really had no sense of time back then.

**In a twist, my favorite was a Swedish semi-translation of “White Christmas” by the title “Jag drömmer om en jul hemma”—“I’m dreaming of a Christmas back home”.

The second, likely* post-divorce and living in Kopparberg, where my maternal grand-parents resided, featured a setting in the grand-parents house and the addition of said grand-parents and my uncle and his family to the dramatis personae. Well, the house is torn down, most or all of the furniture and whatnots are gone, the grand-parents are both dead, and on the uncle’s side they started to celebrate separately relatively soon (and I was obviously never as close with them as with my parents or grand-parents). Again, I am a very different person, and with Mother dead, there is virtually no connection left.

*With the long time gone by and my young age, I cannot rule out that some pre-divorce Christmas also fell into this category.

However, memory lane is just the preparatory road, not the destination, today. The core of this post are two, somewhat overlapping, aspects of most traditions that I find interesting:

  1. What we consider traditional is to a very large part based on our own childhood experiences, both in terms of what is considered a tradition at all and what is considered the right tradition. Comparing e.g. my Christmases with my father and mother post-divorce, they had different preferences in both food and decorations* that often (cf. above) went back to their own childhoods. Similarly, U.S. fiction sometimes shows a heated argument over “star on top” vs. “angel on top” (and similar conflicts)—let us guess which of the parties were used to what as children…

    *Although some of the difference in decorations might be based less in preference and more in inheritance of specific objects.

    As for the very young me, I often latched on to something that happened just once or twice as a tradition, being disappointed when the “tradition” did not continue, say when the paternal grand-mother came visiting and did not bring the expected little marzipan piglet.

    Indeed, many traditions simply “run in the family”, and are not the universal and universally central part of, e.g., a Christmas celebration that a child might think. I recall visiting another family at a young age, thanking for dinner like my parents had taught me, and being highly confused when their daughter laughed at me. With hindsight, I cannot blame her: The phrase, “tack för maten och kamraten” (roughly “thanks for the food and the friend”), makes no sense, and is likely something my parents just found to be a funny rhyme—it is certainly not something I can recall having heard anywhere else.

    Even those traditions that go beyond the family can still be comparatively limited, e.g. to a geographical area. Christmas it self has no global standard (even apart from the differentiation into the “Christ is born” and “time for presents and Christmas trees/decorations/food” celebrations). There are, for instance, weird, barbaric countries where they celebrate on the 25th and eat Christmas turkey instead of doing the civilized thing and celebrating on the 24th with Christmas ham. The “Modern Family” episode dealing with the first joint U.S.–Columbian Christmas gives several interesting examples, and demonstrates well how one set of traditions can be weird-bordering-on-freakish to followers of another set of traditions.

  2. Traditions, even those that are nation wide, can be comparably short-lived. Christmas, again, is a great source of examples, with even e.g. the Christmas trees and Santa Clause being comparatively modern introductions, especially in countries that they have spread to secondarily. One of the most important Swedish traditions, for instance, is Disney’s From All of Us to All of You*—first airing in 1960 and becoming a virtually instant tradition, often topping the list of most watched programs of the year.

    *While this might seem extremely surprising, it can pay to bear in mind that Swedish children were starved for animation for most of the remaining year, making the yearly showing the more special. Also note the slow development of Swedish TV, with the original broadcast taking place in a one-channel system, and a two-channel system being in place until well into the 1980s—implying that the proportion of children (and adults) watching was inevitably large. That a TV broadcast of a movie or similar becomes a tradition is, obviously, not without precedent, even if rarely to that degree, with e.g. “It’s a Wonderful Life” and “Miracle on 34th Street” being prominent U.S. examples; and e.g. “Dinner for One” being a New Year’s example in several European countries.

    The entire concept of the U.S.-style Halloween is another interesting example, even when looking just at the U.S. and children (related historical traditions notwithstanding), but the more so when we look at adult dress-ups or the expansion to other countries, including going from zero to something semi-big in Germany within, possibly, the last ten to fifteen years. Fortunately, we are not yet at the point where we have to worry about children knocking on doors and demanding candy, but this might just be a question of time.

    Many traditions, in a somewhat wider sense, are even bound to the relatively short eras of e.g. a certain technology or other external circumstance. Consider, again, TV*: It only became a non-niche phenomenon in the 1950s (possibly even 1960s in Sweden); it was the worlds most dominant medium and one of the most important technologies by the 1980s, at the latest; and by 2017 its demise within possibly as little as a decade seems likely, with the Internet already having surpassed it for large parts of the population. By implication, most traditions that somehow involve a TV can safely be assumed to measure their lives in no more than decades. (Often far less, since many will fall into the “runs in the family” category.) If I ever have children and grand-children (living in Sweden), will they watch “From All of Us to All of You”, punctually at 3 P.M. on December 24th? The children might; but the grand-children almost certainly will not—there is unlikely to even be a broadcast in the current sense. (And even if one exists, the competition from other entertainment might be too large.) Looking in the other direction, my parents might have, but my grand-parents (as children) certainly did not—even TV, it self, was no more than a foreign experiment (and the program did not exist).

    *It is a little depressing, how many traditions in my family have revolved around food and TV—and I doubt that we were exceptional.

    Similarly, how is a traditional cup of coffee made? Well, for most of my life, in both Germany and Sweden, my answer would have been to put a filter in the machine, coffee in the filter, water in the tank, and then press the power button—for a drip brew. However, the pre-dominance of this mode of preparation (even in its areas of popularity) has been short, possibly starting in the 1970s and already being overtaken by various other (often proprietary) technologies like the Nespresso or the Dolce Gusto. The dominant rule might have been less than 30, certainly less than 40 years. Before that, other technologies were more popular, and even outright boiling of coffee in a stove pot might have been the standard within living memory*. Possibly, the next generation will see “my” traditional cup of coffee as an exotic oddity; while the preceding generations might have seen it as a new-fangled is-convenient-but-not-REAL-coffee.

    *My maternal grand-mother (and several other family members) was heavily involved with the Salvation Army. For the larger quantities of coffee needed for their gatherings, she boiled coffee as late as, possibly, the 1990s. While I do not really remember the taste in detail, there was certainly nothing wrong with it—and it certainly beats the Senseo I experimented with some ten years ago.

All of this runs contrary to normal connotations of a tradition—something very lengthy and, preferably, widely practiced. Such traditions certainly exist; going to church on Sunday being a prime example, stretching over hundreds of years and, until the last few decades, most of the population of dozens of countries. However, when we normally speak of traditions, it really does tend to be something more short-lived and more localized. I have e.g. heard adults speak of the “tradition” of dining at a certain restaurant when visiting a certain city—after just several visits… (It could, obviously, be argued that this is just sloppy use of language; however, even if I agreed, it would not change the underlying points.)

Excursion on other areas and nationalism:
Of course, these phenomena are not limited to traditions, but can also include e.g. national or other group characteristics. A common fear among Swedish nationalists (with similarities in other countries) concern the disappearance of the Swedish “identity” (or similar)—but what is this identity? More to the point, is the identity that I might perceive in 2017 the same that one of my parents or grand-parents might have perceived in 1967? Great-grand-parents in 1917? There have been a lot of changes not just in traditions, since then, but also in society, education, values, wealth, work environments, spare time activities (not to mention amount of spare time…), etc., and, to me, it borders on the inconceivable that the image of “identity” has remained the same when we jump 50 or 100 years*. Or look, by analogy, at the descriptions of the U.S. “generations”: While these are, obviously, generalizations and over-simplifications, it is clear that even the passing of a few decades can lead to at least a severely modified “identity”.

*Looking at reasonably modern times. In older times, with slower changes, this was might have been different. (I use “might”, because a lot can happen in such a time frame; and, at least in historical times, there was always something going on over such time intervals, be it war, plague, religious schisms, …, that potentially could have lead to similar variations.)

I strongly suspect that what some nationalists fear is actually losing the familiar and/or what matches a childhood impression: When I think back on Sweden, I often have an idealized image dominated by youthful memories, and this is usually followed with a wish to preserve something like that for eternity, the feeling that this is how the world should be, and this is what everyone should be allowed to experience. While I am rational enough to understand both that this idealized image never matched reality, even back then, and that it there are many other idealized images that would be equally worthy or unworthy, I can still very well understand those who draw the wrong conclusions and would make the preservation a too high priority.

Written by michaeleriksson

December 24, 2017 at 7:37 pm

A few thoughts around childhood recollections

leave a comment »

Through a somewhat random chain of association, I find myself thinking about one of my childhood’s favorite objects: Skåpsängen*.

*I am not aware of an English translation. Literally, “säng” is “bed”, “-en” is “the”, and “skåp” can, depending on context, translate as e.g. “cupboard” or “closet”. Below, I will speak of “box” for the “skåp” part, because this matches the internal structure best, even if it was larger and more finely worked than what I picture when I hear “wooden box”. I keep the word with a capital “S” because it always came over as a proper name to me—not a mere noun or a mere description. (This was often the case with me. Cf. “mormorsfranska” below.)

This was a foldable bed-in-a-box, that I used to sleep in when visiting my maternal grand-parents as a young child. As a result of the construction, I lied down with my head well within the box, which was something of a world of its own. Not only did the walls and roof shelter* me, but I often found myself just staring at the walls for minutes at a time, following the grain of the wood, especially the brown patterns formed by wood knots, or admiring one or two little pencil drawings (possibly drawn by my mother in her youth)—almost as good as TV. My positive associations are strengthened by how grand-parents spoil their grand-children and the “exotic” overall environment, with its new smells, different and older furniture**, different food***, toys that once belonged to my mother and her brother …—and, obviously, the grand-parents themselves.

*In my subjective impression. There was, of course, no actual danger or discomfort to shelter against.

**Including some actual antiques that had been handed down from an even older generation than my grand-parents’.

***Including what I thought was named “mormorsfranska”, but was actually just a descriptive “mormors franska”—“[my specific] grand-mother’s [style of] bread rolls”, often given to me while tucked into the bed.

While a trip down memory lane is all fine and dandy*, it is not something that I often write about. However, there are a few thought-worthy things and my mind kept wandering back to other childhood memories and potential lessons, a few of which I will discuss below.

*Or not: By now, I am actually feeling quite sad, seeing that the grand-parents (and mother) are all dead, the house was torn down decades ago, Skåpsängen probably does not exist anymore, most of the other things likely have gone the same way, the innocence of childhood has long passed, …, One of the risks with looking back at happy times gone by, instead of forward to happy times to come or at the happy times of the now, is that the element of loss can ruin the experience—and the happier the memory, the greater the loss.

The most notable is how my child’s mind could be so fascinated with the walls of the box, where I today might have had a look around and then immersed myself in a book or my computer. This is largely because a child is easier to amuse and stimulate than an adult, who (often) needs something more challenging, and whose curiosity has moved on to other areas. Not only are such contrasts between the child and the adult important in order to understand children and (e.g. in my case) developing a greater tolerance for them, but when similar variations are present in the adult population they can become a tool to understand humanity as a whole better. Consider e.g. how a difference in intelligence levels can cause one person to view a certain activity as too easy to bother with, while another might be challenged and stimulated, and the activity that challenges and stimulates the former might simply be too hard for the latter; or how some might be more interested in stimulation through thinking and some more* through perception, and/or the two having different preferences for channels of perception.

*At least here the “more” is of importance: There seems to be quite a few people who really do not like to think, but few or none who are entirely cold towards sensory perceptions. More often, it is a question of prioritizing them, or some forms of them, lower than other things.

However, another partial explanation is likely the modern tendencies to crave more active forms of stimulation and not appreciating the little things in life: There can be a benefit found in, for a few minutes a day, just relaxing, cutting out stronger sources of stimulation (e.g. blogging or TV), and just focusing on and enjoying something small in the moment. (While I have resolved to deliberately and regularly do so on a few occasions, the resolution has usually been forgotten within a week. It still happens, obviously, but more accidentally and likely not as often as it should.)

Yet another contributing factor, especially for an adult, is today’s intense competition for our attention: There is so much entertainment, so much to learn, so much to see and do, that a dozen life-times would be too little. Back then, for a child, shortly before lights out*? The competition might have been re-reading a comic or just letting my thoughts wander while staring out into the room…

*Possibly more metaphorically than literally, since I was afraid of the dark and usually insisted that the lights be left on—which could, obviously, have prolonged the time available to look at the box…

An event that took place in Skåpsängen during my very early childhood is another good illustration of the difference between more childish and more adult reactions, resp., among adults, more emotional and more rational ones: The most favorite object of my childhood was a toy penguin. At some point after dark, one of its button eyes came off. I raised hell, annoyed my grand-mother (who, understandably, did not see this as a big deal) severely, and ended up being ungrateful when she sew another button on, without locating the original. (My memory of the exact details is a little vague, but I strongly suspect that if I had seen the “injury” as less urgent and waited until the following morning, the original button would have been used.) Apart from the repeated implications on understanding children and, possibly, humans in general, there are at least two lessons: Firstly, that someone who is very upset and/or makes a lot of noise does not necessarily have a legitimate complaint, or a complaint more worthy than that of more reasonable protesters. Secondly, that we should not expect gratitude from these people if we try to satisfy them…

Importantly, however, I did not complain loudly and stubbornly because of any calculation*—I did it because I was very genuinely upset: I was unable to comprehend that this truly was no big deal. Even if we allow that a child can have a very strong emotional connection to a toy penguin**, this was not a damage that was noteworthy, debilitating, or hard to fix—a few minutes with needle, thread, and (preferably the original…) button, and everything would be fine. For I all know, exactly that could have happened to the other eye at some point when I was asleep and unaware of the events, having no way to tell after the fact. This type of inability to make correct assessments is regrettably very common among adults too, if not in such extremely obvious cases.

*In contrast, I suspect that e.g. a large part of the PC crowd is driven by calculation when it comes to their style of protest. I use similar tactics, on occasion, when dealing with e.g. spamming companies-where-I-placed-a-single-order-and-never-consented-to-any-advertising: Reasoning very obviously does not convince them that they are doing something grossly unethical, so let us see whether they pay attention when a customer leaves in (apparent) anger. (To early to tell, but I am not optimistic.)

**Which we certainly should: Even now, I find myself having a surprisingly strong reaction when thinking back, stronger than e.g. when thinking of the real-life people that I later went to school with… Similarly, one of the most enduringly popular songs in Sweden, since before my own birth, is “Teddybjörnen Fredriksson”, dealing with the nostalgic feelings of a grown man towards his childhood teddy bear (named Fredriksson). I suspect that it is better known and more beloved among Swedes that the top hits of ABBA and Roxette.

Children do provide many, with hindsight, ridiculous examples. The proudest moment of my life came when I, about four years old, gave my grand-father a tip on how to repair a broken (probably) 16mm film—and he, an actual adult!, followed my tip. Did I save the day, like I thought? No: As I realized later in life, he would have done the exact same thing anyway. (As implied e.g. by the fact that he already had the right equipment for the repair.) Similarly, the first, and possibly only, time I played croquet, at about the same age, I was very proud at having beaten my grown-up uncle. (He claimed that I did, and who was I too disagree, not even understanding the rules…) Can you say “Dunning–Kruger”?

The pride aspect is yet another case where children could differ from mature adults: I am not necessarily free from pride, but this particular type of pride (as opposed to e.g. contentment) over a specific event or a specific accomplishment is comparatively rare, and it seems pointless and vain to me for anything but the greatest accomplishments (major scientific break-throughs, Olympic medals, …) Then again, I need not be representative for adults. For instance, while I keep my college diplomas somewhere in a stack of paper, many others, including my mother, have theirs framed and hung on the wall.

Written by michaeleriksson

November 22, 2017 at 10:03 pm

Disney’s princesses and the wishes of women

with 9 comments

Time and time and again, I stumble upon blogs, newspaper articles, and similar, with a thesis along the lines of “Disney and its unrealistic princesses teach little girls what they should like.” (with many variations on who is the culprit, what age the women are, and other details).

Every time I read something like that, I have a near identical comment in mind:

You assume that the girls/women are altered by Disney/whomever. Stop to consider the far more likely explanation that money-makers simply happen to know what women like—and have done the complete opposite: Altered the message to fit the women.

In order to save some time in the future, I have written this post instead, for easy linking. (If you have found this page over such a link, please bear in mind that a one-size-fits-all is rarely a perfect fit: Apply the principles, not the details, to the post in question.)

Disclaimer: I do not claim that this is necessarily a one-way street, but fully acknowledge, e.g., that Disney can affect the girls. My point is rather that the opposite, by Occam’s Razor, should be the default assumption, that the burden of proof is on those blaming Disney, and that, even to the degree that a two-way street is present, the effect of the girls on Disney is likely to be the considerably stronger.

Written by michaeleriksson

July 16, 2010 at 1:03 am

Humor sites

with 2 comments

I try to combine education with pleasure; and have found that humor, in various forms, is not only useful as entertainment, to induce laughs, or to make people happy, but can also give insights into human thinking and behaviour, hold up a mirror of self-criticism to the audience, remind how many deeply stupid people are out there, or otherwise serve a practical purpose.

Correspondingly, I have published a list of humor sites with educational benefits on my website. Your visit is welcome—as are your suggestions for new additions.

Written by michaeleriksson

March 10, 2010 at 5:37 am

Posted in Uncategorized

Tagged with , , , ,

Commercial language

leave a comment »

It is usually obvious when someone has written a text to inform his readers and when to convince them to do something—and in the latter case, the readers tend to be loath to comply. Whether these “convincing” texts bring a net benefit will depend on the readers’ intelligence, education, and experience; but I note that ad writers (and writers of “corporate” texts) naively tend use the same cheap language tricks irrespective of target group—in particular, failing to consider that many of the readers will be smarter than the ad writers themselves are…

This type of writing seems to be spreading further and further, even be it in a less intrusive form than in advertising. Nevertheless, this increase in self-serving language is an annoyance, while, in fact, serving no-one: On the contrary, I fear that it reflects a lack of humility and self-perspective (or may, conversely, affect thinking).

For instance, in my recent exploration of free (legal) sources of music, I found the following snippets on an overview page for the Internet Archivee:

muzic is proud to share their collection with the world in partnership with the Internet Archive.

Why “proud”? Any pride that could be relevant here is the type of pride to be avoided.

Download free recordings of classical music performed live in the Isabella Stewart Gardner Museum’s Tapestry Room. These exclusive recordings from the museum’s regular concert series feature…

The first sentence uses an imperative—a big no-no. The second uses the word “exclusive”; which is not only ad language, but also hackneyed.

Listen to this collection of 78rpm records and cylinder recordings released in the early 20th century.

Imperative again.

RadiOM.org is a unique new music resource providing access to historical and contemporary material recorded over a fifty year-plus span.

“Unique” is another hackneyed ad word—and one that tends to be used untruthfully to an even higher degree than most others. In addition, the sentence as a whole gives a negative impression, e.g. by use of “providing access”. Consider instead:

RadiOM.org provides music recorded over the last fifty-plus years.

If these had been adverts in a newspaper, I would had said nothing, possibly even thought myself lucky; however, consider the context they were in.

Some articles on my website deal with similar topics, notably Idiocies of ad writing.

(For those unconvinced, consider whether the addition “I am proud to exclusive share this unique article with you. Read it NOW!” would increase or decrease your wish to read it.)

Written by michaeleriksson

February 21, 2010 at 11:37 am