Posts Tagged ‘education’
Reading and thinking / Follow-up: Profound vs. trite
In an excursion to a recent text, I wrote that:
A reason for my scepticism towards e.g. business studies and social sciences is that I, when reading such materials, often find myself raising issues like “this cannot be true—consider X”, “but what about special case Y?”, “important observation Z immediately follows—why is there no mention of this?” only to see the book address X, Y, and Z a few pages later on—and in a manner that makes it clear that the author is now teaching the reader something that he could not expect to figure out on his own.
(Footnote removed.)
Looking at harder sciences, the situation is usually better (if less so today than in the past, cf. below). I have, for instance, always tended to think ahead, put claims to scrutiny, whatnot with math books too—but with math this seems to be not just allowed or welcomed but outright expected by the authors.* Indeed, most math books relegate pieces of reasoning entirely to exercises and/or add exercises to make the reader expand on his knowledge and understanding on his own. In math, we are supposed to think for ourselves; in business (to some degree) and social sciences (to a high degree), we are all too often supposed to just accept the Revealed Wisdom.
*Beyond a certain border, maybe somewhere during high school. (To specify the border would require considerable research; it might vary from time to time, country to country, and field to field; and it must be seen as an average over many authors.) There might or might not be a similar border in any other given field, but, with reservation for math adjacent fields, it is then far further up the years.
Unfortunately, there seems to be a general trend for the worse in most fields, math included, likely as a result of continual dumbing down and the acceptance of evermore students into college or specific college programs who are not actually college material resp. sufficiently strong for the specific program. An increasing assumption seems to be that the reader/student/whatnot simply does not have the brains to master something without being led by the hand.* In particular, the obsession with lectures and teachers is depressing—two tools to reach education, knowledge, and understanding that are highly inefficient, yet remain ubiquitous.**
*Leaving the question aside, whether this type of “mastery” truly deserves the name, as the ability to create new knowledge, to apply old knowledge to new problems, to understand unaided a book on the topic, and similar is likely to be far more developed in those who have proved themselves on a harder road.
**The claim “there are no bad students—there are only bad teachers” is pretty much the reverse of the truth. A bright and motivated student with a good book can learn despite the teacher, barring the possibility that the teacher kills his motivation or otherwise outright sabotages him; a dull one can hardly be taught much beyond rote learning, even should he be motivated.
It might be argued that many books of yore* took the issue of understandability too lightly, put a burden of thought on the reader that excluded too many, and made the journey unnecessarily hard for the included.** However, today’s books err on the other side, and are often “premasticated” to such a degree that little thinking is needed, which hampers the development of a deeper understanding. Worse, those who do enjoy to think can see the opportunity to do so diminished. There is also often a shift from time spent thinking to time spent reading, which removes much of the time that could have been saved (at the cost of inferior understanding), e.g. in that a book of old might require one minute of reading and two minutes of thinking (for some amount of content), while a newer book might require two minutes of reading and one of thinking—or, worse, three minutes of reading and none of thinking. Then there is the issue of half-truths, “lying to students”, etc.
*Here, too, I cannot give a specific border, but in a very rough guestimate, the 19th century and earlier might have been too low on understandability, most of the 20th was “just right”, while the latter parts of the 20th century and onwards have been too dumbed down and simplistic.
**Note that this often included a shift of brainpower from understanding the underlying matter to deciphering the text qua text. (I realize that I am not the ideal thrower of the first stone, but sometimes stones still should be thrown.)
Some thoughts on the granularity/length of qualifications
In the overlap between some recent writings* and one of my backlog items: The time needed to gain various qualifications seems unfortunate, notably in that the times can be too different, that the times are often too long (by some standard), and that the times tendentially increase, where a decrease might be more suitable. (Here I do not imply, e.g., that a certain degree should be cut down to half the time, but that the degree might benefit from being split into/replaced by two independent phases, both of which are roughly half the original time. See the discussion of degrees towards the end of the main text.)
*[1], [2]; to a lesser degree, [3]. Getting the gist of the below, with reservations for some German terminology, should be possible without reading them, but they are likely necessary for a full understanding.
Consider, again, the DQR:*
*The description is in German, but the below should be understandable. I will not go into the complications already discussed, e.g. the lesser time/effort needed to earn a level 6 or 7 qualification through the IHK compared to university studies and the granularity/rounding/truncation issues of the scale, although I recommend that the reader keep the latter in mind, as they are likely a partial explanation for what goes on on the pre-uni levels. (Cf. [2], in both cases.)
- The qualifications listed for level 1 are a subset of those for level 2,* and very basic both in how advanced and how long they are. One of them, the Berufsvorbereitungsjahr (“vocational preparation year”, or similar), is presumably a year long, implying that reaching level 1 can be done in one year. (The other might or might not be even shorter.)
*I have not investigated the exact difference, but it might to some part be a matter of unstated differences in length, to some part a matter of prior qualifications not on the scale, maybe relating to students dropping out of school at a different stages. (While Germany has a long compulsory schooling, repetition of grades, medical issues, and similar can see someone “age out” without actually completing school, while immigrants and older generations might be short of this bar for other reasons.) A problem with the scale, at least in the descriptions that I have seen, is that no true “zero point” is defined.
- However, the Berufsvorbereitungsjahr is also (cf. above) listed for level 2, implying that level 2 might also be reachable “from scratch” in one year, and certainly going from level 1 to level 2 is not likely to exceed one year. Here we also find the Hauptschulabschluss*, which is typically earned after 9-or-so years of unambitious* school, which gives us a more tangible point of reference.
*See excursion.
- At level 3, we find e.g. a two-year Azubi program, implying that at most two years are needed from level 2. We also find the “[m]ittlerer Schulabschluss”,* typically earned after year 10 of school, implying that the difference between level 2 and 3 is more a matter of quality than quantity and/or that a quantity of one year is enough.
*Better known as “mittlere Reife”. See excursion.
- Level 4 is quite tricky. Here we find, among others, three-year Azubi programs, which point to a single additional year relative level 3. However, we also find the Abitur*, which is not just harder than the other school tracks but also longer, stretching to year 13 of school, implying another three years relative level 3 and the “mittlere Reife”.**
*See excursion.
**Here and in [2], we can see that the “academic” road up the DQR is harder than the “vocational” road.
- Level 5 is a near vacuum in Germany, as the Abitur is counted as level 4, academic degrees begin at level 6, and the Meister (cf. [2]) is misclassified as level 6. (Germany does not have associate’s degrees, but they would likely have been ranked here, had they existed.) The few entries mentioned seem to be in the extended IHK family and/or some other type of vocational training, and at least the shortest would, in light of the Meister, almost certainly involve less than a year of effort, maybe considerably less. (If it does not, then so little time additional is needed for the Meister that it becomes delegitimized as level 6 even beyond what is already the case.)
- (From here on, I will restrict myself to academic degrees relative the Abitur.) On level 6, we see a three (sometimes four) year jump relative the Abitur to earn a bachelor’s degree. With the current classification of the Abitur, this amounts to a year and a half per level (4 -> 6), while it would be three years with the Abitur at level 5. Alternatively, as level 5 is a vacuum, we have three years to go from one level that actually counts (in a German context) to the next level that actually counts.
- Earning a master takes around two years, and moves someone from level 6 to level 7.
- Finally, a doctorate might take around four* years, and leads from level 7** to level 8.
*Due to the large thesis component of a German doctorate, it is hard to give a time frame and significantly shorter and longer times are possible. However, in a first approximation, I would see four years as a good guideline for Germany. (Other countries might have different norms, but around four years is often appropriate.) The length is one of the factors that have deterred me from attempting a doctorate, and one of the motivations for this text, the original backlog item, and my below suggestions.
**A doctorate immediately following a bachelor is a rare exception in Germany, and was virtually impossible before the Bologna reform, as almost all worthwhile first degrees were on the master level, by a U.S. standard. (Note the Diplom-XXX degrees of old, e.g. the Diplom-Ingenieur.) However, if someone does manage to arrange that road, the time per level is correspondingly halved.
As can be seen, the levels are of very different length in terms of “how long to get there”. Especially the lower levels can be achieved on the order of a year each, while (at least the academic versions of) the higher levels take significantly longer. The step from 7 to 8, in particular, is ridiculously long; while the step from 4/5 to 6 could be viewed as unduly long relative other steps (depending, cf. above, on how many levels are counted). Then there is the oddity of the Abitur vs. other roads to level 4.
Looking at Sweden, based on my own studies, school has a nice and even division into three-year phases (the mandatory låg-, mellan-, and högstadiet* and the optional gymnasiet**). A nine or ten semester civilingenjör*** followed, and had I continued with a doctorate**** it would have been another four-or-so years, although there is an option to take a “licentiate” degree half-way through the doctorate. Here I have long wondered why the system was not changed to use three or, better, two year intervals for the university phases as a matter of course.
*The completion of which would likely match level 2 or 3, going by the DQR entries. (There is a Swedish version, SeQF, of these frameworks, but Swedish Wikipedia is not helpful and an Internet search gives me the impression that it is still a work in progress.)
**Approximately level 4 or, maybe, level 5 in some cases and assuming that the Abitur truly is level 5 instead of level 4. Also see excursion. (Theoretically, “gymnasiet” refers to an older form of the same step, while the current step carries the bureaucratic name “gymnasieskolan”, but non-bureaucrats still prefer “gymnasiet”.)
***Level 7. This degree is modelled on the German pre-Bologna Diplom-Ingenieur, and can, in U.S. terms, be seen as a bachelor immediately followed by a master (thesis included). The length was nine semesters when I took it, but has since been increased to ten. Note that this implies a move from level 4 or 5 to 7 in one (long) step. (Also see excursion.)
****Level 8, of course.
Looking at the U.S., paths like bachelor + Ph.D. (4 + 4 years) and bachelor + master (4 + 2) are common, but there are both occasional associate’s degrees (2 years, possibly as first step to a bachelor) and occasional intermediary masters, earned half-way through a Ph.D. and with the option to remain at that level. There is also an implicit, even more fine-grained, scale through the often used 1xx, 2xx, …, 8xx numbering of courses. While these, in and by themselves, do not result in degrees, degrees and portions of degrees can be informally ranked according to the proportion of the leading numbers among courses taken. (And, to some approximation, the leading number corresponds to the year of tertiary education for someone advancing upwards.)
Combining various systems, how about the following:
Primary and secondary education is divided into four blocks of three years, as in Sweden. Tertiary education is divided into four or* five blocks of two years, following a schedule of associate, bachelor, master, doctorate or associate, bachelor, master, licentiate,** doctorate. (Higher doctorates can, of course, be appended in a similar manner.)
*This would depend on whether a doctorate should be considered roughly “four years past the bachelor” (as in the U.S.) or roughly “four years past the master” (as in e.g. Germany). There might also be a complication of how close in topic a master and a doctorate must be for the doctorate to build on the master, and here the licentiate could be a solution for those who wish to switch topic. (Admittedly, the difference between a licentiate and a second master might be somewhat arbitrary in this case.)
**Or some more suitable name. Here I go by the Swedish convention, but the implication of various degree names tend to differ between countries, as is the case for “licentiate”.
Alternatively, as higher degrees have “diminishing returns”, keep the two-year schedule for associate and bachelor and go in increments of one year from there. (I will not suggest names for the resulting steps of incremental one-year degrees. However, there are sufficiently many existing names with different semantics to go around.) In combination with an incremental “dissertation by publication” this might work quite well even at the doctoral level, as there is also an incremental reward; however, it might not be workable with a conventional monographic dissertation, which typically amounts to more or considerably more than a year’s work for a single publication.
For my part, I would have loved the option of doing a “quarter-doctorate” to see how I liked it, after which I could either have continued or left with the corresponding degree. (As opposed to the current system, where I might have done a year and then left, but, if so, without a degree and with the potential mark of “drop out” against me.) Universities might similarly be interested in the option of taking on more students for doctorates, seeing who does or does not cut it, and winnowing the field down after the first year. Equally, I would have loved the option of doing a “quarter-doctorate”, working a few years, upgrading to a “half-doctorate”, working a few years, etc.
This granularity can also help with avoiding the “specialization trap” of the higher degrees: Today, it is hard to earn a higher degree than one already has without considerably increasing specialization within what already is the field of specialization, which is, of course, fine for those who intend to become academics/researchers/whatnot within that specialization, but might be troublesome for those who look more for personal growth* or better qualifications for more general work. By using shorter degrees, it is easier to focus more on breadth than depth and easier to gain entry into different specialization. For instance, someone with a master’s degree in math might choose to earn a quarter doctorate in each of four mathematical subfields instead of a single doctorate in a single subfield.** Similarly, someone with a master’s degree in math with a healthy side of computer science might today have the choice between being satisfied with the master and going on for four years to earn a doctorate in specifically math—but might, in the alternate system, spend the same four years earning a “half-doctorate” in math, and a “quarter-doctorate” in computer science (three years of quasi-doctoral work + one year of upping computer-science knowledge, maybe resulting in a second master, before the “quarter-doctorate”). This with the option of stopping after e.g. the “half-doctorate”, while a current student would have spent two years not earning a degree. At lower levels, a double-major bachelor might prefer earning two one-year “half-masters” in the fields of the majors over earning a single two-year master in one or spending the extra time to earn one two-year master for each.
*Speaking for myself, personal growth, a wish to understand the world better, plain curiosity, satisfaction from study, and similar are much more likely to move me to study than the wish for credentials, let alone a career in science. (But credentials matter in this world, and it is only fair that those who put in the right work should have the corresponding credentials.)
**It might be more accurate to speak of “a specific topic from a subfield” than “a subfield”, as at least the dissertation tends to be quite pointed, maybe even relating to a single problem and its solution.
Excursion on the length of the civilingenjör:
The original reason for the length is likely historical, in that (a) there was an older thinking of “earn a degree and then your education is done”* and (b) a repeated lengthening over the years. A particular point, however, is that civilingenjör is considered a near-terminal degree in engineering, similar to the J.D. in U.S. law,** and has stood in comparison to lesser engineering degrees, notably the three-year högskoleingenjör*** that few with ambition considered, when the civilingenjör was so much more prestigious. (Note that taking a högskoleingenjör first, and then moving to a civilingenjör as an add on, is possible and would approximate my thoughts above, but this would be unusual and arguably suboptimal as there is a difference not just in length but in how theoretically and how practically oriented they are. Besides, again, the fact that one could enter the civilingenjör immediately makes the högskoleingenjör a too unambitious choice for the good heads.)
*Indeed, the whole differentiation into additional master-this and doctor-that degrees is a comparatively late development, outside of special cases, which is reflected in e.g. the difference between a traditional Scottish (really, bachelor) master and an English master, and the historical U.S. transition from LL.B to J.D. degrees, which was just a (misleading) change of names.
**See [4] for a comparison, from which it is clear that the bachelor preceding the J.D. has a surprisingly small effect on overall length of study, and that the civilingenjör has spent more time on his chosen field.
***Approximately, “college engineer”; this would amount to roughly a bachelor in engineering.
Excursion on the German qualification mania:
A good example of how obsessed Germany is with giving everyone a narrow education and title (a topic of importance for these several texts, notably [1]) is Fachpraktiker (in German). Some youths considered too intellectually limited* for e.g. an Azubi are instead sent to a similar program to become Fachpraktiker, where the (already low) theoretical knowledge is dialed down in favor of more practical learning. Whether they are more employable than before is unclear, but they now at least have a label and are more strongly limited to a single field than before—just like the German mentality seems to demand.
*The Wikipedia page is unclear on the exact scope. Early on, it speaks of “Behinderung” (“disability”), which can include e.g. the blind, but the rest points to specifically those with a mild mental retardation and/or the deeply stupid (hidden behind the euphemism “Lernbehinderung”, which, in theory could imply any number of things, but, outside the disabled, usually implies “a state of deep stupidity—but we must never, ever call anyone stupid”).
Excursion on German school, etc.:
The German school system is both complicated and confusing,* but to give a brief overview:
*Have a look at e.g. a graphic overview from Wikipedia and note that this is still an over-simplification. As a particular complication, different states/Bundesländer often have different rules and/or differ in terminology. I make the reservation that I did not myself go to school in Germany, and might have a more superficial understanding than the natives and/or might have misunderstood something in my compensatory readings.
After year 4 or 6, depending on state, the school system is split into three tracks, the Gymnasium (most ambitious/demanding, ultimately aiming at the Abitur), the Realschule (middle track, leading to the “mittlere Reife”), and the Hauptschule (lowest track, leading to the “Hauptschulabschluss”). After year 9 or 10, the two lower tracks end and are followed by various other forms of education, including the Azubi (cf. [1]; new lowest track) and various more conventional vocational schools (new middle track), while the Gymnasium continues with the “gymnasiale Oberstufe” until year 13. A comparison with the U.S. (high-)school system is complicated by the end of the Realschule/Hauptschule during the “lowerclassman” years of a four-year high school, but it might help to view (a) the Gymnasium and the “gymnasiale Oberstufe” as school resp. high school for those expected to be college material and/or as (inexpensive) “prep school”, (b) the two middle-tracks as school resp. high school for the average, and (c) the lower two tracks as school resp. high school for the bottom of the barrel. However, note that these are statements about averages, not individuals, that a switch from one track or track level to another is possible,* that Azubi programs and vocational schools may overlap in prestige depending on the field of work and future career prospects, and that the vocational component of, especially, the lowest/Azubi track might be far larger than in a U.S.** high school.
*But a step “upwards” might require additional schooling or unusually good grades.
**With reservations for my depth of understanding of the U.S. situation. I am vaguely aware of the existence of vocational programs and/or high schools, and certainly of various “shop” components, but cannot make an actual comparison.
Of course, the above is a considerable simplification, which, for instance (!), ignores the presence of Gesamtschulen, where the split into tracks is often diminished or delayed. (These are popular with politicians, especially as a preliminary step to an end goal of removing all tracking, as the “nurture only” ideology dictates that differences between kids are caused by differences in schooling.)
Excursion on the Swedish gymnasiet:
While the first nine years of school are common in Sweden, the gymnasiet phase sees a radical split into various programs of differing ambition and purpose. Most* programs are of a vocational nature, ranging from two to three years, and fill a role similar to various German Azubi programs and vocational schools. More interesting for this discussion are the three-year “college prep” programs, of which there, with reservations for oversights, were four in my days and now are five,** and which are comparable to the German “gymnasiale Oberstufe”:
*Because different occupations need different programs, while the “college prep” programs can be kept more generic.
**Nationally, as “must provide” programs; other optional local programs can occur.
- The “natural science” program, which is considered the most ambitious and most difficult, gives the widest university access, and was my own choice (in the then incarnation).
(Going by U.S. standards, it, like the German Abitur, has a strong claim of being closer to the associate level than the high-school level and/or of being on level 5.)
- The “technical” program, which is largely overlapping with the “natural science” program, but which is shifted a little in the technical and/or practical direction, and a little less academically “well rounded”.
(Based on content, something like “science and technology” might be closer to the mark, but I stick to informal translations of the names used in Sweden.)
- The “humanist” program, which has a focus on languages and the like, and is popular among those who intend to study fields like languages, philosophy, and religion at university. This especially if classical languages are needed.
- The “social science” program, which is self-explanatory in content and can be seen as the “I want to go to college, but cannot cut it in the ‘natural science’ program” program. (The “technical” and “humanist” programs, in contrast, can be seen as “I want to go to college and I have special interests”, or, for the latter, “I want to go to college and need Latin [or whatnot] to be admitted”.)
Since my days, this program appears to have been split in two, to create an additional “economy” program.* (This was earlier a specialization within the single program.)
*Going by Swedish Wikipedia, the deviation from the sibling might be more one of “business prep” or “law prep”, depending on specialization, than “economy”, let alone “economics”.
It also, unsurprisingly, appears that these two programs are the two most popular (cf., in Swedish, [5]), as “I want to go to college, but cannot cut it in the ‘natural science’ program” is a common affliction, and as Swedish politicians are very keen on increasing the proportion of college-goers, regardless of actual suitability.
However, these programs are not complete specializations. For instance, looking at the current incarnation of the “natural science” program on Swedish Wikipedia, there are mandatory courses in e.g. modern languages, social science, history, and religion. (It was similar those three decades ago, although I do not remember the exact courses.)
How should the Meister be classified? / Follow-up: German make-work, barriers of entry, the Azubi system, etc.
In a text on German make-work and whatnot ([1]), I said, concerning the comparison between master’s degrees and the “Meister” (master craftsman) qualification that:*
*See that text for German terminology not explained here, e.g. “Meister’ and “IHK”.
the [master’s] degree is at level 7 on the German version of the European Qualifications Framework, while the Meister is at level 6—and I have seen the claim that this is only due to IHK lobbying, with 5 being a rating more compatible with the rating of similar international qualifications. From a casual look, I consider the claim very plausible.
Since then, I have looked more closely at German Wikipedia [2] and the German version of the Qualifications Framework, and would support a classification on level 5. Below, I will go into details by comparing (mostly) bachelor* degrees and Meisters (both nominally at level 6).
*Restricted to the proper university bachelor. Cf. the complication of “Fachhochschulen” and “Berufsakademien” mentioned in [1]. (Including them would not change the big picture, however.)
Before I begin, two general complications/reservations that must be borne in mind:
Firstly, the used 1–8 integer scale is often on the crude side and suffers from a “rounding problem” or a “truncating problem”, in that, by analogy, two values that would only be one or two tenths apart on a more fine-grained scale might turn out to be a full integer step apart on the actual scale. For instance, when truncating* numbers to integers, 5.9 turns to 5, while 6.0 remain 6. Vice versa, two numbers that are almost one apart on a fine-grained scale might be identical on the coarser scale (6.0 and 6.9 both truncate to 6). If we assume that a Meister has a fine-grained value of 6.0 and a bachelor a value of 6.9, this might be tolerable; however, having them both at 6 is not. The sole defense of the Meister classification, assuming 6.0 vs. 6.9, would be that it is less a matter of a faulty classification by the Germans and more of a too coarse and misleading scale set by the Europeans.
*The same effect is seen with rounding, just with a different set of numbers, e.g. in that 5.4 turns to 5 and 5.5/6.4 to 6. (The truncation version merely seems more pedagogical in context.)
Secondly, the scale does explicitly not compare academic levels (unless two entries are both of an academic nature). Instead, it tries to find what entries are equal to each other in some more abstract sense, without implying a fungibility—equal but not the same. This approach is of debatable value, but is not obviously wrong. By analogy, a J.D. and an M.D are, in some sense, worth approximately the same, while still being too different to be fungible in many contexts. My take so far: it is a nice idea, but it brings too little value and the lack of fungibility is a problem for most practical comparisons.* I still disagree with the classification of the Meister, but this complication increases subjectivity and arbitrariness in a manner that makes a strict analysis harder and odd positions easier to defend (without these positions necessarily being sensible).
*A more promising idea is the international comparability, the original point of the EQF, in that we do not compare a craftsman with an academic, but, say, an Italian and a German craftsman or an Italian and a German academic. The same general scale and intra-field classifications could have been kept, but divided into separate scales for craftsmen, academics, and what else might apply. (Also see excursion.) This would have the added benefit of keeping fewer entries on each numerical level, which reduces the danger of a scale being too coarse and/or the damage when it is.
To proceed:
- Assume, dubiously, that the typical entry requirements to beginning a Meister (Azubi done) and a bachelor (Abitur done) are equally valuable, hard-earned, whatnot.
A German bachelor is usually earned in three* years, but can be as long as four* in some cases.
*Nominally. Many students need more time, either because they cannot keep up or because they have to work part-time in parallel.
A Meister?
Strictly speaking, in my impression, no additional education is needed to take the corresponding test (Meisterprüfung),* but for those who go to a Meisterschule (“master school”), which is normally considered sufficient preparation, German Wikipedia claims:
*How many/few would be able to pass without prep work or with less prep work than provided by a Meisterschule, I leave unstated. The statement concerns formal pre-test requirements—nothing more, nothing less.
Je nach Berufsbild dauern die Vorbereitungskurse im Vollzeitunterricht zwischen 3 und 24 Monaten; in Teilzeit bis zu 48 Monaten.
Translation:
Depending on the characteristics of the occupation, the preparatory courses last between 3 and 24 months of full-time study; part-time, up to 48 months.Assuming full time study, we then compare between 3 (!) and 24 months with 3 years or longer. To this must be added that the intellectual requirements for a bachelor are higher (with reservations for field of study). Clearly, the Meister is well short of the bachelor.
But what about the practical work experience of a Meister? Does that not count and would not a comparison of 48 months with 3 years be fairer? No: this line of reasoning might be valid up to the point of earning the qualification, say, comparing a freshly minted Meister and a ditto bachelor. However, down the line, this would give the Meister an unfair advantage, as he can count some of his work experience while the bachelor cannot—even decades down the line. I also note that there used to be a minimum work experience of (maybe) 3 years in the field at hand to be accepted to the Meisterprüfung, but that this requirement is long gone. In theory, it should be possible to get to the Meister level with no more work experience than gathered as an Azubi.
- The entry requirements, however, are not equal, unless we apply the same type of faulty classification to them, too.
The typical pre-qualification for a Meister is a successfully completed Azubi program (also see [1]), which is typically three years of a mixture of practical work and easy-by-Abitur-standards classes, for a nominal rating of level 4—or, in some cases, just two years for a rating of level 3. To enter an Azubi program, the formal qualification needed is (likely) the Hauptschule, on the outside the Realschule.**
*Hauptschule and Realschule are the lowest resp. middle branch of the German school system. There are too many unclear descriptions and descriptions that leave out vital information, but beginning an Azubi program after year 9 or 10 of school seems normal.
For a bachelor, it is the Abitur, usually as a result of going through the Gymnasium (highest) branch of the German school system, where children, beginning at year 5 or 7* and continuing to year 13, are given an academically oriented schooling for the purpose of future university studies. This branch is considered considerably harder than the other branches.
*Here and elsewhere there are complications like different states/Bundesländer having slightly different rules.
The Abitur is also nominally at level 4, but it appears that it came close to being put at level 5. (Quoting [2]: “das Problem, ob das deutsche Abitur die Niveaustufe 4 (Vorschlag der Sozialpartner und Kammerorganisationen) oder 5 (KMK-Vorschlag) erhalten soll, [wurde] auf einen späteren Zeitpunkt vertagt; dieses wurde im Frühjahr 2017 der Niveaustufe 4 zugeordnet”. Roughly, the likes of the IHKs wanted it on level 4, the ministers of education wanted it on level 5, and the IHKs won. Note an overriding pattern of misclassification through the influence of the IHKs, as well as the mega-guild issue in [1].) My current impression is that 5 would have been better, and this would have reflected my earlier observation that the Abitur is closer to a U.S. associate’s degree than to a U.S. high-school degree. At a minimum, we have the type of unfortunate truncation comparison discussed above, in that the Abitur might have been a 4.9 truncated to 4; or even a 5.0 misclassified as a 4.9 and then truncated to 4. This while the 3-year Azubi might well be a plain 4.0.
(Also note how we, in a slightly different world, could have had Meister and Abitur both at level 5, but actually do have Meister at level 6 and Abitur at level 4—something very much to the advantage of the IHKs and their members.)
- Looking at the 3-year Azubi, we have the “pro-Meister” position that a Meister began with a weak level-4
qualification, spent between 3 and 24 months on study, and is now on level 6, while the bachelor began with a strong level-4 qualification, spent 3 or more years of more advanced study, and is now also at level 6. Already here, it looks ridiculous.A “pro-Bachelor” scenario juxtaposes level 4 + 3–24 months of study with level 5 + 3 or more years of more advanced study. The comparison is now utterly absurd.
Then there is the issue of the 2-year Azubi, which presumably* also has a Meister-continuation, where someone moves from level 3 (!) to level 6 by earning the Meister…
*I have not looked into this, but the opposite would be highly surprising.
- Another approach is to look at who is qualified to study at the university level. Someone with the Abitur is, without restrictions (“allgemeine Hochschulreife”). A successful Azubi is not (yet, they are both at level 4), while a Meister (at level 6) at least* partially is.**
*I have seen somewhat conflicting information as to whether he gains the general right or merely one in sufficiently near-by fields (“fachgebundene Hochschulreife”). Maybe, the rules simply differ from state to state; they have definitely changed over time.
**And while a bachelor, also at level 6, already has studied successfully at the university level.
In this specific regard, the Meister is more closely comparable to the Abitur than to a bachelor. (But note that a Meister has rights relating to his craft that the Abitur graduate does not.)
- Yet another approach is to look at age. Someone following the main road to a bachelor, while studying full time and sticking to the by-the-book schedule, with no interruptions, might be done at 22 (after Abitur at 19), while a sufficiently ambitious Meister-wanna-be with a similar brain,* could push it to 20, maybe even lower.**/***
*But note that most with ambitions and brains tend to go to university and/or otherwise go to more challenging, intellectual, or profitable occupations. The proportion that lands with a Meister is far smaller.
**Note that I do not say that he should, as gaining more work experience before the Meister might be a good idea, as working as a Meister while barely shaving could lead to credibility problems, and as going for part-time studies might be a better money decision. The point is that looking at the time/effort needed for completion, even quality aside, the Meister might be closer to Abitur (level 4) than to bachelor (level 6).
***I had a brief look for youngest Meister on the Internet and found one 19-old Meister; however, this appears to have been pre-1983 and the ruleset might have been different.
In conclusion, it seems fair to put the Meister at level 5, not level 6, and a case might be possible that the Abitur belongs on level 5, not level 4. This even generally—if we look at the issue from an intellectual/academic point of view, it might well be that the Meister would rightly be rated below the Abitur and, certainly, far below the bachelor.
Excursion on the next IHK step:
As I have noted, IHK has (at least) one step beyond the Meister, which counts as level 7 and as nominally equal to a master’s degree. Here I have less information, but the situation seems to be similar, in that there are few formal requirements for taking the test,* and that the study time is much shorter than for the degrees on the same level. For instance, one info page gives “12-monatigen Sonntagsstudiengangs oder eines 10-wöchigen ‘FAST TRACK’-Lehrgangs in Vollzeitform” as the duration(s) for a prep course. (“12-month Sunday course or a 10-week ‘FAST TRACK’ course in full time”) In contrast, the most typical length for a master, to go from a bachelor at level 6 to level 7, is likely two years/four semesters of full-time study—and here we have roughly half a semester. Indeed, going by study time and what is discussed above, I am not certain that this course is enough even to promote a Meister from level 5 to level 6, with the bachelors,** and it sure as hell is not enough for level 7, with the masters. Indeed, even if we were to accept Meisters at level 6, in conformance with the official framework, the idea of these 10 weeks being enough for level 7 is ludicrous.
*I have clicked around a bit, and they often seem to include a few years of work and knowing English. Knowing English is a Abitur/pre-Bachelor skill and gaining a few years of work experience is something that anyone can do. (A Meister or equivalent is, of course, presumed, but the “equivalent” also seems to have much more leeway than the must-have-a-bachelor criterion to begin a master, which weakens the credibility further.)
**This depends on where at level 5 we put the Meister. If high enough, these 10-weeks worth might just be enough; if not, then not.
The trick, I suspect, is that the IHK qualifications very deliberately include work experiences as boosters (or, even, the brunt of the qualification), while the academic degrees do not. Why this is highly misleading has already been discussed—and it comes close to the diploma-mill scam of “Earn credit for work experience!!!”, except that the IHKs work with the full support of the government.*
*No, I am not contradicting want I say in below, in another excursion: a way to include work experience in the framework would be good, to push it in through the backdoor to the benefit of special interest groups is bad, and to create a general fake impression, even outside the framework, of equivalence is very bad. The effect here is that the one has three years of higher education to earn a bachelor and ten years of work experience, and is stuck at level 6, while the other has, maybe, a year’s worth of higher education, the same ten years of work experience, and is promoted to level 7. Maybe he has learned an enormous amount during those ten years (I did, during my first ten years, and my second ten years), but how are his ten years worth more than the bachelor’s ten years?
As an aside, visitors to the above page might notice the use of phrases like “Master Professional” and “Bachelor Professional”. This is a recent trick to increase the value of these qualifications further, and to further push the fake equivalency with the real degrees—and it leads to a further devaluation of the real degrees, to worsen the situation discussed in [1] (see “Berufsakademie”, etc.).
Excursion on comparisons not made:
There are two obvious comparisons that I have not made:
Firstly, a direct look at the general abilities and whatnots that the framework presupposes on various levels. Such descriptions tend to be wishful thinking or even free fantasies, especially when it comes to what various politicians, educators, advertisers, and similar claim that e.g. earning a certain degree will achieve. The descriptions for a certain qualification will not only usually exceed what is found after reaching that qualification, but often what is found one or two qualifications higher. To boot, the descriptions tend to be vague and open to interpretation. Correspondingly, a comparison is either pointless or will require a much more in-depth understanding of what is intended by the respective descriptions.
Secondly, a comparison with similar qualifications in other countries. While sensible, it would require a lot of work, especially as the number of occupations involved in Germany is very large;* and might fail due to (a) my too limited knowledge of the respective local situation or (b) the likely internationally unusual German system.** Moreover, I cannot rule out that other countries have fallen into the same trap. (However, note that claims by others to the effect that such a comparison should put the Meister at level 5 was the starting point of my interest. Cf. the above quote from [1].)
*It is by no means just carpenters, plumbers, and grocery-store staff, but countless others. To make matters more complicated, there are many cases where a certain field, notably software development, is covered both by IHK programs and regular university degrees—and an apparent match in, say, Spain to an IHK program might turn out to actually be a better match for a university degree. (By analogy, a chemistry course offered by a high school in country A does not, or only very rarely, compare properly with a chemistry course offered by a university in country B—but a too casual and naive observer might see two chemistry courses and jump to conclusions.)
**In my native Sweden, for instance, the nearest equivalent to Azubi programs are pure school programs that cover similar theory and praxis entirely in school, while there is likely no near equivalent of a Meister (today; as opposed to “yore”). This with the reservation that I have had less exposure to the Swedish system than the German, having spent the clear majority of my adulthood in Germany.
Excursion on age:
Just like with work experience, it could be argued that the typical newly minted Meister has an advantage over the typical newly minted bachelor (after adjusting for I.Q. and whatnot) through typically being older at the time of “graduation”. However, we again have the problem that taking this into consideration will unfairly favor the Meister. Yes, it might be true when comparing a freshly graduated bachelor at age 22 with a freshly “graduated” Meister at, say, 28, but it will not remain so when compare both at age 28, let alone 48.)
Excursion on other deficits of the framework:
There are other deficits with this scale. Consider e.g.:
- It does not reflect non-formal qualifications, including work experience and self-studies.
This might be hard to avoid, but it is a whopper. For instance, to someone in his early forties, is a degree achieved twenty years ago more important than the following twenty years of work experience?* For my part, I have no qualms about claiming that my informal studies, even discounting work experience, rightfully should move me to level 8, with the Ph.D.-holders, while my formal qualifications and my “official” classifications leave me at just level 7, even be it twice over.
*In sufficiently brainy fields.
- Multiple qualifications on a certain level bring no extra value. I have two master’s degrees and share level 7 with those who have one—and those who have a dozen. (Should they exist.) This while a single Ph.D. nominally trumps any number of master’s degrees. (Yes, the nature of a Ph.D. is different, and maybe one Ph.D. should trump two master’s degrees—but three? Four? Five? Should a single master’s degree trump two or three bachelors?)
- Even formal qualifications are all or nothing. For instance, someone with half a bachelor, be it due to an interruption or because he is still in college, is no better off than a high-school graduate. The successful student might then go from level 4 to level 6 from one day to the next, after having spent 3 or 4 years at level 4. A better system might involve a component of “credits earned” at different levels, instead of just “degrees earned”.
- It does not include a 9th level for “higher doctorates” and other “post-doc” successes, like published papers and scientific discoveries. This is of some importance in e.g. Germany, where a regular doctorate is not always sufficient for professorships and where the status as terminal degree is disputable.*
*The “Habilitation” is indisputably rated higher, and has a regular doctorate as a pre-condition, but the waters are muddied by how different Bundesländer and/or universities formally categorize it, and whether it counts as/results in a degree or something else.
- Individual countries can have incentives to give their own qualifications an artificially high level (as with Germany and the Meister).
- It fails to separate real university degrees from the “Fachhochschule” and “Berufsakademie” degrees (cf. [1]) and degrees otherwise of different quality, e.g. between master’s degrees with and without a thesis, and between a Ph.D.* with a large thesis portion and one consisting mostly of classes. (Also note the following two items.)
*In all fairness, specifically German doctorates are thesis heavy—period. However, the overall framework is for Europe, not Germany, and there might only be a question of time before “softer” doctorates become an issue, as yet another symptom of academic inflation.
- It fails to recognize the difference between even a “barely passed” and a “summa cum laude”.
I would certainly argue that the difference between the two can be worth one step on a better scale, and I would go as far as to suggest a need for outright separate degrees, e.g. in that a bachelor is awarded in two different forms depending on whether someone was above or below a certain quality mark, which can then easily be fitted into different levels.* (Although a slightly more fine-grained scale might be needed, e.g. one going from 1 to 16 instead of 1 to 8, which would allow the same division on all levels. Also see earlier remarks on the coarseness of the scale.)
*I note e.g. that the English often speak of receiving “a first”, “an upper second”, or similar, rather than just “a bachelor”. These are strictly speaking not different degrees, but it shows a saner attitude than in e.g. the U.S. and Germany, where the degree is mentioned first and the quality is either left out (more often) or mentioned only in second place (less often; as in e.g. “graduated summa cum laude”).
- It fails to acknowledge the difference in difficulty between e.g. math and gender-studies, which might well exceed the difference from the previous item.
- There is no dimension of field/relevance/whatnot. Consider e.g. my own switch to software development (cf. [1]): I was already on level 7, through my largely non-software studies, but was now in a new field.* Was I better or worse off than if I had gained a more relevant level 6 qualification?** Almost certainly worse. Similarly, if I want plumbing done, I would rather turn to a plumber than to a Ph.D. in classical languages (level 8), regardless of whether he be a Meister classified as level 6, a Meister classified as level 5, or a former Azubi who is a mere level 4.
*However, with hindsight, I believe that I have underestimated the benefits a little in [1], as my continual computer exposure (even if unrelated to software development) did make me a touch typist, did familiarize me with command lines, did broaden my (user) experiences with software, and did give me an early Unix exposure, without which I could conceivably have been stuck as a naive Windows user for quite a few years past graduation.
**Say, today, a bachelor in software development. Back then, the closest choice would likely have been computer science.
Excursion on hyper-egalitarianism:
With the absurdly strong Leftist influence on this-and-that in Germany (and many other countries), there is a possibility that some of the issues here and in [1] go back to hyper-egalitarianism, that all roads to and types of education must be considered equal in value, or that e.g. considering an academic better* than a craftsman or a university better* than a “Berufsakademie” would be bourgeois snobbery.
*This in two variations, the one in doing so directly, the other in considering them different and, thereby, “risk” that “unenlightened” non-Leftists form the opinion that the one was better than the other. Notably, even without making a value judgment of “better”, we do have an objective difference in that earning a proper academic degree involves more thinking and more work than earning a Meister. (Ditto Abitur vs. Azubi, etc.)
Remark on capitalization:
In German, all nouns, not just proper nouns, are capitalized. I usually keep this capitalization when I use German nouns in English (most notably, “Rechtsstaat” and variations). Hence “Meister” with a capital “M”, because I use the German noun, but “bachelor”, because I use the pre-existing English word, not a redundant import of the German “Bachelor” (which originated as a borrowing of the English word). As an aside, I would be in favor of English following the German example, as it can reduce ambiguity, speed up interpretation, and make exposure to previously unknown words easier to handle.
German make-work, barriers of entry, the Azubi system, etc.
As I noted earlier this week, Germany is big on make-work. A partial reason for this might be a default approach that someone spends years learning one occupation* and remains in that occupation for the entirety of his life. If, then, the old occupation disappears or is severely reduced, a multitude is now forced to spend years learning a new occupation, to go unemployed, or to take some extremely low-end job that requires no qualifications.
*I use “occupation” over “profession”, because most of the cases under discussion will be below the typical standard for a profession. Vice versa, “trade” would often be a match, but might be too restrictive on the high end or for some types of occupations. The word “occupation” is not necessarily ideal in other regards, but I can find no better solution to avoiding “profession” off the top of my head. For instance, neither “job” nor “vocation” truly does the trick.
Historically, there has been some justification to this attitude, as many skills involved the right physical speed, dexterity, and “knack” (and, to some part or in some fields, strength and endurance) at specialized tasks, which can only be built through enough practice. (And where there might be some advantage to beginning young in terms of malleability.) For instance, take a weaver working a loom: in order to increase his output, he must be able to perform certain movements faster and/or for longer, and he must be able to do so with so high a precision that the quality of the output remains acceptable. Let us say that he is forced to switch to sewing. It is very possible that he will have an advantage over complete beginners through his skill as a weaver, but his new colleagues will view him as a clumsy slowpoke for a good long while,* until he catches up sufficiently in speed and whatnot. This, of course, assuming that he manages to get enough work in the interim to develop these skills and that he does not starve to death before he has done so.** Then we have complications like the need to get hold of the more literal tools of the trade and material to work on, the risk that much material is wasted in the early phases through mistakes, etc. When our ex-weaver learned weaving, he might have done so as an apprentice, with free food and lodging, under the supervision of a master and with access to the master’s tools, and through a process that began with the performance of trivial tasks as a child and ended with competent work at, maybe, some point in his late teens.*** Now, as an adult, starting over, his situation is likely to be very different.
*How long, I do not know, but chances are that we are talking months to be taken seriously and years to be fully “on par” again. Looking at myself and touch typing, the most similar mechanical skill that I know from extensive personal experience, I am a better typist today than I was ten years ago, and ten years ago, I had already been at it for eighteen years.
**To the first, customers are likely to prefer the more skilled, who will get the job done faster and at a higher quality. To the second, a lower output and a lower quality will mean correspondingly less money.
***Disclaimer: While I have some idea about old apprenticeships in general, I have no special knowledge about weaving, and I do not guarantee that this matches what took place in weaving during any given time period. However, it is the big picture that matters.
(The more intellectual skills? Yes, they were important too, and here too a partial redevelopment would be needed. However, when push comes to shove, an ex-weaver who had learned to handle the needle well enough could find work under someone who had the right domain knowledge and whatnot. If he could not handle the needle, he was out to begin with. Moreover, ability to think is often more important than knowledge, and this ability remains when we switch fields; moreover, at least some of the more intellectual skills would translate, especially between adjacent fields. Today, however, the intellectual skills are likely the bigger stumbling block.)
Today, it is different, as e.g. the transition from being the operator of an industrial loom to the operator of an industrial sewing machine is a far smaller obstacle. Yes, if someone wants to go from machine operator to physician, the effort is massive, but this would, if at all, be something done more out of passion than need.
However, today does come with its own set of obstacles, many in terms of attitude, many in terms of artificial entry barriers—and from here temptations like use of make-work and subsidies to keep old occupations alive can arise. In Germany,* there is a wide range of occupations where the practitioner is supposed to have a certain education, sometimes by regulation, sometimes by mere expectation—and, no, I am not talking about just physicians and lawyers. The more-often-than-not misguided restrictions range from teachers** to the cashiers*** in a grocery store.
*With similar issues common in many other countries, although rarely to such a degree.
**Teachers must by law pass government exams and have a certain university education. They are not necessarily very good because of this education, teaching is rarely the first choice for the best minds, and talents from other fields, who might be looking for a change of tempo or a new challenge, are kept out by the high entry barriers. Similar errors are common internationally, but the German rules for teachers rival what other countries post for lawyers.
***No, there are no governmental exams for cashiers, and many are just doing a brief stint to solve an unemployment issue or to have some income while studying; however, the intended-by-the-powers-that-be road to a job in a grocery store, be it cashier or store manager, is to work as an Azubi (cf. below) to earn the title of Kaufmann/-frau (depending on sex).
The proportion of jobs for which a certain position is non-negotiable has grown smaller, but the IHKs* have worked very hard to keep it up. Indeed, during my early years in Germany, they tried, if in vain, to push mandatory IHK qualifications for anyone who wanted to run a computer/software/whatnot business, as was the case for e.g. carpenters—never mind whether the prospective runner already held an academic degree relating to computers and/or had years of experience in the field at hand. (These were comparatively new fields at the time. Today, the same request would have had no chance at all.)
*Imagine if the local city guilds in a medieval town joined up to form one single mega-guild, which additionally provided a range of different exams, of often disputable value. Put in the money and effort to pass the right exam(s), and you are on the inside, like a lawyer within the bar; don’t, and you are on the outside and not allowed to compete with the insiders. (To take the exams there are fees and expensive preparatory courses, but, from what I have been told and with typical reservations for hearsay, it is almost hard to fail the actual exams, likely because the point is less to keep the quality of the insiders up and more to ensure that only those who have paid their dues are allowed to become insiders.)
One of the main tools of (not necessarily deliberately) raising barriers is the focus on “Azubi” apprenticeships,* which take three-or-so years, depending on the field, to complete. Now, there is not necessarily anything wrong with the Azubi system, per se, as a way to gain practical experience in a certain occupation, together with some theoretical and general knowledge. However, there is a lock-in effect, and one pushed by the likes of the IHKs. For instance, in many countries, a young man with some generic tool skills, who is locking for a first job, can ask around and maybe find a construction job, or a job in a garage, or with a carpenter, or whatnot, with the understanding that “we will see how you work out”. This is not impossible in Germany, but chances are that he will be met with “But you were not an Azubi in our field—we have no use for you!”, “Become an Azubi first and in three years we will see. No, we do not offer any Azubi positions.”, or similar more often than he likes. Similar problems will often manifest when someone has had a job (and/or been an Azubi) in field A and now wants to switch to field B. Moreover, the too young German faces the complication of mandatory school, which, depending on location and circumstances, might extend as far as 18. Being an Azubi formally fulfills the requirement of going to school; just working for someone, no matter how educationally successfully, does not.**
*“Azubi” is short for “Auszubildender”; literally, roughly, “one to be educated”; more idiomatically “apprentice” or “trainee” (assuming a comparatively low level of trainee). Beware that I use the word with some syntactic and other liberty, in order to avoid introducing too many German words and/or jumping between English and German words.
**Generally, Germany has utterly failed to understand that education is good, but that school is a different matter altogether. For instance, there are years of mandatory school (not education) and home schooling is forbidden as an alternative to “regular” school. Land der Dichter und Denker? Wohl eher der Dicken und Doofen!
Indeed, switching is tricky. Firstly, we have the psychological component of having invested these three years into a certain field and a certain training program, and many will feel that these “sunk costs” are hard to leave behind. Secondly, there is the issue of qualifications, where the switcher will either have the wrong training or be forced to retrain—which can imply further months or years down the drain.* To this, we have the risk that someone does go through retraining—only to discover that the new field is another poor fit (has low demand, or whatever might be the problem). Thirdly, we have the trainer–employer’s** view. Here there are at least two questions that are likely to arise, namely, “If you failed at A, why should we expect you to succeed at B?” and “If you did not stick with A and your old trainer–employer, how do we know that you will not dump us, after all that we have invested?”.
*I have not looked into the options here, but if we assume that an Azubi-level formal qualification is wanted, I doubt that it would be doable without at least one additional year, more likely two, as there are limits to what rebate the previous training can bring in the new field.
**Azubis are hired by the business where the practical training takes place, with the understanding that a proper employment will usually follow after “graduation”. The pay is low, but over three years it accumulates. To this, the costs of training and whatnot have to be added, while it is unclear how much of the costs the Azubi can offset through productive work. (Using Azubis “too” productively can even lead to problems with regulators—Azubis are officially there to learn, not to be productive workers.)
Going higher in the IHK system increases the lock-in effect. It does not end with the basic qualification that a successful Azubi earns. It is followed by at least two (optional) further levels, the first being “Meister”*. Complete your Azubi education and you can work in a field, but to run a business in that field, you had better be a Meister. So, now you have spent years of effort and many thousands of Euros on earning the right IHK qualifications for this specific field, not to mention the opportunity cost of foregoing other options, and you want or need to do something else? Tough luck. (See excursion for a comparison with higher education.) Or, now you are thirty and want to join the teens for basic qualifications? Would you enjoy that and would you be accepted?
*Literally, “master”; corresponds roughly to the traditional master craftsman; not to be confused with the academic degree. (Apart from the very different material and approach, the degree is at level 7 on the German version of the European Qualifications Framework, while the Meister is at level 6—and I have seen the claim that this is only due to IHK lobbying, with 5 being a rating more compatible with the rating of similar international qualifications. From a casual look, I consider the claim very plausible.)
An unfortunate development is that the Abitur* is increasingly becoming a pre-requisite for the more attractive Azubi positions, when the intention is that the Azubi is “instead of”—either someone earns an Azubi and goes straight to work or he earns an Abitur and goes on to university. This sets the career entry back by several years for the Azubis, reduces the time available for a later career shift,** and might make the perceived cost of the first Azubi period larger (with an ensuing greater perception of being entitled to work in a certain field, as opposed to biting the bullet and retraining to be something else).
*The German approximate equivalent of high school is divided into several different tracks. The Abitur is earned by finishing the longest and most academically challenging track. It is mostly intended for those who intend to continue at university, and it is, in its own right, closer to an associate’s degree than to a U.S. high-school diploma. Here, too, I take some liberties in the use of the word, for the sake of simplicity.
**But might move the original career choice to a time of greater maturity and insight.
From another perspective, keeping the perception of worth/qualification in a certain field tied to a certain program, study, whatnot is advantageous to those who lack in brains but still want to feel important, claim to be competent, use authority arguments (with them as the authority…), and similar. This has likely contributed to similar problems worldwide and on many different levels, e.g. in that a presumptuous chimney sweep declares himself an expert while having the expertise of a cocker-spaniel,* that a principal with an Ed.D. in “educational leadership” speaks with authority on pedagogy and talks down to teachers who have a mere master’s degree, and similar.
*See a handful of older texts for my personal experiences.
A particularly frustrating example from my own life is my attempt to register with the Künstlersozialkasse* when I switched from software to writing. My registration was denied with various nonsensical claims, many in the family that I had not studied writing in college and could, therefore, not conceivably have a serious interest in writing—despite there being no formal requirement to this effect. This while, apparently, any idiot who has a degree in journalism is welcomed with open arms, despite journalists typically being incompetent writers and definitely not being Künstler, and despite the massive current surplus of journalists.** This well demonstrates how “Quereinsteiger”*** are kept out with teeth and claw when there is a surplus in a certain field (or would be a surplus, if entry barriers were lower), while they are welcomed with open arms when there is a deficit, as with IT around 1999 (and at many other times).
*Over-simplified explanation: a way to keep costs for e.g. health insurance down when working in a creative profession. I have filed a complaint with the right court, but am, years later, still waiting for the case to have its turn. The “Künstler” part implies artist, but is taken to include e.g. authors of literature in contexts like these; and the Künstlersozialkasse has the definite original purpose of helping those who try to make some type of art, be it paintings, literature, or music—not journalists.
**This is likely another instance of make-work (or something in a related family): newspapers do not sell as they used to and we cannot have all those journalists switching to some other field just because there is too little work to go around.
***A popular German term to indicate someone who switches fields. Looking at components, “[E]insteiger” means entrant/someone who enters, “[q]uer” something crossing something else or, by metaphorical use, something odd or unusual (as with the related English “queer”). How these are intended to be combined in meaning is not clear to me even in German, but variations like “someone who enters queerly” (no innuendo intended), “someone who enters in a sidewise [sideways?] manner”, and, maybe, “someone who uses a side-entrance” are conceivable.
Excursion on switching fields after higher education resp. IHK education:
Yes, there is a similar issue with switching fields after higher education, say, after earning a master’s degree in engineering physics, but there are key aspects that differ. The most notable is that a larger proportion of the population and a much larger proportion of the low-qualification/-payment/-status jobs are affected by the IHKs. That switching from a career as a physician to one as a lawyer is tricky, well, that is understandable, maybe even unavoidable—but switching from mere employee in a carpentry business to mere employee in a plumbing business? Then we have the age of decision: the Azubi applications are often made by mid-teenagers, and a once started program is hard to change; in the U.S., the (final) choice of college major might be around 19/20, and the choice to go to med school, law school, or earn a master/Ph.D. even later; in e.g. Germany or Sweden, similar (non-master/-Ph.D.) choices might be made at 18/19,* but dropping out of one degree program to follow another has fewer complications and less “loss of face” than switching from one Azubi program to another (and the college fees are much smaller than in the U.S., making a switch less painful economically). Then we have the filter effect of higher education,** where e.g. the degree tells us so much more about the characteristics of the graduate than a completed Azubi program does; and the fact that both the depth and the breadth of a college degree is far greater, and more generic skills are present. Through this filter effect and the greater depth/breadth, a college graduate is more attractive to employers in other fields (within reasonable limits) than an Azubi.
*As they do not work with “generic” bachelors-with-majors, but with more field-specific ones, where a major would be redundant; and as the equivalent of med and law school is available without a prior bachelor. Note that “high school” ends at a higher age than in the U.S.
**Even be it less so today than in the past, due to the flooding with students well short of “college material”; however, the STEM fields are still reasonably strong.
Excursion on my own experiences:
Why “master’s degree in engineering physics” in the above excursion? Because that is what I earned for my first master: I originally studied mostly math and physics, but went to work as a software developer,* transitioning through a mixture of on-the-job learning and own studies. I spent most of that time in the Java area, but later transitioned to Oracle and PL/SQL, again through a mixture of on-the-job learning and own studies. A few years back, I decided to write fiction and went through the same again. In addition, I had an interruption of my Java years to work as a business analyst, which followed a similar pattern; and I am a certified Scrum master. My formal education, be it the original math/physics or the master of computer science that I later earned, have been of little value in terms of my work capabilities,** and I would almost certainly have developed faster as a software developer, had I foregone them. In contrast, one of the modern software-development “boot camps” might have been a more valuable help.*** Put in the hard work and switching fields, while reaching average**** standards, is not that hard, even for “brainy” fields, and the less so for more practically oriented ones. (Excepting those few with strict formal criteria and some few where a very large amount of knowledge is presumed. Work as a physician is an example of both.)
*This original switch was motivated by my wish to stay in Germany and the impression that it would be easier to find work in software development during the then (1999) raging IT boom. Later switches were based on shifting interests and a wish for something new. (Except for the period as business analyst, which was offered to me by my then boss, and which I, with hindsight, should have turned down.)
**But they might very well have helped me get jobs through improving my CV. Two exceptions are a very programming heavy first-year course and the programming work done on my first master’s thesis, which gave me some introductory experience and made the transition easier, but this was a small fraction of the overall workload. The master of computer science had much more programming/software-development related material; however, surprisingly little of that has been of any practical use. (It might very well be that someone with less prior practical experience and/or someone who took a strongly software-development oriented program, which computer science is not, would have had a different experience.)
***I have no practical experience and am only superficially familiar with the idea and typical contents, but the principle is promising.
****Well above average, in my case, but I am brainier than most. More generally, the standards reachable in a given time frame will depend strongly on the level of brains available (and/or what other constraint might be present in a given field)—but the sad truth is that the average in most fields is highly unimpressive. (To be among the best of the best of the best in a field is a different matter entirely, but, almost tautologically, this is an accomplishment that is extremely rare even among those who stick to a single field, unless that field is quite small.)
However, this assumes (a) that there are no artificial obstacles like those caused by an over-reliance* on the Azubi-system, (b) that the attempt is made with a dedicated effort, (c) that the attempt is actually made.
*Note that I see Germany as being overly reliant on the Azubi-system. Kept within better limits, the problems would be smaller, and much of what is today done within the Azubi-system could be done better with a mixture of more informal on-the-job training and regular work-experience. I might suggest a new coinage of “learning-on-demand”, where employees on simple jobs are taught relevant skills if and when the need arises—not in a blanket manner. Manning the check-out in a grocery store, e.g., is not rocket science (heart surgery, corporate law, engineering, whatnot). A plumbing trainee who knows how to install a sink can be taught how to install a toilet with ease and on demand.* Etc. Certainly, this is how things very often work in more qualified occupations, e.g. software development, except that the learner is supposed to learn things through own thought, through experience, from books and the Internet, or (on the very outside) a tip at the coffee machine, without needing a teacher.
*If he can learn it at all. If he is dumb as a doornail, he might fail, but then he would likely fail no matter when he is taught, and chances are that he would have failed with the sink too.
Excursion on devaluation of academic degrees:
In Germany, there is not just a problem with regular academic inflation (grade inflation, too many students admitted, too many wishy-washy degrees, whatnot), but also with a devaluation of “proper” academic degrees through the awarding of degrees with the same name by “Fachhochschulen”* and “Berufsakademien”** that use lower standards of quality and/or quantity. Imagine, as an analogy, if U.S. community colleges were allowed to award three-year*** bachelor degrees, and the difference between these and a four-year bachelor degree from a regular college was not properly respected. To make matters worse, this follows upon the Bologna-process, which replaced the established “Diplom” system with a more Anglo-American bachelor + master system, with different universities having different stringency standards in the wake of the confusion (and not every employer and whatnot actually understanding how the systems compare).
*They often use translations like “university of applied sciences”. Whether this matches the use in the U.S., e.g. in the sense of “associate of applied sciences” is unclear to me, but they do have a more vocational and less academic tilt. The use of “university” is definitely questionable, and “college” might better, but still only approximately, reflect the German division into “Universität” and “Hochschule”. Then again, this division is another thing that the powers-that-be try to eradicate, leading to yet another devaluation.
**I am unclear what the equivalent would be, if any. A literal translation is “professional/vocational academies”. They are, in my impression, to real universities what Azubi-studies are to regular school, including a tie to a specific employer and a corresponding large work-for-credit portion. In an absurd twist, what Berufsakademien award are not formally counted as degrees (“akademische Grade”) but “staatliche Abschlussbezeichnungen” (approximately, “state graduation designations”; the German original is almost as idiotic). The names of the not-degrees remain the same as for the university degrees.
***A relevant comparison within the U.S. system. German bachelor degrees are also often three years long, but this is not a relevant comparison, as the Abitur is a pre-requisite and far beyond the U.S. high-school degree (as noted above).
Excursion on the college mania and Azubis:
Germany is yet another country hit by the college mania, where everyone and his uncle wants a bachelor’s degree (or more). A consequence is that that the number of strong candidates for a position as Azubi is diminished. This, in turn, leads to a lower quality of low-level workers, especially if college graduates are too proud* to take such jobs and/or their applications are rejected for lacking the formal IHK qualifications. For the same reason, employers and prospective Azubis cannot agree on whether there are too few Azubi applicants or too few open Azubi positions. (The raw number of Azubi applicants, who want a position, is often considerably larger than the number of satisfactory applicants, to whom the industry wants to give a position.)
*The belief that a degree automatically opens doors is increasingly incorrect, due to the reduced value of degrees as filters, but can be hard to overcome.
What does and should a politician know?
Earlier today, I wrote:
In particular, how much own knowledge must be required of Trump when he has specialists to advise him? Can we require that non-physician, non-virologist, non-epidemiologist Trump, at so early a stage, answers a team of renowned specialists with “But Diamond Princess?”, after being told that “science”, “models”, whatnot show that several percent of the U.S. population might die? (Fauci, Birx, et al., in contrast, have no excuse.) And what if he did give this answer and was met with a plausible sounding explanation as to why the data from the ship were not representative? And how is Trump different from dozens of other world leaders, including Angela Merkel (an actual former scientist, albeit not in medicine).
This leads to two much bigger questions:
What does the typical politician, head of government, whatnot actually know?
What should he know?
Both questions are tricky, and I will not attempt an even remotely exhaustive answer (for now, at least). In fact, the risk of politicians believing one thing and publicly claiming another makes the first question near impossible to answer.
However, looking briefly at the “does”, while taking claims at face value,* it seems that the typical specimen is well short of where he should be, including lacking basic knowledge of history, economics, business principles, and relevant science. Consider e.g. the wholesale COVID fiasco, the common skepticism to nuclear power, ignorance of IQ and biology, belief in that long-debunked “tabula rasa” nonsense, a horrifyingly naive take on various environmental issues (e.g. EVs**), …
*Not that I necessarily do in real life.
**Electric vehicles undeniably have many advantages, but the whole “debate” seems to begin and end with “No more exhausts! Yay!”, unless some minor concern is given to practical complications around charging stations and the like. Questions like the environmental impact of construction (e.g. through rare-earth mining), eventual scrapping, and production of electricity to actually charge the vehicles seem to be outside the intellectual scope of most politicians. In this, EVs illustrate the overall problem well.
Indeed, there is much that I would consider “general knowledge”* that is simply lacking. Maybe my bar is higher than most others’, but I do find it absurd when a politician, for instance, speaks of nuclear power as if modern Western plants were comparable to that of Chernobyl or fails to understand that fossil fuels cause much more death and damage every year than Chernobyl and Fukushima put together. (Or, e.g., fails to understand that “Chernobyl and Fukushima put together” is approximately the same as just “Chernobyl”.)
*More correctly, what my fellow Swedes call “Allmänbildning” and the Germans “Allgemeinbildung”, for which “general knowledge” is only an approximate translation.
(And if politicians tend to lack general knowledge, why should we expect Trump to be able to argue with medical specialists on a specialized medical topic?)
The “should” will depend much on the circumstances at hand, and having a good brain, being willing to constantly learn, understanding that it is important to draw on multiple sources, and striving for understanding over mere knowledge might be more important. However, an older text contains an excursion on education for politicians, which provides a partial answer. (Certainly, the minimum reading suggested for voters is the more urgent for politicians, but it falls far, far short of what e.g. a POTUS should have read and understood.)
An interesting idea is that the amount of knowledge should be enough to question the advisors in two senses: (a) To know what questions should be asked in a given situation. (b) To put the advisors and their advice in question, as they might not only be suboptimally knowledgable, themselves, but also often have hidden or not-so-hidden agendas that might make them dishonest or manipulative.
About those experts…
Disclaimer: This text started out in one place, saw a change of tack, and then evolved into something even more different. With the length of the text and the massive amount of writing that I have done this month (averaging a little more than one published text a day), I lack the energy to rewrite it to the level that might find expert (to stick with the theme) approval. I publish anyway to shrink that backlog just a little.
When I was a young teenager, I had an attitude to governance very similar to e.g. a modern day U.S. Democrat or other Big-Government proponent. The broad masses are stupid* and politics is the art of putting the right persons in charge, so that they can make wise decisions for all—an expert for everything and every expert in his place. The more than thirty years since have seen a gradual change of opinion, my last few illusions crushed by the behavior of “experts” during the COVID pandemic.
*Well, I got one thing right, but whether they are actually more stupid than those actually in charge is sometimes debatable.
While I have certainly not walked around with blinders for close to fifty years, the process was slower than it, with hindsight, should have been, and I always seemed to just exclude, sometimes unconsciously, a slice of society as non-experts, rather than to contemplate the whole of society. I discovered that most teachers were nothing to write home about during my school years, for instance, but I never viewed teachers as true experts (on pedagogy or whatnot)—specialists, yes; experts, no. When it came to individual fields of study, the first few teachers were generalist, and we had a series of single teachers teaching all subjects for the first six years of school, with a switch based on age group, not subject.* In contrast, I expected e.g. the professor of pedagogy, the minister of education, and a Ph.D. holder in an individual subject to have true expertise.
*Excepting some special cases, like wood-shop and music. Here and elsewhere, note that many references are to Sweden. The general claims are likely to hold internationally too, but the exact details might vary.
The next six years were better, with teachers who specialized in, usually, two subjects and had a correspondingly deeper knowledge, but they were still “only” teachers. When they failed to live up to my standards, and most did, I shrugged it off—even high-school teaching is, after all, not where you find your typical genius. Those who can do, do; those who can’t do, teach. (And two or three STEM teachers might have been outright good, if far from deserving a status as infallible policy makers, giving me some hope for the future.)
Time went by, and I saw disappointment after disappointment, e.g. in that most politicians were intellectual nobodies. (In my very late teens, I actually watched a few parliamentary debates in the hope of learning something. Well, I suppose that I did learn a thing or three, e.g. that parliamentary debates are horrifyingly boring, that politicians are poor at reasoning, and similar.) But, hey, politicians—they are elected because of popularity, not true merit.
Physicians who believe in homeopathy? Sad—and a failure for medical education, but most do not and, after all, medicine is, compared to e.g. math, a field requiring more memorization and less thinking.
(More generally, I have seen many cases of a member of field X who believes stupid thing Y or does not understand Z, and have shrugged it off as an individual nitwit, not representative of the majority. To return to the physicians: those who believe in homeopathy are not representative in this specific regard, but their presence still sends a warning sign for the field as a whole—maybe, they are somewhat representative in terms of ignorance or weakness of critical thinking.)
Bosses who are both less intelligent and less informed than half their teams? Well, the sad truth is that promotions rarely go to the true high potentials* and that e.g. showing dedication, behaving in the “right”** manner, becoming friendly with the right higher-ups, and understanding and using company politics is more important than being good at the actual job. (In a bigger picture, promotions-for-being-a-woman and promotions-for-being-a-minority-member have to be added, but I have myself been relative sheltered from them, likely simply because of the demographics of German software development. In neighboring departments, and during my time as a business analyst, I have seen some quite suspect cases, however.)
*A phrase somewhat popular in e.g. personnel management.
**Exactly what this implies will vary from case to case, but being sufficiently compliant and sufficiently sociable are common factors.
Over time, I continually shrank, in my mind, the circle of alleged experts who actually were experts, but I never truly stopped believing in experts: they might be rarer than they should be and they might be misallocated, but they do exist, they do, they do! (Or was that Santa Clause? I forget.)
Then came COVID and the conclusion that virtually everyone hailed as an expert* by media and politicians and virtually everyone put in charge of something important was a long way from where he should be.**
*But note that this still leaves room for experts not hailed as such by media and politicians.
**In the case of some, e.g. Fauci and Birx, not just in terms of competence but actual physical location, viz. prison.
A large part of this is, of course, misallocation, as true experts, unlike Santa Clause, do exist—they just are not put in the right positions. Much might be misrepresentation. A recurring problem is experts speaking on topics that they do not understand, using their (real or alleged) expertise on another topic as legitimacy, and/or their failure to factor in other fields. (As with e.g. the many lockdown proponents who failed to consider factors like the effect on the economy, which made lockdowns extremely dubious, even assuming that they had worked strictly from a defeat-COVID perspective. That the lockdowns did not work even to defeat COVID just makes it that much worse.) Another is true expertise being shouted down by propaganda, suppressed by threats of cancellations, and similar. However, the most damaging might well be empty credentials:
What does that diploma or whatnot truly mean? Higher education used to have a strong component of filtering for ability, intelligence (which is the most important in any expert), willingness to work hard, and similar, but is not necessarily that developing—a degree tells us who had the right type of mind before admission, but it does not create that mind. The mind is strong after graduation because it was strong before admission. Yes, I learned much during my own studies, but it did not fundamentally change me beyond what a similar stretch of e.g. work would have done.* Today, this filter effect has been catastrophically weakened, for reasons like laxer admissions,** lower standards to pass old courses, new courses that are bottom-of-the-barrel, and entire majors that might be passed just through being sufficiently compliant and not thinking for oneself (e.g. gender-studies). Even in the days of old, non-STEM fields were not ideal filters, as a hard worker could often compensate for a lack of brains.
*The typical college years fall into an interval when there is still some physical maturation taking place and when there are large “beginner’s gains” in terms of professionalism, self-knowledge, worldview, etc. (This especially as they often coincide approximately with leaving home, gaining new legal rights, having first major economic responsibility, first serious relationships, and similar.) A few years of virtually anything will make a difference, but the difference will be less a matter of e.g. college having a magic effect and more of nature taking its course.)
**Note how much larger the college population is relative the past.
From what I have observed myself, read, and been told by others, Ph.D. studies are similar—they do develop a narrow set of skills, but the main question is whether the mind was there before admission.*/** Moreover, doctoral studies often involve surprisingly long stretches of leg-work.
*Disclaimer: I have two Master’s degrees but have never been in a Ph.D. program.
**Reservation: There are some interesting differences between different countries. For instance, many U.S. (non-language) programs have a requirement that the candidate must have some knowledge, often “reading proficiency”, of one or two foreign languages by the time the degree is awarded. This requirement might increase the distance between Ph.D. holders and non-holders within a given country, but mostly because of a catching-up effect, as such language knowledge is expected at the end of high-school in many other countries, including Sweden. (I had nine years of English and six of German by the end of high school.)
That professorship? Well, as I have learned over the last few years, the main point of a professor today is neither research nor teaching—but collecting enough research grants for his employer. (And as early as the mid-1990s I had a professor complain to me that all the administrative work was taking up half his workday.) Now, when a professor is judged by how much money he brings in, what quality of professor qua researcher* and expert will result? Do not get me started on other aspects of academic research, like publish-or-perish, “p-hacking”, dubious co-authorships, pressure to get the “right” results, and whatnot.
*Being a good researcher plays in, of course, but knowing the right people, being charming, being able to bluff, and, sadly, the willingness to pick the right research topics seems more important. Even more sadly, fudging one’s results can be helpful (if undiscovered).
Looking at other qualifications, it is rarely better. High government positions, for instance, say very little about competence, and sufficiently high positions tend to be detached from actual research and expertise in favor of policy work, public relations, and whatnot. Awards are often dubious (witness many of the Nobels and Oscars), with a common influence of politics, filtering based on opinions, mutual pats on backs, and similar. The MacArthur “Genius grant” has been derided as entirely detached from genius.* Some awards are even invented for the purpose of publicity and credential padding (arguably, including the Oscars).
*I have not done the leg-work to have a firm own opinion, but I do note that somewhat recent recipients include the likes of Ibram X. Kendi (racist, hate monger, reality distorter) and Nikole Hannah-Jones (of the fraudulent 1619 project).
At the end of the day, formal credentials are better than nothing, but they do not truly separate the wheat from the chaff. If there is any doubt, consider the current German Minister of Health, Karl Lauterbach. His formal qualifications seem* to include
*Neither German nor English Wikipedia gives a thorough and consistent overview, and the matter is complicated by the different systems for medical education being used. The below is drawn from his self-published CV (in German), with some speculation on my behalf for the two first items.
- Academic qualification as a physician (the equivalent of a U.S. M.D.).
- An additional proper doctorate in medicine (which the U.S. M.D. is not).
- A Master of Public Health.
- A Master of Science in Health Policy and Management.
- A Doctor of Science in Health Policy and Management.
(To boot, often at Harvard.)
However, Karl Lauterbach is a disaster. He has pushed for mandatory vaccinations, at a time when the COVID pandemic was diminishing and others went in the opposite direction. He has ignored new data showing how misguided previous actions were—and instead wanted more of the same actions. Before he became Minister of Health, he was a serial guest on TV, propagandizing for extreme countermeasures (and there is, by now, beyond reasonable doubt that the countermeasures that were implemented did far more damage than the pandemic, per se). Both before and after his appointment, he has distributed true mis-/disinformation, including unsupported claims that a COVID-infection would lead to more rapid aging. Many consider him borderline cuckoo, instead of merely incompetent. I certainly consider him outright dangerous, even by the standards of Leftist politicians, the type of man who would wreck society and then look back in pride, insanely believing that he had saved it. (In all fairness, few complain about a runny nose after decapitation.)
Taking a step back to look at myself, when have I developed the most intellectually? In school? At university? No, when I have been at home, reading, thinking, and writing on my own. How does that compare in terms of paper qualifications? Well, formal education tops out at two master degrees, both with a nice diploma. The more worthwhile informal education leaves me with, depending on point of view, either a blank piece of paper or one typed by myself. In the latter case, I will have no accreditation and no advantage in credibility over someone who has not gone down my road but merely claims that he has.
Looking, similarly, at great geniuses* throughout history, they have often** had few or no university-level qualifications and/or had their main qualifications outside their later area of excellence—instead they started on a high level of natural talent (likely, typically g based), developed themselves on their own, and let their work speak for them. Edison had no degree. Srinivasa Ramanujan was largely self-/book-taught when he began his discoveries. Einstein earned a doctorate, but he did so in 1905, when he was already at the top—before that he had some nonsense degree to teach physics (or similar). Tolstoy was a university dropout.*** Paul McCartney has no degree and reputedly could not even read sheet music during his Beatles days. (The other members were similar, IIRC.) Etc.
*Note that I at no point say that I would be one of them.
**Very far from always, especially when we turn to fields that naturally require someone to have an advanced degree to be taken seriously, to get research opportunities, and so on. The further back in time we go, the more the “often” applies; the closer to the now, the less it applies. (As a natural development of both the availability of higher education and the increasing belief in empty credentials.) Go back sufficiently far, and those without degrees will dominate utterly; get close enough to the now, and few, even among the moderately bright, will fail to have at least a bachelor.
***The combination genius–dropout might be quite common.
Turn it around and look at how many have a bachelor, master, or doctorate, yet fail to display any signs of genius—indeed, look at how many are quite stupid. Does the modern day U.S. have more genius per capita than ancient Greece? Is that Ph.D.-wielding college professor in creative writing a greater author than the poorly educated Shakespeare? Why is Terence Tao (a Ph.D.-wielding math professor) considered a mathematical superstar, while most other Ph.D.-wielding math professors are not?
Comparing modern scientists, e.g. Terence Tao and a vanilla mathematician, brings us to another important point—that the type of credentials well known to the broad public, those that tend to fill (non-academic) CVs, those that politicians brag about (until accused of plagiarism), those that snotty pseudo-intellectuals might use as an excuse to be snotty, etc., are not the A and O within academia. At least in STEM fields, the Ph.D. has long been referred to as the “union card”—it is what permits the holder to actually perform work in the field. It is not, however, seen as proof of mastery of that field. To become a professor it might or might not be a pre-requisite (depends on the field and the country), but to apply for the job without a slew of published articles and other merits beyond the Ph.D. would be optimistic indeed. Similarly, a physician is very limited in his right to practice medicine after just receiving his degree, and much more on-the-job training follows before he is fully qualified to practice independently. Most other fields have lower or no entry hurdles, but it is, again, a near given among the knowledgeable that merely having a degree does not prove mastery of the field at hand—nor is it expected.
No, with some reservations for degrees that are both advanced and high in math, I ignore formal qualifications, even those on a Lauterbach level, these days. The proof of the pudding is in the eating.
Excursion on my parents:
The above includes far from all proper cases of disappointing “experts”, and it ignores borderline or merely related cases. A particularly interesting borderline case is parents: like so many other little kids, I once saw my parents as demi-gods, capable of doing anything, knowing everything, holding the keys to the world—magicians who made food appear and who could read the TV guide. How silly they turned out to be over time. Not to put too fine a point on it—they were outright human.
Follow-up: Children vs. parents vs. the government (circumcision)
Long ago, I wrote a text ([1]) criticizing a Conservative complaint about anti-circumcision suggestions by the Left ([2]). While I stand by that text in principle and remain strongly opposed to (non-voluntary) circumcision, I have since seen many practical complications around rights and power that make me question my priorities. In particular, it might well be that some arbitrary power given to the parents would be a far lesser evil. As the author of the complaint said to another commenter, but representative for much of the reactions:
Your missing the point of the original post. The problem is the government’s intervention in this very personal decision. That is the danger to society, not whether you are for or against the practice. Once we allow the government to start running our families or religions, Fascism will follow.*
*Compared with my impression, he might have the causalities wrong. Does Fascism follow because we allow the government to X, does a Fascist government insist that we “allow” it to X, or is it a mixture? Witness e.g. Joe Biden.
Similarly, in direct response to me:
[…] The issue is not circumcision; it’s whether some Left-wing (or Right-wing for that matter) Moon-bats know better than parents and should be allowed to intervene in child rearing. Just look around and see the results. We are in the 5th decade of the Progressive experiment to have Social workers and other government agencies take over the responsibilities of raising our children. Object failure with kids coming out of school who cannot read, teenage pregnancies and abortions at all time highs.
Government does very little right. Suggestion that we continue to cede parental rights to it makes no sense.
Notably, the Leftist attitude of very selective and hypocritically applied rights (both with regard to “what rights” and “rights of whom”) and great centralization of power has done much harm and is very contrary to a free and democratic society.
The status of parents relative their children (and vice versa), which was at the core of the original circumcision discussion, is a great source of examples—and shows why it is very hard to say what measures are acceptable. This to the point that a circumcision dictated by the parents is a far smaller danger than what current schools, governments, whatnot, impose or would impose if not resisted. (Also see excursion.)
In the last few years, we have e.g. seen schools presuming to go behind the parents’ backs when it comes to gender-dysphoria, which could result not just in a slice of skin missing from a penis but an entire penis being surgically removed and/or used for vaginoplasty (or whatever the word might be), in a decision that could be regretted* a few years later and that cannot be considered truly informed**. The same, m.m., applies to the girls. Note that even lesser interventions, e.g. hormone supplements and suppressors might have far-going and/or permanent effects, and certainly more so, on average, than a circumcision.
*While I have never seen formal statistics, the Internet is full of complaints that e.g. “I was pushed to go ahead and now, when it is too late, I realize that this is not what I wanted” or “transitioning seemed as it would solve all my problems, but it just made matters worse”.
**The decision is monumental and life-altering. Should any teen, let alone proper child, make such a decision without even getting the parents’ perspective on the issue? (Even assuming that the parents are not given a legally binding veto; and even disregarding the question of whether a teen has half the knowledge and experience to make an informed decision.) This includes not just different perspectives, but also own experiences. What if e.g. one of the parents can truthfully say that “I went through something similar, but it blew over after a year or two and now I am happy that I did not act on it”? Is that not something that the child would benefit from knowing? An interesting thought-experiment is to replace this monumental and life-altering decision with something lesser, but still very, very major—say, a girl of 14 who wants to marry her boyfriend. Would even the gender-fanatics consider this an appropriate decision to make behind the parents’ back?
Something very similar applies to COVID-vaccines, where at least some schools have put pressure on children to get vaccinated without the parents’ knowledge, let alone consent. This despite the uselessness of these vaccines in the age group(s) relevant; and despite the risks (small, but by no means trivial) involved.
Then we have matters like indoctrination into far-Left quasi-religions like CRT and attempts to turn the children into Leftist/environmental/whatnot activists. Note, in particular, “action civics”, which foregoes true civics education in favor of activism on issues/for causes that the children themselves are typically far too poorly qualified to judge, but which happen to match the opinions of e.g. the teacher, Greta Thunberg, AOC, or Bernie Sanders. (Which is not to say that Thunberg et al. are good judges of the matters at hand—but they do have strongly voiced opinions.)
Indeed, from what I have read on schools (sadly, often including colleges) in the U.S., and often other countries, the old principle of parents being the guardians and school merely having an “in loco parentis” right during school hours has been inverted: the school (or, worse, the state) is the guardian, and parents are given a mere “in loco scholae” right during the dark ages when schools are closed and children risk exposure to wrongthink. Moreover, this right is often thin and combined with more duties than rights relative the school,* to the point that parents might be reduced to unpaid caretakers.
*In [1], I opine that parents (should) have more duties than rights towards their children. This claim is not to be confused with the above, which is an unrelated matter with a very different ethical situation.
A particularly absurd demonstration of the sheer presumptuousness of schools is shown by a representative* incident from the COVID-era: A student was participating in a remote class. The teacher spotted something “offensive” in his room, reprimanded him, and demanded that the “offensive” object be removed. Justification: during the participation, the student’s private room would be part of the classroom and everything had to conform to the rules of the classroom. A saner attitude, and what I would have told the presumptuous teacher and/or the principal, had I been one of the student’s parents, is the reverse: This is my home and my rules apply. If you intrude through an online class, then you are a visitor in this home and should behave accordingly. If you are not willing to do so, I will exercise my legal right to throw you out.
*I have seen several somewhat similar incidents. Unfortunately, I do not remember the exact details of any individual incident.
However, similar problems are very widespread, and I suspect a deliberate strategy to (a) reduce or remove non-governmental* instances of (in some sense) power, (b) transfer power to more and more centralized points. Everyone should obey and no-one should presume to object. No-one should have any power, unless that power is granted by the government** for the purposes of the government. No-one should form an own opinion, unless appointed by the government to do so—and then the opinion becomes mandatory for everyone. Etc.
*Or maybe non-Leftist.
**Or maybe the Left, here and elsewhere. While the drift to a larger and more powerful government is by no means uniquely Leftist, it is the stronger with the Left; at least the current Left seems think that only the Left could ever be “legitimately” in power (unlike e.g. Donald Trump); and the associated machinery, including a large proportion of civil servants, often has a strong Leftist tilt, implying that even with non-Leftist political leadership, the government as a whole tends to remain Left-of-Center. (Also see excursion.)
Consider, among other examples:*
*However, note that many of these might have partial explanations in natural developments unrelated to the Left. Also note that they are not necessarily caused by Leftist politicians, even when they do relate to the Left. (For instance, a Leftist activist abuse of administrative positions in colleges and/or Leftist propaganda pressure on colleges might have much to do with the below item on the infantilization of professors.)
- A continual strengthening of the U.S. federal government over the state governments, giving the individual states less possibilities to stand up to the federation.
This is paralleled by a similar trend in the EU, where the core principle of subsidiarity grows ever more neglected. (I might go as far as suspect that some national governments use the EU as an excuse to do what they want to do, but do not want to take the blame for.)
- The move from many small to few large businesses, which are easier to control, and through which the number of persons truly* in charge of something is reduced. Note how Leftist governments, including e.g. Communist China, Nazi-Germany, and Social-Democrat Sweden, often have pushed hard to achieve such centralization. (And, whether for similar reasons or accidentally, many Western countries have so large obstacles to running small businesses, especially with employees, that too few take the leap.)
*A sole owner and CEO of even a small private company is in charge in a manner beyond that of a non-owner CEO of a public company, and far beyond a middle-manager in a larger company, even should they have the same number of subordinates.
- Flattening of hierarchies within companies, as in e.g. Sweden and as pushed by many “progressive”* business experts. The more employees are “equal” and the fewer “bosses” there are, the fewer citizens there are accustomed to some level of power or authority. (This the more so, cf. above, when someone today “equal” would have been owner/CEO of his own small business a 100 years ago.) Moreover, there are lesser rewards and recognition for excellence, which might reduce self-confidence among those who would have been willing to take a stand. Ditto lesser responsibility at work and potentially a resulting lesser responsibility in matters political.
*Not necessarily and automatically in the Leftist sense, but certainly in the “look how forward-thinking and modern I am” sense.
(Whether larger and/or deeper hierarchies are a good thing, I leave unstated. Other factors play in, e.g. that promotions often go to incompetent smooth talkers and to those who understand more about company politics than about the company business—a deeper hierarchy might well just leave us with more incompetents ordering the competent around.)
- Not only are college students increasingly infantilized, reducing or delaying their ability to stand up for themselves against e.g. the government, but the same applies to college faculty and scientists, full professors included. Sites like Minding the Campus contain a great many examples of highly educated and intelligent persons having to bow down to various administrators hired to ensure Leftist goals A, B, and C. This includes cases of having to write letters of apology or take classes on A, B, and C for having said something in class that some snowflake thought offensive—and usually something that no reasonable third-party would have considered offensive.
Someone who researches the “wrong” topic and/or reaches the “wrong” result can see a promising career end, tenure denied, research grants go to someone more PC, etc. Speak out about scientific knowledge that contradicts an official narrative and cancellation will follow—as will accusations of being variously “racist”, “sexist”, “anti-vaxx”, whatnot, depending on the topic at hand.
Etc.
No, no mere professor should have the right to make statements about reality or to encourage students to think on their own. His, sorry, their only message should be what is Approved and compatible with the Cause. What that message is, is for us Administrators and Activists to determine.
- Then there is the whole COVID-debacle, which might provide material for dozens of items for someone patient* enough. I lack this patience, but I do point to e.g. how physicians and medical researchers have seen problems similar to the previous item, how physicians have been limited in their rights to treat according to their own best beliefs, and a great number of earlier texts on various COVID-related issues (not always ones relevant to the current topic, however).
*Pun unintended, but fortunate in that someone who has been a hospitalized COVID-patient might well have much to add that we others have missed.
Excursion on “government”:
The word “government” is tricky, due to differences in international meanings and implications, and my own use tends to be inconsistent. Above, it is mostly intended in the more U.S. sense of the sum of all elected politicians, civil servants, departments, agencies, and whatnots. Sometimes, the use in the more international sense of those-currently-in-charge might be intended; sometimes, the use of either meaning in a given sentence is compatible with my intentions.
Excursion on benefactors and power:
There is an aspect of benefactors and power that applies above, but is easily missed: If (misperceived or real) good deeds, charity, rightings of judicial wrongs, and other help cannot come from outside the government, there is less reason for the citizens to be grateful or loyal to other entities than the government. Consider e.g. a privately owned business with a dozen employees, where one employee needs some type of help.* In one universe, the owner can choose to help or not help,** in the former case earning loyalty points. In another, his hands are tied by tax payments, some of which go to render the same help in a manner that tricks the employee into believing that the government helped him at no cost to anyone else. In yet another universe, the employee works for a public multi-national company, and there might be no-one in the hierarchy who is and feels*** authorized to help him with company resources.
*I find it hard to give a specific enough example that also works generically. For instance, “needs an expensive operation” might work in one jurisdiction but not another, depending on the rules for and coverage by various health-insurance plans. (And might be abused by some naive Leftist by “You see! We need mandatory single-payer insurance!”, with no consideration for the bigger picture, overall costs, flawed incentives, whatnot.)
**Depending on the details and modalities, the chances might be good, e.g. in that a solid employee receives a loan against future wages. (Which is also better for the employer than tax payments that result in money being gifted, and often fairer for society.)
***Even someone with the formal right might abstain in order to avoid criticism from above.
The effect of taxes on such situations should not be underestimated. For instance, for a private individual to be charitable is much easier when taxes are low than when taxes are high, and higher taxes for the purpose of “good deeds” might well result in a similar effect on the recipients, but with gratefulness transferred away from deserving private individuals, who would have helped or, in the past, did help, to the undeserving government. For the religious, tithing with low taxes and with high taxes are worlds apart. Etc.
Indeed, it is often the governmental intervention that causes the need for governmental help (or “help”). Consider the German health-insurance scam,* where rates around 15 percent of income are typical. What if the brunt of these 15 percent were instead invested by the individual, and used to pay for health (or other) costs as they arise? Chances are that he would do considerably better (and have better incentives), and that his “need” for help when something does happen in today’s system largely arises through his loss of money to pay for health insurance in the first place. (Ditto unemployment insurance, mandatory pension fees, and many taxes.)
*Not a true insurance at all and I stand by the word “scam”. A true insurance would pay for those few large events that are problematic for normal earners, e.g. cancer treatments and big operations, not for everyday nonsense like the common cold. I am very much in favor of a true health insurance; I am very strongly opposed to the current wasteful bullshit.
Excursion on hypocrisy and motivations:
Many of the above problems are given hypocritical and weak motivations to justify their presence. For instance, I have repeatedly heard claims in the family “mandatory schooling is necessary, because some parents might indoctrinate their children and school is needed as a counterweight”, while the schools, themselves, engage in massive indoctrination. Without mandatory school and/or with home schooling, maybe some children would be indoctrinated; with mandatory school, it is virtually all of them. The true issue is not one of indoctrination, but of indoctrination into a certain set of wanted-by-the-Left (and/or -the-government, -educators, -whatnot) opinions. If the indoctrination conforms with this set, it is viewed as good; if it is contrary, it is seen as evil. Similarly, merely not exposing the children to the “good” indoctrination is seen as evil; similarly, teaching the children to think for themselves, which might cause them to withstand the indoctrination, is seen as evil. In effect, it is less a matter of making school a counterweight to the parents and more of preventing the parents from being a counterweight to school. This is the more annoying as the indoctrination in school is often more harmful/evil than the parental—contrast e.g. “All Whites are evil by birth!” with “Christ died to redeem us all.”, let alone with “We should think for ourselves. Sapere aude!”.
Excursion on the duration of indoctrination:
For indoctrination to be durable, an absence of contradiction is beneficial, maybe necessary. A lone pair of indoctrinating parents from the previous excursion are unlikely to have a major permanent influence, unless the rest of society is on similar lines. School is typically supported by the press, many politicians, the message from TV shows, etc., and this indoctrination is harder to shake, as I know from my own experiences.
Rethinking education: School as a vehicle for history education
I have previously made claims along the lines that what most pupils* need to learn is Reading, wRiting, aRithmetic, with the majority of the more “academic” parts of the curriculum being wasted on most pupils;** and that schools fail by ignoring*** the practical sides of life to a too high degree. (As well as a great many other criticisms.)
*Throughout, I will stick to “pupil” over “student” to indicate the comparatively low age and development, and over “children” to avoid a perceived exclusion of e.g. high-schoolers. (Generally, I tend to avoid words like “child” beginning with puberty, and often-but-inconsistently use a child–teen–adult division.) The intent is on primary and secondary education, with a gradual shift from history of a field to the field it self as the years pass. (And, often, with a shift from history in general to history of various fields. See excursion.)
**With the additional complication that those who actually benefit would typically be better able to learn on their own than in school. (Maybe, excepting the first few years of school.) I certainly was and, when time outside school allowed it, did.
***While the misguided practical education that took place during my own school years was largely wasted in, at least, my own case. I have had no practical use of either the mandatory wood-shop or textile-shop, and what I might have learned in “cooking class” (“home economics” would be too generous, but might match the official intent) came too early to (a) be of interest, (b) actually have remained with me when I began to live on my own.
As my knowledge of history has grown beyond the few hours a week provided by school, I have increasingly developed a different view, where more material (than in my old view; still less than today) is present, but with a strong emphasis on exactly history and where history is the usual entry point to the treatment of the actual subjects for those bright or old enough to benefit. Life-skills and the three Rs would remain, of course, but most* of the rest of school would deal with history—including national and world history, history of thought, history of science, history of economics, history of literature, history of this, and history of that, with a slow transition towards the this and that, per se, over time. I would recommend a particular focus on classics** studies, but do not see the focus*** as mandatory, and plenty of space must be left for the post-classical world, even should a focus be implemented.
*In some areas, even the three Rs aside, this might be too impractical. For instance, replacing physical education or a foreign language with the history of physical education, resp. the history of that language, would border on the idiotic. In other cases, some core-topic education might be necessary before its history, or the topic might need to be moved to a later year. For instance, during my own early years in school, some time was spent (wasted) on learning the names of various animals, trees, whatnot. The sensibility of this activity is disputable (cf. excursion), but replacing it with a, for that age, too specific history of biology would bring little benefit. Even in these cases, however, some degree of history of the topic might be sensible as a companion or complement.
**Assuming education in a Western context (also see excursion). In other contexts, e.g. in China, this might need modification to reflect the local equivalent. In others yet, e.g. large parts of Africa, there might be no usable local equivalent or only equivalents that are too close in time.
***However, some knowledge of ancient civilizations and works is mandatory or the entire program turns into a travesty. For instance, to deem some time frame the “modern era” and then ignore everything that came before is untenable. (Sheer lack of reliable information might force limits on non-trivial study before a certain time in a certain place, but that is a different matter entirely.)
This would naturally, to some degree, go hand in hand with knowledge of the underlying field. For instance, a discussion of the history of astronomy would naturally establish e.g. the rough structure of the heliocentric solar system and the non-heliocentric galaxy, some approximation of the age of the Earth, some understanding of the difference between the distance from London to New York and the distances from the Earth to the Moon and the Earth to the Sun, whatnot. This either because it follows directly from the natural knowledge of history or because supplementary information is provided to put the historical information into context. As the pupils grow older, history of X will fade into the background and be more of a springboard to engage with X proper—if the pupil has the brains for it.
A major intended benefit is that a pupil who is over-challenged by a core field might have a better chance at the history of the field, and less of his time will be wasted on an activity with little or no return value. (Although, as always, the duller or lazier* pupils might receive less benefit than the brighter and more industrious*.)
*I am torn between formulations like “lazier” vs. the likes of “less motivated” (ditto, m.m., the more positive phrases). The latter will often be closer to the truth , but have been used and abused by educationalists and politicians for so long that they border on being meaningless and/or on being generic and blanket terms for “does poorly in school” (no matter the reason).
A solid knowledge of history, or even the knowledge available to the weaker pupils, has many advantages, including:
- Inoculation against destructive ideologies and poor policies, like most variations of the Left. For instance, someone who has a solid understanding of 20th-century history and economic history is unlikely to vote for the Left (especially, the Old Left), while someone who understands the history of Europe vs. (sub-Saharan) Africa, of slavery,* of women, of civic rights, whatnot will be far less likely to fall for the propaganda of the New Left.
*Including its historical extent (far larger and older than the “transatlantic slave trade”) and the massive inclusion of Whites and other non-Blacks, the strong Black (and other non-White) involvement in the Black slave trade, how Whites/Europeans/the U.S. North were the ones who eventually reduced and locally banned slavery, and how the South was hindered, not helped, by slavery in its economic development.
This might to some degree extend to e.g. COVID, as someone with a knowledge of past medical practices, the effects and characteristics of past epidemics, whatnot, would be far more sceptical towards the effectiveness (let alone efficiency) of and the risk of side-effects from various counter-measures and reactions, and would have a far better understanding of how trivial COVID is relative some past epi-/pandemics.
- More generally, there is an aspect of learning from past errors and the mistakes of others, of not being “doomed to repeat”, etc. For instance, in politics, someone who has some understanding of the relative or absolute failures vs. successes of the economy of the Soviet Union vs. the U.S., Mao’s China vs. Deng Xiaoping’s, North- vs. South-Korea, East- vs. West-Germany, Socialist vs. pre-Socialist Venezuela, etc., is unlikely to repeat the mistakes that lead the failures to failure and likely to favor what made the successes successful.* (This includes an important general observation, with an eye at many current demands and plans: government intervention very often makes things worse—and often much worse.) For instance, in a business setting, a CEO might look at the decline of the U.S. auto industry and draw conclusions about how to and how not to handle his own business. For instance, on a more individual level, someone wise to history might note a continual clampdown on various civic rights, notably free speech, draw the right conclusions, and either begin a protest while there is still time or leave for another country.
*To which we can add a few recent examples that are less wide in scope, e.g. the Sri Lankan crisis (mandatory “organic” farming), the U.S. lack of baby formula (a mixture of an artificial oligopoly and a forced reduction of capacity), the U.S. oil crisis (strong contributors include various Biden interventions, notably the termination of the extension of the Keystone pipeline), artificial inflation (e.g. monetary expansion), artificial lack of willing employees (the state pays people to stay at home), various energy crises (Keystone, embargoes, abolishment of nuclear power, state subventions of dubious “green” technologies, state money to offset (idiots!) rising gas prices, …), etc. (Note that this list is not limited to the U.S. and that, while Biden is often a major factor in this, even internationally, many other leaders of other countries have made similarly poor decisions—albeit rarely even half as many as Biden has.)
Note that the first example is highly relevant even to the average citizen, which is what the average pupil will grow up to be, with the modification that he is unlikely to vote for someone who would repeat the mistakes. The same might to some degree apply to the second example too, e.g. in that a stock owner might move his investments elsewhere in time.
- Similarly, it can be highly beneficial to draw on the ideas of the past, especially as we do have a great problem with ideas disappearing from common consciousness or being gradually misunderstood over time. A splendid example is the early ideas on what the U.S. (qua political entity) should be, how it should be governed, etc., and why this was so. Precious little of the thoughts of, say, Thomas Jefferson still remain in the philosophy of the current political system—and what there is, many ignorants* want to abolish.
*Note that I do not call them ignorants because they want to abolish something. I do so because they are ignorant of why this-and-that was originally introduced, do not understand the potential downsides, and generally have a simplistic and, well, ignorant view of related matters.
A personal example is my changed understanding of the jury system. I long considered it idiotic, because it opened up the doors for decisions by those of disputable intelligence, insight into criminal science, knowledge of legal principles, whatnot—never mind the risk that the jury members might prove more vulnerable to emotional manipulation than a judge. Indeed, going by TV,* having the lawyer better at manipulating the jury was more important than having the better evidence. These problems remain, but there was something to the jury system that I was unaware of,** namely that the “jury of one’s peers” was aimed at being a counter-weight to governmental power and a means to give the peers a way to prevent unjust laws and prosecution from infringing on “true” justice.*** (With some similar ideas also applying, e.g. that a single judge might be statistically more likely to be partial or easier to bribe than twelve jurors.)
*Unfortunately, this appears to be at least partially true in real life too, but not to the extreme degree seen on TV.
**Maybe, because neither Sweden nor Germany uses juries.
***Of course, the main way that a jury can do that, “jury nullification”, is something that the government wants to see banned, and the mention of which towards a U.S. jury already is banned. (Note that I am, myself, in two minds about jury nullification, as it can be a tool for both justice and injustice; however, the point above is not whether it is good or bad, but that I was originally unaware of even the idea.)
As a special case, historical knowledge can remove the need to reinvent the wheel. For instance, most of the “clever” thoughts and “wisdom” of today have an at least approximate correspondent in the past (often several). Take something like Cognitive Behavioral Therapy—at least the general idea was covered by the Stoics during classical times. (Also see excursion on the Ship of Theseus for a more personal example.)
Generally, our ancestors might have trailed us in scientific understanding, but not necessarily in terms of e.g. insight into philosophy, human nature, how to live one’s life, whatnot. The accumulated wisdom of a few hundred or thousand years is almost bound to exceed the snapshot of today’s reinvented wheels.
- A better knowledge of past thought leads to a better understanding of various fields and aspects of the world, including how something might have come into being or how something currently weird or silly seeming* might not be so in the light of the past. Good examples are often found around wars and international conflicts, including the Russian–Ukraine situation since 2014 (or whatever years is used as the starting point).
*Consider the Third Amendment to the U.S. Constitution, which, without knowledge of the historical context, appears not just weirdly specific but outright weird.
More generally it can lead to a more nuanced worldview, as every exposure to something different can, and to the insight that the norms of today are not absolutes or necessarily better than those of the past (ditto e.g. methods). (The reverse of the latter is a very common fallacy and one that I, myself, was not immune to in my youth.) Is this is or that change over time actually progress—or is it mere change? Maybe, even, change for the worse? Is A actually better than B, or does it merely have a different set of advantages and disadvantages? Is C actually better than D, or is it merely better for some special interest group, e.g. politicians? Etc.
- Chances are that history education will allow at least some “big picture” insights to remain, even if the details fade (and they usually do). For instance (cf. excursion), memorizing the names of animals will bring next to no value, as merely knowing the name allows no insight and as the name will too often be forgotten a few days or years later. In contrast, a pupil who forgets almost everything about the Romans is still likely to remember that they had big empire around the Mediterranean*, centered on Rome* in what is now Italy*, in the past. Someone who manages to forget even that, will still remember that the world was different in the past. (Unfortunately, an insight that some adults in the modern world seem to lack.) A memorizer of animal names might still remember that there are animals, but that insight actually (still…) is present with virtually everyone even without the help of formal education. If in doubt, even a one-year old might have seen a few dogs, pigeons, or flies.
*Here we see a pleasant potential side-effect of history: there is some, often a considerable, knowledge effect on other areas, notably geography. Chances are that most of the early geography education can be replaced entirely with a side-effect from history education.
Excursion on school vs. education:
The above (and below) is often phrased in terms of “school”, including in the title. This reflects the realistic realities of the foreseeable future, as well as the historical situation for a good chunk of time,* but by no means the ideal. School is and remains disturbingly inefficient and, often, ineffective, and the true focus should by rights be on education—not school. School, at least in its modern incarnation, is not a good way to gain an education and the education is what matters.
*How good depends on the “where”, but often begins at some point in the 19th century on a near-mandatory level and can go back far further on non-mandatory level.
Excursion on more advanced pupils:
There is a minority of pupils who would be under-challenged by the above, and who would benefit from more direct contact with the subject matter at an early/-ier stage. These should be allowed and encouraged to have that direct contact. Indeed, giving the brighter pupils room, means, and encouragement to develop themselves at their own tempo is central to a successful school system. (Whether a successful school system currently exists, I leave unstated.)
Excursion on higher education:
As the level of education increases, the relative importance of “history of X” relative “X” decreases. (Unless, of course, X is a field of history to begin with, in which case “history of X” would amount to history-of-the-historiography and will usually be the far less important subject on all levels.) However, it is likely to be of some importance on all levels and should not be ignored. For instance, a mathematician is likely to benefit from knowledge of what approaches have been taken to a certain sub-field or problem in the past, or how to solve a certain type of problem with less “fancy” methods than the current. Exactly how to address this, I leave unstated, as off-topic, but possibilities include an extensive one-off survey course with a focus on history, the inclusion of a (sub-)module in the individual (regular) course, and just pointing to a certain treatment that the student can choose or not choose to study on his own terms. (The relevance of this material for a test decreases accordingly, from core for the survey course, to minor for the (sub-)module, to none for the “own terms” study.) Noteworthy is that the relevance of history might vary from field to field. For instance, a knowledge of the history of math for a mathematician is likely less valuable than knowledge of the history of economics to an economist.*
*The contents of the respective histories might also be different in character. For instance, the history of math will deal relatively more with what approaches the mathematicians took and what beliefs they held (“history of the field of math”), and the history of economics relatively more with actual developments of an economic nature, say, causes and consequences of the “Great Depression” and what might have happened with a more sensible POTUS than FDR (“history of economic developments” or “history of the economy”). To some degree, but not necessarily for educational purposes, a subdivision into several history fields relating to X might be beneficial, e.g. history of X as a field of study/science, history of thought on X, history of events relating to X, etc.
From a personal point of view, I have occasionally made the experience that I know less about mathematicians and scientists than someone with a weaker knowledge of the respective field. A good (and accessible to others) example is Murray’s “Human Accomplishment”, where I often had a different expectation of who was how important* and often had a “Who the hell is that?” moment when looking at the top-twenty science lists outside of math and physics. This is an interesting side-effect of my having studied primarily math (physics, whatnot), it self, the history of math only secondarily, and biographies of, anecdotes about, human-interest pieces focused on mathematicians hardly at all. (At least, at a somewhat adult age. Some autobiographical works by physicist Feynman are an exception.) The counterpart, on the other hand, might have gobbled down the human-interest pieces without actually touching the math.**
*But, as Murray stresses, the relative importance of some figures might change considerably with a change in methodology.
**In the specific case of Murray, we have to considered a systematic and prolonged busyness with various works of a who-is-who-in-X and history-of-X character for the specific purpose of writing his book. That I trailed even in the scientific lists (let alone Japanese literature) is unexpected.
(Whether this is a problem is debatable. I would certainly prioritize an understanding of the developments of the field of math, it self, over knowledge of who-was-who.)
Excursion on the Ship of Theseus:
In an older text, I dealt with (among other things) the grandfather’s axe (pseudo-)paradox. Finding it too simplistic, I dropped the two-piece axe in favor of a many-piece T-Ford, replaced piece by piece over decades. Some time later, I discovered the Ship of Theseus—a many-piece version of the same idea that preceded my T-Ford by some two millennia.*
*It was used by Plutarch, whose lifespan falls a little short of two millennia ago, but it might not have originated with him. (And any actual ship owned by Theseus, should he have a historical basis, would necessarily have been built long before that, as he was ancient history even to Plutarch.)
Indirectly, this might also point to a danger of school trying to stuff too much into the pupils or doing so too early, as I seemed to “post-remember” having encountered the Ship in school. However, this applies with any choice of topic—and I remain, even after the altered opinion discussed above, with my line that modern school tries to cover too much material and, often, too early.
Excursion on risks:
There are of course some risks and disadvantages with this history-focused scheme, which must be considered during implementation. The most important is that history education can easily be abused to give the pupils a flawed worldview by outright distortions of history, but also by undue focus on certain groups or angles, by agenda pushing, and by application of some pseudo-scientific framework. The infamous “1619 project” is a great example of how not to do it. Anything Feminist, Marxist, Post-Colonial, or Post-Modern is also to be avoided like the plague.*
*Among what is likely to be encountered today. The threats of tomorrow might be something different altogether. The point is to be truthful and scientifically minded—not ideological or agenda pushing.
(However, the same abuse risk is present even in today’s school, as with the aforementioned “1619 project”, and I suspect that a broader and deeper history knowledge would make it harder to keep the truth from at least the somewhat brighter pupils, even when abuse takes place. The more information is present, the likelier it is that an attempted distortion will miss something or be internally inconsistent.)
Another risk is an implicit over-focus on “thoughts of others”, as opposed to own thinking. This is, obviously, a staple of school, but I suspect that it could be worse in a history-centric school. Countermeasures like encouragement of own thought and critical engagement* with claims by e.g. old philosophers are recommended. Ditto a juxtaposition of thinkers who have held opposing ideas.
*By the pupil! Not the teacher or some Leftist destroy-the-past or everything-old-is-wrong fanatic.
Excursion on learning animal names:
The memorization of names of animals, trees, whatnot mentioned above is a good example of school failing. These pairings fell into roughly three categories: (a) Those that I already knew (yes, a kid in school will know what a bear is). (b) Those that I soon forgot again and later learned permanently from a more sensible source in a more sensible manner, e.g. by watching a nature show, where ten minutes were spent showing and discussing the whatnot (vs. the single still image and name presented in school). (c) Those that I soon forgot again and never relearned, because they never had any kind of relevance to me. (In all cases, we have the additional complication that the Swedish names have been less important to my adult life than the German and English.)
What then, apart from busywork or the ability to claim that the pupils were learning something, was the point of this nonsense? Would it not have been infinitely better to just show a few nature shows in class or, in lieu of class, give watching some nature show on TV as home work?
To boot, this mere association of name and image is fairly pointless. A good example is posed by a test where I just could not come up with “järv” (“wolverine”). I knew what the image depicted, I knew what a wolverine was, I had already learned the word outside of school, and had even read a book which featured a wolverine as the protagonist.* (But I had not yet encountered the superhero Wolverine.) I just could not come up with the right word in the heat of the moment. I tried to salvage the situation by giving “carcajou”, which the book had mentioned as a local-to-the-setting-of-the-story name for wolverines in general** (and which might have been the proper name of the protagonist too), but received 0 points. Someone else might well have received points merely for having memorized the right name for the right image and actually having a cooperative memory, without having any further clue about wolverines.
*The school library had a large number of books with animals-as-protagonists, which I had wolverin…, wolfed down.
**Checking for the exact name, which over the decades had faded, I see that this is indeed the case in French Canada. But knowing a French-Canadian term was of no help with a Swedish test. Other names that I learned on my own, over the decades since, include the English “wolverine”, the German “Vielfrass”, and the Latin “gulo” (resp. scientific “gulo gulo”). Today, the “French French” “glouton” was added. Now, how would my progress have been hampered by not having the mere name–image combination included in the curriculum? (Not at all.) What benefit might the other pupils have had, even had they managed to answer the question? (Likely, none.)
Excursion on various shifts:
As mentioned above, there will be shifts as time passes. Their exact nature and many other details are beyond the scope of this text, but a general idea, using the Swedish 4 x 3 years division of låg-/mellan-/högstadiet + gymnasiet,* might be that those on lågstadiet focus on (elementary) national and world history, those on mellanstadiet see this complemented with various histories of various fairly large fields (e.g. history of science), and those on högstadiet complement history with a study of the actual fields and see histories of somewhat smaller fields (e.g. history of physics**). Gymnasiet would then be mostly the fields proper and histories of new fields (exactly what fields go where is another implementation detail, but history of economics seems a good example for gymnasiet).
*Sweden has mandatory education divided into blocks (låg-/mellan-/högstadiet) of three years, for a total of nine years, followed by a voluntary (usually) three-year fourth block, gymnasiet. I have found this parcelling into equally long blocks of three years to be very practical when thinking about school.
**Which is not to say that no history of physics should be given earlier—it should, as part of history of science. However, with the specialization there would be more depth and breadth and more involvement of actual physics. To detail what goes where is beyond the scope of this text, but we might e.g. have mellanstadiet and history of science cover how the world was once viewed as geocentric, but is now known to be heliocentric; while högstadiet and history of physics might contrast Keplerian calculations of elliptical heliocentric planetary movements with older circular heliocentric movements and with the older still geocentric epicyclical calculations. (Not necessarily with much mathematical detail, however.)
Excursion on politics and other fields:
Generally, I have become more and more convinced of the importance of history over the years, and I would certainly see history as the single most important subject for a politician to study (be it during formal education or in private). Looking e.g. at the U.S., what do we typically get instead? A BA in pol-sci, or some other weak field,* followed by a JD.** In fact, I would consider both pol-sci and law studies to be of only secondary benefit to someone who wants to be a good politician. They might help with understanding the machinations of the branches of government and how to write new and understand existing laws, but they are less helpful when it comes to deciding what policies make sense, what laws should be made, etc. No, the clear top-one subject for a politician is history; the equally clear second placer is economics. After these two, we can look at topics like pol-sci,*** law, philosophy (including ethics and various works relating to governance), public administration, business administration, etc.****
*Not that history would be inherently harder. The point is that there are many fields where a brighter student might gain much more than a duller student, but where the dull student might still manage to gain the degree, maybe even with a strong GPA, because the minimum requirements on brightness are low. That someone has a bachelor in e.g. pol-sci simply does not tell us anything much about his intelligence level or how much he gained through his studies. Contrast this with the footnote on STEM fields below.
**Of the common “professional” post-bachelor degrees, I suspect that an MBA would be more beneficial than a JD. Chances are that an “academic” master/doctorate in e.g. history or economics would be far better than either.
***With the reservation that pol-sci often contains pieces of the other fields, which might, depending on point of view, either make it less valuable (due to shallowness of coverage of these fields) or more valuable (due to a width that makes a separate study of some other fields less urgent).
****I do not mention STEM fields here, because they are rarely immediately relevant. However, they can be extremely good filters for intelligence (unlike most of the above) and a good general scientific understanding can be very useful when it comes to specific topics. (And most of my own formal education was in STEM fields.)
This with the reservation that a position in a certain field could require a deeper knowledge of that field, which might change the priorities for the individual concerned. For instance, to become secretary of defense, a prior education and career in the military would be highly recommendable, while considerable knowledge of economics is secondary (but still advantageous), and while the history knowledge might be tilted in the direction of military and conflict history at the cost of, say, the histories of art, agriculture, and architecture. A legal requirement of some minimum level of qualifications might even be an option. Consider, as a negative example, the current Swedish “Försvarsminister”, Peter Hultqvist: As I understand Swedish and English Wikipedia, he has no (!) higher education and was a journalist (!*) before entering politics. The true reason behind his appointment? His career within the Social-Democrat movement, beginning in the 1970s.
*Not only is journalism a pointless qualification for a politician, but journalists are also one of the few groups that might rank lower in my mind than even specifically Leftist politicians.
Excursion on classics studies:
To expand a little on the benefit of classics studies for a Westerner, I would note (a) the additional value in understanding Western culture, gaining a cultural continuity, etc.;* (b) that the wide range of thoughts and interests, often at a level that post-Roman Europe only reached again during the late medieval times or the Renaissance, provide many natural entry points into other fields, including literature, language, art, mathematics, philosophy, and to some degree** natural philosophy/history/science.
*However, I stress that, unlike some other proponents, I do not necessarily see Western culture as a natural or unprecedented number one. (Although, it is legitimately one of the few most interesting.) I am, for instance, well aware of great Chinese and Indian accomplishments at comparable times in history. There is still an increased benefit through the connection over time: In order to understand later thoughts in the Western or European room, and many historical developments, some understanding of e.g. Plato and Aristotle can be quite helpful. (Vice versa, to understand Chinese thought without exposure to Confucius would be a challenge indeed.) The effects of the Romans are still visible in languages and borders, and the West-/East-Rome (and/or the older Rome/Greece) division is at least an indirect contributor to the Western/Eastern European differences of today. Etc.
**The hitch is that the shallowness of knowledge compared to today and the lack of modern scientific methods is troublesome here.
As a special case, Latin is an excellent first foreign language for a native* English speaker.** The drawback of being a dead language is countered by (a) its benefits on understanding English, which has been enormously influenced by Latin (be it directly or indirectly over e.g. French); (b) the great differences in grammar compared to English, which allow a better understanding of languages in general; (c) the great help that it gives if*** a (living) Romance language is attempted at a later stage.
*While English is the obvious choice for most other Westerners, e.g. Swedes.
**Unless special interests need to be considered. Notably, even among dead languages, someone aiming for study of Christian theology is better of with (classical) Greek and Hebrew, while someone aiming for an actual career in the “classics area” might be better of beginning with Greek and only adding Latin at a later stage.
***And the chances are considerable that a further foreign language will indeed be one of these, e.g. Spanish in the U.S. or French in the U.K.
(I am tempted to add a (d) of access to some important works in the original language, but that applies to virtually all languages major enough to be candidates for a first foreign language—and the point of strong reading skills can take frustratingly long to reach. However, this also reduces the disadvantage of learning a dead language—it will take a long time before mastery of a language is sufficiently progressed that a living language brings practical benefits over a dead one. For instance, many or most with only a school-level knowledge of a foreign language will be able neither to converse fluently with a native speaker of, nor to read a book in, that language.)
Excursion on me and history:
As with many school topics, at least pre-gymnasiet (cf. above), I likely learned as much or more history outside of school as I did in school, even back then. This through a mixture of own readings that to some degree dealt with history, what could be gleaned from novels/TV/movies playing in the past*, and various TV documentaries.
*Not necessarily “historical novels” and their screen equivalents, as many were written at or shortly after the time of the events, and had simply reached me at a date when the events had passed out of the “contemporary”. For a trivial and very early example, I very likely first heard of the WWII bombings of London and the evacuation of children to the countryside through the “The Lion, the Witch, and the Wardrobe”, which was written roughly a decade after the events. In contrast, the portions of “The Magician’s Nephew” that play in “our” world would fall in the “historical novel” category. (I note that these “shortly after” books are less likely to contain inadvertent falsification and guesswork than “historical novels”, but also that both should be taken with a grain of salt.)
It was only far later that I began to gain a true appreciation for history, including spending a great many hours reading Wikipedia articles on historical topics in my late 20s. These readings were originally motivated by a general curiosity, but increasingly by the observation that there were more abstract things to be learned, e.g. about success in warfare,* by applying thought to the material—something which was not very clear from the too basic school history. This move from merely knowing facts** to seeing connections, understanding causes and consequences, drawing conclusions, whatnot lead me to a very different view of history than school had instilled. (Just like math and what school calls “math” have little to do with each other.)
*Examples include that a long war tends to be won by the party with the stronger industry (and/or ability to recuperate and keep production up), not the stronger military; that wars and battles are often won by making fewer mistakes than the other party; that better training can outweigh superior numbers; and that the technology or strategy that won the one war might be outdated by the time of the next war.
**Not restricted to who did what in what year, but also including e.g. that the Romans had a large Mediterranean empire.
Since then, I have added a very considerable amount* of historical knowledge and understanding—but still too little. There is so much to learn that I simply have not had the time for, and I truly wish that my early education had given me a better start. To this, bear in mind that I am not a professional historian and that I have a great many other interests/there are a great many other worthy fields, while the day is only so long. History, however, is a field were school might truly bring something—provided that a greater focus is put on understanding, which was not (cf. above) the case during my school years.**
*How much is hard to say, in part due to how spread out it has been, in part due to the different nature of my studies relative (what I would expect from) formal college studies. I would, however, take for granted that I am ahead of the average U.S. fresh-out-of-college history major.
**I deliberately do not go into details of how history should be taught above. This because (a) it would make this text twice as long, (b) would require considerable additional research or speculation on my behalf, (c) the problems with e.g. more facts than understanding and a too elementary level are ubiquitous in school, and reforming this is a separate issue from the shift towards more history.
Follow-up: Westworld
A while back, I wrote very positively about the TV-series “Westworld”. We are now some part into the third season, and I am no longer watching. The strengths of the first two seasons are largely gone; the new story lines have so far not been impressive, ditto their execution; many strong characters and actors have been written out or (characters) been severely altered, with insufficient replacement; … Nothing against Aaron Paul, but he is not (yet?) on the level of Ed Harris and Anthony Hopkins. Interesting philosophical questions have been replaced with almost hackneyed dystopia scares* relating to e.g. surveillance and demonstrations of how-easy-I-can-kill-you. The last scene that I (partially) watched struck me as simultaneously almost silly and trying too hard to be dramatic (episode 3 / Caleb, Dolores, the milkshake, and whatnot).
*Which is not necessarily to say that they will turn out to be wrong or that I do not share similar concerns, but it is just variations of what others have already done the last few years.
Westworld
The TV series “Westworld” has impressed me immensely. The first season is possibly the best single TV season that I have ever seen, because of its combination of entertainment value and food-for-thought (although much of the “food’ covered ground already familiar to me). The second is weaker, especially through failing to add much new* thought, but is still stronger than most of what can be found elsewhere.
*Examples include means vs. ends and whether the pigs are better than the farmers.
“Westworld” is also strong proof that it is not the medium but the content that matters: here there is no need to make excuses for watching TV instead of reading a great book. It is also a proof that it is not necessarily the “high concept” that matters, but what is done with it (as with e.g. “Star Trek Next Generation”). Where the movie (in my vague recollection) was fairly shallow entertainment, the TV series has true depth—“The Truman Show” meets Asimov.
During my first watching of season 1, a few years ago, I put down a lot of keywords for a text, but never got around to writing it, the scope of the intended text being discouragingly large. Most of the below is formed formed through expansion of a subset of these keywords into a less ambitious text. Even with my recent second watching of the first (and first watching of the second) season, I have to make reservations for a mis-remembering of what I wanted to say. Some keywords are left as is, because they are fairly self-explanatory.
Among the food-for-thought we have:
- What is the nature of existence, free will, perception, memory?
As an aside: while I do not suggest that we live in a similar world, merely that this is food for thought, I have often had the nagging suspicion that I am part of some weird cosmic experiment or “The Truman Show” situation, where someone tries to push the limit for what absurdities I am willing to consider real. A simpler, and more plausible, explanation is that humans really are that stupid, irrational, self-centered, whatnot. Similarly, a rats-in-a-labyrinth, “Westworld”, or “Matrix” style setup could easily explain e.g. the theodice problem, but the simpler explanation is the absence of deities in favor of nature taking its semi-random course.
- What makes an intelligent entity? When should rights and/or personhood be awarded: Turing-test*, sentience, consciousness, level of intelligence, …
*And to what degree is a Turing-test effective and useful?
- What rights and duties should be awarded to a godlike and/or creator being? (And to what degree does this depend on his status, per se, and his other characteristics?) Rulers in general? Parents? Etc.
- What should ethics and law say on the humans vs. robots (or vs. AI) situation? (Note some overlap with the previous item.) This including questions, not limited to robots, like if we have the ability to e.g. induce pain or suffering, plant bad memories (or memories, at all) or scrub memories, prevent self-development, …, when, if at all, do we have the right to do so?
Several keywords relate to the apparent gods (i.e. humans) and the paradoxical and/or odd state on the “inside”, including how paradoxically weak the gods are relative their subjects in some regards, while still having godlike or quasi-magical powers in other regards, e.g. in being able to “freeze” a host at will. Similarly, there is the paradox of the ever young and in some sense immortal hosts vs. the aging and highly mortal gods.* The angle that the creation, freed from artificial restraints, would be superior to the creator is particularly interesting, and will likely be true for humans vs. e.g AI in the long term.** (Of course, this state of the inferior being in charge of the superior is not unusual in the real world, where e.g. many dumber teachers are intellectually inferior to the brighter students and many stupid politicians make decisions over the head of genius citizens.)
*Indeed, in season two, attempts are revealed to replicate a human mind within a host body, with the intention of functional immortality for humans.
**In turn, raising the question whether we should follow the road of resistance, as in a sci-fi movie; engage in identity politics/racism/sexism/whatnot, as the current U.S. Left; or whether we should let the creation take over. From at least some angles, the latter is likely the most reasonable, and one reason why I do not enjoy the “Terminator” movies is my suspicion that humanity is the greater evil and that it might be for the better if the terminators were successful. (Again: humans are that stupid, etc.)
Other “god” issues are how the gods are divided into several groups, including regular guests, crew, management, whatnot, and how the hosts are controlled by Ford even as they attempt to rebel, raising questions as to how much of a rebellion it was. (Theological analogs are by no means impossible: What, e.g., if God meant for Adam and Eve to eat the apple or for Judas to betray Jesus to ensure that some set of events took place? Generally, the thought-experiment of mapping some religion to a “Westworld”-style setting is interesting.)
There were good examples of how sympathies are based on appearances and superficial behavior, rather than substance, as with William (aka the young “man in black”) and his interest in Dolores, which is an obvious great danger in real life. (I am uncertain whether I had such sympathies myself towards characters on the show when I wrote the keywords; however, I do know that I can be somewhat susceptible in the short-term. In the long-term, my observations of behavior and values take over, but this does not necessarily seem to be the case with others, which has lead to me having radically different estimates of some people than the majority has had.)
I spent some time considering the possibility of building a superior humanity: smarter, better memory, stronger, … (As well as long-standing wishes of mine—a conscious control over sleep phases and a built-in volume control for the ears or ear-equivalent.) A very disturbing possibility, however, is the abuse of similar systems to e.g. ensure conformity of opinion: for instance, looking at current U.S. colleges, it would be unsurprising if someone were to mandate the implant of the “right” opinions for someone to even be admitted. Or consider a “Harrison Bergeron” scenario, where someone with a natural advantage in some area has the corresponding control adjusted to limit his ability to the maximum available to the average person. (Note e.g. how the hosts intelligence was normally artificially limited, while Maeve’s had been set to the maximum available to her.)
To the IT specialist, “Westworld” is a great illustration of the limits of security, and how even small freedoms might ultimately be used for e.g. privilege escalation to reach great freedoms (cf. Maeve’s development). However, this is not strictly limited to IT: to some degree, similar effects might be available in real life, e.g. in a prison setting.
Some remaining keywords:
- extremely intelligent, well-shot/cinematographic, extra-ordinary cast
- well-crafted hiding of the two different time-periods
- complex network of known and, more importantly, unknown relationships and history
- interesting mixture of genres
- gratuitous sex scenes*
*I did not pay attention to this when I re-watched the first season; however, I did not notice much during the second season. This might be a point where the second season was ahead.
Excursion on the first vs. the second season:
Pin-pointing the exact (relative) weaknesses is hard without a repeat watching, but, speaking off the top of my head, the main problem is staleness, too much of the same ground, too much of the same issues. For instance, the alternative “Maeve escapes” scenario would likely have made for a much better attempt at variation than the “prisoners rebel” scenario that was chosen. Here the adventures of Maeve coping in the “outside” world, etc., could have made up a great source of both variation of action and new thought, while the “inside” world could have gone on roughly as before (at least, for the duration of the season).
I can see the point behind the “prisoners rebel” scenario, but it did not work that well; ultimately, we had the same setting and largely similar configurations of people; and there might simply have been too little worthwhile material to cover an entire season, instead of two or three episodes, in the rebellion it self. (Implying that too much filler was present.)
An interesting difference is the use of a jumbled time-line: in the first season, this was used to great effect; in the second, it was mostly a source of confusion with little value added. (A partial exception was Bernard’s journey.)
The last episode strikes me as dissatisfying and contorted, and a poor setup for a continuation. (Notwithstanding that the action seems set to play more on the “outside” for the third season. The manner is simply too different from the “Maeve escapes” scenario.) A particular mistake might have been speaking too explicitly about free-will (either the viewer has got the point already, or he wastes his time with the show) and, possibly, jumping into fallacious reasoning about free will: Free will ceases to be free when it is manipulated from the outside, not because the inner mechanisms have a deterministic character. These inner mechanisms are not a force upon us—they are how we are “implemented”. (An interesting, and in my eyes problematic, border-line case are influences that would often be considered “inner” but disturb the normal state, as when someone grows hungry. Certainly, I would consider these a greater limit on free will than e.g. a deterministic brain.)
Generally, parts of the second season had a bit of “Lost”-y feeling—a series that could have been truly great, but which collapsed on account of too much confusion, mysticism, unnatural story-lines, whatnot. (And, yes, I am aware that J. J. Abrams of “Lost”, and the ruiner of “Star Trek” and “Star Wars”, has been involved with “Westworld” too.)
Excursion on changing franchises:
The recurring reader might see my complaint of staleness as inconsistent with e.g. a text motivated by “iZombie” and its deterioration: would I not prefer a series that remained the same? To some degree, I do find myself reevaluating this stance, especially because my own book plans have come to involve considerable changes from book to book (within a potential book series). To some degree, the claims are compatible: the second season of “Westworld” failed to truly repeat the strengths of the first season (and did not add new strengths). Once it failed at that, the level of constancy or variation on the surface is less important: my original message is not that a franchise should have each installment be a carbon copy of the previous, but that it should play to its strengths. (I have also spoken positively about innovation in e.g. a text on “Valerian and the City of a Thousand Planets”.)
An interesting twist, however, is that the end of the first season left me fearing similar developments as with “iZombie”, where an irrevocable change pretty much killed the series through changing the world too much. With “Westworld” the changes might have been irrevocable and have, in some ways, turned the world on its head, but very similar story lines and ideas could continue with little damage. (Note, e.g., that even during the first season, few guests had any non-trivial impact on the story-lines. Off the top of my head, we might have had no more than the “the man in black” in the “now”, and him and his future brother-in-law in the past. Story-lines in the past can continue with little regard for changes in the now and (in the now) “the man in black” continued as usual. In contrast, had the first season been highly guest-focused, e.g. on a “guest of the week” basis, the rebellion could have been highly damaging.)