Michael Eriksson's Blog

A Swede in Germany

Archive for November 2018

WordPress and mangling of quotes

with 4 comments

Preamble: Note that the very complications discussed below make it quite hard to discuss the complications, because I cannot use the characters that I discuss and expect them to appear correctly. Please make allowances. For those with more technical knowledge: The entity references are used for what decimal Unicode-wise is 8220 / 8221 (double quotes) and 8216 / 8217 (single quotes). The literal ones correspond to ASCII/Unicode 34, which WordPress converted to the asymmetric 8220 and 8221. (I stay with the plain decimal numbers here, lest I accidentally trigger some other conversion.)

I just noticed that WordPress had engaged in another inexcusable modification of a text that I had posted as HTML by email—where a truly verbatim use of my text must be assumed.* Firstly, “fancy”** or typographic quotation marks submitted by me as “entity references”*** have been converted to literal UTF-8, which is not only unnecessary but also increases the risk of errors when the page or a portion of its contents is put in a different context.**** Secondly, non-fancy quotation marks that I had deliberately entered as literal UTF-8 had been both converted into entity references and distorted by a “fanciness” that went contrary to any reasonable interpretation of my intentions. Absolutely and utterly idiotic—and entirely unexpected!

*Excepting the special syntax used to include e.g. WordPress tags, and the changes that might be absolutely necessary to make the contents fit syntactically within the displayed page (e.g. to not have two head-blocks in the same page).

**I.e. the ones that look a little differently as a “start” and as an “end” sign. The preceding sentence should, with reservations for mangling, contain two such start and two such end signs in the double variation. This to be contrasted with the symmetrical ones that can be entered by a single key on a standard keyboard.

***A particular type of HTML/XML/whatnot code that identifies the character to display without actually using it.

****Indeed, the reason why I use entity references instead of UTF-8 is partially the risk of distortion along the road as an email (including during processing/publication through WordPress) and partially problems with Firefox (see excursion)—one of the most popular browsers on the web.

The latter conversion is particularly problematic, because it makes it hard to write texts that discuss e.g. program code, HTML markup, and similar, because there the fancy quotes are simply not equivalent. Indeed, this was specifically in a text ([1]) where I needed to use three types of quotation marks to discuss search syntax in a reasonable manner—and by this introduction of fanciness, the text becomes contradictory. Of course, cf. preamble, the current text is another example.

This is the more annoying, as I have a markup setup that automatically generates the right fancy quotes whenever I need them—I have no possible benefit from this distortion that could even remotely compete with the disadvantage. Neither would I assume that anyone else has: If someone deliberately chooses to use HTML, and not e.g. the WYSIWYG editor, sufficient expertise must be assumed, especially as the introduction of fancy quotes is easy within HTML it self—as demonstrated by the fact that I already had fancy quotes in the text, entered correctly.

Excursion of Firefox and encoding:
Note that Firefox insists on treating all* local text as (using the misleading terminology of Firefox) “Western” instead of “Unicode”, despite any local settings, despite the activation of “autodetect”, despite whatever encoding has actually been used for the file, and despite UTF-8 having been the only reasonable default assumption (possibly, excepting ASCII) for years. Notably, if I load a text in Firefox, manually set the encoding to “Unicode”, and then re-load the page, then the encoding resets to “Western”… Correspondingly, if I want to use Firefox for continual inspection of what I intend to publish, I cannot reasonably work with pure UTF-8.

*If I recall an old experiment correctly, there is one exception in that Firefox does respect an encoding declared in the HTML header. However, this is not a good work-around for use with WordPress and similar tools, because that header might be ignored at WordPress’ end. Further, this does not help when e.g. a plain-text file (e.g. of an e-book) is concerned. Further, it is conceptually disputable whether an HTML page should be allowed to contain such information, or whether it should be better left to the HTTP(S) protocol.

Advertisements

Written by michaeleriksson

November 29, 2018 at 8:27 pm

Multiple ideas vs. focused texts

leave a comment »

I have repeatedly encountered claims by authors* that their stories only truly work, come to life, whatnot, when they are based on two or more separate ideas. I have made the same, but more ambivalent, observation regarding my own (non-fiction) texts: The texts that I really have a drive to write, that are the most fun to write, that develop my own thoughts the most, …, tend to be the ones combining two or more ideas. The way of writing described in an older text also almost forces the inclusion of multiple ideas, even when the original drive was not rooted in more than one idea. On the downside, these texts are also the least focused, might give the readers less value for their time, and would be most likely to be torn to pieces if submitted to e.g. a college course on essay writing.**

*Including Stephen R. Donaldson with regard to his “Gap” and “The Chronicles of Thomas Covenant” series. As for the others, I have to pass, seeing that I only took superficial note at the time, and that these encounters stretch back over decades.

**However, I state for the protocol that I simply do not agree with much of the standard writing advice and that my deviations are more often rooted in disagreement with than ignorance of such advice. This includes, outside of very formal types of writing, keeping a tight focus on a single topic, or focusing on a single side of an issue. (The latter might be more convincing, but should a good essay aim at convincing or at giving insight and attempting to get at the truth?) It also includes “avoid the passive”, which I consider one of the worst pieces of advice a writer can get.

Consider, for a comparatively simple example, a few thoughts around the 2018 Chess World Championships and the fact that all the regular games ended in draws:*

*I originally intended to write a separate text on that matter. Since I sketch the material here, and have an ever-growing back-log, I will forego that text.

On the one hand, I have long observed that as a sport matures and as we look at higher levels within a sport, (a) players/teams tends to grow stronger defensively at a higher rate than they do offensively and/or that differences in scores tend to grow smaller, (b) the nature of the sport often changes.*

*Consider for (a) the chance of in soccer finding a five-goal victory margin or an overall of ten goals for a single game between national teams today vs. fifty years ago, between men’s teams vs. between women’s teams, or between Bundesliga teams vs. between local amateur teams. (With great reservations for temporary fluctuations, the presence of a truly exceptional team, unusually large mismatches in level, and similar.) Consider for (b) how a low level bowler aims to knock down as many pins as possible, hoping for a strike every now and then, while the pro is set on failing to get strikes as rarely as possible.

On the other hand, partially overlapping, it seems to me that chess is being crippled by too many draws. Notably, at increasingly higher levels of play, it becomes rarer and rarer for Black to win, implying that the “job” of Black is to get a draw. In contrast, the “job” of White is to win. However, even White’s win percentage is often so low that draw orgies result. Looking specifically at the 2018 World Championships, we also have another negative: The defending champion and eventual winner, Carlsen, was known to be stronger at speed chess than his opponent, Caruana—and in case of a tie after twelve regular games, the match would be determined by a speed-chess tie-breaker. Considering this, Carlsen had a strong incentive to play risk-free games, knowing that a tie was close to a victory and that draws in the individual games favored him. Going by the account on Wikipedia, this almost certainly affected the 12th game.* After twelve consecutive draws, the tie-break came, and Carlsen now won three games out of three… Similarly, even without a favorable tie-breaker, a player who got an early win might then just go for safety draws in order to keep that one-point advantage until the end.

*Going by the other game accounts, there were no strong signs of such caution. (I have not attempted to analyze the games myself, and I might even lack the ability to judge what is “safety first” play on this level.) However, another player in his shoes might well have played with an eye mainly at forcing a tie-break, and have viewed a victory within the regular portion as a mere bonus.

Looking more in detail, my plans did not advance so far that I can say with certainty what I would have written, except that it would have included a potential rule change to e.g. give Black more points than White in case of a victory and/or to make the split of points uneven (in favor of Black) in case of a draw. This would have had the intention of giving Black incentives to play harder for a win and/or to make White dissatisfied with a draw, and some discussion of how such rule changes could turn out to be contra-productive would have followed. For the latter, an intended parallel example was the off-side rule in soccer: Abolishing it would lead to more goals if the teams play as they do today and it could give incentives to play more aggressively through putting forwards higher in the field to await a pass; however, it could also lead to more defensive play through keeping defenders back even when attacking, in case the ball is lost and a quick counter-attack follows.

*For some value of exciting: I usually find watching soccer to be quite boring.

Here we also have an illustration of one of the problems with more focused texts: If I were to try to divide the above into two (or more) texts, they would each be missing something or be forced to partially duplicate discussions. It could be done. There might even be contexts when it should be done. However, this would entail more work than writing a single text, the result would be lesser in my eyes, and I would, myself, prefer to read the joint text.

The illustration would have been better, had I been further along in my planing. However, consider e.g. how a discussion of the off-side rule in the chess text would have been weakened without a discussion of the more general phenomenon and the context of the comparatively low number of goals in soccer (if in doubt, when compared to e.g. basket-ball or ice-hockey). Goals in soccer, in turn, would be a fairly uninteresting and loose special case without having an eye on the wider issue of (a) above. Or consider just discussing the “drawiness” of top-level chess without mention of (b) in general. Etc. For a good example of a text actually written, see [1]: Here a discussion of a specific PED-related controversy is combined with a general discussion of some sub-topics, and then transitions into a questioning of how reasonable the current take on PEDs is. (Could have been two or even three texts, had “focus” been a priority, but, to my taste, the actual text works better.)

Excursion on fiction and multiple ideas:
The above-mentioned claims by authors are likely mostly relating to fairly abstract ideas or broad themes that do not automatically point the way;* however, in my experiences as a consumer of fiction, the better works often have a number of ideas or concepts of a more concrete kind that combine to make them that much greater. For instance, “Star Wars” without light-sabers would still be “Star Wars”, but it would not have been quite the same. Similarly, “Star Wars” without the Force would still have worked, but … Or consider “Star Trek” and the individual series with and without holodecks. Clearly, the holodeck is not needed, but it adds so many great additional possibilities. It would certainly be possible to make a reasonably good “high concept” series around the holodeck alone. Similarly, “Chuck” basically combines two different TV series—comic spy adventures and the absurdities of a fictional electronics store. Taking just the latter and combining it with the family-life of Chuck would have made for a reasonable comedy series. Taking just the former in combination with family-life would have made for a very good action-comedy series. Having both in one series made it truly great entertainment.

*Donaldson speaks e.g. of a combination of leprosy and unbelief for “The Chronicles of Thomas Covenant”—the road from those ideas to the actual books is quite long and very different books would have been equally possible.

And, no, this is not limited to modern screen entertainment: Shakespeare’s “Romeo and Juliet”, e.g., is not just a tragedy or love story—but both. It also has more comedy in it than most modern rom-com movies… Then there are life lessons to be drawn, some remarkable poetry, and whatnot. At least the filmatisations by Zeffirelli (outstanding!) and Luhrman show the room for action scenes.*

*I am uncertain how this could have come across in an Elizabethan theater: On the one hand, the means would have been more limited; on the other, the standard of comparison was necessarily much lower. (Both movies also make remarkable use of music; however, that is independent of the underlying play.)

Written by michaeleriksson

November 29, 2018 at 6:59 pm

Good riddance, CEBIT!

leave a comment »

It appears that the once world-leading German computer fair CEBIT has been canceled: Good riddance! Let other trade fairs follow suit!

While I do not rule out that there are some special cases of fairs that make sense or some minor purposes of a fair that cannot be better solved through other means, fairs are largely pointless—mostly just diverting money to the fair industry and to the local city and its tourist businesses. For others, including regular tourists and “legitimate” business travelers, the effects are mostly negative. This especially through the great troubles of finding hotel rooms during fairs, and the often quite considerable price hikes* that take place on the rooms that can be found. (Note the similarity to the advertising industry, both in purpose and in that it brings more benefit to it self than to its customers—and is usually outright negative for everyone else.)

*During the largest fairs, I have seen prices more than doubled on some occasions.

Going as a consumer* is, judging by my own experiences, fairly pointless: The things that might be interesting to see are what everyone else wants to see, implying that there are queues and crowds. The actual information presented is typically minimal and/or extremely commercial. Information about e.g. products and services are much easier to find on the Internet or through qualified publications in today’s** world. The main benefit might well be the opportunity to get some freebies, e.g. a few magazines—but compared to the ticket price this will rarely be worth the trouble. (And most visitors will also have to factor in travel and hotel costs, etc.) Indeed, I have twice received complimentary tickets to specifically the CEBIT and still chosen not to go, considering the other costs involved and the time wasted too large to make it worth the effort…

*If allowed, as with CEBIT: Some fairs are “business only”, which I consider far more sensible, both through creating a greater focus and through reducing the damage to third parties.

**Note that the situation here and elsewhere might have been very different just a few decades ago.

The situation is very similar for those businesses who are there as passive visitors. They might in addition have the option to “check out the competition”, but since they will only see what the competition wants seen, the value is low. There are some networking opportunities, but these face the same popularity issues—especially, as these visitors are likely to be less important players, who bring comparatively little value to the popular targets… Such networking would be better handled by visiting a few conferences, where the participants are better filtered and more time for such purposes is available. Alternatively, a contact service* that matches up businesses with sufficient compatibility in mutual value is likely to create greater benefit.

*I am, admittedly, uncertain to what degree such exist and do a good job; however, I have seen the idea broached repeatedly over the years. If in doubt, creating such businesses and foregoing fairs would be an improvement.

For active participants (i.e. those who have own stalls and whatnots), the situation is a bit better, but mostly a fair amounts to a publicity opportunity or a “to see and be seen”*. Here we again have the popularity problem—the likes of Apple will garner great interest, while almost no-one will pay attention to an obscure ten-man company. At the same time, Apple does not need to go to trade fairs to get publicity… For that matter, running a product demonstration or a speech over the Internet is not hard, while e.g. putting up a sales brochure is utterly trivial.

*Likely with heavy emphasis on the second part. Indeed, my employer during the dot-com crash deliberately went to computer fairs, including the CEBIT, for the purpose of showing that the company still existed…

At the end of the day, the press and the executives might like fairs, but the benefits compared to the alternatives remain dubious for everyone except the fair organizers, the hotels, etc. For most others, the fairs are an outright negative. For instance, I could have saved many hundred Euros and at least several hours of accumulated hotel searches had it not been for the flooding of the Cologne* hotel-market that takes place again and again. Or consider the additional pressure on the (already strained) transports to and from Cologne. Or consider that a very large and central piece of real estate is occupied by the fair area, where there could have been apartment houses for hundreds, likely even thousands, of people, easing the pressure on the over-heated apartment market.

*Cologne is one of the leading fair cities in Germany, and I have spent a part of my freelance career working there. (But the CEBIT, to avoid misunderstandings, took place in Hanover.)

Written by michaeleriksson

November 29, 2018 at 12:38 am

A potential revamping of college tuition

leave a comment »

With regards to college/university there is a subset of problems that could be partially resolved in a simple manner:

  1. In order to ensure a high degree of equality of opportunity and social mobility, it must be possible even for people with low income and little wealth (be it own or when looking at the parents) to gain degrees. (Assuming that they are intelligent and hard-working enough to succeed with the academic side of the equation—few misconceptions are more dangerous than the belief that college creates diamonds out of charcoal.)
  2. Colleges cost money to run, and it is not optimal to finance them through public funds. Not only is the use of “someone else’s” money a bad idea in general, but here those that do not go to college are disadvantaged in an unfair manner.

    Note that this affects the U.S. too, because of the considerable “financial aid” given. Notably, the financial aid is also a driving force behind tuition increases—when the economically weaker buyers of a uniform product are given more money, the sellers have strong incentives to raise prices. The price raise then hits everyone, while only the weaker where given aid, which increases the group that would benefit from aid. To boot, the original aid receivers do not benefit as much as intended, creating a wish for more aid per person. Here there is a risk of a vicious circle.

  3. Academically poor students tend to cost a lot more money than the better students, e.g. in that they require more support outside of lectures and that they are the main reason why the highly inefficient lecture system is still “needed”.
  4. There is a severe over-inflow of students not suitable for college, who force further dumbing down, weaken graduation criteria, etc.
  5. In tuition-heavy countries, colleges have an artificial incentive to let students graduate, pass, get good grades, or even be admitted, irrespective of whether they have actually earned it.
  6. Excessive income, as e.g. with some U.S. colleges, leads to waste, including an every growing administration.

    (As an overlapping special case, it could be argued that the U.S. campus system is an evil per se, and that the students would be better off paying directly for own and independent housing, as they do in e.g. Sweden and Germany, rather than to pay the colleges to provide housing. Certainly, my impression of the living environment, from U.S. fiction and general reputation, points to it being positively harmful to someone who actually wants to study, which would make it a doubly poor use of money.)

  7. If only partially relevant: Popular programs* often have to reject even qualified students.

    *I use “program” to mean something at least somewhat structured, with an at least somewhat separate admission, and similar. Due to the wide variety of systems in use, this word need not be suitable everywhere. Note that the word “major” would implicitly exclude e.g. master programs and med school, which makes it highly unsuitable, even other potential concerns aside.

Consider the following solution sketch*:

*It is highly unlikely that this sketch would be viable without modifications and there are details to clarify. Complications include what exact numbers to use, whether borders should be sharp or fuzzy, what criteria should determine who belongs where, whether percentages or absolute numbers are better, how many categories are reasonable, what conditions are best suited for what category, …

Colleges are by law forced to let the top 10 percent of students study for free, with costs covered by the colleges’ funds.* Students from 10 (exclusive) to 30 (inclusive) percent are charged approximately at cost**. Students from 30 (exclusive) to 60 (inclusive) percent are charged at cost + some reasonable*** markup. The remaining students can be charged whatever the college wants. There is no additional financial aid.

*It is of fundamental importance that the colleges’ money be used. If, e.g., government money was given to the colleges to cover the costs, the system would fail.

**Based on a reasonable estimate of how much each student costs with regard to what directly relates to the education, e.g. salaries to professors for the courses taken, but not e.g. the cost of running the administration or various sports programs.

***Possibly, 500 or 1000 EUR/semester (resp. the purchasing-power adjusted equivalent in local currency), or some percentage of the costs (on top of the costs themselves).

In such a set-up, worthy students will rarely have financial problems; colleges can still earn plenty of money (but with less issues of insane surpluses); a very wide admittance would be possible, but the academically less* fit would tend to disappear when they discover that they fail to score well enough to study cheaply, which increases the quality of the graduates; etc. Note especially that while colleges might still have incentives for over-admission and “over-passing”, the students so favored would still need to pay their fees, and these incentives will then be largely countered by incentives for said students to drop out**. To boot, the colleges only have incentives to keep the students on—not to give them better grades than they deserve or to let them graduate before they have reached a certain standard.

*Note that these need not be unfit when it comes to a competitive program. In such cases, the effect is not so much a removal of the unworthy as it is a filtering based on result, where today a filtering based on expectation of result takes place. For instance, instead of admitting those with a GPA of 4.0 and leaving the 3.9s lying, a program could admit the latter too, and then let the students filter themselves out based on actual performance over the first few semesters. (But there might still be some programs where this type of increase is not plausible.)

**From the given program at the given college. It is quite possible that studies are continued with more success in a different program and/or at a different college.

In countries where various forms of public funding pay for significant portions of the cost and tuition is kept very low, this scheme would allow the introduction of higher fees (without negative effects on worthy students) and a corresponding reduction of the cost to the public: Instead of effectively shelling out money to everyone who wants to study, the money is limited to the worthy—or even to no-one, because the worthy are already covered by the fees paid by the unworthy.

Also note that the restriction on costs includable in the two mid-categories give incentives to keep administration and other overhead down. For instance, if a professor is given a raise, ninety percent of students can be charged extra—but for an administrator, it is only the bottom forty. Ditto if the number of professors respectively administrators per student is increased.

Excursion on actual costs:
Keep in mind that the actual cost of a student is much, much lower than what some U.S. fees could make one believe—this especially when we look at a “marginal”* student or a student bright enough to learn from books (instead of lectures) and to solve problems through own thinking (instead of being led by the hand by TAs). As I have observed, it would sometimes be cheaper for a handful of students to pool their money to hire a dedicated, full-time professor than to go to a U.S. college.

*I.e. an additional student added to an existing class, who will typically add far less to the overall cost than could be assumed by calculating the average cost per student.

To exactly quantify costs is hard to impossible, when looking at e.g. differences in class sizes, salaries of professors, the type of equipment needed or not needed in different courses, what type of work* the students have to present, etc. However, for a good student taking non-wasteful courses, the marginal cost might be a few hundred Euro per semester, and a few thousand should be plenty in almost any setting and even on average.

*Compare e.g. a math course with one or two tests to a writing course with a handful of essays, all of which should be given insightful feedback. (Whether they are given such feedback, I leave unstated.)

Excursion on percentages:
When percentages are used, we can have situations like someone dropping out of the top 10 percent because others dropped out entirely.* Originally, I saw this as negative; however, on further thought, in might work out quite well, seeing that the limit will grow tougher in the more advanced years, stimulating competitiveness and keeping the level of those who graduate even higher. However, some type of fail-safe might be beneficial, e.g. that the percentages are converted to absolute numbers at the beginning of each semester. (If there were a hundred students to begin with, the ten best students are guaranteed top-level status, even if the class has shrunk to ninety at the end of the semester.)

*E.g. because he was the tenth best student in a class of one hundred, and is now the tenth best in a class of ninety.

Excursion on choice of college, program, whatnot:
A potentially positive side-effect is that strong students have new incentives to consider less popular colleges and programs. For instance, someone who could be accepted to Harvard, but with a considerable risk of having to pay, might prefer a college where he is almost guaranteed to be a top-10-percenter. Such decisions might in turn have effects like creating a downward pressure on tuition fees of expensive colleges, spreading talent more uniformly, reducing the “networking effect”* of top colleges, etc.

*According to some, the main benefit of going to e.g. Harvard is not the level of the education, but rather the career-boosting contacts made there. Also note that networking is often just a form of corruption—something that damages an employer and/or society for the benefit of the networker. Such damage can e.g. occur when someone is hired because of “whom he knows” rather than “what he knows”.

Excursion on the freedom of the colleges:
One negative effect is that it limits the freedom of colleges regarding pricing, which could have negative market implications and/or be ethically dubious. This complication should be seriously considered before an implementation is attempted.

A reconciliation might be to only put some categories of colleges under the suggested system, including all that are state owned/run, all that have received non-trivial public support within some number of years prior to the “now”, and all that have directly or indirectly benefited from financial aid to their students in the same time frame.

However, if push truly comes to shove, this is one area where even such a strong regulation would be acceptable to me—in light of the catastrophic decline of higher education over the last few decades and the great threat that an even further decline poses.

Excursion on living costs:
In a non-campus system, topics like rent might need additional attention. It might e.g. be necessary to allow some amount of financial aid, preferably in the form of loans, to cover such costs. However, importantly, this would be something between the government and the student—with the college having nothing to gain. Further, it is not a given that such aid would be necessary on a larger scale, especially as societies grow more affluent: For very many, living with the parents, monetary help from the parents, working summers, private loans based on expected future income, or similar, can provide a solution that does not use tax-payer’s money and does not have a major impact on success in college.

Remark concerning “Thoughts around social class”*:
This text is not strictly a part of that text series, but there is some overlap and the implied division of students into more and less worthy categories is highly compatible with an intended future installment.

*See e.g. the last installment published at the time of writing.

Written by michaeleriksson

November 26, 2018 at 7:29 am

Revisiting verbosity based on Wesnoth

with one comment

Since writing a text dealing with verbosity (among other things), I have dabbled with Wesnoth*, which well illustrates the problems with undue verbosity, lack of tempo, and similar:

*See an excursion for some information and a few notes on terminology beneficial for the non-player’s understanding of this text.

  1. Most campaigns contain an undue amount of narration and dialogue*.

    *Which is fixed in advance. Only very rarely can the player influence the development of the dialogue, and then only within a small set of fix choices.

    Now, a good story can make a campaign more enjoyable*; however, the point of the game is to play the game—if I want to read an extensive story, I can just grab a book.

    *Especially, through adding aspects with no correspondence in the “pure” game, e.g. character background or a romantic sub-theme.

    Worse: Most of the resulting text is pointless. It adds no value to the story or the overall enjoyment; is repetitive; states what should be a given; or is otherwise a waste of time. (That the text is very often poor by criteria relating to prose, effectiveness, story-telling, …, does not help—but that is an unrelated topic.)

    For instance, very many scenarios start with multiple enemy leaders saying variations of “I will crush you, puny humans!” or “Victory shall be ours!”—which reminds me of German sports writers, who do not tire of headings like “X will den Sieg!” (“X wants to win!”). What had they expected that made that news-worthy?!?

    Another complete idiocy is “war council” scenarios where various characters make mostly pointless statements, sometimes leading up to half-a-dozen characters, one after the other, saying “Agreed!” (or something to that effect)—where a simple “(All agree.)” would have done just as well, with a fraction of the player’s time wasted. Usually, the entire council could have been compressed into just a few lines of dialogue or replaced by a simple narrative message.

    The bulk, however, is lost on unduly long narration, mostly amounting to filler.

    To boot, if a campaign is played more than once, the value of the (textual parts of the) story are diminished further (while the non-story parts remain similarly interesting to the first time). What might be acceptable the first time around, need not be so the second, third, or fourth time.

    Sometimes, it is so bad that I skip entire sequences of story (which is, fortunately, possible as a lesser evil)—but am then (a) left with no benefit at all from the story, (b) often lack context,* and (c) can miss various hints to optimal game play given in the text**.

    *E.g. in that I do not know why I suddenly have an ally or why I am suddenly trying to defeat a band of orcs, instead of those undead that had hitherto been the main enemy.

    **E.g. that a wooded area contains hidden enemies or that some aspect of the standard game-mechanisms has been temporarily altered.

    Most campaigns would be better by cutting the text in half; some would be better by cutting it to a tenth. (Note that I do not say that the story should be cut—only the text.) Generally, it is important to understand that different types of work require different types of writing—a game is not a novel, a play, or even a comic.

  2. The previous item is made the worse by limitations in the way that the game displays text: A longer piece of narration is displayed with no more than a few lines at a time (the next few lines following after user confirmation) and in an annoying manner, where each line is slowly blended in, one after the other. (Fortunately, this blend-in can be skipped by pressing the space key; however, this risks skipping too far, and a setting to skip the blend-in as a matter of course is not present.) Similarly, dialogue, even single words, is always displayed individually for each character speaking. Both imply that the user (even when wanting to read) has to hit the space key every few seconds; both have negative effects on strategies like getting a cup of coffee between scenarios to read the narration and dialogue at the beginning of the next scenario in a fully relaxed state.

    A particular annoyance with dialogue is that any utterance causes the view of the “board” to be focused on the speaking character, which leads to an additional delay and implies that the focus will usually end up at a different portion of the board than before the dialogue.*

    *Since the original focus is not restored. This is OK for pre-scenario dialogue, but problematic with in-game dialogue: Consider making a move to attack, having that attack interrupted by a triggered dialogue, and then having to scroll back to attempt the attack again… This leads to yet another unnecessary delay.

  3. The problems are not limited to text. For instance, some war-council scenarios contain sequences of half-a-dozen characters moving across the board, saying something, and then moving back across the board. These movements bring no value, appear to be unskippable, and take an excruciating* amount of time, during which the player can do nothing within the game. Still, some campaign makers have deliberately taken the effort to add these “value subtracted” moves…

    *I play with the animation speed increased by a factor of four (and have all unnecessary animations turned off). Even so, such sequences are horribly slow. With default settings, the best bet would be to grab a book until the movements are over—which really shows how redundant they are. (Another interface quirk is that the next faster setting is a factor of eight, which would be beneficial here, but might make other portions of the game move too fast.)

  4. A related scenario-error within regular game play is to involve too many units at the same time. For instance, there are a some battle scenarios (e.g. in “Legend of the Invincibles”) with more than a hundred AI-controlled units on the board at the same time (almost all of which are moved every single round)—and where it takes several rounds for the player and the AI-controlled enemy to even make contact.* The ensuing (mostly) unimportant movements, can go on for minutes… Even after contact is established, it takes quite a while before the majority of the units are actually involved in fighting—and that often occurs because sufficiently many of units have finally been killed off…

    *A better way to handle so large battles is to give the opponents less “starting gold” and more “income” or otherwise delay the “recruitment” (without reducing the total number of units eventually involved). A partial improvement is to reduce distances between opponents, but this could lead to a too fast defeat of some of the enemies or increase the influence of luck.

    In such cases, I have even made my own moves, done something completely different while waiting for the computer to make its moves, and then just checked whether the outcome was sufficiently satisfactory* when it was my turn again. Of course, this work-around is often foiled by some random dialogue in the middle of the battle, e.g. when an important enemy unit died. I then have to click through the dialogue, restart the battle, and go back to my “something completely different” for another few minutes…

    *With an eye on two things: Firstly, the loss of some specific units can lose the game outright. Secondly, if too large losses of other units occur, an eventual victory would by Pyrrhic. In both cases, it is time to start the scenario over with a better approach.

In the defense of these campaigns, they are contributed by various users and, therefore, rarely written by professionals. Then again, the more “professional” a campaign appears in other regards, the more text there tends to be (both in general and with regard to “pointless” text).

Excursion on Wesnoth, background information, and terminology:
The games is officially called “Battle of Wesnoth”. It is a turn-based strategy game, mostly played against an AI, which I played very often some years back—before frustration with too great an influence of luck, a poor user interface, and many idiocies in campaigns eventually drove me away. (The issues discussed here relate to literal or metaphorical verbosity—the overall list would be much longer.)

A “campaign” is a series of linked scenarios, roughly equivalent to the overall adventure or war. A “scenario”, in turn, is roughly a sub-adventure or a single battle. A “unit” corresponds to a piece in chess. I have otherwised tried to be low on “technical terms”, in favor of what those unexperienced with computer games and/or Wesnoth might find understandable.

Note that some descriptions above have been simplified compared to actual play. (For instance, even the large battles scenarios discussed above will typically start with only a handful of units, and see armies rapidly expand through “recruitment”.)

Those interested can download it for free from the official Wesnoth website, which also provides more detailed knowledge than given here.

Disclaimer: I played using the latest version available in the standard Debian repositories (1.12), which is not the latest version released. However, this should only affect general game-features, not individual campaigns. Further, the user interface has never improved* much in the past, leaving me pessimistic concerning later versions.

*Add more unnecessary or even annoying animations—yes. Tweak the looks of various units—yes. Improve actual usability—no.

Excursion on reading speed:
I suspect that some of the above is worse for those who read or process information faster, e.g. in that the “coffee strategy” will work better for a slower reader, who will hit the space key less frequently and have more time to relax during an individual portion of text. (On the other hand, a slower reader will, obviously, need longer to reach game play, and might grow more frustrated with the length of the delay.)

Excursion on “The Elements of Style”:
“Omit needless words” is likely the most famous claim in that book. Examples like Wesnoth and Der Untergang des Abendlandes really drive home the point. Notably, the main problem with both is the sheer quantity of needless words (and needless movements, etc.). The latter also shines a different light on this recommendation, in as far as “The Elements of Style” was written in a different era, when texts like Spengler’s were far more common than today, and the advice correspondingly more beneficial: Looking at typical writing back then, it was likely the single most important advice to give; today, it is “merely” good advice. For instance, my recent criticism of Stephen King’s novels (as too thick for their own good) is not rooted in individual formulations being unduly long*, but in problems on a higher level, e.g. individual scenes that could be cut out or shortened without loss.

*His sentences are reasonably compact—certainly, more so than my own…

Written by michaeleriksson

November 18, 2018 at 11:43 pm

Conflicting own beliefs and what to do about them

with one comment

In the set of beliefs* held by anyone, there will be occasional real or imagined conflicts (consider also e.g. the concepts of “cognitive dissonance” and “doublethink”). People differ mainly in (a) the degree that they are aware of and (b) how they handle these conflicts. Unfortunately, most people are unaware of most or all conflicts that arise, make no attempts at detecting them, and are prone to just explain away the conflicts that are known—even descending to outright doublethink.** A particular issue with awareness is that a too faulty or incomplete understanding can make such conflicts go undetected.***

*I use “belief” as a catch-all that, depending on context, could include any or almost any belief, idea, opinion, whatnot that implies or would imply something about something else. This includes e.g. “cucumbers are green”, “cucumbers are blue”, “God does [not] exist”, and “I [do not] like chocolate”.

**This includes such absurdities as simultaneously professing to believe in Evolution and Gender-Feminism. Indeed, a great deal of my annoyance with politics/ideology (in general) and Feminism/Leftism/PC-ism (in particular) results from the adherents ever recurring faults in similar directions.

***Consider again Evolution vs. Gender-Feminism: It is, for instance, highly unlikely that evolutionary processes would generate physical differences while keeping mental abilities identical—but exactly that is seen as a given by most Gender-Feminists (and a significant portion of the PC crowd, in general). Similarly, it is highly unlikely that the different roles of men and women in most societies over thousands of generations would have left no trace in form of e.g. natural inclinations. A Creationist–Feminist match-up would be less prone to contradictions.

In many cases, these conflicts are sufficiently trivial that they may be neglected.* For instance, that someone has two favorite dishes, music bands, movie stars, …, rarely has major impact on important decisions.** When it comes to topics that can have a greater impact, especially on others, care should be taken, however. Consider e.g. questions like how to vote in an election, what recommendations to make to others, what agendas to push, …—here it is important to have a sufficiently sound view of the topic; and if beliefs conflict, the view is unlikely to be sufficiently sound.

*A resolution can still bring benefit, e.g. through better self-knowledge, and I would not advice against the attempt.

**However, the resolution is often fairly simple, e.g. that none of two is the favorite and that the word “favorite” is best avoided; or that an opinion has changed over time, while still being professed out of habit.

Giving blanket rules for detection is tricky, but actually reading up* on a topic, gaining an own understanding (as opposed to parroting someone else’s), and deliberately trying to see the “bigger picture” and making comparisons between different fields and ideas, can all be helpful. Above all, perhaps, it is helpful to actually think through consequences and predictions that can be made based on various beliefs, and looking at how they stack up against both each other and against observations of reality. In my personal experience, writing about a topic can be an immense help (and this is one of the reasons why I write): Writing tends to lead to a deeper thought, a greater chance of recollection in other contexts, and a thought-process that continues intermittently long after a text has been completed.

*Note especially that information given in news papers, in school, or by politicians tends to be too superficial or even outright faulty. Wikipedia was once a good source, but has deteriorated over the years (at least where many topics are concerned). The “talk” pages can often contain a sufficient multitude of view-points, however.

If a conflict has been detected, it should be investigated with a critical eye in order to find a resolution. Here there are at least* five somewhat overlapping alternatives to consider: (a) One or both beliefs are wrong and should be rejected or modified. (b) Both beliefs have at least some justification and they should be reconciled, possibly with modifications; e.g. because they cover different special cases. (c) The conflict is only apparent, e.g. through a failure to discriminate. (d) One or both beliefs are not truly held and the non-belief should be brought to consciousness; e.g. because profession is made more out of habit than conviction. (e) The support of both** beliefs is approximate or tentative (awaiting further evidence), and (at a minimum) this condition should be kept in mind, with revisions according to the preceding items often being necessary.*** Note that the above need not result in rejection of one belief—it can equally be a matter of modification or refinement (and it can also happen to both beliefs). This is one reason why investigation is so beneficial—it helps to improve one’s own mind, world-view, whatnot.

*A deeper effort might reveal quite a few more alternatives. I write mostly off the top of my head at the moment.

**Here it has to be both: If one belief is taken as true and only one as approximate, then it would follow that the approximate one is outright faulty (at least as far as the points of conflict are concerned), which moves us to the “One” case of (a).

***For instance, if two physical theories are not perfectly compatible, the realization that physical theories are only approximations-for-the-now (eventually to be replaced by something better) gives room for an “approximate belief” in either or both theories. As long as work proceeds with an eye at the used assumptions, with the knowledge that the results might not be definite, and while being very careful in areas of known conflict or with poor experimental verification, this is not a major issue. Indeed, such “approximate belief” is par for the course in the sciences. In contrast, if someone was convinced that both were indisputably true, this would be highly problematic.

Again, giving blanket rules is tricky, especially with the very wide variety of fields/beliefs potentially involved and with the variety of the above cures. However, actually thinking and, should it be needed, gathering more information can be very productive. Having a good ability to discriminate is helpful in general; and with (b) and (c) it can be particularly beneficial to look at differences, e.g. if there is some aspect of a case where one belief is assumed to apply that is not present in a case where the other belief is assumed to apply. With (d), it is usually mostly a matter of introspection. (In addition, the advice for detecting conflicts applies to some parts here and vice versa. Often, the two will even be implicit, hard-to-separate, parts of a single process.)

For a specific, somewhat complex example, consider questions around what makes a good or poor book, movie, whatnot—especially, the property of being hackneyed: On the one hand, my discussions of various works have often contained a complaint that this-or-that is hackneyed. On the other, it is quite common for works that I enjoy and think highly of (at least on the entertainment level*) to contain elements of the hackneyed—or even be formulaic. Moreover, I rarely have the feel that this enjoyment is despite of something being hackneyed—this weakness, in it self, does not appear to disturb me that strongly.

*Different works serve different purposes and should be measured with an eye on the purpose. When I watch a sit-com, depth of character is far less important than how often and how hard I laugh; the romance in an action movie is a mere bonus (or even a negative, if there is too much); vice versa, an action scene in a rom-com is mere bonus; plot rarely makes sense in non-fiction; etc. For more “serious” works, more serious criteria and higher literary standards apply.

Is my explicit complaint compatible with my implicit acceptance? To some degree, yes; to some degree, no.

On the “no” side: I suspect, after introspection, that I do or do not find a certain work enjoyable, thought-worthy, whatnot, based on criteria that are not explicitly known to me.* If I find enjoyment (etc.), I am less likely to look for faults; if I do not, I am more likely to look for faults—but there is no guarantee that my original impression was actually caused by the faults now found. Some will almost certainly have been involved; others need not have been; and there might have been other faults involved that I never grew explicitly aware of.

*There are many aspects of different works that can individually have a large impact, and the end-impression is some form of aggregation over these aspects. For instance, consider the impact of music on movies like “Star Wars” and “Vertigo” or on TV series like “Twin Peaks”—change the music, and the work is lessened. Notably, the viewer is rarely strongly aware of the impact of the music (even be it hard to miss in the aforementioned cases).

On the “yes” side there are at least three things to consider: Firstly, a work can be hackneyed and have sufficient other strengths to outweigh this. Poor works are rarely poor due to one failure—they are poor because they fail on numerous criteria, e.g. (for a movie) being hackneyed and having a poor cast, wooden dialogue, unimpressive music, … Being hackneyed is, alone, not a knock-out criterion—being original is an opportunity to gain points that a hackneyed work simply has not taken. Secondly, different criteria can apply to different works,* and being hackneyed is not necessarily an obstacle for the one work, even though it is for another. Thirdly, if something is known to work well, it can be worth using even if it is hackneyed—“boy meets girl” has been done over and over and over again, but it still works. (See also an excursion below.)

*Partly, as in a previous footnote; partly, with an eye on the expected level of accomplishment. For instance, my very positive discussion of Black Beauty must be seen as referring to a children’s book—had I found the exact same contents in a work with the reputation and target group of e.g. James Joyce’s “Ulysses” (which I have yet to read), I would have been less enthusiastic.

All in all, I do not see a problem with this conflict in principle; however, I do suspect that I would benefit from (and be fairer in detail* by) looking closer at what actually created my impression and less closely on criteria like “original vs. hackneyed”. The latter might well amount to fault finding or rationalization. To boot, I should pay more attention to whether specifically something being hackneyed has a negative effect on me (beyond the mere failure to have a positive effect through originality).

*I doubt that my overall assessment would change very much; however, my understanding and explanation of why I disliked something would be closer to the truth. Of course, it might turn out that being hackneyed was a part of the explanation in a given case; however, then I can give that criticism with a better conscience…

Excursion on expectations:
In a somewhat similar situation, I have sometimes complained about a work having set a certain expectation and then changed course. That is an example of another issue, namely the need to discriminate*. There are setups and course changes that are good, in that they reduce the predictability, increase the excitement, whatnot. This includes well-made “plot twists”. There are, however, other types of expectations and course changes that are highly unfortunate—including those that make the reader (viewer, whatnot) set his mind on a certain genre or a certain general development. A course change here is likely to detract from the experience, because different genres are enjoyed in different manners, and because there is often an element of disappointment** involved. Depending on the change, there can also be a delay and reorientation needed that lessens concentration and enjoyment further. Another negative type of changes is (almost always) those that try to rejuvenate a TV series or franchise by sacrificing what once made the series worth watching, by “jumping the shark”, and similar.

*Yes, discrimination is also a sub-topic above; however, here we have a too blatant case to be truly overlapping: There is no need for me to re-investigate my own beliefs—only to clarify them towards others. (Except in as far as I might have suffered from a similar fault-finding attitude as discussed above, but that attitude is just an independent-of-the-topic aspect of an example.)

**Note that this holds true, even when the expected and the delivered are more-or-less equivalent in net value. (However, when there is a significant improvement, the situation might be different: I recall watching “Grease” for the first time, with only a very vague idea of the contents; seeing the first scene; and fearing that I was caught in the most sugary, teenage-girls-only, over-the-top romance known to man—the rest was a relief.)

Excursion on “boy meets girl”:
An additional, but off-topic, complication when considering the hackneyed, is that there comes a point of repeated use when the hackneyed does not necessarily register as hackneyed and/or is so central to a genre that it is hard to avoid. Consider the typical “boy meets girl” theme. This, in it self, is so common and so basic to movie romance that it rarely registers as hackneyed. In contrast, the rarer “childhood friends fall in love” does*. With “boy meets girl”, the question is less whether the theme lacks originality and more whether the implementation is done with sufficient quality** and whether the details are also lacking in originality (is there, e.g., yet another desperate chase to and through an airport at the end?).

*At least to me, which also shows that there can be a large element of subjectiveness involved.

**Oscar Wilde defended against accusations of plagiarism by pointing to the difference between adding and removing a petal when growing tulips: To repeat in a better manner what someone else has already done, is not necessarily a fault.

Excursion on good fiction:
More generally, I am coming to the conclusion that fiction (and art, music, whatnot) either works or does not work—and if the end result works, an author (movie maker, whatnot) can get away with more-or-less anything along the road. This includes the hackneyed, poor prose, absurd scenes, artistic liberties with science, a disregard for convention and expectation, the tasteless, … (But the question of “because or despite?” can be valuable, especially with an eye at a different reactions among different readers.) The proof of the pudding is in the eating—not in the recipe.

Written by michaeleriksson

November 17, 2018 at 2:53 am

Adults say the darnedest things

with one comment

I just re-encountered the fiction (and real-life) cliche of the child–adult exchange “He started it!”–“That is no excuse!”. This is a good example of adults telling children things that simply do not make sense,* and that are likely to leave the children unconvinced: “He started it!” is not just an excuse—it is a perfectly legitimate reason. There might be situations where it can be pragmatically better to turn the other cheek, try to deescalate, find a more constructive solution than retaliation, whatnot; however, that has no impact on the ethics of the issue and expecting a child to understand such matters is highly optimistic.** Furthermore, there are many cases where retaliation in kind is the best solution, especially when boundary pushers and bullies are concerned (which will very often be the case with children): Both being exposed to consequences for inappropriate behavior and having to take a dose of one’s own medicine can have a great positive effect in limiting future inappropriate behavior.

*I suspect that this is partly due to the answer being dishonest, that the adult is motivated by something unstated. (“What” will depend on context, but a fear of negative consequences from e.g. fights between children could be high on the list, as could a wish to just keep some degree of peace and quit.)

**And arguments in that direction are usually absent to begin with.

Note how the “adult” reply makes no attempt at providing reasons or actually convincing, and how a discussion of pros and cons is entirely absent—it is just an (invalid) claim that the child is supposed to take at face value “because I said so”. No wonder that children are not more cooperative…

The “because I said so” is, of course, a good example in its own right—the effect of such argumentation is that the child’s rejection of a claim is complemented by a feeling that the adult is an unreasonable dictator. It might or might not create compliance in action, but compliance in thought is not to be expected. Worse, it could have a harmful long-term effect on the relationship. It is true that there might be a point where a child is too young or the situation too critical for a deeper discussion to beneficial; however, the uses that I have seen (be it in fiction or in real life) would usually have benefited from a motivation.* Consider** e.g. a child’s refusal do the dishes countered with “because I said so” vs. “we agreed that everyone should take a turn—and today is your day”; the adult’s refusal to play based on “because I said so” vs. “I am sorry, but I am dead tired and need to take a nap”; or even any discussion resulting in “because I said so” vs. “I pay the bills; I make the rules”. The last example might superficially seem to offer no real difference, but most children (above a certain age) will at least be able to see the adult perspective of the bill payer and the hypothetical alternative of buying greater freedom through going hungry and homeless—but not of the more power-based “because I said so”. (Also note that “I am the parent; I make the rules” is closer to the dictator than to the bill payer.) At the same time, I advice against reasonable sounding arguments that do not make sense on closer inspection or that could back-fire.***

*Generally, even among adults, I recommend that any rule and whatnot be given some form of motivation, so that those affected know why something should or should not be done. This to increase the chance of compliance, to make more informed choices possible (e.g. when dealing with interpretation and special cases), and to allow a critique of the rule with an eye on future improvement.

**I stress that I do not consider the alternative arguments to be silver-bullets—dealing with children is hard and often amounts to a “damned if you do; damned if you don’t” situation. They are, however, improvements.

***E.g. “That is no excuse!” above. A more interesting example stems from my own childhood (pre-VCR): My mother argued that she should watch the news on the bigger color-TV and I a simultaneously broadcast movie on the smaller black-and-white one, because she had not seen the news in a week (due to a study absence). From my perspective, the negative effects of the inferior device on a movie were larger than on the news, and it might be years (not a week) before another opportunity to watch that movie arose. The result? I was left with not only an implicit “because I said so”—but also with the feeling that my mother was dishonest… (Adult me is open to the alternative that she simply had not thought the matter through.)

A sometime reasonable, but more often misguided, argument is “And if your friends all jumped off a bridge, would you follow them?!?” (with many variations). The analogy involved is usually inappropriate (notably regarding dangers) and/or too subtle (the “lemming” aspect). Normally, the only justification is that it came as a response to a weak argument from the (typically?) teenager, e.g. “but all my friends are going”. Here, however, such “smart ass” answers are not helpful. Better would be to evaluate the suggestion (e.g. going to a certain party) on its merits, factoring in both the fact that “all my friends” can seem like a strong argument to the teenager (even when it is not), and that there are at least some cases where the argument has merit through its impact on teenage life* or through giving a different perspective**.

*The degree to which adults should be concerned about this is limited, but it is not something to ignore entirely. There are aspects of popularity and networking that might be largely alien to an adult (and to some teens, including my younger self); however, they are there and showing them some consideration is not wrong.

**Notably, that something is wide-spread and tolerated by other parents could point to a too restrictive own attitude.

Generally, I caution against giving “smart ass” answers to children, and recommend using only factual arguments. For instance, my school class would sometimes be asked to explain/solve/perform/… something that had simply never been taught (especially when teachers changed). Typically, someone would reply with the idiomatic “det har vi inte fått lära oss”, which carries the clear intent of “that has not been taught” (and an implicit “so you cannot fairly require us to know”). Unfortunately, this phrase is vulnerable to the deliberate misinterpretation of “we have not been allowed to learn this” and the answer was invariably along the lines of “Who has forbidden it?”. The results on the class were never positive… To boot, this answer is doubly unfair in that (a) the students cannot be expected to guess what the next teacher considers “must haves” when the previous teacher saw things differently, and (b) traditional schooling severely limits the time, energy, and (often) interest available for own learning in addition to the official curriculum. (Note that both, even taken singly, invalidate the potentially valid angle that this answer does have—that learning should not be limited to school and that teachers usually indicate the minimum to learn.)

In a bigger picture, adults often impose constraints or obligations on children that make little sense. For instance, what is the point of a child making his own bed, should he not see a benefit for himself in doing so? There is no automatic advantage in a made bed and if no-one else is hurt by it… Indeed, apart from when I receive visitors (actual reason) or change the sheets (trivial extra effort), it might be more than twenty years since I, as an adult, made my bed.

Excursion on women as perpetrators:
While errors like those above are by no means limited to women, they do appear to be considerably more likely from women. It is conceivable that at least some of the problem stems from an arbitrary imposition of some irrational values that often occur among women (e.g. that any and all violence no matter the reason is evil, or a wish for orderliness-for-the-sake-of-orderliness).

Excursion on fairness:
Much of the above is related to the feeling of being unfairly treated. A fair treatment is by no means a guarantee for a happy and well-behaved child; however, the opposite will make things worse. Where fair treatment might be important to most adults (at least when on the receiving end…); it is paramount to most children.

Written by michaeleriksson

November 13, 2018 at 2:08 am