Latest Articles

  • The estimated reading time for this article is about 3 minutes.

    Jon Stewart

    Jon Stewart appeared to be backing the racist, debunked idea that COVID-19 was made in a Wuhan science lab on the very first Late Show filmed in front of a full audience in more than 400 days.

    On the one hand, this is just a drop in the ocean that is “people being wrong in public” problem that is the scourge of our Information Economy times. Further, to speak unpopular or even wrong ideas is (in the U.S.) a fiercely protected right. Lastly, history has shown us enumerable examples of smart people holding odious beliefs. Humans are complicated, yet we seem to pretend that they are not.

    If you believe that the Chinese government manufactured COVID-19, there is little I can do change your mind. But, here is Rebecca Watson speaking about this very topic and why the consensus of virologists who have studied this pathogen do not see evidence of human manufacture.

    I am old enough to know this will change almost no one’s belief on this, but it is there for those with an open mind.

    But this particular person espousing this particular conspiracy theory is disquieting. Stewart was, during the Bush years, a fact-checking juggernaut. Sure, he had a research staff then and yes, I was happy to hear neo-conservative propaganda trashed. But it seemed then and seems even now that Stewart is a pro-little-guy and pro-truth crusader. His decades-long work for getting benefits for 9-11 first responders is a testament to this.

    Now, take that guy off the air, fire the fact-checking team and stick him on a farm for several years and you have a very ordinary rich guy free to consume media that feeds his preconceptions, just like Mike the MyPillow guy or Elon Musk or the former president. Then recall that, like all public figures, Stewart likes an audience and that “hot-takes” are the currency of the same, should we be that surprised to see him digging in on a controversial position?

    I hope that Stewart’s performance was a Kaufman-esque stunt. That Stewart wanted to point out the criticality of media literacy in a world where monied interests lie to the public. But I am not holding out a lot of hope for this outcome.

    Much is made on the political Right of how the Left is turning into Thought Police who mandate a party line (which is an irony so rich I trust I need not point it out). As Stewart gets called out for supporting a debunked idea promulgated mostly by zealot supporters of the former president, it is important to remember the challenge of epistemology. How do we know what we know? We all have to defer to the considered, professional opinions of those technocrats who study the subjects that we do not. Do these experts get it wrong sometimes? They do, but at a much lower frequency than non-experts do. If not, then there is no reason to have experts at all. An expert opinion can be and should be challenged in good faith. But most of the time, the experts are closer to the truth than anyone else. Reality is not obliged to agree with our preconceptions or crazy ideas.

    Still, this example of a dude being wrong in public hurts.

    💬 Share a response to this post

  • The estimated reading time for this article is about 6 minutes.

    Escape Room Roundup

    Even before the COVID-19 lockdown, my family and I enjoyed doing escape rooms. We are lucky enough to be a place with easy access to commercial escape rooms like Escapology Live Escape Games - Worldwide Escape Rooms, plus a few other smaller ones. However, we are not master puzzle breakers. Our track record for solving the puzzles is rather low. Still, the experience of doing escape rooms is so enjoyable for all involved the a string of loses, though galling, is not a deal-breaker.

    We are a natural market for “board game” versions of escape rooms and we have going through at least three of the following titles: Exit: The Game, Unlock!, and Deckscape. In this post, I would like to offer my impressions of each system and its suitability to casual family gaming.

    I will forego any pseudo-scientific rating system, but instead lay bare my biases. Each system has been judged by my experience playing the game with three people, one of which is a pre-teen. How well each delivered a memorable and pleasant gaming experience is my primary criterion.

    One last note: my family does not play with timers. These introduce FAR TOO MUCH drama into a process that is sufficiently fraught for the players. If you too are looking for casual fun, do discard any timing device. The game ends when the last puzzle is solved or your players are uninterested in continuing.


    Deckscape, as the name suggests, is merely a deck a cards! Puzzles are both visually based (meaning you have to find hidden figures in cards), and logic based. No additional device is needed.

    By far, this is the best series for casual escape room fun. No, these games will not impress you hard-core gamer friends, but they are all replayable and the stakes never feel too high. Aside from logic and math, you rarely need any outside knowledge to complete the puzzles. I only wish there were more of these!

    The only drawback to this system is the lack of an in-game hint system. However, I don’t find these puzzles rarely necessitate them.

    Great starter titles from this series: Test Time and The Fate of London.


    These games require a tablet or mobile device to play. It is possible to disable the timer for most of these games, although a few specific titles do have timed events (like Tombstone Express and The Nautilus’ Traps), and I do not recommend those specifically.

    My family has had the best luck with this series. There are many titles to choose from, with a variety of fantasy settings that should appeal to many. Some real-world knowledge (especially for the Arabian Nights themed title) are very helpful for solving the puzzles. This is not typically the case.

    The app has a hint system, which works well. Some of the “figure out the weird device” type puzzles are less interesting to me, but I know these appeal to some (as this kind of thing is all over Interactive Fiction games).

    Great starter titles from this series: The House on the Hill and The Formula.


    The Exit series has the most of titles of any of the offerings mentioned in this post. Each title has self-contained game that is card-based with additional custom “feelies” and make this series extremely popular. The system consists a riddle deck, a solutions deck, and a hints deck. There are 10 puzzles per game. Puzzles are visual, logical, spatial, and may require some crafting skills, like precision cutting and measuring. By far, this series has the most creative puzzles and the most unique puzzle solving experiences. Why is this not my favorite series?

    Although we have done 3-4 of titles in Exit, we have never completed any title. We always quit before the end. Frequently, we get dead-ended in the game, where is it unclear what the puzzle is. In the last on we did, we got to a point where we need a “hidden feelies” to solve the puzzle, but we were never told by the game to obtain that. We just took the the thing anyway and moved on.

    Many of the puzzles require more careful crafting of cards or feelies items than my family (or me) is interested in doing. I’m not terrible with rotating objects in my head, but I am terrible at crafting in general and worse doing so under pressure. When you are a parent, you cannot give any one task 100% of your attention, but the Exit series, which often has a lot of “game state” to manage requires this.

    One last point, the production values of Exit games are very high. Each title is gorgeous. Although I find some of the puzzles rather opaque, I am inclined not to blame the game designers but my own limitations for this. I wish I could enjoy this series more.

    Great starter titles from this series: The Abandoned Cabin. If you do not like this one, do not bother with the rest of the series.

    Honorable Mention: Journal29

    One last entry that I recommend as an excellent family activity is Journal 29, which is a book-based “escape room” or series of puzzles. The fiction of this is that a group of researchers working on a secret project have disappeared, leaving behind one last journal filled with clues to suggest the nature of their work and the reason for their disappearance. This will be very familiar to X-Files fans.

    You do need an internet-enabled device to solve the puzzles, some of which require GPS look-ups, or special media files from journal29. Most of the puzzles are visual and logic based. Some may require light googling.

    On the other hand, I found these puzzles incredibly approachable (4 out of some 60 utterly stumped me). Doing this puzzles with your family over the course of several sessions is very enjoyable.

    I hope this helps you find the right escape room experience for your family. These games are a great way to keep your brain engaged while providing a unique (mostly non-digital) shared experience.

    💬 Share a response to this post

  • The estimated reading time for this article is about 5 minutes.

    Because I design and implement web applications, the issue of how people discover existing features in them and how easily they can accomplish their tasks is a daily concern. The most important rule of iterative interactive design is to believe user complaints. This may seem so obvious as to go without calling it out, but it should be the mantra of all app designers. It is not.

    I will quickly confess that I do not hold my user experience or design skills in the highest regard. But I have learned enough to know how grievous many of my earlier designs are and how hard it is to produce “invisible design.”

    Given this professional concern, you can imagine that when I run into badly designed user interfaces outside of my job that it rather sends me around the bend.

    Let’s talk toasters and stoves.

    I have a lovely toaster. Here is a close up of the controls on it:

    My toaster controls are confusing without glasses

    Aesthetically, it is a handsome bit of sculpture. It’s got that retro, Art Nouveau style that an aging, mid-century man like me enjoys. However, I have old man eyes. In the morning, when I am most likely to use the toaster, my eyes are particularly useless.

    Given that, let’s look at temperature controls on this unit. There are two dial controls. Dials are a fine way to translate heat intensity to a physical motion. However, look carefully at the position indicator on the dial. Do you see it? I suspect most of you can when you are looking this photo, but please note that small black dot is easily lost in the glare of the “chrome” bits of the design. Under lower light conditions (such as those common found in the morning when 100% of my breakfasts occur), it is possible to carelessly misidentify this indicator so that the control is set 180 degrees further into the “dark toast” settings that I want.

    Hold that thought because I have another kitchen application with similar controls that presents similar problems.

    Here are the controls for two burners on my stove top:

    My stove is trying to kill me

    Again, the aesthetics of this stove is pleasing to me. It too has a mid-century modern appearance with stainless steel bits. The dial controls are arranged intuitively enough for me. Unlike the toaster, the stove designers made the dial indicator a line rather than a dot.

    Since it is a gas stove, the 11 o’clock position causes a spark that sets the gas on fire. As you rotate the dial anti-clockwise past that the burner attenuates from full gas flow to its lowest setting (which is around 3 o’clock). A good amount of the time, I am cooking things with this dial at the 6 o’clock position.

    When you are paying attention, these stove controls are great and easy to use. The trouble comes at the end of dinner prep when you are transition from cook to waiter. At least for me, my attention shifts away from cooking concerns after I dump the food into a service container. I am hungry and prefer my food eaten hot. Many is the time I forget, in my rush to eat, to turn off the burner, especially when the dial is in the 3 o’clock or 6 o’clock position.

    When a burner is set to simmer there is hardly any flame visible. When the dial is at 6 o’clock the indicator appears similar to the off position.

    Either way from my table, I cannot see the flames of the burner and the dials do not stand out.

    For a long time I thought there was little to be done to help me with these problems. However, my wife is currently taking a design course. After discussing some of the challenges she was working through, it got me thinking again about my appliance UX problem. What is the core problem with the toast and stovetop controls? I submit that these controls do not offer clear enough indicators of their current state when viewed from normal, but sub-optimal positions.

    Taking a page out of early rocket design, it occurred to me that painting one hemisphere of the dial control would be enormously helpful. Observe:

    Never burn your toast again, sleepyhead

    And also:

    I dare you to ignore this indicator

    Now, before I get cancelled, let me assure you that this is a design idea. I am not literally going to take a highlighter and color these dials. Such a solution would cause “domestic inquietude”. However, the principal of making the two hemispheres of the dial visually distinct has been used in other places to great success. In the future, I hope appliance designers consider how their controls present in less than perfect conditions.

    The title of this post is a nod to the very excellent book The Design of Everyday Things by Don Normal. If this post tickled your fancy, read what a UX pro has to say about our designed world.

    • Tags:
    • ux

    💬 Share a response to this post

  • The estimated reading time for this article is about 5 minutes.

    Runnel is my answer to an increasingly broken Internet. It’s a program I did not want to write, that I tried not to write, but finally had to. The result is a hacked together, light-weight web app that reads the directory in which you keep your MP3s, extracts ID3 tags from each mp3 file, and presents a sorted list to the user, who may add songs to ONE, GLOBAL PLAYLIST for listening.

    No persistent databases. No user accounts. No connection to outside services that may close up shop in the future. Runnel never modifies your mp3 collection at all. It never writes files outside of its own directory. It never “phones home” to tell me what people using my app are doing with it. It is just a single-threaded Mojolicious server that uses Bootstrap for styling, a small table sorting library (which I might replace), and small bits of playlist list code that manages a single MediaStream object. It works on desktop and modern mobile devices (sorry, Apple Newton users).

    Like all new programs, Runnel has bugs. When they bother me enough, I fix them. I use Runnel every day to play music from my iPad or iPhone to bluetooth speakers around my house. Could the interface be better? Yup. Could it have a lot more features? Uh huh! Am I going to waste time adding those? Heck to the No.

    While the tech that went into Runnel might be interesting to some (it’s mostly vanilla JavaScript with modules), what this post is about is why I had to write what should have already exist in better forms somewhere on the Internet. Actually, due to discoverability problems in general (thanks to search engine dependence on sponsored ads), there may well be dozens of projects like this. But let this rant start with the large scale issues before raking any particular actor over the coals.

    I started using the Internet fairly late among my peers. The first computer I bought with a modem was in 1993 or 1994 and with it, I taught myself to program in C. Back then, I also discovered Bulletin Board Systems, on which I played online DOOR games like Legend of the Red Dragon and found mountains of UFO conspiracy text files. I was living the 2020 pandemic lifestyle 30 years before y’all.

    It may not be believed by younger readers of this blog (who themselves are only conjectured to exist), but simple mp3 files were A Big Deal. Until this time, high quality stereo music files only existed (for the most part) as uncompressed audio files (WAV, AIFF, etc.). A five minute song might produce a file 20 Megabytes (MB) in size. To translate that into an equivalent modern file size, that’s something like 20 Gigabytes (GB) for each song. Audio CDs hold about 640 MB and contained around twelve songs without much room to spare. So exciting a technological breakthrough were mp3 files that people started “ripping” their audio CDs and sharing the resulting mp3 files. This idea literately became the core business idea for one of the early Dotcom companies called Napster.

    I can imagine that if you had not heard this story before, it may strike you a pretty thin concept for a company, but please be assured that Napster was so big a deal that the entire recording industry lined up against them.

    My point is that sharing mp3 was a very popular use of the Internet in the late nineties and early 2000s. Because of this there were many proprietary and open source programs designed for streaming large collections of mp3 files. I recall running a few of them back in the day.

    Due to massive improvement in web technology and internet connectivity, tech companies started moving away from producing downloadable software and instead created web-based applications. Why give your grubby customers your precious code when you could just sell access to it? These companies were collectively called the “Web 2.0” and it is where things on the Internet took a bad turn.

    In 2021, we find ourselves largely dependent on cloud based services (which is largely rebranded Web 2.0, but metastasized) that publish our microblogs, our spreadsheets, our photos, our music collections, and even our health as measured (questionably) by consumer electronics.

    I am no luddite, but our relationship to tech has in my lifetime gone from being one where people were owners of software to one where the selling of people’s personal data as obtained from nearly all uses of softwares drives a huge swath of the world economy. The richest American, Jeff Bezos, is not worth $200 billion because he sells books. Google is not one of the largest tech companies because they help you find recipes for chicken tikka masala. Facebook makes no money from keeping you in touch with your family. In each instance, these companies use your data for profit (either to drive their own advertising or to sell to third party advertisers).

    OK. We are not likely to fix this trend with a blog post. But what can we do?

    As consumers, we can roll the clocks back a few decades to were we run at least some of our own software. Learn to code or find code on github that meets your personal needs. Rent a virtual machine on linode or host your web site off a raspberry pi at home from your always connected internet access point. None of these solutions are as convenient as those cloud based solutions, but when enough people have the same problem, someone starts working on a solution. That’s the Internet I remember and it was a far healthier place for it.

    💬 Share a response to this post

  • The estimated reading time for this article is about 7 minutes.

    When you are lost in the wilderness, most survival guides will tell you to find an elevated position (like a hill or mountain) from which to survey your location. Often getting above the overwhelming details of your surroundings will help you find recognizable landmarks so that you will no longer be lost. Although I am a computer engineer by training and practice, I have long been a student of history which is the way a mere mortal can rise above his or her temporal context to obtain a wider perspective.

    Even though I have been an agnostic atheist since adolescence, I have had many experiences with various flavors of Christianity. My mother was a believer (although her faith was decidedly not based on a reading of scripture). I read a lot of C.S. Lewis. I was baptized around the age of 10, so that it is still a memory I retain. I attended Boston College where I was taught a version of Western Philosophy that included noted Christian writers like St. Augustine. All this is to say that whether or not I would have chosen to study Christianity, I have nonetheless been steeping in it for decades.

    To be completely candid, I am somewhat hostile towards this faith. Familiarity breeds contempt, as they say. What wisdom age has brought me suggests it is not religion that greatly affects people’s behavior and attitudes. People do what they want and justify their actions with whatever fiction is most convenient for them to do so. However, some truly contemptible behaviors have used the cloak of random Biblical passages as absolution and vindication for actions no religion on Earth sanctions. How religion conned the world into believing that only a faith could foster ethics and morality is a feat of advertising well worth studying. History has always been a more reliable (if descriptivist) teacher of human behavior.

    A few years ago, I heard an interview Bart Ehrman on Fresh Air. He was promoting his book /Misquoting Jesus/. Something about this topic clicked with me. As I have indicated, I was no stranger to theologians, but few of them were atheists. The best analogy for Ehrman is that he does for historical-critical New Testament studies what Carl Sagan did for space science, which is to say Ehrman brings the consensus view of his discipline to a lay audience. I will admit to being a bit of a fan boy of his (I joined his blog twice), which I have never considered doing for technical blogs). To date, I have read one other Ehrman book, /Jesus Interrupted/, but I expect I will read his Heaven and Hell book at some point. This fall, I found another NT scholar author I enjoy, Mark Goodacre, whose primer, /The Synoptic Problem: A Way Through the Maze/, was a clarifying treatise on why scholars believe in a lost NT source called Q and why other scholars (rightly, it seems to me) find this hypothesis untenable.

    As in all academic fields, there are heterodox views of NT study and youtube is the place to find those in abundance. One of my favorite speakers on the subject is Robert M. Price (also a Lovecraft scholar). Price represents the notion that there was no actual historical Jesus and that the cult fabricated him out of similar well-known regional myths. This idea is called Mythicism and has been soundly rejected by the academic consensus.

    While I find a lot of what Price and his fellow mythicists say attractive, I do not find this line of thinking compelling or productive. Mythicism is a “just-so” story that fails to account for historical evidence of Jesus found in Biblical and other contemporaneous documents. Mythicism also fails the “sniff test.” Just looking at the three synoptic gospels (Mark, Matthew, and Luke), all of which were written after Jesus’ death by different authors, none of whom actually knew Jesus personally, there are points of agreement about who Jesus was and what he did. More interestingly, there are incidents in these gospels that no writer trying to make converts would make up. These “difficult reading” passages include a story about the family of a young Jesus concerned that he was crazy (which would be a totally normal human conclusion a mother would have to a son that starts spouting off apocalyptic, Jewish fundamentalist sermons in the street!). The historicity of Jesus is a settled question in NT academia. As a starting point for understanding what the consensus view has to offer, it is more convenient to accept this point than to fruitlessly debate it.

    NT studies include trying to reconstruct how an obscure Jewish apocalyptic street preacher from a small rural village in Judea grew into an organization that dominated (and often decided) the lives of millions of people across the globe. Understanding this process is not an act of faith, but a duty of the intellectually curious. Regardless of my personal animosity toward the faith, to ignore Christianity’s impact is disingenuous.

    Of course, there are theological questions that NT studies bumps into. One of the more curious and fundamental questions occurs fairly early in the readings of the gospels and the letters of Paul (the earliest post-Jesus proselytizer we have records of). The faith promises some kind of salvation (the exact nature of which is beyond the scope of this post) to those who believe. What exactly is it that the faithful should believe? If you look the gospel of Mark (believed to be the earliest gospel), Jesus wants his audience to be more faithful to the Jewish law than the Pharisees. This is a call to a kind of strict fundamentalism that is no longer a part of mainstream Christianity, yet it is canonically a part of the NT. Paul argues that the Jewish law should not apply to recent Gentile converts and salvation was based entirely on the belief that Jesus was bodily resurrected by God. This question of faith does not get settled in the NT, which again is a curious historical oddity.

    In case you did not know, the New Testament is a collection of 27 books (“three [like the Trinity] to the third power; it’s a miracle” as Ehrman often quips) all written some years after the death of Jesus. For some of these books, we believe we know who the author was. The first seven letters of Paul are believed to be written by the apostle Paul, for example. Other letters attributed to Paul in the NT are not believed by scholars to have been authored by him. All of the Gospels are anonymously authored. The names given to them are the work of later compilers. The gospels were not the first books of Christianity, but work of later first century AD Greek-speaking Christians trying to create a narrative of Jesus from oral traditions and perhaps written sources no longer available to modern scholars. There is an apocalypse, which was a style of book that contains criticisms of contemporaneous life dressed into mystically language). We know that the author of this book was named John, but this is not the same John who wrote the gospel. The rest of the NT consists of letters from the “apostolic fathers” (early church leaders) in which the doctrines of Christianity were being worked out. That is to say, the letters are the beginning of doctrinal thinking and not completely worked-out policy statements. Remember this point when someone starts citing Scripture at you to justify their personal biases.

    Aside from looking at all the places where NT sources contradict each other (which is a lot of fun), learning about the struggles of the young church has softened my views towards (most) Christians.

    Understanding the context in which the books of the NT were written and why they were written humanizes Christianity. Christianity becomes an evolving project of centuries guided by human minds trying to understand human suffering, mortality, and how to address both.

    Faith that claims to have all the answers remains anathema to me. Faith that encourages questions is one I can live with. Ultimately, I remain disinclined towards magical thinking.

    💬 Share a response to this post