History refers to the study, documentation and narrating of the past. Knowing and understanding of the past is essential when envisioning, planning and building the future. The way how history is told affects the ways in which future can be imagined.
Since permacomputing envisions a long-term future of the computing that, it needs to tell the history of computing in ways that make permacomputing and similar alternative ways of thinking more relevant.
Problems of mainstream computing history
A lot of things get eliminated from the mainstream narrative for various reasons:
- It is essentially "winners' history":
- The developments in the US are overemphasized in comparison to what happened in the rest of the world, sometimes even eliminating prior art: Vannevar Bush's "Memex" vision was not that original (see Paul Otlet), Douglas Engelbart was not the first one to invent a mouse (see Telefunken's Rollkugel), etc. etc.
- The history of computer networking is told in ways that eliminates non-Internet networks, some of which were still quite prominent in the 1980s. BBSes are sometimes mentioned as a side curiosity because they are part of the "consumer history", but what about BITNET, DECnet, etc.? Even Minitel is scarcely mentioned even though it had millions of users already in the 1980s, maybe because it was French and therefore irrelevant.
- The history of personal computing very much centers around a Californian narrative where the young enterpreneurs (such as Steve Jobs) were the heroes who liberated the world from the evil mainframe culture. This is sometimes intertwined with a more general "hacker mythos" even though its approach to liberation was often largely non-commercial.
- It is also very much "consumer history" especially from the 1980s onward. Consumer-grade hardware and their applications (especially games) get a lot of love, and even a lot of obscure platforms from small countries are documented. However, it is often very difficult to even find mentions of prominent institutional or scientific projects, strange non-US hobbyist subcultures, etc.
- There have been conscious attempts to make earlier
developments irrelevant or obsolete, especially in Internet
history:
- The "Internet years" idea in the late 1990s ("one year in cyberspace is equivalent to ten years in meatspace" etc.) was perhaps invented because researchers did not want to do their homework. The pre-WWW Internet was so long ago in Internet years that it was in a different era that didn't need to be studied.
- "Social media" was defined in a way that made it possible to start its history from the 2000s (again, eliminating BBSes, Usenet, IRC, etc.)
- In general, each new user generation wants to pretend it invented more things than it actually did.
- Local histories are understudied, especially those of non-Western countries. The same applies to minorities, women, lower social classes and many other group that don't get to be as loud as the affluent white Californian males.
Problems in how the story is told:
- The overarching story is that of economic growth and maximization. Hardware systems are divided into "generations" (that may sometimes be only a few years in length) that are intended to obsolete the previous generation. Big companies and their business "achievements" (such as the establishment of monocultures) get a lot of praise.
- This "chain of obsolescence" narrows the technological history down to a one-dimensional "highway of progress" where there are only two possible directions ("forward" and "backward"). This makes it difficult to envision other directions and represent them in ways that don't sound "backward".
- The concept of "retro" is used to separate some parts of history and technology into a different world that is only relevant to personal memories. This world is also the place for "backwards" ways of thinking (such as the appreciation of small and efficient program code that can't be justified from business perspectives, or the acknowledgement of the benefits of earlier communications systems in comparison to modern social media).
Ideas and examples
- Siliconization is a concept that is used in Romania to refer to how their local technocultural practices ("șmecherie") were replaced by an imported "silicon valley" model in the 1990s.
- Eriksson and Pargman have suggested the use of counterfactual history as a tool to imagine computing futures. It is often difficult for students and other people to even imagine a computing world that is not built around Moore's law, so for example imagining how computing might have evolved in a low-coal world can be helpful at making this kind of conceptual leap.