12 min read
Support this website by purchasing prints of my photographs! Check them out here.We live in a world of planned obsolescence. You'd be hard pressed to find a consumer-grade technology manufactured today that will still be working in 10 years. The minimization of circuits is another shame. What used to be repairable by hand is now destined for landfill. The built-in schematics of old have long since given way to soldered RAM and the warranty void stickers of new — if you've torn apart a TV built in the 80s you'll know what I'm talking about.
When I first started writing this article in 2023 I had just watched the Silo series on Apple TV (and Severance before that). Silo is a great bit of Sci-Fi, a rare better-than-the-book series, and does an excellent amount of world building. Both Silo and Severance were made with very custom, retro computing machines. In Severance the design is a bit more artificial, nefarious, but in Silo the computers are tools designed to last a couple hundred years. Both shows use computers with CRTs and with mostly character-driven displays.
The computers in Silo really got me thinking about what a computer designed to survive for over a hundred years might look like.
Of course, "computer" used to be a very specific term, used for a very small band of machines. Now, many refrigerators could be called a computer. Good luck using a refrigerator to keep a personal diary. The better term, as it applies here, would be general purpose computing machine, which refers to a device where one can largely solve unanticipated problems and, perhaps, even execute code and somehow interface with the outside world.
There's a certain "Turing Completeness" (not in the literal sense — all computers are Turing Complete) when it comes to an Android phone as compared to an iPhone. For example, say that you're trapped in a room and the only way to get out of the door is to insert a USB drive with a text file on it saying "open sesame" (like something out of a Resident Evil game). If you've got a USB-C thumb drive, formatted as FAT32, and an Android phone, well you can pull this off. If you have an iPhone, you're out of luck. It's like, in some way, you're on some adventure, and you come upon a certain problem, and you're stuck. Full Stop.
The ideas as described in this document are attempts to define some sort of device that is "Turing Complete" enough to get around as many of these "full stop" situations as possible. Back to the Android example, if your thumb drive is USB-A, or is formatted as ExFAT, or if you don't have a text editor installed, then you have hit a full stop.
What is this fascination with any of this? I suspect it comes from being born in the 80s, at a time when computers were new and magical. A tool that could solve all problems. And then decades of Star Trek and space survival movies came out. Movies where the lowly hacker walks up to the fortress, jacks into the serial port, and lets himself in. Robocop, Terminator 2, etc.
Another issue with many modern devices is their requirement for internet access. When I was in high school, sometime in the early 00s, a programmer whom I was job-shadowing with lamented the cycle of computers forever alternating between thin and thick clients. A thin client is a machine that's just a simple keyboard and display that depends on a remote server to function, while a thick client is something like a laptop running Linux or Windows XP. In a way we're now transitioning back towards thin clients. Just look at the Windows 11 OS which has issues during install without internet connectivity. Or even the Adobe software empire which now requires subscriptions to the online Creative Cloud service. Modern computing involves software hamstrung with subscriptions.
Note that the computers in Silo were technically thin clients. But they didn't have to be!
Most of these issues stem from corporations trying to wring out every dime they can from consumers. But I'm sure with things like the right to repair movement, and an eventual crash in the technology investment markets, that we'll eventually return to a time where thick clients reign supreme. Companies like Framework are even helping to make repairability a consumer-desirable feature.
But I digress. What might a general purpose computing machine look like, one that's designed to last over a hundred years, one that would even be useful in some sort of post apocalyptic environment?
Well, such machines obviously need to work without internet access. This might immediately discard modern versions of Windows. macOS is slightly better in this regard. While the OS installer has plenty of hidden surprises, in my experience, it always at least succeeds with an install even without internet access. macOS also doesn't complain about licensing and serial numbers, though it will absolutely complain about running on non-genuine Apple hardware. It's possible to slap together a Hackintosh in some situations, but as the name implies, it's a hack. Apple has always balked at standards, e.g. the new Mac Studio uses proprietary removable storage, so good luck swapping that out in a hundred years.
It's also important that a computer contain all of the tools necessary to modify its own software, software to replicate itself to other computers, and software to help repair itself or other computers in the case of a catastrophic event. A hundred years is a long time and there will be unanticipated problems along the way. Classes of bugs like the Y2K problem. The need to reduce the OS's strain on battery usage. The need to write new hardware drivers. Rebuilding a kernel is a must.
While a Windows or macOS computer could ship with the entire repository and tooling required to build the OS, something only available to Microsoft and Apple employees today, I suspect such repositories would span multiple terabytes and be largely impractical. Plus I guarantee one would need several versions of Visual Studio or Xcode or various runtime versions to build the entirety of Windows or macOS.
Linux and BSD machines have famously made it easy to recompile and install new kernels locally. Some Linux distributions only ship source code and require all binaries to be compiled locally. And while Linux and BSD are great candidate operating systems for the post apocalyptic computer, they aren't necessarily the best choice, but more on that later.
Portability is probably another desirable aspect of such computers. If people migrate around in 100 years then they'll need portable computers. If people don't migrate, well, they can keep a portable computer in the same place for a long time. So massive stationary computers might not be the best 100 year computer. I suppose I could call my 24 year old fully functional iMac G3 a portable computer.
Nations rise and fall and standards change. It's not guaranteed that a power grid supplying alternating current at 60Hz will be in our future. For that reason the post apocalyptic computer should be able to accept power from a variety of inputs, and accept a wide range of voltage and unregulated currents. Such a machine would need some sort of battery, a circuit to smooth out bumpy electricity, and a pair of alligator clips that works with unanticipated power sources.
A battery is probably the biggest nightmare. It would need to be something that doesn't degrade too quickly. It would also need to be re-creatable in the future once failure inevitably happens. The now banned mercury oxide batteries that powered many 70s era cameras still don't have a perfect replacement after all. The ability to charge a computer using solar panels would certainly be convenient.
The computer would also need to come with ample documentation, both in print and digital. Documentation that is in print is important for failure scenarios, like when the device doesn't boot. An explanation of how the battery works. An explanation of how the power inputs work. Any instructions on bypassing a dead battery. How to turn the computer on, what sort of sensors it has, how to recover from a failed boot partition, etc., is all required to be in print. Bootstrap stuff.
Digital documentation is needed for everything else. Manuals for programming. How to build the kernel. How to debug rare boot problems for a second computer. How to network with other devices. How to use all of the sensors. How the LCD is attached to the bezel. How to build replacement components or even a new computer from scratch using entirely commodity parts.
In so many Sci-Fi series, like Star Trek or Silo, it's important for a community to be self-sufficient. For example, producing food, recycling waste, etc. This includes repairs. Can you imagine what Star Trek would look like if they couldn't leave the ship to repair it from the outside? Or how long would a Silo last if one couldn't fix things? Computers are a bit more constrained than a starship, but ideally, if a non-vital component breaks, one could ideally remove it and work on it while reading repair docs from the functioning computer.
The software on these computers would be extensive as well. A wide array of programming language compilers: C, C++, Rust, etc. Scripting languages as well: Bash and Python both come to mind. Each of these require a compiler/interpreter, programming manuals, and all source code for the compilers and interpreters.
Any software included on the machine should have the source code to build it. Every single aspect of the computer should be reconfigurable. When the computer generates some sort of media to install itself on other computers, such as by creating USB boot drives, it should be able to replicate the current state of the packages as they run on the computer. Assuming the possessor of the computer has made improvements to the software it would be a shame to lose them upon replication.
The computers also need the ability to rollback the system to a known good state. This can be achieved by using a recovery partition, but it would seem impractical to keep such a partition up to date with improvements made to the system along the way. Something like ZFS snapshots might be more practical. Experiment with changes, improve the system, in case of catastrophe you roll back, in case of success you commit a new filesystem snapshot.
Bit-rot is a phenomenon where individual bits of data held in long term storage can become corrupted over time. This is often attributed to cosmic rays but really anything that physically exists will have small changes randomly made to its atoms. For this reason the storage of the computer would probably need to remain largely uncompressed. Errors in uncompressed data are easier to recover from.
Security is another awkward concept. Some forms of encryption can result in massive failures in the case of bit-rot. Full disk encryption also makes it difficult to restore a computer in a disaster scenario. I'm not dying on this hill but a hundred year computer doesn't use full disk encryption. When it comes to longevity, a computer will likely change hands many times. It's then questionable whether user authentication is relevant. Or, put another way, is it okay if the security mechanism is that you've locked the computer in a box instead of locking it with a login prompt?
Physical moving parts are often the first components of a computer to fail. Spinning drives therefore aren't the best storage medium. Hinged displays, like with modern laptops, are another point of failure. Having a machine with a stationary display may be the most resilient.
Before discussing user interface I'd like to take a quick step back. Computers from decades past largely had text based interfaces. If you're old enough, or a computer programmer, you've seen your fair share of full screen text interfaces. Computers were slow too, of course, so while text interfaces were the standard it's not like those machines were fast enough to draw complex graphics anyway.
Fast forward a few decades and computers are now exponentially more powerful than they used to be. They can do all sorts of fancy user interface magic, such as transparent window shadows and glossy window backgrounds which blur the layers behind them which can be overlaid on top of something else in another window that's animated and rendering text. Most of the CPU power of modern machines is squandered to ensure that overcomplicated UI elements with subtle shading and rounded rectangles render at a quick enough pace.
At the end of the day we don't actually need these crazy complicated user interfaces. I'd argue that for 90% of the most common user actions the Windows 98 UI is better than the Windows 11 UI. In fact old operating systems often draw updates to the screen with less latency than newer operating systems. Over time we've added so many layers of software abstraction that it's resulted in large binaries that consume astronomical amounts of disk space and memory and come with poor performance. Check out Software disenchantment for a nice rant about how software has gotten slow and sloppy.
All this to say that we can do fine with less CPU and memory resources than we have today. The hundred year computer could be served with either a simple graphical user interface or with a text interface.
Going way back to the Turing Completeness of a computer, in particular regarding how it interacts with the world, it's important that such devices can communicate and consume information about the world over many mediums. What's the ultimate device from Sci-Fi for doing this sort of thing? Why, the humble Star Trek Tricorder of course! While it's fiction there is certainly a lot we can take for inspiration.
Things like Bluetooth and WiFi are useful in today's world, but these complex protocols will likely be extinct in the future. Other forms of radio are certainly going to be more universally beneficial. The post apocalyptic computer will definitely need Software Defined Radio (SDR), which is a hardware radio that can have its properties changed by software. Using something like an RTL-SDR dongle I can listen to AM and RM radio, CB radio, and plenty of folks using amateur radio, though it sadly doesn't allow for transmission. Of course, there is more feature-packed SDR hardware out there, and while the SDR devices of today need to abide by FCC transmission rules, those same rules won't apply in a post apocalyptic scenario (or will at least change over the next 100 years).
The computer needs speakers. It should probably also have a 15mm audio jack. Other communication devices and sensors would be beneficial: infrared RX / TX, audio RX (mic), visual light RX (camera), Geiger counter. Older ports like serial and parallel are going to be easier to develop with than, say, USB. Powering the device with a traditional power source would be better served with a barrel jack than today's USB-C standard. GPIO ports would also be useful for breadboard development. Having a breadboard built in might be even better.
When it comes to keyboards, every key needs to be labeled. Blank keycaps are a definite no-go. Nobody is going to know what those keys do in the future. Having dedicated number keys and F1-F12 keys are ideal, but if a Fn key is required, then the modifier keys should all have proper colored labels. Any key mapping should be done at a hardware level, not software, to ensure key purpose remains consistent. If pressing Fn+3 sends an F3 while booted into the Operating System then it should also send an F3 in the BIOS.
As far as CPUs go, the x86 architecture has probably run its course, especially with power consumption. You never know what sort of energy crisis may be waiting for us down the road. ARM is interesting from a lower power consumption perspective. RISC-V is interesting from an open source and ability to procure schematics perspective. However, even simpler architectures might be better if a simpler operating system is chosen.
Operating System choice has many effects on such a computer. I believe it comes down to the question of if the OS should be complex, like a BSD or Linux, or simple. A complex OS comes with complexity of maintenance. A computer won't last for a century if some misconfigured service fills the disk with logs. Such a complex kernel can be hard to reason about and make changes to. The distance from kernel code to driver code can also make things hard to grok. Power consumption is going to be higher. It is "overkill" if the post apocalyptic computer only needs 1% of the functionality provided by the OS. Overkill translates into failure points.
Simpler operating systems get a bit more interesting. There is actually a purpose built OS that relates to a lot of the ideas I've mentioned in here: DuskOS and CollapseOS. While geared towards the collapse of society (aka post apocalyptic), they're also interesting in that they're built around the Forth language. This is reminiscent of old DOS era computers like the TRS-80 which made it very easy to write and run BASIC programs.
In a world increasingly dominated by planned obsolescence and disposable technology, the idea of a general-purpose computing machine designed to last a century feels both radical and necessary. Drawing inspiration from Sci-Fi, retro computing, and a deep frustration with modern constraints, we can imagine a machine that prioritizes repairability, self-containment, and the ability to modify and replicate itself.
Such a computer would embrace simplicity over excess, durability over convenience, and openness over restriction. It would rely on robust commodity hardware, energy efficiency, and a software ecosystem without proprietary constraints. The challenges of power sources, bit-rot, and simplified communication protocols all need thoughtful solutions, but the goal remains the same: a machine that endures, adapts, and continues to function even in the face of technological, economic, or societal collapse.
Whether this vision takes form as a movement, a niche project, or a future necessity, the idea remains compelling. As technology continues to cycle between openness and controlled, thick and thin clients, repairability and disposability, we may yet see a return to computing that values longevity and resilience over profit-driven impermanence. The hundred-year computer isn't just a dream — it’s a challenge to rethink the way we build and use technology in an uncertain future.