I highly recommend his book on Wolfenstein 3D (mentioned at the end of the article). Even though it's technical, it is not dry or boring. And there are lots of old-school tricks and optimizations, like 64 unwrapped functions for scaling wall textures, storing the sprites "sideways", managing the wacky graphics card, and hacking the graphics modes to even be able to display something like a game in the first place.
Yeah, thats great in retro systems. If you gather the tools, you can rebuild them.
I recently recompiled good old Uplink game for fun. All I needed is Visual Studio 6.0. The devel archive contained all necessary deps to build game. It was a bit tricky to do it right but after an hour of fixing deps (I got rusty) I made it and game works fine :) I even fixed some minor glitches and added little improvements here and there.
I think that's only with Linux's package management mess?
Most commercial embedded/windows stuff I have seen usually vendored their dependencies in a thirdparty/ folder AND fixed the toolchains they use. So the setup is always reproducible.
Dunno, but for example I tried to compile Stalker SoC and it was very messy adventure. It took me whole day to build it (deps problems, compile errors) and it was such an disappointment because game just crashed. Maybe I will try again one day.
I learned to program in Borland Turbo C++. One thing from back then that I really miss is how easy it was to do some complex things. I was able to draw to the screen by just calling geometric shape functions, and it made pictures! I found out that if I drew a shape, then called the xor function on the shape a new but sligtly different shape, I could make animations. So making little sprites that looked like they were running made out of only 1000 lines of C++ code was awesome. Some friends and I we got together and made a final fantasy like game using these tricks, hand crafted sprites and a game world that you could walk across, every map was a whole screen, and you would go to an adjacent map when you hit the sides, and every step if you had a natural 1 you would go into a fight with some enemies.
This was all pretty tedious and everything, but it was a lot of fun to high schoolers. And if we knew more computer science and software engineering we would have done more probably, and it would have worked better. But unlike today we didn't have to learn SFML or or activex or opengl to just start playing and get stuff working, we could just call circle.
I suspect that if we wanted to find an environment for kids to do simple shape programming, we can find something like that.
The difference seems to be that there's been a greater divergence between "professional" tools and "kids/intro tools" whereas Turbo C++ (or Turbo Pascal in my days) were kinda both.
There has been a lot of effort put into making kids versions of technology, which I think is probably the right move for elementary school aged kids. But once you get into middle and highschool, I think as a kid there is a "coolness" factor to using the same stuff that the pros use. Since at that point you are trying to be an adult, you are trying to do stuff the right way, and hey you might be getting a real job in the field just a couple years anyways. So there is some value I think in making just more friendly of an environment and on boarding process to a tool.
When I was teaching programming my go-to was the JavaScript canvas api. It’s 2d only, and very simple. And being on the web once a student has made something we can host it for them and they can show their friends.
I have a ~20 line html harness which sets up a page with a full screen canvas element and gives you global window width & height variables. That’s all you need to get started. And it’s real JavaScript - so students learn a useful programming language as a result, and the advanced students can go nuts and add sprites, sound and networking if they really want to.
I tell people to leave the html file alone and edit the javascript, using the canvas API to make it draw whatever they want. Canvas supports text, shapes and images and translations / transformations. If they want, I show them how to animate their work as well.
Another option is to use the p5.js library. They also have a nice online editor, at https://editor.p5js.org/, which makes it easy for students to get up and running quickly.
A kid in the early to mid-80's could get enough working that they could imagine it possible to match a store bought game.
Once hardware capabilities went up, so did the need for more specialized skills and longer development cycles. Even though it might have still been possible to draw a box on the screen with a single command, the relationship of that box to a valuable outcome was a lot less obvious.
I'll most likely pick FreeBASIC or Python for teaching kids simple graphics programming.
FreeBASIC has a QB-compatiblity mode, so using old QB tutorials shouldn't be a problem.
I've been struggling with getting my ten-year-old across this gap. He's outgrown Scratch/Roblox and I don't think PICO-8 is quite the right set of abstractions (no built in entity system, seriously?); we're working with Godot right now and he's making progress, but it's definitely a lot more of a learning curve figuring out how to do stuff in a tool that has everything.
I’ve got a 15 yo who decided to start programming in Python using Pythonista on his iPhone. He refuses to take any input from me; just wants to learn on his own. Pythonista comes with some nice game-programming modules. So far he’s shown me a Pong game, 2048 clone, air hockey, and more.
Would processing[0] be a good fit? It's designed to be easy to use and learn but powerful enough for professional use. Very quick to get cool stuff moving on a screen and the syntax is Java with a streamlined editing environment.
I think something like basic or pascal is right. The "hard" part I think that Borland did so well was make it so easy to just get things right. A Neo Turbo Pascal that you could could type draw(x,y,z,r,"red") and see a red circle on the screen is the ideal, without having to mess with a very complicated workflow like in unity or unreal.
LOGO is a great take on LISP-like languages overall, unfortunately it uses dynamic scope. Of course this only comes up in larger programs but I do wonder if anyone has made a lexically-scoped LOGO variant and what it might be like to teach coding in it.
that looks cool to me, thanks for sharing. I think something that works well for kids is instant feedback on their code/tweaks, looks like this has it.
PICO-8 sits in an interesting spot where it's not exactly a "toy" language/environment the way Scratch is, nor is it especially geared at learning, but on the other side of things it is based on a real programming language (lua) and there's an interesting scene for it where people flex pushing it to its limits making demos, demakes, etc, many of which are far beyond the NES-level capabilities it's meant to have.
I kind of envy you for having access to Borland Turbo C++ and learning resources for it as a kid. The closest I could get to it was reading a review on my local computing press. Even assuming I could magically get a copy, I still wouldn't know what to do with it without reading material. And even if I had the reading material, I'm not sure how much I would make out of it with my fledgling knowledge of English at the time.
Borland Turbo C++ (and Pascal as well) had great help documentation. Every function was thoroughly explained and there was a lot of examples. I learned C just by reading Turbo C++ help docs. I miss that time.
I wouldn't have access too if it were not for the local pirate scene. Even owners of local software houses doing professional accounting systems didn't mind copying a few disks of software they paid for to a kid who knew how to ask. If you got something new, you'd immediately share with your colleagues.
And everybody kept in mind that if we ever started making money from our hobby, one of our first investments would be into buying properly licensed copies of the tools we used.
Computer Science classes in Indian schools (fifth to eighth grades in my case) taught programming using Borland Turbo C++. That was back in the 2000s, but I wouldn't be surprised if they still use it today.
Modern 2D graphics are not based on plotting pixels to a framebuffer, they have "textures" or "surfaces" as a native part of the system and any compositing is done as a separate step. So if anything making "simple sprites" has become a bit easier since you can just think of any composited surface as a more generic version of a hardware sprite.
I think this is more simple from the pov of making a game engine from scratch or a game with complex effects and graphics. But is it more simple from the pov of a high schooler that just wants to get some flat colored shapes on the screen?
Even rendering "flat colored shapes" efficiently can be a bit non-trivial if you expect pixel-perfect results, like you'd get by plotting to an ordinary framebuffer - the GPU's fixed rendering pipeline is not generally built for that. The emerging approach is to use compute shaders, and these are not yet fully integrated with existing programming languages - you can't just edit ordinary C++/Rust code and have it seamlessly compile for CPU and GPU rendering. But we're getting closer to that.
I'm curious how does the texturing work in the SDL backend? I did not read the code but am curious now.
Back in the day I heard about of a friend who claimed to write some 2d framework which was faster than Direct2D -- that was a pretty early version of Direct2D I think.
SDL is a cross platform library, so it works by just using whatever native rendering context is available, abstracted to a common API. In more general terms AFAIK it uses the geometry API under the hood and just pushes triangles/quads to the GPU whenever possible.
I was refering to 2D. for 3D, SDL2 has OpenGL and Vulkan APIs.
SDL3 is going to have a general purpose 3D API with its own shader language AFAIK.
In my experience, if what you want is to just get a window open and render some sprites, have some basic low level stuff handled and not have your hand held, SDL2 is ridiculously easy for it. There's also Raylib[0] and SFML[1], neither of which I've used but I hear good things about.
Same here, going into UNIX back in the early 1990's, after using the Borland IDEs across MS-DOS and Windows 3.x (and being aware of their OS/2 versions), felt like time travel to the genesis of programming, CP/M style.
Thankfully a professor pointed us to XEmacs, which I managed to get my Borland experience back, which became my UNIX companion until KDevelop, Eclipse, Netbeans came to rescue.
One thing from back then that I really miss is how easy it was to do some complex things.
This might be my biggest disappointment with "modern" programming. I want direct access to the hardware with stuff like $100 1GHz 100+ core CPUs with local memories and true multithreaded languages that use immutability and copy-on-write to implement higher-order methods and scatter-gather arrays. Instead we got proprietary DSP/SIMD GPUs with esoteric types like tensors that require the use of display lists and shaders to achieve high performance.
It comes down to the easy vs simple debate.
Most paradigms today go the "easy" route, providing syntactic sugar and similar shortcuts to work within artificial constraints created by market inefficiencies like monopoly. So we're told that the latency between CPU and GPU is too long for old-fashioned C-style programming. Then we have to manage pixel buffers ourselves. We're limited in the number of layers we can draw or the number of memory locations we can read/right simultaneously (like how old arcade boxes only had so many sprites). The graphics driver we're using may not provide such basic types as GL_LINES. Etc etc etc. This path inevitably leads to cookie cutter programming and copypasta, causing software to have a canned feel like the old CGI-BIN and Flash Player days.
Whereas the "simple" route would solve actual problems within the runtime so that we can work at a level of abstraction of our choosing. For example, intrinsics and manual management of memory layout under SSE/Altivec would be substituted for generalized (size-independent) vector operations on any type with the offsets of variables within classes/structs decided internally. GPUs, FPUs and even hyperthreading would go away in favor of microcode-defined types and operations on arbitrary bitfields, more akin to something like VHDL/Verilog running on reprogrammable hardware.
The idea being that computers should do whatever it takes to execute users' instructions, rather than forcing users to adapt their mental models to the hardware/software. Cross-platform compilation, emulation, forced hardware upgrades that ignore Turing completeness, vendor/platform lock-in and planned obsolescence are all symptoms of today's "easy" status quo. Whereas we could have the "simple" MIMD transputer I've discussed endlessly in previous comments that just reconfigures itself to run anything we want at the maximum possible speed. More like how a Star Trek computer might run.
In practice that would mean that a naive for-loop on individual bytes written in C would run the same speed as a highly accelerated shader, because the compiler would optimize the intermediate code (i-code) into its dependent operations and distribute computation across a potentially unlimited number of cores, integrating the results to exactly match a single-threaded runtime.
The hoops we have to jump through between conception and implementation represents how far we've diverged from what computing could be. Modern web development, enterprise software, a la carte microservice hoards like AWS that eventually require nearly every service just to work, etc etc etc, often create workloads which are 90% friction and 10% results.
Just give me the good old days where the runtime gave us everything, no include paths or even compiler flags to worry about, and the compiler stripped out everything we did't use. Think C for the Macintosh mostly worked that way, and even Metrowerks CodeWarrior tried to have sane defaults. Before that, the first fast language I used, called Visual Interactive Programming (VIP), gave the programmer everything and the kitchen sink. And HyperCard practically made it its mission in life to free the user of as much programming jargon as possible.
I feel like I got more done between the ages of 12 and 18 than all the years since. And it's not a fleeting feeling.. it's every single day. And forgetting how good things were in order to focus on the task at hand now takes up so much of my psyche that I'm probably less than 10% as productive as I once was.
Microcode? I don't think that's how modern μarch works. You can definitely make modern compute accelerators more like a plain CPU and less bespoke, and this is what folks like Tenstorrent and Esperanto Technology are working on (building on RISC-V, an outstanding example of "simple" yet effective tech) but a lot of the distinctive featuresets of existing CPUs, GPUs, FPUs, NPUs etc. are directly wired into the hardware, in a way that can't really be changed.
After seeing it mentioned on here a bunch of times over the years, I finally read Master of Doom a couple of months back. Great book!
The games they were producing were incredibly exciting to play at the time, but it’s even more inspiring looking back at the history to see what a handful of scrappy kids could create.
Installing to the C drive! Luxury! We swapped those floppies back and forth out of the A and B drives because we had no hard drive and it kept prompting for the part of the compiler swapped out! And we enjoyed it! Kids these days don’t believe you when you rant about it!
Likely just full height, not double height. What has become the normal height for a 5.25" drive is half height. I've seen plenty of full height 5.25" hard drives, but I don't think I've seen double height (would take 4 bays in a modern computer case, if modern computer cases had 4 bays)
I was going to say "my current desktop has 4 CD-capable bays", but I checked and it's actually only 3 bays followed by unused space (and then 2.5" and 3.5" bays below that).
Even if you didn't have that 3.5" drive for your computer, perhaps you can take solace in the fact that you understand intuitively why the disks were called "floppy"!
Mmmm...back when your computer had a smell. You mentioned that hard drive and the memories of computer aroma flashes in my snoot. I had a friend that would seal up his Nintendo to trap the aroma.
I think RHIDE still has source code available, even if it's 23 years since the last update?
Real problem with implementing such a terminal-mode GUI is that Unix/Linux terminals clung to a terminal standard which didn't provide all the features of an MS-DOS text mode program. In particular, it can't recognize the Alt key by itself, or Alt+Letter key combinations. Or Ctrl+Shift+Left/Right keys.
If you have nostalgia for this, check out Free Pascal. It's an open-source Pascal/Delphi clone, and includes a clone of the blue DOS Turbo Pascal editor that will give you the vibe you crave.
I used it for a few of the Advent of Code days this year, but man the nostalgia wore off. Both the limitations of the IDE and the verbosity of Pascal weren't "fun" to operate in daily coming from the modern world. But definitely sweet memories.
Borland C++ was extremely good: a C++ compiler, standard library, and IDE with debugger that fit in about five megabytes. With the cozy yellow-on-blue Borland colour scheme.
I was quite surprised how bad MFC was by comparison, then OWL got replaced by VCL, still quite good, while MFC continued bad as always.
While Borland had an approach that we could get nice high level frameworks in C++ (a sentiment that Qt also shares), Microsoft C++ folks seemed keen in keeping their beloved MFC as low level as possible.
I read somewhere that MFC originally was similar in concept, but faced too much resistance internally, thus the outcome and was rewriten.
And to this day Microsoft hasn't been able to deliver a C++ framework that is as nice to use as those from Borland.
> I read somewhere that MFC originally was similar in concept,
This is what AFX was. They reused some of the core pieces, so the AFX name persisted into MFC.
> but faced too much resistance internally, thus the outcome and was rewriten.
The concern was that developers who had just ascended the Win16 API learning curve would now have to ascend another totally different learning curve to understand the framework. MFC developed from a (supposedly) nice object oriented framework into a way to avoid explicitly passing handles to API calls. (It also replaced message cracking and a few other things.)
By the time Visual C++ rolled around, Microsoft started adding higher level abstractions to MFC and building it back out a bit, but the underlying damage was done.
> And to this day Microsoft hasn't been able to deliver a C++ framework that is as nice to use as those from Borland.
ATL was supposedly quite nice, as was the mostly unsupported WTL derivative (that supported complete app development).
Yeah, that was the thing with Afx, kind of forgotten it, thanks.
ATL was and is, anything but nice, unless one is into deep love with COM without tooling, to this day Visual Studio still doesn't offer anything to alleviate the pain of dealing with COM, IDL and generating C++ stubs, because just like MFC and Afx, it seems there are many internal feuds against having nice tools.
The only time Microsoft finally created something nice to use COM from C++ (C++/CX), those internal feuds managed to kill it, replace it with a developer experience just as bad as ATL (C++/WinRT), and then when bored left the project to play with Rust/WinRT.
It is excellent book that can bring you throughout the game development in the old day again. I like the book very much and I finished reading it while waiting in the line in the cafeteria years ago.
Archive: wolfsrc.zip
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of wolfsrc.zip or
wolfsrc.zip.zip, and cannot find wolfsrc.zip.ZIP, period.
The speed you could get on those "box character TUIs with light up Alt-key shortcuts" was insane - I can remember going to Fry's Electronics, where everything was managed by some (probably mainframe) computer that was interacted with via terminals (later a terminal emulator on windows) and the employees who knew it could get you a printed out quote for the cage before the thing had even finished drawing the first screen.
Doom is actually quite easy to build with modern tooling because the source code is already structured to keep the platform-specific parts isolated and the source is mostly vanilla C89 (Doom was developed on NeXT workstations and then 'downported' to DOS).
For instance here's my WASM port of the Doom shareware version. This is a fork of doomgeneric, which itself is a fork of fbDoom - but midway through I noticed that all the abstractions added by fbDoom and doomgeneric are actually not very helpful for a WASM port that should run in browsers, and that I probably would have been better off forking the original source code instead):
reply