(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=38858012

在帕斯卡和尼克劳斯·沃斯最近去世的消息传出后,许多人分享了他们对帕斯卡和尼克劳斯·沃斯的想法和经历。 有些人提到了他对计算机科学的持久贡献,包括帕斯卡、奥布朗和极简主义原则。 其他人强调了他对该领域的编程语言和教育工作者的影响。 此外,一些人对他撰写的《算法 + 数据结构 = 程序》一书以及 Pascal 中易于理解的指针表示法表示赞赏。 总体而言,Wirth 的遗产在编程社区中显得意义重大并受到赞赏。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
Niklaus Wirth has died (twitter.com/bertrand_meyer)
1681 points by aarroyoc 14 hours ago | hide | past | favorite | 306 comments










Besides his contribution to language design, he authored one of the best puns ever. His last name is properly pronounced something like "Virt" but in the US everyone calls him by "Worth".

That led him to quip, "In Europe I'm called by name, but in the US I'm called by value."



The joke goes back to Adriaan van Wijngaarden introducing Wirth at a conference in the 1960s. I'd love to see a video of the audience reaction to that one.

https://en.wikiquote.org/wiki/Niklaus_Wirth

https://lists.racket-lang.org/users/archive/2014-July/063519...



Today I learned (from the Wikiquote page), what an obviously socially witty person he seems to have been!

> Finally a short story for the record. In 1968, the Communications of the ACM published a text of mine under the title "The goto statement considered harmful", which in later years would be most frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template: we would see all sorts of articles under the title "X considered harmful" for almost any X, including one titled "Dijkstra considered harmful". But what had happened? I had submitted a paper under the title "A case against the goto statement", which, in order to speed up its publication, the editor had changed into a "letter to the Editor", and in the process he had given it a new title of his own invention! The editor was Niklaus Wirth.

It is refreshing to see the old-fashioned trope of the genius computer scientist / software enginieer as a "foreigner to the world" being contested again and again by stories like this.

Of course people like Niklaus Wirth are exceptional in many ways, so it might be that the trope has/had some grain of truth, that just does not co-correlate with the success of said person :)

And of course people might want to argue about the differences betweem SE, CS and economics.

After all that rambling... RIP and thank you Niklaus!



ACM Historian JAN Lee said he was the origin of that joke, if I recall that conversation correctly.


The joke really only works if you use his first name! The complete joke is that "by value" means pronouncing first and last name to sound like "Nickles Worth".


"worth" alone still means "value" though


I always (and still) undestood it to just need the surname.


Said the same here:

https://news.ycombinator.com/item?id=15361069

at the end of the comment.



Looks like he had a good sense of humor too.


He was the go to guy for witty comments.


Dijkstra would consider that one harmful. In America we consider Wirth the to go guy for witty takeaways.


Sweet! Annoyed I didn’t think of that one


Yogi Berra would love you.


worthy takeaways


Yes, Macs do have some worth, though they are harmful.

We should take them away.



Goto the top of the class.


Best CS pun ever. Thanks for sharing !!!


I saw it stated as pronounced as Veert, somewhere, maybe in his Wikipedia or Wikiquote pages.


It's pronounced with an ɪ like in "wit".


Okay, either me or the ref. may have been wrong. but I distinctly remember "Veert", because it was non-intuitive to me (as the way to pronounce Wirth), as a guy who didn't know any German at the time, only English. So the ref., probably.


Besides all his innumerable accomplishments he was also a hero to Joe Armstrong and a big influence on his brand of simplicity.

Joe would often quote Wirth as saying that yes, overlapping windows might be better than tiled ones, but not better enough to justify their cost in implementation complexity.

RIP. He is also a hero for me for his 80th birthday symposium at ETH where he showed off his new port of Oberon to a homebrew CPU running on a random FPGA dev board with USB peropherals. My ambition is to be that kind of 80 year old one day, too.



> not _better enough_

Wirth was such a legend on this particular aspect. His stance on compiler optimizations is another example: only add optimization passes if they improve the compiler's self-compilation time.

Oberon also, (and also deliberately) only supported cooperative multitasking.



>His stance on compiler optimizations is another example: only add optimization passes if they improve the compiler's self-compilation time.

What an elegant metric! Condensing a multivariate optimisation between compiler execution speed and compiler codebase complexity into a single self-contained meta-metric is (aptly) pleasingly simple.

I'd be interested to know how the self-build times of other compilers have changed by release (obviously pretty safe to say, generally increasing).



Hmm, but what if the compiler doesn't use the optimized constructs, e.g. floating point optimizations targeting numerical algorithms?


Life was different in the '80s. Oberon targeted the NS32000, which didn't have a floating point unit. Let alone most the other modern niceties that could lead to a large difference between CPU features used by the compiler itself, and CPU features used by other programs written using the compiler.

That said, even if the exact heuristic Wirth used is no longer tenable, there's still a lot of wisdom in the pragmatic way of thinking that inspired it.



Speaking of that, if you were ever curious how computers do floating point math, I think the first Oberon book explains it in a couple of pages. It’s very succinct and, for me, one of the clearest explanations I’ve found.


Rewrite the compiler to use a LLM for complication. I'm only half joking! The biggest remaining technical problem is the context length, which is severely limiting the input size right now. Also, the required humongous model size.


Simple fix: floating-point indexes to all your tries. Or switch to base π or increment every counter by e.


That’s not a simple fix in this context. Try making it without slowing down the compiler.

You could try to game the system by combining such a change that slows down compilation with one that compensates for it, though, but I think code reviewers of the time wouldn’t accept that.



probably use a fortran compiler for that instead of oberon


You cannot add a loop skew optimization to compiler before compiler needs a loop skew optimization. Which it would not need at all because it is loop skew optimization (it requires matrix operations) that need a loop skew optimization.

In short, compiler is not an ideal representation of the user programs it needs to optimize.



Perhaps Wirth would say that compilers are _close enough_ to user programs to be a decent enough representation in most cases. And of course he was sensible enough to also recognize that there are special cases, like matrix operations, where it might be wirthwhile.

EDIT: typo in the last word but I'm leaving it in for obvious reasons.



Wirth ran an OS research lab. For that, the compiler likely is a fairly typical workload.

But yes, it wouldn’t work well in a general context. For example, auto-vectorization likely doesn’t speed up a compiler much at all, while adding the code to detect where it’s possible will slow it down.

So, that feature never can be added.

On the other hand, may lead to better designs. If, instead, you add language features that make it easier for programmers to write vectorized code, that might end up being easier for programmers. They would have to write more code, but they also would have to guess less whether their code would end up being vectorized.



perhaps you could write the compiler using the data structures used by co-dfns (which i still don't understand) so that vectorization would speed it up, auto- or otherwise


Supported cooperative multitasking won in the end.

It just renamed itself to asynchronous programing. That's quite literally what an 'await' is.



It hasn't won. Threads are alive and well and I rather expect async has probably already peaked and is back on track to be a niche that stays with us forever, but a niche nevertheless.

Your opinion vs. my opinion, obviously. But the user reports of the experience in Rust is hardly even close to unanimous praise and I still say it's a mistake to sit down with an empty Rust program and immediately reach for "async" without considering whether you actually need it. Even in the network world, juggling hundreds of thousands of simultaneous tasks is the exception rather than the rule.

Moreover, cooperative multitasking was given up at the OS level for good and sufficient reasons that I see no evidence that the current thrust in that direction has solved. As you scale up, the odds of something jamming your cooperative loop monotonically increase. At best we've increased the scaling factors, and even that just may be an effect of faster computers rather than better solutions.



in the 02000s there was a lot of interest in software transactional memory as a programming interface that gives you the latency and throughput of preemptive multithreading with locks but the convenient programming interface of cooperative multitasking; in haskell it's still supported and performs well, but it has been largely abandoned in contexts like c#, because it kind of wants to own the whole world. it's difficult to add incrementally to a threads-and-locks program

i suspect that this will end up being the paradigm that wins out, even though it isn't popular today



I was considering making a startup out of my simple C++ STM[0], but the fact that, as you point out, the transactional paradigm is viral and can't be added incrementally to existing lock-based programs was enough to dissuade me.

[0] https://senderista.github.io/atomik-website/



Sound’s nifty. Did this take advantage of those Intel (maybe others?) STM opcodes? For a while I was stoked on CL-STMX which did (as well as implementing non-native version to the same interface)


nice! when was this? what systems did you build in it? what implementation did you use? i've been trying to understand fraser's work so i can apply it to a small embedded system, where existing lock-based programs aren't a consideration


It grew out of an in-memory MVCC DB I was building at my previous job. After the company folded I worked on it on my own time for a couple months, implementing some perf ideas I had never had time to work on, and when update transactions were


that's exciting! i just learned about hitchhiker trees (and fractal tree indexes, blsm trees, buffer trees, etc.) this weekend, and i'm really excited about the possibility of using them for mvcc. i have no idea how i didn't find out about them 15 years ago!


Meanwhile, in JS/ECMAScript land, async/await is used everywhere and it simplifies a lot of things. I've also used the construct in Rust, where I found it difficult to get the type signatures right, but in at least one other language, async/await is quite helpful.


Await is simply syntactic sugar on top of what everybody was forced to do already (callbacks and promises) for concurrency. As a programming model, threads simply never had a chance in the JS ecosystem because on the surface it has always been a single-threaded environment. There's too much code that would be impossible to port to a multithreaded world.


Not in Java, .NET and C++ case, as it is mapped to tasks, managed by threads, and you can even write your own scheduler if so inclined.


Also (AFAIK) not in JavaScript. An essential property of cooperative multitasking is that you can say “if you feel like it, pause me and run some other code for a while now” to the OS.

Async only allows you to say “run foo now until it has data” to the JavaScript runtime.

IMO, async/await in JavaScript are more like one shot coroutines, not cooperative multitasking.

Having said that, the JavaScript event loop is doing cooperative multitasking (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Even...)



It has mostly won for individual programs, but very much not for larger things like operating systems and web browsers.


In the last 15 to 20 years asynchronous programming --- as a form of cooperative multi-tasking [1] --- did gain lot's of popularity. That was mainly because of non-scalable threads implementations in most language runtimes, e.g. the JVM. At the same time the JS ecosystem needed to have some support for concurrency. Since threads weren't even an option the community settled first on callback-hell and then on async/await. The former reason to asynchronous programming alleged win is currently being reversed. The JVM has introduced light weight threads that have the low runtime cost of asynchronous programming and all the niceties of thread-based concurrency.

[1]: Asynchronous programming is not the only form of cooperative programming. Usually cooperative multi-tasking systems have a special system call yield() which gives up the processor in addition to io induced context-switches.



In .NET and C++ asynchronous programming is not cooperative, it hides the machinery of a state machine mapping tasks into threads, it gets prempted and you can write your own scheduler.


But, isn't the separation of the control-flow into chunks, either separated by async/await or by sepration between call and callback, a form of cooperative thread yielding on top of preemptive threads? If that isn't true for .NET, then I'd really interested to understand what else it is doing.


Mostly won for CRUD apps (yes and a few others). Your DAW, your photo editor, your NLE, your chatbot girlfriend, your game, your CAD, etc might actually want to use more than one core effectively per task. Even go had to grow up eventually.


It's moving in more and more.

A core problem is that it's now clear most apps have hundreds or thousands of little tasks going, increasingly bound by network, IO, and similar. Async gives nice semantics for implementing cooperative multitasking, without introducing nearly as many thread coherency issues as preemptive.

I can do things atomically. Yay! Code literally cooperates better. I don't have the messy semantics of a Windows 3.1 event loop. I suspect it will take over more and more into all walks of code.

Other models are better for either:

- Highly parallel compute-bound code (where SIMD/MIMD/CUDA-style models are king)

- Highly independent code, such as separate apps, where there are no issues around cooperation. Here, putting each task on a core, and then preemptive, obviously wins.

What's interesting is all three are widely used on my system. My tongue-in-cheek comment about cooperative multitasking winning was only a little bit wrong. It didn't quite win in the sense of taking over other models, but it's in widespread use now. If code needs to cooperate, async sure beats semaphores, mutexes, and all that jazz.



Async programming is not an alternative to semaphores and mutexes. It is an alternative to having more threads. The substantial drawback of async programming in most implementations is that stack traces and debuggers become almost useless; at least very hard to use productively.


Indeed, however the experience with crashes and security exploits, has proven that scaling processes, or even distributing them across several machines, scales much better than threads.


preemptively scheduled processes, not cooperatively scheduled


Ah, missed that.


async/await has the advantage over cooperative multitasking that it has subroutines of different 'colors', so you don't accidentally introduce concurrency bugs by calling a function that can yield without knowing that it can yield

i think it's safe to say that the number of personal computers running operating systems without preemptive multitasking is now vanishingly small

as i remember it, oberon didn't support either async/await or cooperative multitasking. rather, the operating system used an event loop, like a web page before the introduction of web workers. you couldn't suspend a task; you could only schedule more work for later



(To set the tone clearly - this seems like an area where you know a _lot_ more than me, so any questions or "challenges" below should be considered as "I am probably misunderstanding this thing - if you have the time and inclination, I'd really appreciate an explanation of what I'm missing" rather than "you are wrong and I am right")

I don't know if you're intentionally using "colour" to reference https://journal.stuffwithstuff.com/2015/02/01/what-color-is-... ? Cooperative multitasking (which I'd never heard of before) seems from its Wikipedia page to be primarily concerned with Operating System-level operations, whereas that article deals with programming language-level design. Or perhaps they are not distinct from one another in your perspective?

I ask because I've found `async/await` to just be an irritating overhead; a hoop you need to jump through in order to achieve what you clearly wanted to do all along. You write (pseudocode) `var foo = myFunction()`, and (depending on your language of choice) you either get a compilation or a runtime error reminding you that what you really meant was `var foo = await myFunction()`. By contrast, a design where every function is synchronous (which, I'd guess, more closely matches most people intuition) can implement async behaviour when (rarely) desired by explicitly passing function invocations to an Executor (e.g. https://www.digitalocean.com/community/tutorials/how-to-use-...). I'd be curious to hear what advantages I'm missing out on! Is it that async behaviour is desired more-often in other problem areas I don't work in, or that there's some efficiency provided by async/await that Executors cannot provide, or something else?



> I ask because I've found `async/await` to just be an irritating overhead

Then what you want are coroutines[1], which are strictly more flexible than async/await. Languages like Lua and Squirrel have coroutines. I and plenty of other people thing it's tragic that Python and Javascripts added async/await instead, but I assume the reason wasn't to make them easier to reason about, but rather to make them easier to implement without hacks in existing language interpreters not designed around them. Though Stackless Python is a CPython fork that adds real coroutines, also available as the greenlet module in standard CPython [2], amazing that it works.

[1] Real coroutines, not what Python calls "coroutines with async syntax". See also nearby comment about coroutines vs coop multitasking https://news.ycombinator.com/item?id=38859828

[2] https://greenlet.readthedocs.io/en/latest/



Bliss 36 and siblings had native coroutines.

We used coroutines in our interrupt rich environment in our real time medical application way back when. This was all in assembly language and the coroutines vastly reduced our multithreading errors to effectively zero. This is one place where C , claimed to be close to the machine falls down.



interesting, i didn't even realize bliss for the pdp-10 was called bliss-36

how did coroutines reduce your multithreading errors



well some of the things i know are true but i don't know which ones those are; i'll tell you the things i know and hopefully you can figure out what's really true

yes! i'm referencing that specific rant. except that what munificent sees as a disadvantage i see as an advantage

there's a lot of flexibility in systems design to move things between operating systems and programming languages. dan ingalls in 01981 takes an extreme position in 'design principles behind smalltalk' https://www.cs.virginia.edu/~evans/cs655/readings/smalltalk....

> An operating system is a collection of things that don't fit into a language. There shouldn't be one.

in the other direction, tymshare and key logic's operating system 'keykos' was largely designed, norm hardy said, with concepts from sigplan, the acm sig on programming languages, rather than sigsosp

sometimes irritating overhead hoops you need to jump through have the advantage of making your code easier to debug later. this is (i would argue, munificent would disagree) one of those times, and i'll explain the argument why below

in `var foo = await my_function()` usually if my_function is async that's because it can't compute foo immediately; the reasons in the examples in the tutorial you linked are making web requests (where you don't know the response code until the remote server sends it) and reading data from files (where you may have to wait on a disk or a networked fileserver). if all your functions are synchronous, you don't have threads, and you can't afford to tie up your entire program (or computer) waiting on the result, you have to do something like changing my_function to return a promise, and putting the code below the line `var foo = await my_function()` into a separate subroutine, probably a nested closure, which you pass to the promise's `then` method. this means you can't use structured control flow like statement sequencing and while loops to go through a series of such steps, the way you can with threads or async

so what if you use threads? the example you linked says to use threads! i think it's a widely accepted opinion now (though certainly not universal) that shared-mutable-memory threading is the wrong default, because race conditions in multithreaded programs with implicitly shared mutable memory are hard to detect and prevent, and also hard to debug. you need some kind of synchronization between the threads, and if you use semaphores or locks like most people do, you also get deadlocks, which are hard to prevent or statically detect but easy to debug once they happen

async/await guarantees you won't have deadlocks (because you don't have locks) and also makes race conditions much rarer and relatively easy to detect and prevent. mark s. miller, one of the main designers of recent versions of ecmascript, wrote his doctoral dissertation largely about this in 02006 http://www.erights.org/talks/thesis/index.html after several years working on an earlier programming language called e based on promises like the ones he later added to js; but i have to admit that, while i've read a lot of his previous work, i haven't read his dissertation yet

cooperative multitasking is in an in-between place; it often doesn't use locks and makes race conditions somewhat rarer and easier to detect and prevent than preemptive multitasking, because most functions you call are guaranteed not to yield control to another thread. you just have to remember which ones those are, and sometimes it changes even though your code didn't change

(in oberon, at least the versions i've read about, there was no way to yield control. you just had to finish executing and return, like in js in a web page before web workers, as i think i said upthread)

that's why i think it's better to have colored functions even though it sometimes requires annoying hoop-jumping



> async/await guarantees you won't have deadlocks

You will get them in .NET and C++, because they map to real threads being shared across tasks.

There is even a FAQ maintained by .NET team regarding gotchas like not calling ConfigureAwaitable with the right thread context in some cases where it needs to be explicitly configured, like a task moving between foreground and background threads.



And these fancy new names aren't there just for hiding the event loop? :)


Sort of and sort of not.

The key thing about 2023-era asynchronous versus 1995-era cooperative multitasking is code readability and conciseness.

Under the hood, I'm expressing the same thing, but Windows 3.1 code was not fun to write. Python / JavaScript, once you wrap your head around it, is. The new semantics are very readable, and rapidly improving too. The old ones were impossible to make readable.

You could argue that it's just syntactic sugar, but it's bloody important syntactic sugar.



I never left 1991 and I haven't seen anything that has made me consider leaving ConcurrentML except for the actor model, but that is so old the documentation is written on parchment.


Exactly. The way I think about it, the "async" keyword transforms function code so that local variables are no longer bound to the stack, making it possible to pause function execution (using "await") and resume it at an arbitrary time. Performing that transformation manually is a fair amount of work and it's prone to errors, but that's what we did when we wrote cooperatively multitasked code.


Coroutines are better than both. Particularly in reasoning about code.


if the implied contrast is with cooperative multitasking, it's exactly the opposite: they're there to expose the event loop in a way you can't ignore. if the implied contrast is with setTimeout(() => { ... }, 0) then yes, pretty much, although the difference is fairly small—implicit variable capture by the closure does most of the same hiding that await does


Not asking about old JavaScript vs new JavaScript. Asking about explicit event loop vs hidden event loop with fancy names like timeout, async, await...


do you mean the kind of explicit loop where you write

    for (;;) {
        int r = GetMessage(&msg, NULL, 0, 0);
        if (!r) break;
        if (r == -1) croak();
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }
or, in yeso,

      for (;;) {
        yw_wait(w, 0);
        for (yw_event *ev; (ev = yw_get_event(w));) handle_event(ev);
        redraw(w);
      }
async/await doesn't always hide the event loop in that sense; python asyncio, for example, has a lot of ways to invoke the event loop or parts of it explicitly, which is often necessary for integration with software not written with asyncio in mind. i used to maintain an asyncio cubesat csp protocol stack where we had to do this

to some extent, though, this vitiates the concurrency guarantees you can otherwise get out of async/await. software maintainability comes from knowing that certain things are impossible, and pure async/await can make concurrency guarantees which disappear when a non-async function can invoke the event loop in this way. so i would argue that it goes further than just hiding the event loop. it's like saying that garbage collection is about hiding memory addresses: sort of true, but false in an important sense



What worries me is we may have a whole generation who doesn't know about the code you posted above and thinks it's magic or worse, real multiprocessing.


okay but is that what you meant by 'hiding the event loop' or did you mean something different


I always knew my experience with RISC OS wouldn't go to waste!


>Supported cooperative multitasking won in the end.

Is this the same as coroutines as in Knuth's TAOCP volume 1?

Sorry, my knowledge is weak in this area.





Thanks, will check that.


The quick answer is that coroutines are often used to implement cooperative multitasking because it is a very natural fit, but it's a more general idea than that specific implementation strategy.


interesting, i would have said the relationship is the other way around: cooperative multitasking implies that you have separate stacks that you're switching between, and coroutines are a more general idea which includes cooperative multitasking (as in lua) and things that aren't cooperative multitasking (as in rust and python) because the program's execution state isn't divided into distinct tasks

i could just be wrong tho



Yeah thinking about it more I didn’t intend to imply a subset relationship. Coroutines are not only used to implement cooperative multitasking, for sure.


Note that Oberon descendents like Active Oberon and Zonnon, do have premptive multitasking.


In the past, this policy of Wirth's has been cited when talking about go compiler development.

Go team member, Robert Griesemer, did his Phd under Mössenböck and Wirth.



Do you happen to remember where he said that? I've been looking for a citation and can't find one.

I think that some of the text in "16.1. General considerations" of "Compiler Construction" are sorta close, but does not say this explicitly.





> ... his 80th birthday symposium at ETH where he showed off his new port of Oberon to a homebrew CPU running on a random FPGA dev board with USB peripherals.

This was a fantastic talk. https://www.youtube.com/watch?v=EXY78gPMvl0



Thank you for sharing. I was there and didn’t expect to see this again. :)

He had the crowd laughing and cheering, and the audience questions in the end were absolutely excellent.



Always glad to be of service.

I think I last watched it during the pandemic and was inspired to pick up reading more about Oberon. A demonstration / talk like that is so much better when the audience are rooting for the presenter to do well.



According to his daughter (she runs a grocery store, and my wife occasionally talks to her), he kept on tinkering at home well past 80.


A Wirthwhile ambition. :)

Sorry, couldn't resist.

I first wrote it as "worthwhile", but then the pun practically fell out of the screen at me.

I love Wirth's work, and not just his languages. Also his stuff like algorithms + data = programs, and stepwise refinement. Like many others here, Pascal was one of my early languages, and I still love it, in the form of Delphi and Free Pascal.

RIP, guruji.

Edited to say guruji instead of guru, because the ji suffix is an honorific in Hindi, although guru is already respectful.



i hope you are! we miss you


I'm a former student of his. He was one of the people that made me from a teenager that hacked on his keyboard to get something to run to a seasoned programmer that thinks before he codes.

Even before I met him at the university I was programming in Oberon because there was a big crowd of programmers doing Wirth languages on the Amiga.

He will be missed.



I'm also a student of his, and later met him socially on a few occasions as a graduate student (in a different institute).

Undergraduate students were all in awe of him, but I got the impression that he did not particularly enjoy teaching them (Unlike other professors, however, he did not try to delegate that part of his responsibilities to his assistants). He seemed to have a good relationship with his graduate students.

In his class on compiler construction, he seemed more engaged (the students were already a bit more experienced, and he was iterating the Oberon design at the time). I remember an exchange we had at the oral exam — he asked me to solve the "dangling ELSE" problem in Pascal. I proposed resolving the ambiguity through a refinement of the language grammar. He admitted that this would probably work, but thought it excessively complex and wondered where I got that idea, since he definitely had not taught it, so I confessed that I had seen the idea in the "Dragon Book" (sort of the competition to his own textbook). Ultimately, I realized that he just wanted me to change the language to require an explicit END, as he had done in Modula-2 and Oberon.

Socially, he was fun to talk to, had a great store of computer lore, of course. He was also much more tolerant of "heresies" in private than in public, where he came across as somewhat dogmatic. Once, the conversation turned to Perl, which I did not expect him to have anything good to say about. To my surprise, he thought that there was a valid niche for pattern matching / text processing languages (mentioning SNOBOL as an earlier language in this niche).



Yes, the Amiga was one of the platforms where Modula-2 had quite a crowd, more so than on the PC, as we got spoiled with Turbo Pascal instead.


which languages? I've just restored an Amiga 500 with Workbench 2.1 and I'd love to honor his memory.


Modula2 was available and got used on Amiga. Silly teenager me found such high level languages "cheating" at the time.


Lords of the Rising Sun was written in Modula2 on the Amiga

https://www.google.com/search?q=lords+of+the+rising+sun

My understanding (please correct me) is that Turbo Pascal on PC was actually Modula2 ?



No, Borland did have a Modula-2 compiler (where actually Martin Odersky of Scala fame worked on), but they decided to focus on Turbo Pascal and sold it.


The recent discussion here about Turbo Pascal commenters said it was written in assembly. Seems to be supported by https://en.wikipedia.org/wiki/Turbo_Pascal


At least several Pascal, Modula-2, and Oberon-2 compilers.

My very first compiled programming language was Pascal. I got the free "PCQ Pascal" from the Fish disks as I wasn't able to get the C headers from Commodore which I would have needed for doing proper Amiga programming. Likewise later Oberon-A although I don't remember where I got that from.

There were also commercial Modula-2 and Oberon-2 compilers. I just found that the Modula-2 compiler was open sourced some years back. https://m2amiga.claudio.ch/

Aminet has directories for Oberon and Modula-2 related programs: https://aminet.net/dev/obero and https://aminet.net/dev/m2



begin

this is terrible news;

is there a better source than twitter (edit: https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html thanks to johndoe0815);

wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain, and moore seems to have given the reins at greenarrays to a younger generation;

young people may not be aware of the practical, as opposed to academic, significance of his work, so let me point out that begin

the ide as we know it today was born as turbo pascal;

most early macintosh software was written in pascal, including for example macpaint;

robert griesemer, one of the three original designers of golang, was wirth's student and did his doctoral thesis on an extension of oberon, and wirth's languages were also a very conspicuous design inspiration for newsqueak;

tex is written in pascal;

end;

end.



> wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand;

And yet far from the last. Simple, correct, and beautiful software is still being made today. Most of it goes unnoticed, its quiet song drowned out by the cacophony of attention-seeking, complex, brittle behemoths that top the charts.

That song never faded, you just need to tune in.



who are today's new great minimalists?


In no particular order: 100r.co, OpenBSD (& its many individual contributors such as tedu or JCS), Suckless/9front, sr.ht, Alpine, Gemini (&gopher) & all the people you can find there, Low Tech Magazine, antirez, Fabrice Bellard, Virgil Dupras (CollapseOS), & many other people, communities, and projects - sorry I don't have a single comprehensive list, that's just off the top of my head ;)


i... really don't think kris de decker is on niklaus wirth's level. i don't think he can write so much as fizzbuzz

fabrice bellard is wirth-level, it's true. not sure about tedu and jcs, because i'm not familiar enough with their work. it's absurd to compare most of the others to wirth and hoare

you're comparing kindergarten finger paintings to da vinci

you said wirth was "far from the last" apostle of simplicity. definition of apostle: https://en.wikipedia.org/wiki/Apostle



I'd also mention https://www.piumarta.com


ooh, yeah, now that's a possibility


I would add Jochen Liedtke (unfortunately he passed away already more than 20 years ago) as inventor of the L4 microkernel.

Several research groups continued work on L4 after Liedtke's death (Hermann Härtig in Dresden, Gernot Heiser in Sydney, a bit of research at Frank Bellosa's group in Karlsruhe and more industrial research on L4 for embedded/RT systems by Robert Kaiser, later a professor in Wiesbaden), but I would still argue that Liedtke's original work was the most influential, though all the formal verification work in Sydney also had significant impact - but that was only enabled by the simplicity of the underlying microkernel concepts and implementations.



agreed, though i think l4 was more influential than eumel (which is free software now by the way) even though eumel preceded l4


uxn ftw


uxn is cool but it's definitely not the same kind of achievement as oberon, pascal, quicksort, forth, and structured programming; rek and devine would surely not claim it was


to quote gp, "beautiful software is still being made today". It's not a competition.


you don't get to be described as an 'apostle of simplicity' just because you like simplicity. you have to actually change the world by creating simplicity. devine and rek are still a long way from a turing award


From your Wikipedia link about the meaning of the Word Apostle:

“The term [Apostle] is also used to refer to someone who is a strong supporter of something.[5][6]“

So I would call many people and myself (as someone who started studying Computer Science with Assembler and Modula-2) Apostle of simplicity.

No need for techno-classism.



you don't get to dictate who does or doesn't get recognized for creating awesome works that influence and inspire others. take your persistent negativity elsewhere.

btw, uxn is absolutely the exemplification of "software built for humans to understand" and simplicity. I mean...

> the resulting programs are succinct and translate well to pen & paper computing.

> to make any one program available on a new platform, the emulator is the only piece of code that will need to be modified, which is explicitly designed to be easily implemented

https://100r.co/site/uxn_design.html

how one can frame this as trivial is beyond me.



i don't think uxn is trivial, i think it's a first step toward something great. it definitely isn't the exemplification of "software built for humans to understand"; you have to program it in assembly language, and a stack-based assembly language at that. in that sense it's closer to brainfuck than to hypertalk or excel or oberon. it falls short of its goal of working well on small computers (say, under a megabyte of ram and under a mips)

the bit you quote about uxn having a standard virtual machine to permit easy ports to new platforms is from wirth's 01965 paper on euler http://pascal.hansotten.com/niklaus-wirth/euler-2/; it isn't something devine and rek invented, and it may not have been something wirth invented either. schorre's 01963 paper on meta-ii targets a machine-independent 'fictitious machine' but it's not turing-complete and it's not clear if he intended it to be implemented by interpretation rather than, say, assembler macros

i suggest that if you develop more tolerance for opinions that differ from your own, instead of deprecating them as 'persistent negativity' and 'dictating', you will learn more rapidly, because other people know things you don't, and sometimes that is why our opinions differ. sometimes those things we know that you don't are even correct

i think this is one of those cases. what i said, that you were disagreeing with, was that uxn was not the same kind of achievement as oberon, pascal, quicksort, forth, and structured programming (and, let me clarify, a much less significant achievement) and that it is a long way from [meriting] a turing award. i don't see how anyone could possibly disagree with that, or gloss it as 'uxn is trivial', as you did, unless they don't know what those things are

i am pretty sure that if you ask devine what he thinks about this comment, you will find that he agrees with every word in it



There wasn't a single place I asserted that uxn is specifically novel or unprecedented. In fact, Devine's own presentation[0] about uxn specifically cites Wirth and Oberon, among countless other inspirations and examples. I'm saying it's awesome, accessible, simple and open.

I don't need to "develop more tolerance for differing opinions" - I have no problem with them and am completely open to them, even from people who I feel are communicating in an unfriendly, patronizing or gatekeeping manner. rollcat shared some other people and projects and you took it upon yourself to shoot down as much as possible in that comment - for what purpose? No one said Drecker is "on Wirth's level" when it comes to programming. We don't need him to write FizzBuzz, let alone any other software. I'm sorry you don't recognize the value of a publication like Low-Tech Magazine, but the rest of us can, and your need to shoot down that recognition is why I called your messages persistently negative.

Further, when I give kudos to uxn and recognize it as a cool piece of software, there's absolutely no point in coming in and saying "yeah but it's no big deal compared to ____" , as if anyone was interested in some kind of software achievement pissing contest. The sanctity and reverence for your software idols is not diluted nor detracted from by acknowledging, recognizing and celebrating newer contributors to the world of computing and software.

I have to come back and edit this and just reiterate: All I originally said was "uxn ftw" and you found it necessary to "put me in my place" about something I didn't even say/assert, and make it into some kind of competition or gatekeeping situation. Let people enjoy things. And now, minimizing this thread and never looking at it again.

[0] https://100r.co/site/weathering_software_winter.html



Yeah, these younguns have a lot to learn. :-) The notion that there's something innovative about using a small VM to port software is hilarious. BTW, here is a quite impressive and effective use of that methodology: https://ziglang.org/news/goodbye-cpp/

Dewey Schorre and Meta II, eh? Who remembers such things? Well, I do, as I was involved with an implementation of Meta V when I was on the staff of the UCLA Comp Sci dept in 1969.



nice! which direction did meta-v go meta in?

i recently reimplemented meta-ii myself with some tweaks, http://canonical.org/~kragen/sw/dev3/meta5ixrun.py

i guess i should write it up



I use Suckless's terminal, st. It's great. To install it, I compile it from source. It takes a couple seconds. Take a look at their work.


do you think writing st is an achievement that merits a turing award

because i've also written a terminal emulator, and it compiles from source in 0.98 seconds https://gitlab.com/kragen/bubbleos/blob/master/yeso/admu-she... screenshot of running vi in it at https://gitlab.com/kragen/bubbleos/blob/master/yeso/admu_she...

(steps to reproduce: install dependencies; make; rm admu-shell admu-shell-fb admu-shell-wercam admu admu.o admu-shell.o admu_tv_typewriter.o; time make -j. it only takes 0.37 seconds if you only make -j admu-shell and don't build the other executables. measured on debian 12.1 on a ryzen 5 3500u)

i wrote pretty much all of it on october 12 and 13 of 02018 so forgive me if i don't think that writing a terminal emulator without scrollback is an achievement comparable to pascal and oberon etc.

not even if it were as great as st (and actually admu sucks more than st, but it can still run vi and screen)



No better source yet, I think.

But it is the real account of Bertrand Meyer, creator of the Eiffel language.



Niklaus Wirth's death was also announced (by Andreas Pirklbauer) an hour ago on the Oberon mailing list:

https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html



thank you

dang, maybe we can change the url to this instead? this url has been stable for at least 14 years (http://web.archive.org/web/20070720035132/https://lists.inf....) and has a good chance of remaining stable for another 14, while the twitter url is likely to disappear this year or show different results to different people



yeah, and i hope meyer would know

but still, it's twitter, liable to vanish or block non-logged-in access at any moment



Since Twitter is suppressing the visibility of tweets that link outside their site I think it would be perfectly fair to block links to twitter, rewrite them to nitter, etc. There also ought to be gentle pressure on people who post to Twitter to move to some other site. I mean, even I've got a Bluesky invite now.


Please don't make dang do more work.

https://news.ycombinator.com/item?id=38847048



bluesky seems like the site for people who think that the problem with twitter was that the wrong billionaire gets to decide which ideas to suppress

(admittedly you could make the same criticism of hn; it certainly isn't decentralized and resilient against administrative censorship like usenet was)



Well I didn't mean to just endorse Bluesky but call it out as one of many alternatives.

I'm actually active on Mastodon but I am thinking about getting on Instagram as well because the content I post that does the best on Mastodon would fit in there.



Do you know about Pixelfed?


it won't surprise you to learn that i like mastodon but haven't used it in months


Martin Odersky, creator of the Scala language and Wirth's student, also seems to believe it: https://twitter.com/odersky/status/1742618391553171866


I've been a massive fan of the PhD dissertation of Wirth's student Michael Franz since I first read it in '94. He's now a professor at UC Irvine, where he supervised Andreas Gal's dissertation work on trace trees (what eventually became TraceMonkey)


thank you, i definitely should have mentioned franz's work, even though i didn't know he was gal's advisor

perhaps more significant than tracemonkey was luajit, which achieves much higher performance with the tracing technique



> tex is written in pascal;

Just thought about that when Donald Knuth's Christmas lecture https://www.youtube.com/live/622iPkJfYrI lead me to one of his first TeX lectures https://youtu.be/jbrMBOF61e0 : If I install TeX on my Linux machine now, is that still compiled from the original Pascal source? Is there even a maintained Pascal compiler anymore? Well, GCC (as in GNU compiler collection) probably has a frontend, but that still does not answer the question about maintenance.

These were just thoughts. Of course researching the answers would not be overly complicated.



> If I install TeX on my Linux machine now, is that still compiled from the original Pascal source?

If you install TeX via the usual ways–TeX Live and MikTeX are the most common—then the build step runs a program (like web2c) to convert the Pascal source (with changes) to C, then uses a C compiler. (So the Pascal source is still used, but the Pascal "compiler" is a specialized Pascal-to-C translator.) But there is also TeX-FPC (https://ctan.org/pkg/tex-fpc), a small set of change (patch) files to make TeX compilable with the Free Pascal compiler (https://gitlab.com/freepascal.org/fpc/).

For more details see https://tex.stackexchange.com/questions/111332/how-to-compil...



> Is there even a maintained Pascal compiler anymore?

Of course

https://www.freepascal.org/



Nice. Although "latest news" are from 2021.


It's not like Pascal changes a whole lot.


> wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand

Absolutely!

And equally important was his ability to convey/teach CS precisely, concisely and directly in his books/papers. None of them have any fluff nor unnecessary obfuscation in them. These are the models to follow and the ideals to aspire to.

As an example see his book Systematic Programming: An Introduction.



> wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain

Also Alan Kay still with us.



in the neat/scruffy divide, which goes beyond ai, wirth was the ultimate neat, and kay is almost the ultimate scruffy, though wall outdoes him

alan kay is equally great, but on some axes he is the opposite extreme from wirth: an apostle of flexibility, tolerance for error, and trying things to see what works instead of planning everything out perfectly. as sicp says

> Pascal is for building pyramids—imposing, breathtaking, static structures built by armies pushing heavy blocks into place. Lisp is for building organisms—imposing, breathtaking, dynamic structures built by squads fitting fluctuating myriads of simpler organisms into place.

kay is an ardent admirer of lisp, and smalltalk is even more of an organism language than lisp is



I had wanted to interview Val Schorre [1], and looked him up on a business trip because I was close. Died 2017, seems like a righteous dude.

https://www.legacy.com/us/obituaries/venturacountystar/name/...

[1] https://en.wikipedia.org/wiki/META_II



yeah, i wish i had had the pleasure of meeting him. i reimplemented meta-ii 3½ years ago and would recommend it to anyone who is interested in the parsing problem. it's the most powerful non-turing-complete 'programming language' i've ever used

http://www.canonical.org/~kragen/sw/dev3/meta5ixrun.py

(i mean i would recommend reimplementing it, not using my reimplementation; it takes a few hours or days)

after i wrote it, the acm made all old papers, including schorre's meta-ii paper, available gratis; they have announced that they also plan to make them open-access, but so far have not. still, this is a boon if you want to do this. the paper is quite readable and is at https://dl.acm.org/doi/10.1145/800257.808896



It is a good paper, and I give much respect for ACM opening up their paywall of old papers. They even stopped rate limiting downloads. I'd like to think my incessant whining about this had some effect. :) It is such a wonderful thing for curious people everywhere to be able to read these papers.

I haven't reimplemented meta-ii, I will.

You might like https://old.reddit.com/r/rust/comments/18wnqqt/piccolo_stack...

And https://www.youtube.com/@T2TileProject/videos



thanks! i like lua a lot despite its flaws; that's what i wrote my own literate programming system in. lua's 'bytecode' is actually a wordcode (a compilation approach which i think wirth's euler paper was perhaps the first published example of) and quite similar in some ways to wirth's risc-1/2/3/4/5 hardware architecture family

i hope they do go to open access; these papers are too valuable to be lost to future acm management or bankruptcy



Are you familiar with the 'Leo' editor? It is the one that comes closest to what I consider to be a practically useful literate programming environment. If you haven't looked at it yet I'd love it if you could give it a spin and let me know what you make of it.

https://leo-editor.github.io/leo-editor/



What are the lua's flaws in your opinion? Sincere question.


there are a lot of design decisions that are pretty debatable, but the ones that seem clearly wrong to me are:

- indexing from 1 instead of 0;

- the absence of a nilness/nonexistence distinction (so misspelling a variable or .property silently gives the wrong answer instead of an exception);

- variables being global by default, rather than local by default or requiring an explicit declaration;

- printing tables by default (with %q for example) as identities rather than contents. (you could argue that this is a simplicity thing; lua 5.2 is under 15000 lines of code, which is pretty small for such a full-featured language, barely bigger than original-awk at 6200 lines of c and yacc plus 1500 lines of awk, and smaller than mawk at 16000 lines of c, 1100 lines of yacc, and 700 lines of awk. but a recursive table print function with a depth limit is about 25 lines of code.)

none of these are fatal flaws, but with the benefit of experience they all seem like clear mistakes



As far as I know, Henry Baker is still with us. I had a dream where I interviewed Wirth for like 20 hrs so we could clone him with an LLM. We need to grab as much video interviews from folks as possible.


henry baker has made many great contributions, but last time i talked to him, he was waiting for somebody to start paying him again in order to do any more research

but i'm sure he'd agree his achievements are not in the same league as wirth's



I wasn't trying to compare them in any other way other than, that Henry Baker is still with us.


cool, sorry if i came across as confrontational


> wirth was the greatest remaining apostle of simplicity, correctness, and software built for humans to understand; now only hoare and moore remain…

No. There is another.

https://en.m.wikipedia.org/wiki/Arthur_Whitney_%28computer_s...



Some would dispute the "built for humans to understand". Whitney's work is brilliant, but it's far less accessible than Wirth's.


The point of Whitney's array languages is to allow your solutions to be so small that they fit in your head. Key chunks should even fit on one screen. A few years ago, Whitney reportedly started building an OS using these ideas (https://aplwiki.com/wiki/KOS).


I'm aware of the idea. I'm also aware that I can read and understand a whole lot of pages of code in the amount of time it takes me to decipher a few lines of K for example, and the less dense codes sticks far better in my head.

I appreciate brevity, but I feel there's a fundamental disconnect between people who want to carefully read code symbol by symbol, who often seem to love languages like J or K, or at least he better able to fully appreciate them, and people like me who want to skim code and look at the shape of it (literally; I remember code best by its layout and often navigate code by appearance without reading it at all, and so dense dumps of symbols are a nightmare to me)

I sometimes think it reflects a difference between people who prefer maths vs languages. I'm not suggesting one is better than the other, but I do believe the former is a smaller group than the latter.

For my part I want grammar that makes for a light, casual read, not having to decipher. I want to be able to get a rough understanding with a glance, and gradually fill in details, not read things start to finish (yes, I'm impatient)

A favourite example of mije is the infamous J interpreter fragment, where I'd frankly be inclined to prefer a disassembly over the source code. But I also find the ability to sketch out such compact code amazing.

I think Wirths designs very much fit in the languages that are skimmable and recognisable by shape and structure category. I can remember parts of several of Wirths students PhD theses from the 1990s by the shape of procedures in their Oberon code to this day.

That's not to diminish Whitney's work, and I find that disconnect in how we process code endlessly fascinating, and regularly look at languages in that family because there is absolutely a lot to learn from them, but they fit very different personalities and learning styles.



I sometimes think it reflects a difference between people who prefer maths vs languages. I'm not suggesting one is better than the other, but I do believe the former is a smaller group than the latter.

This dichotomy exists in mathematics as well. Some mathematicians prefer to flood the page with symbols. Others prefer to use English words as much as possible and sprinkle equations here and there (on their own line) between paragraphs of text.

The worst are those that love symbols and paragraphs, writing these dense walls of symbols and text intermixed. I’ve had a few professors who write like that and it’s such a chore to parse through.



i keep hoping that one day i'll understand j or k well enough that it won't take me hours to decipher a few lines of it; but today i am less optimistic about this, because earlier tonight, i had a hard time figuring out what these array-oriented lines of code did in order to explain them to someone else

    textb = 'What hath the Flying Spaghetti Monster wrought?'
    bits = (right_shift.outer(array([ord(c) for c in textb]),
                              arange(8))).ravel() & 1
and i wrote them myself three months ago, with reasonably descriptive variable names, in a language i know well, with a library i've been using in some form for over 20 years, and their output was displayed immediately below, in https://nbviewer.org/url/canonical.org/~kragen/sw/dev3/rando...

i had every advantage you could conceivably have! but i still guessed wrong at first and had to correct myself after several seconds of examination

i suspect that in j or k this would be something like (,(@textb)*.$i.8)&1 though i don't know the actual symbols. perhaps that additional brevity would have helped. but i suspect that, if anything, it would have made it worse

by contrast, i suspect that i would have not had the same trouble with this

    bits = [(ord(c) >> i) & 1 for c in textb for i in range(8)]
however, as with rpn, i suspect that j or k syntax is superior for typing when what you're immediately evaluating expressions rather than writing a program to maintain later, because the amount of finger typing is so much less. but maybe i just have a hard time with point-free style? or maybe, like you say, it's different types of people. or maybe i just haven't spent nearly enough time writing array code during those years


The k solution is ,/+|(8#2)\textb

k doesn't have a right shift operator, but you don't need that, you can use the base encoding operator instead

Personally I think this is clearer than both the array-ish python and the list comp.

https://ngn.codeberg.page/k/#eJwrSa0oSbJSCs9ILFEA4gyFkoxUBbe...



I'm sitting here with the K reference manual, and I still can't decode this.

I'm wildly guessing that your approach somehow ends doing something closer to this:

    textb.bytes.flat_map{_1.digits(2)}
Which I must admit it took me embarrassingly long to think of.


thank you very much! which k implementation does this page use?

oh, apparently a new one called ngn/k: https://codeberg.org/ngn/k



As much as it's a struggle to read, you got to love the people implementing this are dedicated enough to write their C as if it was K.


I think the K would likely be both simpler and harder than your first example by reading very straightforwardly in a single direction but with operators reading like line noise. In your case, my Numpy is rusty, but I think this is the Ruby equivalent of what you were doing?

    textb = 'What hath the Flying Spaghetti Monster wrought?'

    p textb.bytes.product((0...8).to_a).map{_1>>_2}.map{_1 & 1}
Or with some abominable monkey patching:

    class Array
      def outer(r) = product(r.to_a)
      def right_shift = map{_1>>_2}
    end

    p textb.bytes.outer(0...8).right_shift.map{_1 & 1}
I think this latter is likely to be a closer match to what you'd expect in an array language in terms of being able to read in a single direction and having a richer set of operations. We could take it one step further and break the built in Array#&:

    class Array
      def &(r) = map{_1 & r}
    end

    p textb.bytes.outer(0...8).right_shift & 1
Which is to say that I don't think the operator-style line-noise nature of K is what gives it its power. Rather that it has a standard library that is fashioned around this specific set of array operations. With Ruby at least, I think you can bend it towards the same Array nature-ish. E.g. a step up from the above that at least contains the operator overloading and instead coerces into a custom class:

    textb = 'What hath the Flying Spaghetti Monster wrought?'
    
    class Object
      def k = KArray[*self.to_a]
    end
    
    class String
      def k = bytes.k
    end

    class KArray >_2}.k
      def &(r) = map{_1 & r}.k
    end

    p textb.k.outer(0...8).right_shift & 1
With some care, I think you could probably replicate a fair amount of K's "verbs" and "adverbs" (I so hate their naming) in a way that'd still be very concise but not line-noise concise.


that all seems correct; the issue i had was not that python is less flexible than ruby (though it is!) but that it required a lot of mental effort to map back from the set of point-free array operations to my original intent. this makes me think that my trouble with j and k is not the syntax at all. but conceivably if i study the apl idiom list or something i could get better at that kind of thinking?


I think you could twist Python into getting something similarly concise one way or other ;) It might not be the Python way, though. I agree it often is painful to map. I think in particular the issue for me is visualizing the effects once you're working with a multi-dimensional set of arrays. E.g. I know what outer/product does logically, but I have to think through the effects in a way I don't need to do with a straightforward linear map(). I think I'd have been more likely to have ended up with something like this if I started from scratch even if it's not as elegant.

    p textb.bytes.map{|b| (0...8).map{|i| (b>>i) & 1} }.flatten
EDIT: This is kind of embarrassing, but we can of course do just this:

    textb.bytes.flat_map{_1.digits(2)}
But I think the general discussion still applies, and it's quite interesting how many twists and turns it took to arrive at that


well, simplicity anyway, arguably (like moore) to an even higher degree than wirth


It's OK on Hacker News to dis a reputable news source now?


Niklaus Wirth was also responsible for changing the title of Dijkstra's paper to "Goto Statement Considered Harmful".

https://en.wikipedia.org/wiki/Considered_harmful#cite_ref-6



Relevant excerpt of Dijkstra's own account (from EWD1308 [1]):

Finally a short story for the record. In 1968, the Communications of the ACM published a text of mine under the title "The goto statement considered harmful", which in later years would be most frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template: we would see all sorts of articles under the title "X considered harmful" for almost any X, including one titled "Dijkstra considered harmful". But what had happened? I had submitted a paper under the title "A case against the goto statement", which, in order to speed up its publication, the editor had changed into a "letter to the Editor", and in the process he had given it a new title of his own invention! The editor was Niklaus Wirth.

[1] Transcription - https://www.cs.utexas.edu/%7EEWD/transcriptions/EWD13xx/EWD1... PDF - https://www.cs.utexas.edu/%7EEWD/ewd13xx/EWD1308.PDF



And it continues to this day!


A sad day. He was a titan of computing and still deserved even more attention that the got. If his languages had been more prevalent in software development, a lot of things would be in a better shape.

After playing around a bit with Basic on the C64/128, Pascal became my first "real" programming language I learned. In the form of UCSD Pascal on Apple II at my school as well as Turbo Pascal 3.0 on a IBM PC (no AT or any fanciness yet). Actually a Portable PC with a build-in amber CRT.

When I got my Amiga 500, Modula 2 was a very popular language on the Amiga and actually the M2Amiga system was the most robust dev env. I still think fondly of that time, as Modula 2 made it so easy to develop structured and robust programs. The module concept was quite ahead of the time, while the C world kept recompiling header files for so many years to come. Today, Go picked up a lot from Modula 2, one reason I immediately jumped onto it. Not by chance, Robert Griesemer was a student of Wirth.

During the 90ies, while MS Dos was still used, Turbo Pascal still was the main go-to language on the PC for everyone, as it was powerful, yet approachable for non-fulltime software developers. It picked up a lot of extensions from Modula 2 too and also had a nice Object system. It peaked at the version 6 and 7. Probably to the day my favorite development environment, partially because of the unmatched speed of a pure character based UI. And Turbo Pascal combined the nice development environment with a language which found a great compromise between power and simplicity.

Unfortunately, I was only vaguely familiar with his later work on Oberon. I ran the Oberon system natively on my 386 for some toying around. It was extremely impressive with its efficiency and full GUI in the time of DOS on the PC. A pity, it didn't achive more attention. Probably it would have been very successful if it had gained tracking in the not too late 80ies, in the early 90ies of course Windows came along.

From a puristic point of view, the crowning achievement was of course when he really earned the job title of a "full stack developer", not only designing Oberon and the OS, but the CPU to run it as well. Very impressive and of a huge educational value.

END.



Prof Wirth was a major inspiration for me as a kid. I eagerly read his book on Pascal, at the time not appreciating how unusual it was for its elegance and simplicity. I also followed with interest his development of the Oberon language and Lilith workstation. When I was 13, he gave a talk not too far away, I think it might have been Johns Hopkins, and my dad took me to it. It was a wonderful experience, he was very kind and encouraging, as I think the linked photo[1] shows.

[1]: https://mastodon.online/@raph/111693863925852135



Great story. Thanks for sharing.


Wirth was the chief designer of the programming languages Euler (1965), PL360 (1966), ALGOL W (1966), Pascal (1970), Modula (1975), Modula-2 (1978), Oberon (1987), Oberon-2 (1991), and Oberon-07 (2007). He was also a major part of the design and implementation team for the operating systems Medos-2 (1983, for the Lilith workstation), and Oberon (1987, for the Ceres workstation), and for the Lola (1995) digital hardware design and simulation system. In 1984, he received the Association for Computing Machinery (ACM) Turing Award for the development of these languages.


I still wonder what the tech world would’ve been like today if Wirth had had the marketing sense to call Modula “Pascal 2”


…also if he hadn’t insisted on uppercase keywords.


That wasn't the issue many make it up to be, thanks to programmer editors.

We already had autoformatting tools in the early 1990's, Go did not invent them.



I'm kind of a fan of Lola, an easy-to-learn HDL which was inspired by Pascal/Oberon vs. Verilog (inspired by C) and VHDL, inspired by Ada.

I like Wirth's whole software stack: RISC-5 (not to be confused with RISC-V) implemented in Lola, Oberon the language, and Oberon the environment. IIRC Lola can generate Verilog - I think the idea was that students could start with an FPGA board and create their own CPU, compiler, and OS.

I also like his various quips - I think he said something like "I am a professor who is a programmer, and a programmer who is a professor." We need more programmer/professors like that. Definitely an inspiration for systems people everywhere.



Also collaborated with Apple on Object Pascal initial design, his students on Component Pascal, Active Oberon, Zonnon, and many other research projects derived from Oberon.


For those who don't know, Pascal was what a lot of the classic Mac software was written in, before Objective-C and Swift. It grew into Delphi, which was a popular low-code option on Windows.


I wouldn’t describe Delphi as low code, it is an IDE. Wikipedia also describes it like this[1] and does not include it in its list of low code development platforms[2].

[1]: https://en.m.wikipedia.org/wiki/Delphi_(software)

[2]: https://en.m.wikipedia.org/wiki/List_of_low-code_development...



It was a RAD platform though. From following your links:

> Low-code development platforms trace their roots back to fourth-generation programming language and the rapid application development tools of the 1990s and early 2000s.

> Delphi was originally developed by Borland as a rapid application development tool for Windows as the successor of Turbo Pascal.



It still is, and got a new release last month.


I wouldn’t know, I was like a Borland fan…


It's a shame that Pascal was largely abandoned (except for Delphi, which lived on for a while); I believe several Pascal compilers supported array bounds checking, and strings with a length field. In the 1980s this may have been considered overly costly (and perhaps it is considered so today as well), but the alternative that the computing field and industry picked was C, where unbounded arrays and strings were a common source of buffer overflow errors. Cleaning this up has taken decades and we still probably aren't done.

Better C/C++ compilers and libraries can help, but the original C language and standard library were certainly part of the issue. Java and JavaScript (etc.) may have their issues but at least chasing down pointer errors usually isn't one of them.



The industry picked C when Pascal was still widely supported, not as a result of it being abandoned.


A side effect of UNIX adoption, C already being in the box, whereas anything else would cost money, and no famous dialect (Object Pascal, VMS Pascal, Solaris Pascal, UCSD Pascal) being portable.

Unfortunately Pascal only mattered to legions of Mac and PC developers.



Delphi still lives on, to the extent that there is enough people to sell conference tickets in Germany, and a new release came out last month.


AFAIK, even Photoshop was originally written in Pascal.


The Photoshop 1.0.1 source code is available from the Computer History Museum https://computerhistory.org/blog/adobe-photoshop-source-code...>

Comments: https://news.ycombinator.com/item?id=17132058>



It was, according to Sean Parent (Adobe employee) in an interview about Pascal (around 8:03): https://adspthepodcast.com/2023/12/29/Episode-162.html


Pascal was the second language after Basic. I was always interested in learning Modula, but picked up Delphi instead.


Pascal was the second language I learned after Fortran. I didn't particularly like Fortran but Pascal really hit home and motivated me to learn C.


Love motivated me to learn Pascal, Money motivated me to learn C.


I learned Pascal and MODULA-2 in college, in my first two programming semesters. MODULA-2 was removed shortly afterwards but Pascal is still used in the introductory programming course. I'm very happy to have had these as the languages that introduced me to programming and Wirth occupies a very special place in my heart. His designs were truly ahead of their time.


I had Pascal and some Modula as well (on concurrent programming course).

I learned C++ later myself as a Pascal with bizzare syntax. I always felt like semantics of C++ was taken entirely from Pascal. No two lanuages ever felt closer to each other for me. Like one was just reskin of the other.



I already told this story multiple times, when I came to learn C, I already knew Turbo Pascal since 4.0 up to 6.0, luckly the same teacher that was teaching us about C, also had access to Turbo C++ 1.0 for MS-DOS.

I adopted C++ right away as the sensible path beyond Turbo Pascal for cross-platform code, and never seen a use for C primitive and insecure code, beyond being asked to use it in specific university projects, and some jobs during the dotcom wave.

On Usenet C vs C++ flamewars, there might be still some replies from me on the C++ side.



I learned C that way (algorithms class was in C), even had a little printout table of the different syntaxs for the same instructions (here's how you write a for, if, record, declare a variable, etc). At the time I remember thinking that the C syntax was much uglier, and that opinion has stayed with me since -- when I learned Python everything just seemed so natural.


I started my first company based on Delphi, which itself was based on Turbo Pascal. Wirth was a great inspiration, and his passing is no small loss. May his work keep inspiring new programmers for generations to come.

One of his quotes: "Whereas Europeans generally pronounce my name the right way ('Ni-klows Wirt'), Americans invariably mangle it into 'Nick-les Worth'. This is to say that Europeans call me by name, but Americans call me by value."



He was indeed! I wrote my bachelors thesis on bringing modularity to a language for monitoring real time systems and his work, especially on MODULA-2, was a huge source of inspiration.


Wirth must have adopted the quote (how could he not), but it actually goes back to a clever line by someone introducing him at a conference.

https://news.ycombinator.com/item?id=38858993



What conference was it? Edit: Nvm, saw your other comment[0].

[0]: https://news.ycombinator.com/item?id=38858993



From a comment I left on Mastodon:

He gave a talk at the CHM (He was inducted as a fellow in 2004) I got to talk with him and was really struck by someone who had had such a huge impact was so approachable. When another person in the group challenged Modula-2 he listened respectfully and engaged based on the the idea that the speakers premise was true, then nicely dissented based on objective observations. I hope I can always be that respectful when challenged.



A sad day for the history of computing, the loss of a great language designer, that influenced many of us in better ways to approach systems programming.


It is sad but the guy had long and fulfilling life many can only dream about. I would raise a toast to that. Hopefully he is in a coding Valhalla.


Not just coding: he was also interested* in hardware and built whole machines.

(* might Carver Mead describe him as a metaphorical "tall, thin, person"?)



he very well might, dave. i miss talking to you


I am not very sad. Death is part of life.

I'm much more sad when life sort of decays (Alzheimer's, dementia, or simply becoming slow/stupid/decrepit), ends early, or when life is simply wasted.

He was about to turn 90.

He lead a long, impactful, fulfilling life.

That's a life to celebrate.



This is beautiful phrasing, very much how I think about life myself. Let's hope the last days of Mr Wirth were free from physical pain. Thinking of my grandpa who died all of a sudden, apparently without serious physical impairments or aches, at age 90, after a happy, well-lived, ethical life.

Heaven is happier by one person now for sure, again. And maybe some compilers over there also need tinkering. Rest in peace, Mr Wirth.



Pascal was my first "real" language after Basic, learned it in the late 80s, wrote a couple of small apps for my dad in it.

Learned most of it from a wonderful book whose name I have forgotten, it had a wrench on its cover, I think?

Anyway, still rocking Pascal to this day, since I still maintain 3 moderately complex installers written with InnoSetup, which uses RemObjects Pascal as a scripting language.

4 years ago, a new guy on our team, fresh from school, who never even knew this language existed, picked up Pascal in a week, and started maintaining and developing our installers much further. He did grumble a bit about the syntax but otherwise did a splendid job. I thought that was a tribute to the simplicity of the language.



> Pascal was my first "real" language after Basic, learned it in the late 80s

Me too, word for word. I spent a few years in my pre-teens immersed in the Turbo Pascal IDE, which was a full-on educational tool of its own that explained everything about the language. I moved on to C after that, but I still get a nostalgic vibe from looking at Pascal syntax. It was a great foundational experience for me as a programmer.



Pascal was my second language, after BASIC. I was about twelve and pointers cost me a little to understand. But the first hurdle was not having line numbers. It seemed weird.

In the end, it was definitely worth the effort, and I learnt good habits from it. I used it in college, and I suppose I kinda still do, because I do a lot of PL/SQL.

He was hugely important for generations of coders.

RIP.



now would be a good time read this in his memory https://cr.yp.to/bib/1995/wirth.pdf

Also, his Oberon system provides a rich seam to mine. This, from a symposium held at ETH Zurich on the occasion of his 80th birthday in 2014, is a whirlwind retrospective. "Reviving a computer system of 25 years ago" https://www.youtube.com/watch?v=EXY78gPMvl0

One of the greats.



When I first got to play with Turbo Pascal (3.something?), I was more impressed by the concise expression of the language in the EBNF in the manual than by Turbo Pascal itself, and it was what made me interested in parsers and compilers, and both Wirth's approach to them and the work his students undertook in his research group has been an interest of mine for well over 30 years since.


R.I.P. Niklaus Wirth. Your ideas, languages and designs were the inspiration for several generations of computer scientists and engineers. Your Lilith computer and Modula-2 language kindled a small group of students in Western Siberia’s Akademgorodok to create KRONOS - an original RISC processor-based computer, with it’s own OS and Modula-2 complier, and lots of tools. I was very lucky to join the KRONOS team in 1986 as a 16 yo complete beginner, and this changed my life forever as I become obsessed with programming. Thank you, Niklaus.


I hold an old print of his Pascal language report near and dear on my.bookshelf. he bootstrapped oberon with one peer in 1-2 years.

his preference for clarity over notational fancyness inspired so many of us.

the Pascal family of languages are not only syntactically unambiguous to the compiler they are also clear and unambiguous to humans. can. the Carbon successor to c++ strives for the same iirc.



After having read some of the comments on Pascal here -- fellow HNers, what's your view on Pascal as a teaching/introductory language in 2023, for children aged 10+? Mostly thinking of FreePascal, but TurboPascal in DOSBox/FreeDOS/SvarDOS is also a possibility.

I'm also thankful for references to "timeless" Pascal books or online teaching materials that would be accessible for a 10+ year old kid who is fine with reading longer texts.

(My condolences are below, fwiw. His death is, interestingly, a moment of introspection for me, even if I'm just a hobbyist interested in small systems and lean languages.)



Niklaus Wirth is most famous for Pascal but the best language is his last, namely Oberon which is both smaller and more capable than Pascal. If you are interested in a freestanding compiler (separate from the operating system Oberon), have a look at OBNC.

https://miasap.se/obnc/



I just needed a feature of Pascal yesterday in one of my Rust libraries: ranged integers. I know, you can go about it in different ways, like structs with private fields and custom constructors, or just with a new generic type. But, having the ability to specify that some integer can only be between 15..25 built-in is a fantastic language feature. That's even so with runtime bounds checking disabled because the compiler would still complain about some subset of violations.

What an innovator and a role model. I wish I can be as passionate about my work in my 80's as him.



Not only did Pascal (TP more precisely) taught me about systems programming with a safer language, it was also my first foray into type driven programming, learning to use the type system to express conditions not bound to happen.

Ranged numerics and enumerations were part of that.



I owe a debt of gratitude to him and the Pascal programming language. Sincerest condolences to those he left behind.


The greatest of all quiche eaters has just passed away. May he rest in peace. https://www.pbm.com/~lindahl/real.programmers.html But seriously, PASCAL was the first programming language I loved that's actually good. Turbo Pascal. Delphi. Those were the days. We got a better world thanks to the fact that Niklaus Wirth was part of it.


One of the titans of the era, Pascal greatly contributed to my love of programming and my eventual career. Rest in peace Dr. Wirth.


I don't see obituaries yet. In the meantime: https://en.wikipedia.org/wiki/Niklaus_Wirth


I loved the book Compiler Construction. Wirth's emphasis on minimalism is a huge inspiration.


I haven't read that one yet, but "Algorithms + Data Structures = Programs" is just an absolutely beautiful gem of a book. It embodies his principles of simplicity and clarity. Even though it's outdated in many places, I adored reading it.


Speaking of which, that book is one of the very few sources I could find that talks about recursive descent error recovery and goes further than panic mode.


That's the other one I keep hearing about, must read it.


There's also an interesting book "A model implementation of standard Pascal" - not by Wirth - that implements an ISO Standard Pascal compiler in Standard Pascal, written as a "literate program" with copious descriptions of every single aspect. Basically, the entire book is one large program in which non-code is interspersed as very long multiline comments.

I couldn't find it available for download anywhere, but Internet Archive has a version of it in the library that can be checked out for free: https://archive.org/details/modelimplementat0000wels/page/n9...



Here's my transcription of that book: https://2k38.be/misp/

compiler.pas can be compiled with a modern Pascal compiler, but the resulting compiler cannot compile itself. I don't know if that's caused by a transcription error, a bug in the modern compiler or a bug in the Model Implementation.

I would love it if somebody gets this working. I don't think I myself will continue with this project.



RIP, and thanks for helping indirectly to put me on my career path.

I learned pascal fairly late in the grand scheme of things (basic->6502 assembly->C and then eventually pascal) but it was used for the vast majority of my formal CS education first by instruction, then by choice, and eventually in my first real programming job. The later pascal dialects remain IMHO far better than any other languages I write/maintain/guide others in using. Like many others of his stature it was just one of his many hits. Niklaus Wirth is one of the giants I feel the industry stands on, and I thank him for that.

"All those moments will be lost in time, like tears in rain..."



Compiler construction was and is one of my all time favorite books on the matter. You can't put it down once you start, it is that good.

https://people.inf.ethz.ch/wirth/CompilerConstruction/Compil...



I'm happy to say I got to meet him, thanks to Charles Simonyi. May his memory be a blessing.


I was at the session he did at CERN back in 2004, but sadly never managed to talk to him.


Wirth made one of the most critical observations in the whole history of computing: as hardware develops, software complicates to compensate and slow things down even further.


This is a huge loss in computer science. Everyone interested in computing, no matter if using other languages than Pascal or derivatives, should read his "Algorithms + Data Structures = Programs" book. R.I.P.


ADSP podcast (named after Wirth's book) just had an episode on the history of Pascal https://adspthepodcast.com/2023/12/29/Episode-162.html




Rest in peace.


I really appreciate his work. He had a full life. Since yesterday, without knowing, I was just studying a section of a book detailing the code generation of one of the first Pascal compilers for the CDC 6400.


My very first language was Pascal. I have since forgotten it, but distinctly remember the feeling computers are fun! And the red pascal book. Thank you Niklaus, for all the fun and impact you had on subsequent languages.


Rest in peace. I owe a lot to his work.

"Algorithms + Data Structures = Programs" was a seminal book for me when I was learning about software development and it has influenced how I think about programming. Also, Pascal (in its various dialects) was my main language for many years on multiple platforms (CP/M, MS-DOS, Windows).



I bought the Art of Computer Programming volume 4A a few years ago and didn't even start reading it. 1-3 I read when I… god, my youngest child is almost that age now.

I think tonight is the time to start on 4A, before we lose Knuth too.

And as I picked it down I noticed that, almost by coincidence, AoCP stood next to Wirth's PiM2. It wasn't intentional but it feels very right. There's a set of language books that end with Systems Programming with Modula 3, the Lua book. Thinking Forth, PiM2, then a gap, then the theory-ish section starts with five volumes of Knuth. Sigh.



Here's a fantastic interview with him (in case you speak german): https://de.wikipedia.org/wiki/Datei:ProfessorNiklausWirth.we...


Barely a week ago I was quoting from a paper of him here in HN, "A plea for lean software". It's been discussed here before:

https://news.ycombinator.com/item?id=27661559

RIP



RIP King. 2nd language I learned was Pascal (Turbo 5 then 6) in high school. Tried UCSD P-System from a family friend with corporate/educational connections on 5.25" but didn't have a manual, and this was before the internet. I could/should have tried to use the library to get books about it, but gave up.

Fond memories; I feel like the 90s kids were the last ones to really get to appreciate Pascal in a "native" (supportive, institutional) setting.

I also loved learning Oberon/Bluebottle (now A2 I guess), which I was so fascinated with. I think that and Plan 9's textual/object interface environments are super interesting and a path we could have taken (may converge to someday?)



Pascal pointer notation was so logical. Dereferencing a pointer was just a caret to the right of the variable.

A double pointer was just two carets. And so on.

There was a struct symmetry about the whole thing. C broke that, especially with strict pointers.



RIP. After Basic (on a commodore), I learned Pascal (turbo pascal) in high school of all places.


He was one of the most influential personalities in our field, most modern programming languages descend from and contain many of his ideas.


Coming from ZX Spectrum at home and seeing the beauty of Turbo Pascal on an IBM PC-compatible has greatly contributed to my love of programming. R.I.P., Professor Wirth.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com