(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=39004526

根据给定的材料,以下是一些显着的差异和联系: 1. 针对简单性是优秀编程语言的标志这一理念,作者讨论了 Go、D、Assembler、C++、Rust 和 Clojure 等各种语言。 虽然 Go 看起来很简单,但 D 和 Assembler 及其嵌套函数却显示出语法的复杂性。 为了区分简单性,作者建议回顾过去的语法并考虑诸如易于学习、一致性、可理解性、紧凑性、正交性、可预测性、极简主义、模块化、直观使用和高效执行等因素。 这些品质对于生产经得起时间考验的优雅设计至关重要。 此外,简单性不仅仅是一种观点或主观因素,而且是编程语言的重要组成部分。 给出的一个反例是,Pascal 只花了八个月的时间就开发出来,并在发布后三个月内就被广泛采用。 相反,C#(发音为c Sharp)花了十年的时间,尽管有微软的支持,但仍然没有被广泛接受。 同样,C++ 花了大约五年的时间才创建,但它已经持续发展了近四十年。 因此,选择编程语言时需要简单性。 2. 具体到 Go,作者指出,虽然它看起来很简单,但其简单性相对肤浅。 它的基本原理源于 Newsqueak,这是一种作为 SNOBOL4 庞大架构的替代品而创建的语言,旨在解决 Prolog 和 Lisp 的局限性。 尽管 Newsqueak 未能赢得用户,但它通知了 Go,Go 的开发团队由具有实际工程专业知识的成员组成。 作者认为,判断一种编程语言简单性的试金石在于其编译速度,Oberon、Go、D等语言的特点是编译速度快。 然而,仅靠这个测试是不够的,更重要的是复杂表达的易于理解性和理解他人所写内容的能力。 3. 作者提出了对 Rust 的担忧,并强调真正的挑战不在于创建简洁的代码,而在于将其与适当的基础设施配对。 他们建议,如果 Rust 有一个符合 OCaml 等传统语言的内置解释器,那么它的使用起来可能会变得更加舒适。 另外,作者推荐

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
Niklaus Wirth, or the Importance of Being Simple (acm.org)
267 points by madmax108 1 day ago | hide | past | favorite | 111 comments










The article mentioned Philippe Kahn (Borland co-founder) as the student of Wirth, never heard the fact before. The podcast [1] confirms this. Probably the Borland's decision to buy Compass Pascal might be influenced by his Kahn's early impressions.

[00:07:12] ... I remember you had a choice between two programming language and on one side they taught Fortran and Fortran is the language of science, or it was the language of scientists.

[00:07:40] And then there was this new class that was started by this Professor Niklaus Wirth about Pascal. And it was, I think the first or second year it was taught. There were a lot of people in the Fortran class and not that many people in the Pascal class. So I said, oh, I'll go to the Pascal class.

[00:07:59] And that's how I met Professor Wirth. And that was great. That was my favorite class from that moment because he's such a, such an enlightened person and a clear thinker that it was a great, great experience for me.

[1] https://ethz.ch/en/news-and-events/eth-news/news/2022/05/we-...



> in the sense that the limitations and exclusions of the language design were precisely what made compact implementations possible and widely successful.

All of the Pascals that were widely successful extended the language in key ways. I was an initial fan of Pascal, until I discovered that a large amount of my time was spent trying to work around limitations of the language.

With C, those workarounds disappeared and I was far more productive.

(I know C compilers always have numerous extensions, most of them of dubious value, but still, plain C remains a far, far more useful language than the Pascal of "Pascal User Manual and Report". Which is why C buried Pascal.)



As I said in another forum earlier this week,

"A lot of the hate centered around how it wasn’t a systems programming language, when it wasn’t supposed to be one. It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification."

After BASIC, I learned on Pascal (Apple and IBM). It was invaluable to clarify in my mind how programming structures work, as opposed to the (fun) chaos of early BASICs. I really didn't need much more than the I/O available in standard Pascal at the time. And it hid details like endianness that I was not yet ready to handle.

Were there problems? Of course. Among others, the vendors should have done a lot more to standardize extensions, the P1 parameterized array bounds should have been in the initial spec, and while the P-machines had many virtues, performance was not one of them. Far too many of the early implementations just ran the P-code in an interpreter.



While there may be general agreement that Pascal is a great teaching language and a weak systems language, I believe that early versions of mac operating system (system N, not OSX) relied heavily on a modified version of Pascal. Perhaps Steve did not get the memo.


Becaues Clascal/Object Pascal had an enormous number of non-standard extensions to make it a much more system and apps oriented language than the 1973 version. I believe Bill Atkinson was involved in the decision. Same story with TurboPascal and Delphi.

Walter's complaint is partially that it needed extensions to do anything non-educational on 1973 hardware. This is true, but to me is vaguely non-sensical, as it clearly had not been designed for that and was labelled as such.



It's been well over 40 years since I wrote Wirth Pascal code, but one of the problems was one couldn't write a line without a line ending. That's not an activity confined to system programming!


Sure. But for the output of the tiny calculator class project we did, it just didn't matter.


Yes, Pascal is good for that purpose.


Many of the decisions for Pascal seem aimed for a teaching language as opposed to a production language. For instance, in a production language, optimisation of generated code is worth a longer compile cycle, but in a teaching language (where programs are repeatedly resubmitted until they barely run, and then are never touched again) short* compile cycles are everything and quality of generated code is an afterthought.

* and cheap: I remember in the days of 30 engineers sharing a VAX that if one person was compiling at a time everything was snappy, but (especially near the end of quarters!) if ten people tried compiling at once interactive latency suffered greatly. Classroom use must've been even worse.



Here's what Wirth wrote about it in retrospective, and yes it was explicitly designed as a teaching language with a syntax suitable for a recursive descent one pass compiler http://youtu.be/5RyU50qbvzQ


Haha, so true! The Caltech PDP-10 slowed to an agonizing crawl the week before finals. Even though playing games on it was banned for that week.


Many of the decisions for Pascal seem aimed for a teaching language as opposed to a production language.

Because it was. The fact it could be extended in so many ways to be a production language shows it had 'good bones', but many practical issues, like I/O, were left as an 'excercise to for the student'.

There seems to have been a lot of revisionist history around Wirths passing with people using Pascals limitations as an indictment of his PLT creds, virtually all of which ignore he was an academic working in academic environment on topics that interested him at the virtual beginning of programming on very, very limited machines. It's like calling Watt a hack because he didn't also add a supercharger and emissions control to the steam engine.



> For instance, in a production language, optimisation of generated code is worth a longer compile cycle, but in a teaching language (...) short* compile cycles are everything and quality of generated code is an afterthought.

I don't think this is a valid take. It sounds like an attempt to rationalize away awful build times from some systems language. C++ has atrocious build times, but in contrast C compiles almost instantly. Other production languages such as C# and Java also have fast build times.

I don't think long build times are correlated with optimization or production-ready code. They are however linked with attempts to get compilers to do a lot of work just to get our code to generate machine code, but some of that work is due to how some aspects of a programming language sucks and require more work just to pull off an optimization.



> but in contrast C compiles almost instantly

was that the case 30 years ago?



For some compilers, yes.

Also, code was a lot shorter in those days.



Original Pascal was a teaching language. That's all.


Are you saying that Pascal dialect had WriteLn but not Write?


No, it has write(), but some versions on some OSs needed an eol to flush the buffers. In general the file ops were deliberately underspecified [1], but it was 1973 and the variety of OSs was much weirder than today.

[1] https://www.standardpascaline.org/The_Programming_Language_P...



> It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification.

It's more the other way around: learning to drive on a trench digger. Then you find out that what you really need on the places people actually drive, like a freeway, is a car.



Agreed. Pascal made it simple to understand fast and I was in productive flow.

Then suddenly I realized there are stuff I cannot do with it. That was the last day I used Pascal - and I remember it like my dearest experience in programming!



> It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification.

Correct. But then you get hired for a trench-digging job, and told that you have to use the car, because reasons. And so you wind up really hating the car, when the problem was the other people who decided that it was the right tool for that job.



> But then you get hired for a trench-digging job, and told that you have to use the car, because reasons.

That's not a problem caused by the car. That's a problem caused by someone looking at a worksheet, see "dig a trench" in it, and promptly say "this looks like a job for my trusty car.".



Pascal also has had numerous extensions.

No one used Pascal of "Pascal User Manual and Report" from 1976.

Strangley compiler extentions are only cool when talking about C, in fact, most C developers have no idea what ISO C is actually all about.



I did, in 1986 through 88. The only extension it had was separate compilation. (This was on PDOS running on a 68000, for the morbidly curious.)

And I agree with WalterBright. Unextended C was far more usable than unextended Pascal.



From the looks of Small-C and RatC not really that usable.

During the 1980's there was already several usable Pascal alternatives, plus Modula-2 came out in 1978 exactly to fix the shortcomings of Pascal without extensions, and the largely ignored ISO Extended Pascal came in 1991, retifying many of the 1980's common extensions.

Using original Pascal, was really when there was no alternative, like myself trying to fit some form of Pascal compiler into a 48 KB on top of Timex 2068 ROM, which naturally wouldn't fit.

Neither did the Darthmound BASIC JIT compiler for that matter, or one of those Small-C / RatC alternatives.



I remember the Modula-2 crowd being rather bitter that the advent of C++ sunk M-2. One of the M-2 compiler guys ruefully said to me "I backed the wrong horse." (Zortech C++ had taken the PC market by storm then.) The success of ZTC++ is why Borland changed direction from Pascal to C++. I heard rumors that Microsoft had been internally developing its own OOP language, but abandoned it and switched to C++ also after the success of ZTC++.


Borland did keep supporting Pascal in addition to C++ then, no? They introduced Object Pascal with Turbo Pascal 5.5 IIRC, and later kept extending that with Delphi. As far as I know, their strategy was playing both Pascal and C++ up through the bitter end with Delphi and C++ Builder. I don't know which of them, C++ or Pascal/Delphi, but Delphi was certainly popular enough at its heyday.

Obviously, Object Pascal was a very far cry from the Pascal User Manual and Report, and even at its extended version it was strictly less powerful than C++, so your point still stands. But it was more beginner-friendly. In my opinion, the moment you got beyond the basics (no pun intended), it was friendlier than BASIC or Visual Basic, while far more powerful. That was quite a sweet spot.

Once I felt comfortable with C++, I couldn't go back to Pascal. Even with the extensions, there were some things that were too painful for me like lack of RAII, lack of Generics, short strings by default. Perhaps some of these are addressed by Free Pascal or modern versions of Delphi nowadays, but that ship has sailed. I feel like Pascal with its extensions worked great in the 1980s and 1990s, and I have very fond memories of my time using it, but it just doesn't have much to offer anymore. There are other beginner friendly languages out there that are powerful enough and have better tooling and far larger communities. And it was all about the tooling and community all along.



However, Wirth himself realized the problems of Pascal and his later languages are basically improved versions of Pascal -- Modula, Modula-2, and Oberon. But these languages didn't even really displace Pascal itself let alone C -- but maybe if he had named them in a way that made it clear to outsiders that these were Pascal improvements they would have had more uptake.


One could argue that C's success is largely because it was even simpler than Pascal and more generic --- a notable example is that Pascal has I/O built-in to the language, while C defined them as part of the standard library (which could even not be present.)


From a compiler writer's perspective, C is much more complex than Wirth's Pascal.

Pascal's builtin I/O was a major impediment to its usability.

However, one really great feature of Wirth's Pascal is nested functions with access to outer scopes, which D wholeheartedly embraces. They really are slick. I use them heavily.



> one really great feature of Wirth's Pascal is nested functions with access to outer scopes, which D wholeheartedly embraces

Can you give an example? What does the function access in the outer scope? Is it like an environment-capturing closure?



It can access the variables in an outer scope:

    int moon(int s)
    {
        int sum(int x) { return s + x; }
        return sum(3);
    }
An extra parameter is passed to the nested function, called a "static link", which is a pointer to the stack frame of the statically enclosing function. This is in addition to the usual "dynamic link" which is a pointer to the stack frame that called the nested function.

Nested functions can also be nested, and can access variables in enclosing functions by walking the static links back.

The neato thing about this is it makes stack frames work exactly like members of a struct. In fact, a pointer to a nested function is analogous to (and binary interchangeable with) a pointer to a stack object.



My first (and only) serious compiler had such nested functions, though without stack frames. Instead my VM had two stacks: one for function arguments and locals, and the other for return addresses, same as Forth.

I had no stack frame at all. Instead, the compiler kept track of the stack offset of every accessible local (relative to the top of stack). That way the implementation of my nested function was kind of trivial: there were no difference between true locals like `x` and locals from an outer scope like `s`, except of course for the possibility of shadowing.

One reason this was not special is that internally, it was nested scopes all the way down: there was one scope per local variable, so merely declaring two variables already means dealing with nested scopes. Once that was out of the way adding nesting functions was really easy (I believe it added less than 20 lines to my compiler).

Nowadays I think I would use a single stack instead, but if my language is simple enough I’ll probably skip the frame pointer and just keep track of offsets like I did this first time.



if your outer function f called a function 'quicksort' which called a function 'partition' which called your nested function 'compare', how did 'compare' get access to the variables of f from its statically enclosing scope? how did it know how many locals 'quicksort' and 'partition' had pushed onto the operand stack?


Err… I think recursive inner functions would just blow up… oops.

I guess it’s a good thing I didn’t advertise inner functions and only used them in the standard library…



well, that's the problem the static link solves; it's not such a difficult thing to implement if you let procedure values (function pointers) be two words instead of one. gcc emits trampoline code onto the stack at runtime to support this without breaking abi compatibility, so the pointer to the nested function actually points to the trampoline, which just supplies the static link and calls the real nested function


Almost, the nested function goes out of scope too if the outer function finishes.


True. D handles that complication allocating such scopes on the GC heap, if the nested function survives the termination of the other function.


Nice tribute to Wirth. I just have some feedback :-)

> ...modular languages offer one implementation for each interface.

Unfortunately, this is totally incorrect. Modular languages absolutely allow any interface to have multiple implementations which can be chosen amongst. This corresponds to how a Java 'interface' can have multiple implementations. In fact programming to the interface and swapping the implementation is one of the main selling points of interfaces.

> Some of those constraints remained long after advances in hardware and software made the insistence on one-pass compilation seem obsolete.

With compile speeds nowadays we can only wish that this insistence was obsolete. It's needed now more than ever with the slow-as-molasses compilers (apart from a notable few) of today.



He is talking in the context of modules as in Modula-2 and Oberon, not language types, or the split interface/implementation, which goes towards your point as Modula-3 and Ada allowed various interface module/package declarations to the same implementation.

Still not the same as Objective-C/Java interfaces that later on become more widely known, or Standard ML modules, CLU clusters, all of them type system based.



It's not clear to me that he means only Modula-2 and Oberon when he says 'modular languages', especially as just in the previous paragraph he says: '...stopped caring much for purely modular languages, including Ada as it was then'


what he says is correct about languages like modula-2, oberon, ada, ocaml, c, and c++, at least within a single compilation. you can have multiple implementations of the interface described in a single .h file, but you have to choose at most one of them every time you go to compile. this is distinct from how things like java work; java doesn't have this problem

c++ compilation is mostly slow not because c++ compilers use lalr parsers (they don't) but because c++ programs commonly wedge lots of the implementation into the "interface", i.e., the .h file, and use recursive textual #include

non-c++ compilers are super fast



Recent and related:

Closing word at Zürich Colloquium (1968) - https://news.ycombinator.com/item?id=38883652 - Jan 2024 (28 comments)

Niklaus Wirth, 1934-2024: Geek For Life - https://news.ycombinator.com/item?id=38871086 - Jan 2024 (61 comments)

Niklaus Wirth has died - https://news.ycombinator.com/item?id=38858012 - Jan 2024 (403 comments)



Perhaps part of the issue with the "evolution of the programming language field in recent years" is that 'simplicity' is a high cost optimization. Most every 'simple' system I've been a part of building started life as an oversized beast, laboriously wrestled into a more streamlined form.

Making complicated things is cheaper, easier, and lets you say 'yes' more often to smart, persuasive, people. Simple takes a lot of time, effort, and "getting to no" with people that have good reasons for the things they want to add.



Making complicated things may only be cheaper initially.

I think the problem is that to learn to make simple things you first need to learn making complicated things. This is my story and pretty much the story of every person I ever talked to that learned to make simple things. Some people get it faster, some get slower and some never learn to appreciate simplicity, but everybody first had to learn to make before they learned to make simple.

So now realise we were through decades of exponential growth of the number of developers and at any point in time people with experience were greatly outnumbered by people with little experience and the answer becomes easier to formulate.

Simplicity isn't something that is easy to teach. The best ways to do it I found is by mentoring and working with people, but there are so many people you can meaningfully mentor on a daily basis. People with experience are so outnumbered (and also promoted up if they are any good) that it is very hard for an average apprentice programmer Joe to be able to find his master.



  A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system. - Gall's Law


No, that's not true. Easily disproven by example: I have many times seen smart engineers with little experience design and immediately implement an extremely complex solution to a very simple problem.

Don't believe everything people say. Just because somebody called something "a law" does not mean it is true.



They might have designed and implemented it, but does it work? :)


Sometimes it does. Depending on definition of "work":)


A working "complex solution to a very simple problem" is not a contradiction, as it can be "simpler" than a simple working solution to a complex problem.


I think this true, and in agreement with the parent comment. A working complex system evolves out of a smaller working simple system. You have to climb the hill from the one side. Getting to a complex, working system is the 'learning' part. If you want to be able to do the same things with a simpler system, then you have to do the work to collapse that back down to its fundamentals.

I don't think it's something that can be 'taught' directly. I suspect it requires the experience of building the simple and then complicated things in order to understand the problem enough to make the complicated thing simpler.



The Go programming language is not only partly inspired by Wirth's work (Robert Griesemer, one of the "founding fathers" of Go, studied at ETH Zürich), it also has this goal of simplicity: it started out with fewer (but powerful and unfortunately often underestimated) features than other languages, and while it has added more complexity over the years (modules, generics, ...), this has happened at a much slower rate than with other languages.


> The Go programming language is not only partly inspired by Wirth's work

There are very little similarities between Go and Wirth's languages. R. Griesemer's doctoral supervisor was H. Mössenböck, not N. Wirth.



Look at this document by Mössenböck https://www.astrobe.com/CPIde/CP-Lang.pdf on Component Pascal, which is essentally a modified Oberon 2 (Wirth language)

> Except for some minor points, Component Pascal is a superset of Oberon-2. Compared to Oberon-2, it provides several clarifications and improvements.

The similarities are hard to overlook, replace curly braces by BEGIN / END Pascal style syntax and you are more than halfway there.

Of course Go has added features that were not present in Oberon / Component pascal, the implicit interfaces come to my mind, Go channels also. But inherently its a Wirthian language.

Also Griesemer is not the only Oberon connection of Go. Rob Pike has studied the Oberon system when developing the ACME editor for Plan 9 (Source: https://wiki.c2.com/?OberonOperatingSystem).

I honestly think Oberon didn't get named, because at the time of launching Go, the Pascal / C conflict was a lot more present in the memories of people so there was nothing to gain from pointing out these roots.

It is not uncommon for programming languages to not talk about their roots during launch due to politics (Java for example traces its object-oriented roots to https://en.wikipedia.org/wiki/Strongtalk a typed Smalltalk descendant, but was advertised as a cleaned-up C++).



I don't see what this all has to do with Component Pascal (the language report is not by Mössenböck, but by Pfister et al., btw.). You should instead have a look at the Newsqueak publications. There are more similarities of Go with Algol than with Oberon. Newsqueak reused a few features of Pascal, some of which are still present in Go.

The Acme editor has similarities with the original, text based Oberon system user interface, which unfortunately shares the name with the language, but is not the same thing. The user interface of the Oberon system again looks like a stripped down version of the Cedar environment developed at PARC (where Wirth had his sabbatical).

The object model of Java directly traces back to Simula 67, not to Smalltalk or its descendants (see e.g. https://www.youtube.com/watch?v=ccRtIdlTqlU&t=2444).



You don't have to take my word for it, here's a graph from the Go Programming Language book: https://www.oreilly.com/api/v2/epubs/9780134190570/files/gra...

The most visible Pascal-family influence is the variable declaration syntax ("i int" instead of "int i" as in C et al).



Unfortunately, this graph is at least misleading, if not wrong. The middle path is overly prominent; the detour to Object Oberon (probably so that Griesemer's name is represented) is amusing. Actually there should be a direct line from Newsqueak to Go. Aleph's brief intermezzo is mostly historically relevant for Go, not technically. The path from C should actually point primarily to Newsqueak (in addition to Pascal). Technically, there would be a very thick arrow from Newsqueak to Go, a slightly less thick from C to Go, and a very thin one from Oberon-2 and a few other languages. But as a highly simplified representation of the languages and people directly and indirectly involved and the chronological sequence, the graph is ok.


Here's the accompanying text from The Go Programming Language for that line in the graph:

> One major stream of influence comes from languages by Niklaus Wirth, beginning with Pascal. Modula-2 inspired the package concept, Oberon eliminated the distinction between module interface files and module implementation files. Oberon-2 influenced the syntax for packages, imports, and declarations, and Object Oberon provided the syntax for method declarations.

So mostly packages and syntax - not the most exciting stuff, but the package concept contributes to enabling Go's compilation speed, so it shouldn't be disregarded. Ok, you can still argue that many or most of the C/Pascal-family influences were already present in Newsqueak (I'm only familiar with Pascal, C and Go, so I can't judge that), but I think it's understandable that they wanted to have the language the book is about at the focal point of the graph, and not one of its less successful and more obscure ancestors.



it's true that golang is basically newsqueak with structural interface types, but newsqueak is pretty obviously strongly pascal-influenced. i agree that newsqueak shows significant c influence, but in most of the ways that c and pascal differ, it follows pascal. and it seems unavoidable that griesemer spending his grad school at wirth's school hacking on wirth's language oberon would have influenced the taste he applied to golang


If being simple gives us go, perhaps we should try to drop simplicity for simplicity's sake and accept that complicated things might be needed occasionally?

Go is a modern programming language, which suffers from a lot of issues that were well known at the time of its creation.



> Go is a modern programming language

Is it? What in particular makes it "modern"?



Its limited age compared to the other common ones.


;-)


And yet a lot of modern software is written in it and people seem to be happy using it. Let the music do the talking.


This exact comment could have been written 10 years ago about C and 15 years ago about visual basic.


And before that FORTRAN, COBOL and Assembly.


That's not even the same ballpark. C is timelessness itself. The comment could have been written 30 years ago, or 30 years hence. Nice try!


Its "simplicity" is a reason why it depends so heavily on compile time code generation across the ecosystem.


One might argue that a lot of languages depend heavily on compile time code generation. Just because Rust's macros, or C++'s template metaprograms are sweeping their generated code under a stylish rug doesn't mean it's not happening. Slowly.


I write fiction as a hobby and "Most every 'simple' system I've been a part of building started life as an oversized beast, laboriously wrestled into a more streamlined form" describes my day to day life in both that and my programming job


Being simple...

And then there comes Rust in all its glory with "my string constant".to_string() awkwardness and Golang with datetime to string formmating using "2006-01-02 15:04:05.999999999 -0700 MST"...

Modern languages are full of idiosyncrasies that work like putting left hand in the right pocket, and from behind.



You are confusing "simple" with "easy". When you are trying to solve hard problems the "easy" way you are going to have bigger problems later than just awkwardness.


I'm not sure I would call Rust's solution "simple" either.

My sense is that fine-grained, per-object, deterministic memory management simply cannot be made simple or easy, because the very thing you're trying to do is inherently complex and difficult.

I realize it doesn't provide the same level of safety guarantees that you get out of Rust, but I am very sympathetic to Zig's approach of having no implicit default allocator, so that I can instead just use something simple and easy to reason about like an arena allocator in the 90% of cases where that's good enough for my purposes.



> And then there comes Rust in all its glory with "my string constant".to_string() awkwardness

Would you feel better if it was `"my string constant".to_owned_string()`?



You could use the constants that everyone uses like time.RFC3339 or ISO 8601.

Why it is under the hood allowed to define your own timeformat is quite obviously legacy data. At least it's human readable.



I never like this platonic view of there being one 'perfect language', maybe yet invented, that stood above all others. Instead I am always more of a 'classist'(i didn't find any word for it). I believe there is one class of language, of which it's not hard to achieve, that for all intents and purposes are all equally good. A bit like the notion of 'turing completeness' except turing completeness is way too broad and measure another thing. But I'm betting that the 'best' language is already achieved, and there are quite a few of them. Which ones, that's up to debate.


Huh, the article mentioned a fact that I could have never expected: Logitech was an indirect sire of Wirth, since people from ETH had wanted to commercialize Modula-2. [1] Their first product: a Modula-2 development system bundled with a 3-button mouse.

[1] https://web.archive.org/web/20210324044010/http://www.edm2.c...



Everywhere the company showed their Modula-2 development system people started making inquires about the mouse and its availability as a separate item, Logitech scrambled to put together a developers kit for DOS and started to offer the mouse for sale

Brilliant. It's very important to notice early enough that the clients prefer buying not exactly what you're selling...



> Like a Renaissance man, or one of those 18-th century "philosophers" who knew no discipline boundaries, Wirth straddled many subjects. It was in particular still possible (and perhaps necessary) in his generation to pay attention to both hardware and software. Wirth is most remembered for his software work but he was also a hardware builder

> One of his maxims was indeed that the field remains driven by hardware advances, which make software progress possible.

> One of his maxims for a successful career was that there are a few things that you don't want to do because they are boring or feel useless, but if you don't take care of them right away they will come back and take even more of your time, so you should devote 10% of that time to discharge them promptly.

> Wirth seems to have decided in reaction to the enormity of Algol 68 that simplicity and small size were the cardinal virtues of a language design, leading to Pascal



The litmus test for simplicity of a programming language design is its compilation speed, if the language compile fast it is simple but if the language compile slow it is overly complex. Modern programming languages like Go and D have fast compilation, but C++ and Rust compile much slower. Go is a direct descendent of Wirth's languages namely Modula and Oberon, while D is not albeit some of its feature like nested function is directly taken from Pascal [1]. Interestingly both were designed by authors with engineering background, and personally I think the simplicity is not a coincident since typical engineers loath to embrace any form of complexity.

[1]Nested function:

https://en.wikipedia.org/wiki/Nested_function



I think that’s an oversimplification. By that measure any assembler would be simple, yet assembly is not simple at all. Most esoteric languages compile super fast, but are usually complicated by design.

Also, it’s not even a sound measure: There are C compilers that are extremely fast and some that are a lot slower but achieve better performance. Java compiles to byte code and performs compilation to machine code during execution. Interpreted languages sometimes only use an AST for execution and don’t compile to machine code at all.



Simple depends on the context. You may say programming in assembly language is simple, but it is only simple from the context of writing processor instructions; if you think high-level, like accessing fields in a struct, then programming in assembly complects (or weaves) field access with processor instructions and it turns into a complex thing (from the point of view of accessing fields in a struct).


Assembler is very simple as a language family. It is not simple to use.

The real challenge is combining the two.

I also think focusing on different compilers misses the point, which would perhaps be better expressed by to what extent a naive compiler implementation for the language would be likely to be fast or not.

E.g. Wirth's languages can be compiled single pass, and are fairly easy to compile without even building an AST. Some other languages can fit that too, but for many it's not that you need an AST and multiples passes just to generate good code, but that in some cases it gets increasingly impractical or impossible to compile them at all without much more machinery than Pascal or Oberon needs.



> The litmus test for simplicity of a programming language design is its compilation speed, if the language compile fast it is simple but if the language compile slow it is overly complex

No. OCaml for example has a really fast compiler (like Go), but I would not call it simple. It does have PPX (PreProcessor Extensions) which are like macros, so you can't "blame" the lack of them either.

And everything that uses LLVM is _always_ going to be _slow_, no matter the language.



More to the point, OCaml belongs to the ecosystems that has the golden route, offering multiple implementations, allowing to pick the best one for each workflow.

If Rust had an interpreter in the box algonside the compilers like OCaml (there are multiple ones as well), it would already make a big different in development workflows.



Slightly irrelevant fun fact: the Rust compiler was implemented in OCaml for a long time during the R&D phase of the project.


Is there any particular reason for LLVM being slow? Does it do a lot of complicated optimizations when generating code, or is it designed in a way that makes it slow?


I’ve heard it used to be lean and fast. Then new developers came in, new features were implemented, and it bloated over time. Thus it wasn’t designed in a way that makes it slow. It grew in a way that makes it slow.

Source: hearsay.



> I’ve heard it used to be lean and fast. Then new developers came in, new features were implemented, and it bloated over time.

That's basically the story of every single software project ever made.



I could take an existing language and throw in a lot of rules that would make compilation marginally faster:

1. No variable names etc. with a length of more than 8

2. The order of functions matter now; don’t forward-reference anything

3. You can only nest scopes 16 times, max

Would this make the language simpler for me in a way that I would care about?



This is a personal question related to taste. An impossible question for strangers to answer.


It would be nice to have actual benchmarks of compilation speed of equivalent programs in different languages rather than just the runtime performance as is typical in language shootouts. Go has to me a surprisingly high time to compile a simple hello, world program (about 200ms and generates a 2MB binary). But I suppose that is a fixed overhead and perhaps it scales well.

Generally though, I'm disappointed if hello, world takes more than 20ms to compile -- which is of course true of pretty much every popular language.



The problem is that a hello world isn't sufficient to identify compilation speed, you'd need a program with thousands of lines that does barely anything. And then you're fighting IO as well. although that could be fixed by putting the program in /dev/shm first and running it.


> The litmus test for simplicity of a programming language design is its compilation speed

Simple for programmer to understand and use (small, orthogonal, consistent) isn't always going to be the same as simple for the compiler writer to implement.



> Go is a direct descendent of Wirth's languages namely Modula and Oberon

This assumption comes up again and again, but the evidence is rather poor. There are very few and only marginal similarities. The most obvious is the receiver syntax of the bound procedures. But this can be found in Mössenböck's Oberon-2, not in Wirth's Oberon. Although Wirth was a co-author, he ignored any innovations in Oberon-2 in his subsequent work. Go has a completely different focus and is essentially a further development of Newsqueak; it's definitely not a "direct descentant" of Modula or Oberon.



Part of the reason it persistently comes up is that Robert Griesemer got his PhD under Mössenböck and Wirth, and people not paying very close attention would probably also see the Oberon-2 connection as a confirmation via an indirect step rather than as an argument against the claim.


Yes, Mössenböck was his PhD supervisor (Wirth was only co-examiner). Personally I consider Oberon-2 a better language, but there are hardly any applications of bound procedures; particularly not in the Oberon systems developed at ETH, and surprisingly few in Linz Oberon either. And Active Oberon followed the more conventional Object Pascal approach.


To take it to the extreme:

Writing binary directly is simple because the compilation time is zero.



> The litmus test for simplicity of a programming language design is its compilation speed

From a compiler's perspective, sure. Not being a compiler, I find that metric not very relevant.

Simplicity of a language is better gauged by how easy it is to express complex things in it, and how difficult it is for one person to comprehend what another wrote.



> From a compiler's perspective, sure. Not being a compiler, I find that metric not very relevant.

I'm afraid you're just showing a bit if survivorship bias. I can assure you that compilation speed is a critical trait of any programming language, and the only people who don't understand this are those who benefit from all the work invested by predecessors who did.

Think about it for a moment: how would your turnaround speed be if every single time you had to build your project to test a change you had to brace yourself to wait over one hour to get things to work? After a week of enduring that workflow, how much would you be willing to pay to drive down that build time to 10 minutes, and how much more would you fork to only require 1 minute to pull off an end-to-end build?



Compilation speed can be an important trait of a programming language (or more precisely, a dev env / buildchain). I remember writing code in M68000 assembly, the compile step was lightning fast because you didn't need one. I do also remember going near cross-eyed tracing code flow down narrow columns of vaguely varied yet similar-looking statements -- hours upon hours!

If your daily task build is taking over an hour on modern hardware, it's likely you have organizational problems masquerading as tech debt. No language choice will prevent that; good technical leadership can help mitigate it.



I wouldn’t call D simple. The list of features seems to be endless.


Thankfully C++ modules are on the right path to improve the story on the C++ side.

Using C++23 standard library, alongside modules, and cargo/vcpkg binary caches for 3rd party libs is quite fast.

Rust well, until cargo does offer the option to use binary libraries, it will always lag behind in what C++ tooling is capable of. Maybe if scache becomes part of it.



Naturally I meant conan/vcpkg.


> Using C++23 standard library, alongside modules, and cargo/vcpkg binary caches for 3rd party libs is quite fast.

I don't think your assertion makes sense. The only thing that conan/vcpkg brings to C++ is precompiled dependencies which you don't have to build yourself. You get the same result if you build those libs yourself, packaged them in a zip file somewhere, and unpacked that zip file in your project tree whenever you had to bootstrap a build environment. The problems that conan/vcpkg solve are not build times or even C++, they are tied to the way you chose to structure your project.

With C++, you get a far greater speedup if you onboard a compiler cache tool like ccache and organize your project around modules that don't needlessly cause others to recompile whever the are touched.



Go compilator is fast, because it doesn't allow circular references.


As a nice example, Malbolge compiles instantly.


I can't believe I missed the news of his death. Pascal was my language of choice for many years. RIP, Niklaus.


Makes me think of Rich Hickey and Clojure when I think of simplicity in this context! But, good article!


Niklaus Wirth could afford to be simple, he lived in simpler times where demand was much lower than today, being chased by much less investment.

Change was measured in years, compute and storage options were limited. I wonder how many of his OSes / programming languages spanned multiple heterogeneous compute architectures.

Don't get me wrong, love the guy...but wonder what kind of impact he would have if in his prime today...



it's unclear whether this comment is describing the extreme opposite of the truth out of ignorance or for satirical purposes






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com