(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40163405

这位用户分享了他们从童年开始探索计算机和编码的经历。 他们年轻时喜欢破坏和创建网站,但通过更改与软件密钥比较相关的指令,意外破坏了 Windows 软件。 尽管他们对旧技术及其局限性感到沮丧,但他们认识到现代计算机提供的优势并鼓励对技术的好奇心。 他们提到了他们对十六进制编辑的迷恋,特别是将指令更改为“不等于”而不是“等于”,从而成功破解了该软件。 该用户承认年轻一代缺乏好奇心和决心,将其归因于与早期时代相比现代技术的便利性和资源的缺乏。 他们对与 Commodore 64 一起成长表示感谢,尽管它有其局限性,并强调为子孙后代保留旧技术及其历史的重要性。 最后,他们对开放 DOS 软件存储库表示赞赏,承认其对于理解技术进步和历史背景的重要性。

相关文章

原文


When I was nine years old, I liked poking around with a hex editor on my dad’s PC.

I didn’t speak English and MS-DOS wasn’t yet localized to Finnish in 1989, so I decided to try translating it myself with a dictionary by manually finding and replacing strings in the SYS/COM files. The result worked and my dad was suitably impressed, if probably a bit peeved that nothing worked anymore in the shell as expected (since I had replaced all the basic command names too — “dir” became “hak” and so on).

It’s pretty cool to see those strings again in src/MESSAGES.

At the same time, it feels a bit sad that today’s kids can’t get the same feeling that the computer is really theirs to modify. Modern operating systems don’t run binaries tampered with a hex editor. Most kids are on operating systems like iOS where they can’t even run a C compiler.

They can play with code in various sandboxes locally and on the web, but the computer fundamentally belongs to someone else today.



My favorite Hex Editor hack was when I cracked a Windows software simply by changing the instruction "Equal to" to "Not Equal to" where it matches for Software Key with user entered key.


> At the same time, it feels a bit sad that today’s kids can’t get the same feeling that the computer is really theirs to modify.

Kids with a hacker mentality (let's face it, even in the 80s those of us who hacked around with DOS etc. were the teeny tiny minority) have more options than ever before, including but not limited to FreeDOS, Linux, or a bunch of others https://en.wikipedia.org/wiki/Comparison_of_open-source_oper...

Finding it is also super easy if you have the curiosity (and of course a PC and an internet connection).



You said the magic keyword: "curiosity", when it comes to computers and tech at least. Something that I find severely lacking among - for lack of a better term - Gen Z.


The youth of today simply don't have it in them! Unlike us back then, they're just not as cool, not as strong, not as smart...

This complaint is as old as mankind and has always been wrong. It seems to be a feature of human thinking that we glorify the memory of our own youth.



Many replies here argue that there are plenty of choices today for the hacker-inclined, many more than 40 years ago, which is true, of course.

But I think what your post implies is that regular computers back then invited you to hack them. It might have awoken a curiosity that could have otherwise remained latent in many of us.

A anecdote from the other side of the fence: Apple used to ship an app called ResEdit whose sole purpose was for you to hack with your Mac. Change strings, menu shortcuts, images, sounds. I had my own localized and rebranded version of Eudora and many tiny modifications to games. Can you imagine a first party hacking tool like that today from the fruit company?



I think ots rose tinted glasses to a degree and survivorship bias - the learning curve was very steep, documentation lacking or absent + no one to ask for help. I was glad I could run some games on my C64 and never really got farther with it due to these limitations.


The DOS era was a bit before my time; I had a C64 as a kid but I managed to break it before I got to do too much fun [1].

That said, I got into web dev when I was a pretty young kid, about 9 years old, and I would have fun hacking together different websites. Eventually I noticed the “edit” button on top of the browser and learned you could mess with other people’s sites as well. I had lots of fun breaking stuff.

Computers are so cheap now, I think it’s relatively easy for most families to have a computer that can be programmed. For that matter, the Raspberry Pi is cheap and has hundreds of resources available to play with, many of which are kid friendly.

[1] before you ask, I am not 100% sure what I did, I was playing around with some code I found in a manual for it that my dad gave my when I was 7-8, and I must have done something bizarre cuz it stopped booting.



The C64 breaking was almost certainly not your fault.. they had a bunch of commonly occurring faults, and a number of the chips, especially those which commodore fabbed themselves were notorious for giving up eventually. The PLA failing was a particularly common occurrence, but RAM and other glue logic sometimes fails too. These days there's a vibrant hobbyist community and a lot of the chips have modern replacements available. I'm still grateful for having grown up with a C64, it was a fun way to learn the ropes.


I wouldn't cry too long for the curious children of today. There are more frontiers and available resources for kids today than we ever had tinkering with an isolated TI-99/4A or VIC-20. We could share a cassette with a friend and if we were super lucky there was a local user group that met in a church basement right after the AA meeting.


I remember editing command.com too, with debug.exe, to change some of the standard messages, or even command names, as you did.

Another cool DOS thing was that you could type Alt+255 (last 8-bit character code) as part of a filename. That appeared like a space character, so made it effectively "hidden" to those who did not know about the trick, because they could not type its name to list or edit it.



I used to do this on Windows XP to make my desktop look cleaner, all the files were ALT+255 repeated N times, or, more accurately, repeated a random number of times.


I did that with GTA, translating it to Portuguese. It was then that I learned that I could overwrite the strings with the Hex Editor, but not insert anything because it would stop working. And thus began my dive into computers.

Great memories, thanks for making me remember it.

* Actually now that I really think about it, it wasn't with an Hex Editor, it was with Edit! Fun times.



I remember trying to rename LILO (from "Loading Linux" fame) to PIPO [1] by simply editing the bytes with a hex editor.

Turned out that didn't work, because there was an additional sanity check that halted the boot process if the "LI" bytes were corrupted.

Of course I put through and was a happy user of PIPO for some years, until Grub came along.

[1] https://en.m.wikipedia.org/wiki/Pipo_de_Clown



When I was running Gentoo, I wanted to replace the GNOME foot that appeared on the dropdown menu with a Gentoo-fish-in-a-wizard-hat icon.

I found documentation suggesting that the icon shown on the menu was set in a certain configuration file, and changed that file.

This meant that, when I was using the normal UI to customize the GNOME topbar, the icon associated with that menu, in the GUI, was the fish-wizard icon. But it did not change the icon displayed in the menu itself.

I always resented that. I still don't like the concept of hiding configuration lest the user change it.



> MS-DOS wasn’t yet localized to Finnish in 1989

That's genuinely something today I appreciate but when putting the 10th floppy in to update windows3.x. relative these days!

> it feels a bit sad that today’s kids can’t get the same feeling that the computer

Can't agree more. trying to get my cousins and nephews interested is in their term "Not important"



I really loved the rare cases of software publishers who put the serial number for the software on disk two instead of disk one so you didn’t have to eject the disk you were working from to do the install.


Occasionally I allow my (toddler/preschool) kids to play 'Kindercomp' in DOSbox on my computer. It's got a mode that prints the letters you type across the screen in different colors, which seems to be the fan favorite because it rewards indiscriminate keyboard mashing.

When they get a little older I plan to introduce QBASIC programs that do the same kind of thing, then we can start looking at the code that makes it do that.



MacOS is notorious for this. By default, it would only run binaries signed with an Apple-issued certificate. You can bypass this multiple different ways, of course, but that requires knowing that it can be bypassed in the first place.

Then there are mobile OSes where you don't get to see the binaries at all. Yes you can repack an apk but again, that's a more involved process requiring specific tools and knowledge (and very awkward to do on the device itself), and iOS is completely locked down.



> MacOS is notorious for this. By default, it would only run binaries signed with an Apple-issued certificate. You can bypass this multiple different ways, of course, but that requires knowing that it can be bypassed in the first place.

What do you mean? When I compile something with a myriad of different language stacks or compiler toolchains, I'm not aware of an Apple-issued certificate ever being involved and those binaries run just fine.



Probably because the environment you use to compile it, like the terminal or Xcode, is added to "developer tools" under security settings. Xcode in particular does that for itself automatically.


Some OSs want their binaries to be signed and probably have checksums etc. It would be hard to keep those valid when mucking around with a hex editor.


> Modern operating systems don’t run binaries tampered with a hex editor.

Luckily that isn't universally true. I had to do a decent amount of binary modifications on Linux to deal with bugs and glibc compatibility issues.



It looks like "brain damaged" was the developer's go-to insult when frustrated :D

2024-04-25 19:35 ~/sort/dl/MS-DOS % grep -nri 'brain[ -]damage' .

./v4.0/src/DOS/STRIN.ASM:70:; Brain-damaged TP ignored ^F in case his BIOS did not flush the

./v4.0/src/DOS/PATH.ASM:24:; MZ 19 Jan 1983 Brain damaged applications rely on success

./v4.0/src/DOS/FCBIO.ASM:28:; MZ 15 Dec 1983 Brain damaged programs close FCBs multiple

./v4.0/src/DOS/FCBIO2.ASM:28:; MZ 15 Dec 1983 Brain damaged programs close FCBs multiple

./v4.0/src/BIOS/MSBIO1.ASM:82:; REV 2.15 7/13/83 ARR BECAUSE IBM IS FUNDAMENTALY BRAIN DAMAGED, AND

./v4.0/src/CMD/PRINT/PRINT_R.ASM:1772: ; See if brain damaged user entered



Super cool to see MZ initials, which are for Mark Zbikowski. They are still to this day at the beginning of every Windows executable/PE file.


> In 2006, he was honored for 25 years of service with the company, the third employee to reach this milestone, after Bill Gates and Steve Ballmer. He retired the same year from Microsoft

Considering he "only" joined MS in 1981 (which was found in 1975?), I'm surprised no more people between him and Bill stayed at MS.

Also shouldn't Paul Allen still count (from Wikipedia, "Allen resigned from his position on the Microsoft board of directors on November 9, 2000, but he remained as a senior strategy advisor to the company's executives.")?

Edit: wow, never knew he worked at Valve, too.



> ./v4.0/src/BIOS/MSBIO1.ASM:82:; REV 2.15 7/13/83 ARR BECAUSE IBM IS FUNDAMENTALY BRAIN DAMAGED, AND

Hardly surprising. If you grep the leaked NT3.5 sources for the f-word, you will find similar comments directed towards IBM.



Let's bring back those DOS 4.0 period sayings.

Does this mean if you happen to be talking to BillG you could talk about brain damaged programs and he'd nod appreciatively?



Somehow seeing the phrases "brain damaged" and "DOS 4.0 period" makes me think...

Bill Cosby's "Brain Damage" bit, from the album "Himself": 1982

DOS 4.0: 1988

Those comments also could have also sat in the tree for a few years.



As Scott mentioned in the blog post, we were able to get this running on one of my original IBM XTs with an original IBM monochrome display adapter and display. It was very cool to be able to switch between a running version of a small game, Turbo Pascal, and a DOS prompt with a single key press.

It is always great to have period software on period hardware!

(added: Short video of it running - https://www.youtube.com/watch?v=YPPNbaQaumk)



> It is always great to have period software on period hardware!

Really is. This is why I keep a load of old hardware around. Stuff like Mac OS 9 should be run on real hardware and same for old MS-DOS.



If not for space considerations, I would be right there with you.

I still have hardware and software to get me back to NT 4.0 or Windows 95 (OSR 2, please, it wasn't tolerable before that). I haven't needed to in a while, but in a previous job, we'd run across old disks in some author's archive and I'd go home to dig around, find a 5.25" drive and rig something up, reach back in time.

I could maybe do Windows for Workgroups 3.11.

If shipping weren't so brutal, I would love to send off my old stuff to someone who would use it. I still have working SCSI equipment! I bet somewhere there is someone stymied on trying to liberate some ancient works but for the necessary hardware/software setup.



Indeed - Space is a key for being able to collect and restore this kind of stuff. I have most of my working machine lined up along a wall in one of my garage/lab areas (https://youtu.be/XHvdqB6LSg0). My wife has pretty much no idea what those computers are for, and my daughter just wants to play Oregon Trail on them.

They are fun to collect and restore. It is also helpful to be good at replacing capacitors. ;). Those surface mount ones on the Mac mainboards are almost always bad!



SCSI drives almost never go bad. Compared to IDE/SATA, they were significantly better built and had lower failure rates. I still have a few 15k RPM Cheetahs that still work, last I checked :).


In general, it is surprising how many old hard drives still work. I have a good number of old SCSI drives (even Seagate!) that still work 40 years on. The ST225 in one of my XTs still works great as well. I actually have a ST238R still new in box, and I'll be curious to see if it can spin up some day. I suspect the bearings might be a stuck after all of these years!


I appreciate the work in getting this open sourced but I find it telling that this had to be done through an outside motivator. There seems to be no internal "ticking clock" to get some of these things out in to the open. That's fine no one is owed the source code for this stuff or anything, but it would be nice if there was more interest on the side of the companies to get some of their formative history out so people can learn from it.


That's valid feedback. There is no clock, but there maybe should be. In this case, yes, Jeff and I had to PUSH. And that's a hassle. I'll ask around.


DOS 5 was when I really got into computers. I spent hours pouring through the included manual learning all the shell commands and learning how to write .BAT files. Then I discovered QBasic and it changed everything.


It's fantastic work you've done. As someone who works at a older software company (founded early 80s), I'm sad that there isn't a push internally for us to make our old software source available, or even just the binaries available!

What sort of tactics did you use to convince them? Maybe I can apply them to where I work too...



It might not be a problem for DOS 4, but often the source code of software that was only ever meant to be published as closed source contains source code that was licensed from 3rd parties. This license may not allow publishing the source code.

Doing an investigation of what licensed software was used and possibly trying to get permission from the relevant rights holders (if you can even figure out who owns the rights so many years later) can be a big and expensive task, unfortunately. I understand why companies might not want to take that on (even though it sucks).



For DOS, I believe the core was only ever Microsoft or IBM. Some DOS versions bundled add-ons by third parties, but they are hardly essential for operation - e.g. MS-DOS 6 included DEFRAG and MSBACKUP (both licensed from Symantec) and MSAV (licensed from Central Point Software)

Similarly, with Windows, the third-party components are generally inessentials such as certain device drivers, games, some optional system components like the ZIP file support in Windows Explorer-you would still have a usable OS with these bits ripped out. Parts of NTVDM are third-party licensed, although I believe that’s mainly the software CPU emulator used on RISC platforms, I think x86 was mostly Microsoft’s own code



Agreed.

From MS-DOS 6, remove the defrag, backup and antivirus programs, and DoubleSpace/DriveSpace, and that should I think cover all external code.

If I remember correctly, it didn't include CD-ROM drivers, just MSCDEX to run on top of one... and the network stack was an optional extra. I'm not even 100% sure it includes a mouse driver as standard.

IBM PC DOS 6.3, 7.0 and 7.1 include some additional IBM code: Rexx in place of QBASIC, the IBM E editor, but not much else.



Isn't the zip support in explorer the stuff written by Dave Plummer? I would imagine MS has the rights to that already, and if they don't, I'd imagine Dave would, and I'm sure he'd be fine with it being released.

He has lots of YouTube videos about the zip stuff.



DOS is relatively speaking tiny and actually pretty modular. You can delete a handful of files, mostly binaries and some help files, and that's it, code gone.

From the source it's a little different but there's little integration between the bits.



For some insight, look how people are combing for curse words/ devs making jokes about people being brain damaged etc. There is no upside for the company, and all that has to happen is some unsavory politically incorrect joke to get missed from sanitization and the are on the cancelled chopping block.


The "cancel" stuff only comes from a tiny minority of vocal extremists. Everyone else is entirely unfazed.

Of all the things people here probably hate about the current "modern" Microsoft and its products, political incorrectness in decades-old code is far down the list or not even a consideration.



It is about 'risk'. These companies are deeply afraid of being outed that way. They do not want to end up in court over something silly. Just remember a bottle of windex has the words 'do not drink' on it. 99.99% of people out there would not have done it but there is that small cadre of people who will do it and sue. Either for the power of it or for money. Do not underestimate the depths that fools will goto.


Yeah… if folks are offended by some of the comments in the source code here they really aught to have a look at some of the other popular media from the era to contextualize what was considered acceptable at the time.


I think I lost it when they suggested we stop using the term 'sanity check' or 'sane defaults' because they might offend, well, brain damaged people.

I am close to writing a browser extension that does a find and replace to reverse change these imposed, humourless, coddled changes.



> I am close to writing a browser extension that does a find and replace to reverse change these imposed, humourless, coddled changes.

I will be your first paying customer



Legal also may be concerned that having source makes it easier to detect patent infringement and code copying. Even if you deem the risk zero that that actually happened, why run the risk of somebody claiming you did?

For the company there are as good as zero downsides to not doing anything, and a few small upsides and a few low risk, but potentially very costly (in dollars or reputation) if they happen downsides.

That makes not doing anything the default choice for the company.

For (former) employees who worked on this, the upsides are higher; they’ll get some of their work published for the first time. That’s why we see individuals push for this every now and then.



Those niche people are the developers they want to use their software. When that niche is the target audience for whole product line of yours pleasing them is a good idea.


The historical value is invaluable for our species, especially in the far future. There is a moral imperative for this kind of thing to be made available for posterity that, in my opinion, completely overshadows any commercial, copyright or political correctness concerns.

Frankly, there should be regulations guaranteeing source code release after a few decades, and that all code, including third party code, is released from copyright protection. In return, companies should be granted legal protection from any potential legal consequences. It was over 30 years ago. The idea that somebody should be able to sue Microsoft for copying code or a third party can sue them for releasing it or that they should in any way be punished for unsavory language used THIRTY years ago is clearly utter insanity.



It also sounds like the code didn't come from inside the house... I wonder how many versions of the raw code for these early OSes actually exist? Start ups are more concerned with survival and not archiving their code. Like how many people at their current company are putting in real effort to maintaining early versions of their existing code bases?


Most startups today probably preserve the vast majority of their code, due to how prolific git is. Companies that don't use monorepos may have a bit of a problem, if your stack is a mess of microservices, some of which you eventually retire, it's easy for that old code to get lost.

Before the popularization of source control, things were different. A lot of people would just edit code Villy-nilly, only keeping very recent backups and the source for major versions, if that. There were no commits and hence no tags, so reproducing the code used to build a specific, minor version might be completely impossible. There were no branches, so ports and patches were often made by copying the code tree, doing some changes, compiling and then forgetting about the whole thing. It was entirely possible for a game studio to give their code over to another team or company, have them do a port and never actually care much for the ported code themselves.

Then there's the problem of external components, most of which were licensed, third-party software, not open source libraries. Even now, they may technically still fall under copyright and be impossible to release.



The binaries for the multi-tasking bits did come from an external source; however, the source code is from our corp source code archives team. Even that was a bit less formal back then...


since 1984 no one was offended by it, but in 2024 it became offensive to insane people

AFAIK "sanity check" doesnt even have a history like "retarded" (which actually became an offensive slur).

maybe some folks just feel left out if they cant rage about and feel offended by something these days.

::eyeroll::



As someone who recently asked our QA team to change the name of one of our test suites from "sanity test" to "quick test", maybe I can provide some perspective.

Many software developers like me have faced challenges to our mental health.

Indeed, there has been a time or two when I questioned my own sanity. And perhaps the sanity of some of my colleagues!

It's not so much that there is something inherently wrong with the term "sanity check". We all know what it means. It's just that there are more descriptive and neutral terms available to us, so why not use them?



1. Other terms are neither more descriptive nor more neutral. 2. If someone has a problem with the term "sanity check", they are overly sensitive and they need to adjust to the realities of life. It is unreasonable to expect everyone to coddle them.


sunset() triggers me due to all the times I've seen companies "sunset" APIs only to still have them in production 5 years later.

Let's change it to cancelled()



> It's just that there are more descriptive and neutral terms available to us, so why not use them?

"quick test" does not convey what is usually intended with "sanity test". The intention of the latter is to verify that a base set of assumptions hold true whereas "quick test" can be just about anything.



Walter, you and I have known each other for a long time. Not in person; we've never met. But through our interactions here on HN.

I have a lot of respect for you and for everything you have accomplished.

So I have to ask you directly: Is being "boring" or not the way we should decide how to express ourselves?

Regarding a "quick test" vs. a "sanity test". Instead of judging this on what is "boring", why can't we make a choice on which is more respectful to our peers and colleagues?



Have you ever heard someone get called slow? That's an insult too.

So now you have quick tests. That's disrespectful towards me because I got called slow once.

This can go on forever.



Can we get rid of the "failing" terminology? I think we should call it something more meaningful, such as "it shows there's scope for improvement", or "it highlights areas for more focus".


I like to play with language. English has a million words in it. Why not use it? And why stop there, I insert words from other languages, play with the spelling, use bad puns, allusions to movie dialog, whatever crosses my mind.

I hope the readers/listeners would find it fun, too.

If they mine what I write for an insult, that's on them. I'm not interested in people looking for an insult, or people who get offended on behalf of others. I don't enjoy being around people where I have to walk on eggshells.

For example, I'm bald. I'm not a person experiencing hair loss. I'm bald. I've got a chrome dome. I have to wear a hat when I drive in order to not blind oncoming drivers with the glare from it.

Is changing the words going to change anything? Nope. I'm still just as bald, no matter what words are used.

I think it was Paul Graham that recently tweeted that people should not use the word "delve" into rather than "dig" into, because the former is pretentious and the latter is simpler. I'm a delver, spelunker, archaeologist, excavator, explorer, etc. Take that, Paul!



If you're writing for the pleasure of writing, that's fine. I think such an approach is great for fiction books and poetry in particular.

If you're writing in a professional context, expect your words to be read by non-native English speakers or just people that are extremely tired and want to get their work done quickly.

I think using uncommon and domain-specific words is fine when nothing else is precise enough, but there's no reason to say "touch base" when "get in contact" works.

People should also be aware of cultural context, if you're being overly sensitive and using flowery corpo-speak, people from other cultures may miss your point entirely. A sentence like "John is doing great work, but I can see some areas for future improvement" reads very differently depending on the culture. "John isn't up to standards and needs to improve if he wants to stay with the company" is much clearer. "John fucking sucks and needs to do better" is overdoing it and using the excuse of being direct to be a jerk.



We should make choices based on communication, first and foremost. Words that do not communicate are not useful.

A "quick test" implies almost nothing, other than it is faster than some other, unnamed test out there. Is a quick test good or is it bad? No way to tell. This term, quick test, it does not inform.

On the other hand, a sanity test, well, it is more evocative. You definitely want to pass a sanity test.

If you reach back to your psychology classes, they talked about a four-part test that is useful for determine if a behavior is sane or not.

1) Is it abnormal? Unusual? Out of the ordinary? for the situation.

2) Is it unjustifiable, unreasonable given the circumstances?

3) Is it counter-productive? Which is to say, does the behavior serve the individual, or does it in fact make things worse? Or simply do nothing?

4) (and this is where my memory is fuzzy) I think it involves personal distress. For the life of me I can't find it via Google and it annoys me.

Now, interestingly, a sanity test matches the first three! (The fourth, well, there is no "I" in the program to be distressed) In programming, a sanity test looks for something abnormal, not reasonable for the program, and represents a state that won't get us the desired output. In short, it's highly congruent to the other context for sanity.

This term, it communicates, and in a way "quick test" does not.

In general, one of the main critiques of political correctness which few really notice is that the new term is less specific and less useful, that it fails at communicating. Consider when "Oriental" fell off of the euphemism treadmill for "Asian." We all knew that the first term referred to a particular part of the world, from the Latin for "east." (Ex lux orient and so on) "Asian," however, could refer to people in Russia or people in India, but nobody in the US does (I note in the UK India does get the "Asian" pass). So here the new term is less specific and less useful, and is confusing to boot because we deliberately ignore parts of Asia when using the term "Asian."

If you want something to supplant "sanity test," you gotta work harder for it than "quick test."



We have literally no idea what the 'sanity test' involved actually did, and the common use in computing has little connection to any such use in psychology.

https://en.wikipedia.org/wiki/Sanity_check informs me:

"A sanity check or sanity test is a basic test to quickly evaluate whether a claim or the result of a calculation can possibly be true." ...

"In computer science, a sanity test is a very brief run-through of the functionality of a computer program, system, calculation, or other analysis, to assure that part of the system or methodology works roughly as expected. This is often prior to a more exhaustive round of testing. " ...

"In software development, a sanity test (a form of software testing which offers "quick, broad, and shallow testing"[1]) evaluates the result of a subset of application functionality to determine whether it is possible and reasonable to proceed with further testing of the entire application."

and further comments that "sanity test" for some people is interchangeable with "smoke test".

It also adds:

"The Association for Computing Machinery,[8] and software projects such as Android,[9] MediaWiki[10] and Twitter,[11] discourage use of the phrase sanity check in favour of other terms such as confidence test, coherence check, or simply test, as part of a wider attempt to avoid ableist language and increase inclusivity. "

I could not find a definition for "sanity test" in psychology. I know about the cognitive test that Trump made famous with ‘Person, woman, man, camera, TV‘.



I cannot cop to "smoke test" being interchangeable with "sanity test," in any way. Smoke tests are obvious crashes, often before any real input. I had a friend on the build team of Windows NT 5.0 (until it was called Windows 2000) and their version of smoke test was "can it boot up to the login screen," and that's similar to working in electronics, where powering up your machine or circuit is all you do and you hope it doesn't release the magic smoke, before you actually do anything with it.

On that alone, I am very doubtful on that article.

This is also distinguished from what I have taken to calling a "pre-flight checklist" at the start of large programs, making sure databases are connected, requisite tables are present with the correct columns and datatypes, that specific files or directories exist, that files may be of a certain "freshness," and so on.

I'll stand by it: sanity test is a lot more descriptive and useful than "quick test." Those footnotes are only kicking the can down the road of justifiability. But let's examine them according to the aforementioned The PC Replacement is Less Useful criterion.

Recommending just "test" is ... well, double-plus ungood, in the sense that we now have no idea of the qualities of this test and how it could be distinguished from some other test. Hard pass.

"Confidence test" only implies a statistical likeliness, like a confidence interval. It does not imply the This Ought Not to Happen of a sanity test.

"Coherence test" is interesting but ... coherent against what?

If someone wants to sell me on a replacement, they are free to try, but the replacement must be at least as good at communicating what it does as the term sanity test. If this fails, then it will be scoffed at, and should be.



We do not know if what was described as a "sanity test" actually meets your definition of a sanity test, or was used for some other purpose like a smoke test.

A quick look using Google Scholar finds people using "sanity test" for smoke testing, like

"Typical behavior is to allocate any special resource requirements it needs, map the device into virtual address space, initialize the device and perform a brief “sanity test” to ensure that the device appears to be working correctly" at https://onlinelibrary.wiley.com/doi/pdf/10.1002/%28SICI%2910...

or

"The minimum essential test cases that need to be executed to evaluate the essential functionality are known as Sanity Test Cases" https://uksim.info/icaiet2014/CD/data/7910a048.pdf

or

"Sanity test is a brief run-through of the functionality of the software system to assure that the system works as expected." https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6006830

But it's also used for full testing, including human intervention, like

"There will always be a role for manual testing. For one, it is the only real way to sanity-test your automation itself." - https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d...

and

"Sanity test cases which checks basic functionality and are run for pre-system acceptance and when product goes thru major change. These test cases deliver a very high project value to both engineering dept and to customers." https://ijaer.com/admin/upload/06%20Apr_2012_Sheo%20Kumar.pd...

as well this example which distinguishes between shallow-and-wide smoke tests with deep-and-narrow sanity tests.

"a smoke test generally consists of a collection of tests that can be applied to a newly created or repaired computer program. This is a “shallow and wide” approach to the application." while "Sanity testing will be performed whenever cursory testing is sufficient to prove that the system is functioning according to specifications. A sanity test is a narrow regression test that focuses on one or a few areas of functionality. Sanity testing is usually narrow and deep. It will normally include a set of core tests of basic GUI functionality to demonstrate connectivity to the database, application servers, printers, etc." - http://archives.christuniversity.in/disk0/00/00/48/68/01/the...

Going with PC Replacement question, since we don't know the actual goal of what was called "sanity test", we can't ourselves come up with a better name.

Which is why we should go with the OP's assessment that "quick" is an appropriate term for what they are doing. Since you don't know what that is, it doesn't matter if you buy a new term or not.



> Many software developers like me have faced challenges to our mental health.

It's a stressful industry at times, probably almost everyone has had crap times at one point or another. And then there's Covid, which affected everyone.

Doesn't mean every mention of the word "sanity" needs to be expunged from our language though. :( :( :(

That way lies er... madness. (!) ;)



Words have multiple meanings. In that context, “sanity” means something like “reasonable and rational behavior” (look it up in a dictionary!). It’s counterproductive to forbid the use of words just because someone is somehow unable to differentiate between the multiple meanings words have. That would be insane — as in “highly unreasonable”, obviously.


I've struggled with mental illness all my life and have made several attempts to catch the bus. I'm diagnosed with several mental issues. Somehow I'm still not offended with the phrase "sanity check". What am I doing wrong? Is it possible to learn to get offended by this sort of thing? Will it make me happy?


Software engineering is a field full of neurodiverse people. Trying to police each other's language around mental health in a field where, at least historically, the majority of practitioners are going through some sort of mental health struggle, is borderline rude IMHO.


without knowing what the test did, quick test in no way conveys the same information. Honestly, did you really need to ask them to do that? please dont be that guy on the team (sorry, meant please dont be "that working set of hands on the team" (hope this illustrates the point somewhat))


And please don't use 'brain dead', which is a mortuist expression. Using 'brain dead' in a negative way may be insensitive to zombies. Also avoid the expression 'zombie process' or the word 'expired'.

You don't want to step on the toes of a zombie do you? (It's probably gross.)



Oh, you had to worry about legal. But since lawyers are concerned with risk, if the source is all proprietary and internal, the risk was lower.

When Mozilla went open source back in the 1990s, Netscape's lawyers required the source first be bowdlerized. (Search for "censorzilla" for some examples.)



Git does not tamper with file encoding. Only way file content modification can happen is automatic CRLF conversion, but that’s a local setting, not affecting the public archive, unless .gitattributes is present, not the case here. It is common sense that one should never enable this setting globally on your system, precisely because it can tamper with content unexpectedly.


This is important. If we're really trying to open source and preserve DOS, we need a better system. Someone at MS needs to create museuem.github.com and allow us to browse everything just as it was preserved on disk.


I think it's a mix. We're learning a bit each time. I'm bummed in the rush to get this finally published we made a few errors. We first found what looked like the source back in September last year.

Having the source on GitHub feels like a nice way for some people to peek through it... but there's clear compromise between archival and sharing some source.



My brain is rusty, but I feel like MSDOS 5.11 was where things finally just worked. TSRs, memory managers, etc. Moving a lot and not being a packrat I've lost some of that history.

It'd be interesting to see 5.x and 6.x released.



I think the pinnacle goes to MS-DOS 7.1, which while bundled with Windows was also usable as a DOS by itself and contained features like FAT32 support. MS-DOS 8 was the last version that came with the ill-fated Windows Me and significantly neutered.


I think it was 3.3x where things started working. I don't recall 4.x being around much. I do remember 5 and 6. For some reason 4 never made a splash in my circle of friends.


4 was, TBH, appalling for its time.

It took more base memory than ever. It had a complicated code page system that most people didn't want, and a clunky early version of IBM DOSShell that was scorned although it grew into something useful.

But it supported disk partitions over 32MB, and for that reason, it was reluctantly adopted. If you had a 286 or 386, then there were measures that you could take with a memory manager to make it not so bad, but on 8088/8086 class hardware, it didn't leave enough free memory for many big apps to run.



I was kind of happy with MSDOS 2.11, I felt that they'd got the basics in place (in particular hard disk / subdirectory support) and that bloat hadn't started. From memory I used this for years and years although I was young so time didn't rush past so quickly so who really knows. I kept a version of MSDOS 2.11 debug.com around for decades (patched with itself so it wouldn't just do a version check then quit). From memory it was something like 12K bytes whereas debug.exe from MSDOS 6.x was more like 60K bytes.


> My brain is rusty

It is. Because:

> but I feel like MSDOS 5.11 was where things finally just worked

There was no MS-DOS 5.11.

It went 4, 4.01, 5, 6, 6.2, 6.21, 6.22.

IBM had a few extra versions as the divorce was occurring at the time.



So if MS-DOS 4 was released in 1986, and it is now 2024, that's a 37 year gap between release and open source.

That means Windows XP should be open sourced by ... 2038. Not as far away as it seems. I'll add it to my calendar.



I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code, just with added touchscreen nonsense, adware, and overt contempt for the user.

We may see Windows 9x open-sourced. But then again, it's a stretch because Win32 API is still in wide use today. Releasing the sources for 32-bit Windows versions even this old may have an adverse effect on Microsoft's market domination.

But maybe ReactOS will reach beta by 2038. Does this count as an open-source version of Windows XP? :D

If you really wish to look at XP sources and don't care much about the legal aspect of it, you can do so right now. They were leaked.



> Releasing the sources for 32-bit Windows versions even this old may have an adverse effect on Microsoft's market domination.

I disagree that releasing Windows 9x source code would have any impact on MS market domination.

> I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code

Nowadays releasing something NT like XP may seem crazy. But in 15 years it will be so far away from future Windows, that it won't be that crazy.



> But in 15 years it will be so far away from future Windows, that it won't be that crazy.

It's not like the NT kernel will be going away from current Microsoft products anytime soon.



> I doubt Microsoft would ever open-source any NT Windows versions because the current ones are based on the same code, just with added touchscreen nonsense, adware, and overt contempt for the user.

Initiatives like MinWin and OneCore, secure kernel, device guard,... caused lots of rewrites and moving code around.



All open-source projects that deal with reimplementing parts of Windows, particularly Wine and ReactOS, consider those leaked sources radioactive and would not accept any patches if there's even a slightest suspicion that the patch author gleaned anything from those sources. Those same sources officially released under an open-source license would change that.


I wouldn’t assume Microsoft execs view increased capabilities to run windows programs in Linux as a bad thing, when they think about the matter at all. They would certainly prefer that such a capability be developed by someone else, so they don’t have to support it.


Around the time Windows 2000 came around.

Up to Windows 3.11 it was a GUI on top of DOS. Windows 95, 98, Me used DOS to boot and it was still possible to stop the booting process at a DOS prompt (although in Me this was no longer official). Finally Windows 2000 had nothing to do with it as it is NT based.



Windows 2000 was part of the professional NT line, though, and was the companion of Me for the millennium releases. As far as I know, 2000 wasn't marketed to home users. I think what the comment you replied to is saying is the the transition away from DOS wasn't completed for both professional and home markets until XP, which unified everything under NT for all markets.


Around the year 2000, I was studying computer science at a university. Most of their PC's ran on Windows 3.1. I was using it at home. But one day, Microsoft sent me an offer: I could purchase the student release of Windows 2000 workstation for a mere $25.00. I went for it, and found it better than the Windows NT nap-sayers at school said. I don't know why I was contacted. Probbably because of other Microsoft programs I'd bought at the student bookstore.


Windows 2000 was a pretty great OS. Used to enjoy using a Litestep shell instead of explorer. While it wasn't great for a lot of games, many did run fine. I liked it a lot better than OS/2 that I ran previously.

I generally ran 2-4x the amount of RAM as most did. Still do. Pretty sure this made a lot of the difference.



Hey, Listestep what a blast from the past :)

I rain it until it wouln't run sensible anymore in Windoes 10. I then ditched Windows for Linux soon after - I can recommend KDE Plasma if you want to have something thats sorta configureable enough like Litestep was.



windowblinds is a window decoration customizer - LiteStep does nothing of the sort :) LiteStep completely replaces explorer.exe as the shell host and you can then customize what functions you want to have in your UI. The windows themself would stay looking the same.


Windows 2000 Pro was what I used at home for a long time and it was great. NT 3 and 4 were absolutely terrible which might explain your NT naysayers at school. I never once had to reapply a service pack in Win2k


Still remember the first time I touched Windows NT 4. Half an hour into work experience: Opened up a printer dialogue set a setting that hard crashed the PC; then slowly every other PC in the building as soon as they tried to print (i.e. just as they had _finished_ whatever they were working on; but often just before they _saved_ it).


Can confirm. I upgraded my 98 box to 2000 and never did get some of my hardware working. When I told people I was using 2000 everybody assumed I had stolen it from work. I didn't. My friend stole it from work and shared it with me ;-)


Kind of for a very long while. You then had a descendant SFU from some SP of NT4 to XP / Server 2003, then a further one SUA until Windows 8 / Server 2012. With some code flowing between various companies. I think SFU still used the Posix NT subsystem core. Probably also SUA, although I'm less sure. Not really the case WSL1, though (although probably the core NT kernel was more ready to support it, thanks to its history).


I'm more interested in them open-sourcing something from the 3.x/9x line.

NT seems to have been far more studied, and of course there were the infamous leaks of those along the way.



MSDOS 4 was reportedly an overall bad release and was not in wide circulation, in all my days I think I only came across it once. This is why DOS 3.3 and 5.0 were much more common to find in circulation together.

I'm sure the source for 4 will make for some interesting bug hunting. Anyone remember the MUF list? "Microsofts Undocumented Features".



This is "multitasking dos 4" though, which isn't the same as the much-reviled ms-dos 4. As I understand it, it's a lot closer to dos 3 than it is to dos 4 in terms of functionality.

I wouldn't expect this to understand extended partitions, much less large partitions (that dos 4 uses)



>I wouldn't expect this to understand extended partitions, much less large partitions (that dos 4 uses)

Most of the source code (everything outside of -ozzie) is for regular DOS 4.0 and supports 32-bit sector numbers. They planned to add it in the multitasking version as well [1], but from reading IBMDSK.ASM it isn't there yet.

Also that driver talks directly to the hard disk controller instead of going through the ROM BIOS, and will only support XT drives, not IDE/ATA. Apparently the goal was to be able to do background I/O on an XT, where there is no BIOS support for that.

[1] see driver docs at https://raw.githubusercontent.com/microsoft/MS-DOS/main/v4.0...



No, you are thinking of 4.01 this is 4.0.

Those are very different operating systems, this is DOS 3.2 + later abandoned very crude multitasking features. Roughly.

And this matters because DOS 3.3 was a milestone.

4.01 comes from 4.00 which has nothing to do with 4.0 (yay for versions).



>And this matters because DOS 3.3 was a milestone

DOS 3.3 couldn't understand large partitions -except for Compaq dos 3.31. But regular dos 3.3 couldn't. I don't think dos 3.2 could even understand extended partitions/logical drives -much less large disks.

Still -pretty neat!



“DOS 3.3 couldn't understand large partitions -except for Compaq dos 3.31.”

This is not accurate. Several OEMs added proprietary variations of FAT which supported larger partitions. For instance I run Zenith MS-DOS 3.30+ which has this ability on a Zenith Z-161 XT compatible luggable.

Compaq’s 3.31 added FAT16B support which allowed larger partitions and was the standard for larger partition support going forward in standard MS-DOS.



OpenDOS isn't open-source, its source-available. The license reads more like trial software:

"Caldera grants you a non-exclusive license to use the Software in source or binary form free of charge if your use of the Software is for the purpose of evaluating whether to purchase an ongoing license to the Software. The evaluation period for use by or on behalf of a commercial entity is limited to 90 days; evaluation use by others is not subject to this 90 day limit but is still limited to a reasonable period"



The whole opendos thing is pretty questionable, too. CPM is open source as is its' derivatives. Cool so far. But is DR-DOS a derivative of it? Or is it bound by the 'non commercial' license of the 90's which a) was revoked b)isn't exactly open source (limits distribution) in the first place.

Microsofts' releases have the benefit of being unambiguous.



I feel like FreeDOS could already run just about everything for 20 years or longer. If your goal is to run DOS software your use case was probably already adequately covered with free software.

An interesting thing about DOS is the OS wasn't very involved. Programs did a lot of things we now think of as the realm of an OS, like talking directly to I/O addresses or installing interrupt handlers. I feel like a DOS implementation doesn't even need to do a lot of things, maybe part of why DOS4 is "good enough".



> stupid double stack

DoubleSpace/DriveSpace?

Mostly, you are absolutely right, yes. MS-DOS 5.0 was the peak, and then it started to acquire bloat as MS bundled stuff to compete with DR-DOS 6 and DR-DOS 7.

But the thing is that by modern standards, the bloat is tiny. :-)

I gave MS Office 97 a bad review at the time because it was several times bigger than Office 95, had virtually no new functionality, but introduced a pointless new file format just to get people to upgrade so they could read the files sent to them by companies with the newer version.

But for a decade now, Word 97 is my go-to version. With all the service releases installed, it works great, it's tiny by modern standards -- I think a full install of Word with all the optional filters and things totals 14MB -- and it's lightning fast on 21st century hardware.

Word 95 is even smaller and quicker, but it can't read or write the file format that everyone else, from Pages to Wordpad, uses. So it's crippled: everything has to go through LibreOffice first, to convert it to a .DOC format anything else can view, import, edit, or print.

Time changes the meaning of bloat somewhat.



Bloat is partially the eye of the beholder, and 6.22 wasn't terribly large by any means, but the things it added were not incredibly stable and it showed.

It's also amazing how long things like DOC to DOCX took to really "take hold" in the industry at large. I still get DOC files now and then.



No, still files called `.DOC` but with a different internal format.

There have been at least 3 successive MS Word file formats:

Word for DOS, WinWord 1/2/6/95: old .DOC format

(I think this also applies to Classic MacOS Word 1-5.)

Word 97, 2000, XP, 2003: new .DOC format

(Classic MacOS Word 98, X and 2001 were based on a port of the Windows codebase. Mac OS X Word 2004 and later are OS X-only, but are still based on the Windows codebase and use the same file formats.)

Word 2007-365: new Zipped XML format, .DOCX



It could use RTF but it used DOC as the default format. Word 97 introduced DOC, but different.

The latter one is more capable and compatible with modern things. Open Word and see how it calls DOC "Word 97-2004"



Are there specific parts of the DOS API that existing emulators like DOSBOX don't handle accurately enough?

I don't understand if this source can be usefully integrated into modern DOS preservation projects.



It was a long time since I messed with things, but "net drives" something didn't work in FreeDOS a long time ago. This was useful, because it meant you could from within a PC emulator access your host file system. It's entirely possible that works on FreeDOS now.


This is almost completely unrelated to your comment, but it sparked a fun memory. At one point I had a system set up that would boot dos via ipxe with iscsi drives. I thought it was almost magical how dos had no clue it was using a network drive. I still don't know exactly how it worked. but I suspect ipxe was patching the bios.


DOS was pretty reliable about using BIOS interfaces for drives; if you imitate the BIOS interface it’ll just work.

The problems came from limitations of the BIOS interface (especially size)



IF running software is what matters, there's dosemu2 and dosbox-x.

For the actual hardware or PCem, FreeDOS exists and is alive. DR-DOS has also been open sourced.



DR-DOS hasn't been open sourced. Caldera did release the source for the kernel and a few other bits, but the license only allowed free use for evaluation purposes. After 90 days (for a company) or "a reasonable period" for non-commercial entities you were required to buy a license.

Bryan Sparks did open-source CP/M a little while back, but AFAIK he hasn't said anything about DR-DOS so far.



Thats the DR-DOS/OpenDOS Enhancement Project. Its a set of patches for the Caldera OpenDOS 7.01 kernel.

The license file inside the original Caldera OpenDOS 7.01 source archive says:

"Caldera grants you a non-exclusive license to use the Software in source or binary form free of charge if (a) you are a student, faculty member or staff member of an educational institution (K-12, junior college, college or library), a staff member of a religious organization, or an employee of an organization which meets Caldera's criteria for a charitable non-profit organization; or (b) your use of the Software is for the purpose of evaluating whether to purchase an ongoing license to the Software. The evaluation period for use by or on behalf of a commercial entity is limited to 90 days; evaluation use by others is not subject to this 90 day limit but is still limited to a reasonable period."

So that website is incorrect when it says OpenDOS was released under an open-source license. Not surprising though - most websites discussing OpenDOS make this error. Possibly because at the time I believe Caldera did actually talk about open-sourcing DR-DOS, they just failed to to actually follow through.

If he still has the source code, whats needed is for Bryan Sparks to release it under some regular open-source license like Microsoft have done here.



I heard there was some resolution re: copyright mess in the last few years, but I currently cannot find anything about it.

To the point I might have dreamed it. Odd.



I'd like to see NTVDM open sourced... It's been leaked and there are unofficial builds to get support into 64-bit Windows through the emulation code. Could be a huge boost for general support.

Not sure how much work it would take for Linux or even just wine. But might displace DOSbox.



Ah, good ol' days of configuring AUTOEXEC.BAT and CONFIG.SYS to squeeze few more kb of RAM:) And setting IRQs for some weird sound card that just doesn't work!


I was having a chat with one of the young guys in the office yesterday. He was complaining that his first PC had windows 7 and was slow, because he only had 2GB of RAM. And I was thinking: Gosh, probably he never typed "dir" or "c:" in his life... I feel sooo old :/
联系我们 contact @ memedata.com