(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40884356

用户提出了一个场景,美联社 (AP) 采用新的指南,以避免在报道历史事件时计算世纪。 由于易于理解且所需的工作量最少,这一变化可能会导致英语出版业的广泛采用。 给出的例子是以摄氏度为单位的温度测量系统,其中零代表水的冰点,有助于理解温度的显着变化。 该用户反驳了人们不会轻易适应这一想法的论点,认为现有系统可以在不造成干扰的情况下促进逐步改变。 然后话题转向讨论计数方法,特别是零索引(从零开始计数)和一索引(从一开始计数)系统之间的争论,以及它们对人类认知的影响。 用户最后强调理解基本原则的重要性,而不是期望改变。

相关文章

原文


Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:

https://www.youtube.com/watch?v=KDTxS9_CwZA

The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.



Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.



The publishing industry already has style guides for large swaths of the industry.

Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.

The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.

I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.

If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.



Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"



Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.

For scientific work I obviously prefer kelvin.

Celsius is nearly useless.



For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.



95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.



> sub-5th percentile which couldn't figure it out

Ironic, given that one of the prime arguments in favor of metric is that it is easier.

Why do non-US people even care? And do y'all care that you are wrong? The US has recognized the SI. Citizens continue to use measurements they are comfortable with, and it does not hurt anyone. We are also not the only nation that has adopted SI but not made it mandatory. The UK is an obvious example.

Again, I'm back to 'why does anyone else even give a shit'? Aren't there more interesting things to ponder?



I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.

Celsius is for chemists.



There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.

Celsius works significantly better in cold climates for reasons mentioned in another comment.



If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)

In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.



> If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments?

Probably because they were made for US and changed the labels? I've never seen a thermostat with 0.5 C increments in Europe.

> the change between 25C and 26C is actually pretty big

I would maybe be able to tell you if it's 23 or 27, certainly I can't tell 1 C difference.



The difference between -1 C and +1 C is VASTLY more important in daily life than the difference between 26.5 and 27 C.

Farmers, drivers, people with gardens need to know if it will get subzero at night.

Nobody cares if it's 26.5 C or 26 C.



At sea level, yes :)

I do agree, though I live in Europe and C is the norm. I could never wrap my head around F.

That said, I think 0 is more important in daily life, below or above freezing. How much is that in F again?



My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.

There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.

I'd say it's way more important a distinction than anything that F makes obvious.



You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).

That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.



> only for the reason that everyone else is using it

That is an honest answer, which is refreshing. Beside that, there is not really any particular reason that the US has to make SI mandatory. We adopted SI nearly 50 years ago, we just did not make it mandatory. The US has a bit of national identity which leans towards rebelling, so making SI mandatory would probably be contentions anyway. And it's just not worth the argument, since it buys us very little of actual value.



Temperature is easy, probably the easiest unit to convert... Everyone would get used to it pretty soon after they started using it regularly. There would be some legacy systems out there which would annoying to convert (which is already the case) but within a generation nobody would bother with Fahrenheit at all.

I think the hardest unit to convert is probably length as there is not only a bunch of legacy systems and equipment out there, but Americans are very accustomed to fractional sub-units as opposed to the decimal cm, mm, etc. I’m not sure e.g. the building industry would ever stop saying e.g. four and five eighths. Personally I hate fractional lengths when using american tools. E.g. I’m used to a 11 mm wrench being smaller than a 13 mm wrench. I need to stop and think before I know which is smaller a five eights or a three quarters.



> american tools

That's an interesting way to phrase it. I, and everyone I know, have both metric and SAE tools. At least for wrenches & sockets.

> I need to stop and think before I know which is smaller a five eights or a three quarters.

I'm with you there. I've gotten in the habit of just mentally converting every SAE size to 32nds. I wouldn't really mind losing SAE, but that is not happening. What really makes my blood pressure goes up is Ford ... they mix metric and SAE fasteners on their cars. WTF! Pick one! Subaru is at the other end, easy to work on because 10 & 12mm wrenches will work for maybe 9 out of 10 bolts or nuts.



I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.



People normally just use the subunit which doesn’t divide. E.g. height is usually referred to in cm. If accuracy is important they use millimeters. Roadsigns for cars use km but downtown wayfinding signs for pedastrians use meters.

I agree it is really nice to use base-12 until it brakes, but it brakes much worse then metric. If you have to divide into 32nds everything about feet and inches is much worse (in metrics we would just use millimeters). The worst offender are wrenches which don’t order intuitively. In metric, if you 13 mm wrench is too big, you just grab an 11 mm wrench. In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?



US residents...

If you, as a US citizen, settle abroad, be prepared to run into a wall with Fahrenheits. People in the rest of the world don't have the intuitive grasp whether 50 degrees Fahrenheit is warm or cold.



> US residents

Yeah that's the right terminology. I knew it when I said citizens it wasn't quite right but I blanked on the right answer. 'Residents' is pretty obvious.

> be prepared to run into a wall with Fahrenheits

I agree it's worth knowing just enough about celsius to use it casually when you are traveling. e.g. I just remember 20 is room temperature and every 5C is about 10F. Close enough. And remembering '6' is enough to remember how km and miles are related.

Anyone who is settling abroad ought to be able to pick up intuitive celsius in a couple days. When everyone around you uses the same measuring unit, you adapt pretty quickly IME.



What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.

+10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.

Same with negatives, but in reverse.

Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.

It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.



You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.

40 F is a good time for getting winter tires on.

As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.

The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.



Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.

I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.



> Well, it's been working fine for me for about 15 years, let's agree to disagree here.

If you want to convince yourself, go out on the road in non-winter tires when it is sub-40F, find an open space where you can experiment, and then do a panic stop. Like you might have to do if someone jumps out in front of you.

That is what convinced me to not wait until it was freezing before I put on cold weather tires.



Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.



> After their 16th birthday, the person is going through their 17th year.

While that is true, does it not illustrate exactly the problem? Nobody ever says someone is in their 17th year when they are 16. That would be very confusing.



"You can change the world if you make it easier to meet a need enough people have"

True and should not be forgotten in this debate.

But clear communication is a need many people have.



Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.

Same thing for "this weekend" when. Not spoken during a weekend.



> I think it's more doable to learn to just live with that than to reprogram mankind.

Why not just fix the calendar to match what people expect?

There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.



I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.



> I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless

no it won't lol, people will pay just as much through the new dating system as they would through the old.



People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.



Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?

(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)



"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)

Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).



Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.



settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).



The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.

All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.



The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.



There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.

It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.



Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.

Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.



If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)



> It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.

https://en.wikipedia.org/wiki/Holocene_calendar



We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.



What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.



> What does that even mean?

It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".



> whether that date actually existed or not is irrelevant.

No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?



0 CE = 1 BCE

10 C = 50 F = 283.15 K

1 = 0.999…

Things can have more than one name. The existence of the year 0 CE is not in question. What’s in question is whether that’s a good name for it or not.



The first day of the 1st Century is Jan 1, 1 AD.

The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.



> The first day of the 1st Century is Jan 1, 1 AD.

That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.



You can make this work by having the 1st century start on the last day of 1 BC. Think of it as an overlap if you like; it doesn't really matter.

That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.



In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.

Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.



And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).



Doesn't matter, we can just agree the first century had 99 years, and be done with it.

We have special rules for leap years, that would just be a single leap-back century.

At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.



I have a solution that would work in writing, but not sure how to pronounce it:

1700s means 1700–1709

1700ss means 1700–1799

To go one step further:

2000s means 2000-2009

2000ss means 2000-2099

2000sss means 2000-2999



So shouldn't this be the "0-episode"? ;-)

(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)



There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.



These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?



There was an airport-novel series about a future where people's surnames are the company they work for. It was called Jennifer Government.

Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.



Death strandings naming is not too far from very common naming conventions throughout history, it's a nicely subtle touch.

Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.



Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.

In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).



What about languages that don’t have an equivalent to “the Xs” for decades or centuries?

Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.

I feel this is relevant https://xkcd.com/927/



> Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".

Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.

If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.



It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.

We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!

At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.

Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643



> We do the same with people's ages.

No, we don't.

When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.



> No, we don't.

But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).



“Half one” is archaic English, and common German, for 12:30. Similarly “my 27th year” just sounds archaic to me: I wonder if you went through a bunch of 19th century writing if you’d see ages more often be “Xth year” vs “X-1 years old”.

There may be something cultural that caused such a shift, like a change in how math or reading is taught (or even that it’s nearly universally taught, which changes how we think and speak because now a sizeable chunk of the population thinks in visually written words rather than sounds).



In the UK yes, I think not in AmE? At least I'm pretty sure they don't say 'quarter to' or 'quarter past', and do say 'a half after'.

(I had some confused conversation with a bus driver once. Bizarre experience to have so much language barrier between two EFL speakers, in English!)



I had this exact topic with an Irish coworker who lives in Germany and has issues to convey the right time. For me as a German „half one“ is half of one so 12:30. Same for „Dreiviertel eins“ -> „threequarter one“ being 12:45 and „Viertel eins“ -> „quarter one“ being 12:15. To be fair the logic behind this is also under constant confusion as some parts of Germany rather use „viertel vor“ or „viertel nach“ -> „quarter to“ „quarter after“ and have no understanding of the three quarter business.



The Irish like to say "half one" meaning "half past one". In my native timekeeping parlance "half een" means 12h30. Germanic/Dutch origin.

So whenever I talk time with the locals here I repeat the time back in numerical style to avoid confusion.

"The shop opens tomorrow at half ten".

"Thanks, store opens at nine thirty. See you then."

"No..."



I think of the age number "practically" as the number of "birthday celebrations" I have experienced, excluding the actual day of birth. That's the same as the amount of completed years I've lived on this earth, and one less than the year I'm living in, because that year is not yet completed. (Except of course on birthdays)

But I think this also illustrates just how averse our culture is to using zero-indexing in counts: The age number absolutely is zero-indexed - a baby before before the first birthday is zero years old. But no one calls it like that, instead we drop the year count entirely and fall back to the next-largest nonzero unit, i.e. we say the baby is so-and-so-many months old. And for newborns not yet a month old, we count in weeks, etc.

I think, culturally, it's not that surprising as this method of counting is older than the entire concept of "zero". But I think it shows that there is little hope of convincing a large number of non-nerd people to start counting things with zeros.



That's not really indexing from 0 though. It's just rounding the amount of time you've lived down to the nearest year. You get the same number, but semantically you're saying roughly how old you are, not which year you're in. This becomes obvious when you talk to small children, who tend to insist on saying e.g "I'm 4 and a half". And talking about children in their first year, no one says they're 0. They say they're n days/weeks/months old.



In an indirect manner, we do mark having lived the 27th year in the following forms, we just don’t say it exactly the way you phrased it:

1. On your 26th Birthday, when you say you turned 26 what it means is that you have now lived 26 years. People generally understand this, even if they are going to be spending the next year saying they are 26.

2. It is not uncommon for people to demarcate their age on their birthday in revolutions around the Sun, as a kind of meme. “I’ve now traveled around the Sun twenty-six times.” or something like that, when reflecting on their lives on their Birthday.

The colloquial usage is our legally-defined age. A shortcut for our laws to take, the age-gating ones anyway. It hasn’t replaced our cultural understanding of what the first year of our life actually was.



> When we refer to 'the first year of life', we mean the time from birth until you turn 1.

Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.

So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.



> Sure, but no one ever uses that phrasing after you turn one.

Heck, few people say anything about 'the first year of life' even when talking about someone that young. It is too imprecise, because things change so rapidly. In my experience the most common convention is to use months to describe age before someone turns 2.



On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.



The number we put on the cake represents the number of "years old" (i.e. the number of birthday anniversaries) not the number of birth days someone had (obviously). Zero year-olds are 0, one year-olds are 1, ...



> On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

If you are going to be that pedantic, I would point out that one only has one birthday.

(Well, unless one's mother is extremely unlucky.)



yes, birthday and birth day are different things. Just like everyday and every day have different meanings, and it isn't confusing (to most people).



My preference is semi-compatible with both conventions:

First = 0 Second = 1 Toward = 2 Third = 3 …

This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.



Except we do, as soon as we need the next digit.

In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.

So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).

It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.



> In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

That’s not the case, though. They can vote (and drink, in quite a few countries) when they are at least 18 years old, not when they are in their 18th year (who would even say that?)

People are 18 years old (meaning that 18 years passed since their date of birth) on their 18th birthday. There is no need of shoehorning 0-based indexing or anything like that.

> Most will say yes.

Most people say something stupid if you ask tricky questions, I am not sure this is a very strong argument. Have you seriously heard anybody talking about a child’s “5th year of development”, except maybe a paediatrician? We do talk about things like “3rd year of school” or “2nd year of college”, but with the expected (1-indexed) meaning.

> So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

It’s really not. To have experienced a full year, you need a year to have passed, which therefore has to be the first. I think that’s a cardinal versus ordinal confusion. The first year after an event is between the event itself and its first anniversary. I am not aware of any context in which this is not true, but obviously if you have examples I am happy to learn.

> It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

Right. I know it is difficult to admit for some of us, but we are not computers and we do not work like computers (besides the fact that computers work just fine with 1-indexing). Some people would like it very much if we counted from 0, but that is not the case. It is more productive to understand how it works and why (and again cardinals and ordinals) than wishing it were different.



> who would even say that?

Writers.

And yes, cardinal versus ordinal is my point. The farther from the origin, the less people are likely to want them different.



What? No. When you are 0, it is your first year. When you are 21, you have begun your 22nd year. In the US you are legal to drink in your 22nd year of life.

You are correct that nobody says "22nd year" in this context, but nobody says "21st year" either. The former is awkward but the latter is just incorrect.



> nobody says "21st year" either

On the contrary, enough people say it, it's a quora question:

https://www.quora.com/What-does-it-mean-to-be-in-your-twenty...

Authors love phrases like this. Which, in turn, comes from another ordinal/cardinal confusion stemming back to common law:

"A person who has completed the eighteenth year of age has reached majority; below this age, a person is a minor."

That means they completed being 17, but that's just too confusing, so people think you stop being a minor in your 18th year.



It's just not true. You've completed being 17 years old on your 18th birthday, when you enter your 19th year and can count 18 years under your belt.

Consider a newborn. As soon as they're squeezed out they are in their first year of life. That continues until the first anniversary of their decanting, at which point they are one year old and enter their second year of life.

There is nobody, nobody, who refers to a baby as being in their zeroth year of life. Nor would they refer to a one-year-old as still being in their first year of life as if they failed a grade and are being held back.

The pattern continues for other countable things. Breakfast is not widely considered the zeroth meal of the day. Neil Armstrong has never been considered the zeroth man on the moon nor is Buzz Aldrin the first. The gold medal in the Olympics is not awarded for coming in zeroth place.



> It's just not true.

No one's saying it's true! All that's being claimed is that writers will often use phrases like "became an adult in their 18th year" or "was legally allowed to drink in their 21st year".

It's completely incorrect, but some people use it that way, and ultimately everyone understands what they actually mean.



The top response in your Quora link is that your 21st year "means you’re 20. You have had your 20th birthday, but not yet your 21st." That is the conventional definition.

People commonly make the mistake of thinking otherwise, but that's all it is. A mistake.



If you point at year long intervals, then those will be year long intervals indeed.

Nevertheless the traditional "how old are you" system uses a number 1 less.



Yeah we do, because their 'first year' isn't their age. We do their age in (also zero-indexed) months/weeks/days.

In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).



Your metaphor is comparing apples and oranges. When we could life, it's "one year old" or "aged one year," both of which mark the completion of a milestone. Using the term "18th century" is all-encompassing of that year, which is a different use case. When one recollects over the course of someone's life, like in a memoir, it would be normal to say "in my 21st year", referring to the time between turn 20 years old and 21 years old.



I don't think it is a mistake for Lua. The convention to zero-index arrays is not sacrosanct, it's just the way older languages did it (due to implementation details) and thus how people continue to do it. But it's very counter-intuitive, and I think it's fair game for new languages to challenge assumptions that we hold because we're used to past languages.



Zero-based arrays are counter-intuitive for a while, but if you deal with a lot of data, you typically realize that it's a small price to pay to make manipulation much easier in many contexts. For instance, if you have a ring buffer of size N and an unwrapped position P, the wrapped position is:

Zero-based: P % N

One-based: ((P - 1) % N) + 1

It might seem trivial, but each +/-1 is an opportunity for confusion and a bug nest. With zero-based arrays, it's often the case that the only required +/-1's are when producing and consuming human-readable one-based text.

The next stop on the zero-based epiphany train is the realization that a convenient way to store a range is a { first, first_past } tuple. The size of the range is (first_past - first). The whole-array range is { 0, size }, while a simple empty range is { 0, 0 } (zero is often the default initialization, simplifying things further.)

Both elements are indices, so they can be similarly manipulated, compared and range-checked, making many 'if' clauses easier to think about and verify. If there is a bug, it often ends up being harmless because of the arithmetic properties of this scheme.

Once you start dealing with multiple ranges, the advantages are even more obvious. Two ranges are adjacent iff (first_a == past_b || first_b == past_a). The intersection of two ranges is { max(first_a, first_b), min(past_a, past_b) }, which is nonempty iff they overlap. An array of M adjacent ranges is stored as a uniform (M+1)-tuple.

This realization has become so second-nature for me that I'm probably overlooking four or five even better examples here.



> "it's just the way older languages did"

It's a C family (predecessors and descendants) idiosyncrasy that very unfortunately got out of hand. Most other old languages had either 1-based indexing or were agnostic. Most notably FORTRAN, which is the language for numerical calculations is 1-based.

The seminal book Numerical Recipies was first published as 1- based for FORTRAN and Pascal and they only latter added a 0-based version for C.

Personally, coming from Pascal, I think the agnostic way is best. It is not only about 0-based or 1-based but that the type system encodes and verifies the valid range of index values, e.g. in Pascal you define an array like this:

    temperature = array [ -35 .. 60 ] of real;
You will get an immediate compule-time error if you use
   temperature[61];
At least with Turbo Pascal you could chose if you wanted run-time checks as well.

I have a hard time wrapping my head around the fact that this feature is pretty much absent from any practically used language except ADA.



It's not counter-intuitive at all, it only seems that way because people are now used to languages with zero-based indexing. That's almost entirely because of the C language, which used pointer offset arithmetic with its arrays.

Outside of that machine context, where an array is a contiguous block of RAM that can be indexed with memory pointers, there's no particular reason to do offset indexing. 1-based works just fine - "first element, second element" - works just fine and is perfectly intuitive.

Different types of indexing can make sense in different situations. Some languages even allow that. In Ada, for example, arrays can start at whatever index you define.



There are reasons unrelated to pointer implementations such as the interval argument from Dijkstra's article or conversions between flat and multidimensional array indexes. There's a reason why most mathematical sequences start at zero: it leads to simpler expressions. Vec (and especially matrix) indexing should have been zero-based.



> 1-based works just fine

It really doesn't. You can make it work obviously but you end up with much less elegant code, with +1 and -1 all over the place. E.g. for accessing a row of a matrix you get [width(i-1)+1, widthi+1) instead of the far saner [widthi, (width+1)i)

Generic code also becomes much more awkward.



Both are less elegant in different scenarios. In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

Neither is a true one-size-fits-all solution. They're different kinds of indexes that serve different purposes. The choice of zero-based everywhere is an engineering tradeoff, nothing more.



> In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

Perhaps, but this is extremely rare compared to tasks that are far more elegant with 0-based indexing. Also the worst you can get there is a single +1 in the display code, while trying to shoe-horn algorithms into 1-based code can get much more awkward.



It is the best choice, but the difference is mostly in working manually with slices and offsets and array windows, for indexing and iterating they are mostly the same with maybe a small benefit for mathy notation (the reason why Julia is 1 indexed)



Another example of confusing numeric systems emerges from 12-hour clocks. For many people, asking them to specify which one is 12AM and which one is 12PM is likely to cause confusion. This confusion is immediately cleared up if you just adopt a 24-hour clock. This is a hill I'm willing to die on.



A few months ago, my girlfriend and I missed a comedy show because we showed up on the wrong day. The ticket said Saturday 12:15am, which apparently meant Sunday 12:15am, as part of the Saturday lineup. Still feel stupid about that one.



> just adopt a 24-hour clock. This is a hill I'm willing to die on.

I don't know if I feel that strongly about it but I tend to agree. I see more value in adopting a 24 hour clock than making SI mandatory. AM/PM is silly.



You usually know it from context, and if not, 12 noon or 12 midnight is quite common.

But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system



I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention



A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.



I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.



> It didn't bother anyone except Lua programmers, I'm pretty sure.

What's this reference to? Afaik, Lua uses `os.time` and `os.date` to manage time queries, which is then reliant on the OS and not Lua itself



Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.

A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.



I thought the article was going to argue against chunking ideas into centuries because it's an arbitrary, artificial construct superimposed on fluid human culture. I could get behind that, generally, while acknowledging that many academic pursuits need arbitrary bins other people understand for context. I did not expect to see arguments for stamping out the ambiguities in labellibg these arbitrary time chunks. Nerdy pub trivia aside, I don't see the utility of instantly recalling the absolute timeline of the American revotion in relation to the enlightenment. The 'why's— the relationships among the ideas— hold the answers. The 'when's just help with context. To my eye, the century count labels suit their purpose for colloquial usage and the precise years work fine for more specific things. Not everything has to be good at everything to be useful enough for something.



This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]

In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.



Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.

For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".



Oh you're right, I tripped up. "The last quarter of the second millennium BC" means about minus 1250 to minus 1001.

I often get excited by some discovery sounding a lot older than it actually is, for reasons like this.



I do tend to say "the XX00s", since it's almost always significantly clearer than "the (XX+1)th century".

> There’s no good way to refer to 2000-2009, sorry.

This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.

People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".



How about the 20-ohs?

Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.

Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.

The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.

(Yes, pretty much the same thing as 20-aughts, gotta admit)



I wonder at what point we can just assume decades belong to the current century. Will "the twenties" in the US always primarily mean Prohibition, flappers, and Al Capone or will it ever mean this decade?



I say give it 11 years or so for 2020's kids to starting come into age, and twenties babies will refer to babies born in the 2020's and not centenarians.



You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?



Immigrating from a country that uses "1700s", it probably took a decade before I had internalized to subtract 1 to get the real number.

I will resent it till I die.



Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.

The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.



I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is



> This leaves ambiguous how to refer to decades like 1800-1809.

There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)

There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)

Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.



I got a fight recently with a philosophy teacher about that, I changed the dates like the op to be more clear on my writing, she took it so seriously, it was a big fight about clarity vs. tradition but really superficial and mean on both sides. Now I wish to be* more articulate and have a good debate. I wrote it in my way on the final exam and approved, she had to deal with it I guess.

* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.



The same off-by-one annoyance under discussion bites the author in this very article and he didn't even notice: He calls 1776 the 76th year of the 18th century. But it's not! It's the 77th year of that century!



I didn't make the convention. The 17th century started on January 1, 1701 and ended on December 31, 1800.

That's because the purported year of the birth of Christ was the "first year of the lord", or AD 1, and that's when the first century started. In turn that's because at the time Latin had a word for nothing but not a word for zero, so you couldn't count years starting at zero.

That also means that those old enough to have partied on December 31, 1999 were technically partying for the beginning of the last year of the second millennium.



Only American Christians say 'year of our lord' after the date, if that's what that means to you and is how you want to think of it that's fine, but be aware nobody else is doing that, even though they're working in AD/BC.



"Year of the lord" (not "our" lord, please double check what I wrote) is the literal translation of Anno Domini and it's the reason why years are counted from one instead of zero. I quoted the expression, wrote "lord" in lowercase and added "purported" to make it clear that the religious reference was only for historical reasons, I don't know what else I could have done. If I wanted to imbue some religious meaning I probably would have said "Jesus" or "the Christ".

I am also not American and not a native English speaker. In fact Latin languages say in expanded form "après Jésus-Christ", "dopo Cristo", "despues de Cristo" (abbreviation is only used in writing), so even speakers who are not religious very much know the reference even if they couldn't care less. We're stuck with it.

I agree that it's more of a "who wants to be a millionaire" quirk than something that actually matters, but this case of correcting someone was one of the few cases where it matters.



That is what they say though.

I am Christian in a very literal sense, would use AD dating etc. and have no problem with it whatsoever, I just don't feel the need to suffix (or sometimes prefix) it with 'year of our lord' the way some Americans do; I find it quite jarring.

It's like feeling the need to say 'day of rest' or 'sabbath' every time you say a particular day of the week, or some other (possibly areligious) descriptor of something whenever it's mentioned.



I was referring to 1701 to 1800 inclusive, which is not the standard way people refer to centuries and is not what other people will mean or will think you mean, in general.



You didn't read far enough because he specifically notes this. The 18th century started on January 1, 1701, therefore 1776 is indeed the 76th year of that century, not the 77th, as the year 1700 is part of the 17th century.



This was confusing to me as a kid, especially as we entered the 21st. I also still remember learning about the Dutch golden age in elementary school, but can't remember if it was the 1600s or 16th century.

I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)



As a kid I came across a book titled “Scientists of the 20th century”, and I was intrigued how the authors knew about future scientists.



> Did the American revolution happen before, during, or after the Enlightenment?

I’ve no idea. When did the American revolution happen?

Not everyone’s cultural frame of reference is the same as yours. I can tell you when the Synod of Whitby happened, though.



It's tiresome when people seem to think it's necessary to be annoyed that others make reference to their own cultural frame of reference in their writing.

Even more tiresome when they feel the need to comment about it.

Most people here, even non-Americans, likely have at least a rough idea of when the American Revolution was. And those who don't will either just gloss over it and think no more of it, or find the answer on the internet in a shorter amount of time than it took me to type this sentence. And then there are people like you. Look at the completely useless subthread you've spawned! Look at the time I've bothered to waste typing this out! Sigh.



Also the author does give the year in the next paragraph. So no googling required.

I also didn't know what the date of the American revolution was, but I understood it was just an example.

> if you’re like me, you’ll find the question much easier to answer given the second version of the sentence, because you remember the American revolution as starting in 1776, not in the 76th year of the 18th century.



Yes, that's entirely the point.

The article is "Here's a thing I don't understand! Let's say how silly it is by comparing it to a thing I do understand."

I mean, thanks for that. I could write an article about "Why do people insist on quoting the American revolution as a reference date when they could just say 'late 18th century'". Which would make the same point and get us precisely as far along as this article did.

> Look at the time I've bothered to waste typing this out!

I feel the same way, dude. I feel exactly the same.



The American and French Revolutions are a pretty big deal on the road to modern democracy, as well as being tied to 1700s Enlightenment ideals. Everyone educated should know this.



Of course, they are important, but so are many other things - and speaking e.g. from a European POV, a lot of other events are simply much more salient and commonplace - and the same is probably even more true for other continents (would a random reasonably educated American or European person know when the Meiji restauration happened or when Latin America became independent?). You can't expect everyone to have memorised all the important dates.



America was a backwater at the time and therefore the best place to experiment with European enlightenment ideals. Which it did, and was a direct factor in the French Revolution. I also learned about numerous revolutions in Latin America from Mexico to Bolivar to San Martin over the early 1800s.

The events that directly affect the modern world should be covered in school. I’d say revolutions that created large modern states would be among them.



Of course, we learn about the American Revolution in schools, but people aren't going to remember every date they were taught in schools. The founding of Rome or the Punic Wars are also hugely important for today's world, but not everyone can place them.

The reason most US Americans probably can place the American Revolution is because I assume it's so often commememorated there. In Germany, people would be much more likely to remember the years 1933, 1949 and 1989, because of how often they're referenced.



> I’d say revolutions that created large modern states would be among them.

The Russian revolution as well. And the Chinese one. It’s quite difficult to make sense of the late 20th century (yes, I know) without them. Or the early 21th.



I slightly misspoke above. The important part in my opinion is the milestone of democracy. That it happened nearby wars is somewhat incidental and common.

What I mostly remember about the HRE is that it wasn’t Holy, nor Roman, nor an Empire. :D. My first guess of mid 1600s is not too far off.

(Westphalian sovereignty is interesting… reading about it now.)

Those governments have been replaced multiple times over the centuries however, so achievements at the time were not stable.



Swede here. Both the American an the French revolution were taught when I went to school in the ninth and tenth decades of the twentieth century. As GP points out, they're both fairly significant events that had side effects relevant even to us up here.

I would be extremely surprised if Sweden was unique among European nations in this regard.



I grew up in Scotland. We studied American History in school. In fact we didn't study English History, except as it pertained to Scottish History, so to this day I'm hazy on Magna Carta, turbulent priests and so on.



A pretty big deal in America. I don't think knowledge of the exact date of the American Revolution is a requirement for education outside America. At least no more than "17something...ish".



"17something...ish" is enough to answer (or at least make a high confidence guess at) the original question (was the American Revolution contemporary with the enlightenment?)



For that matter, a lot of historical dates we consider important are only important to us because A) we're westerners and B) we got them drilled into us by textbooks and classes.

The remaining majority of the world (the west is a minority) sincerely couldn't care less about the American or French or Industrial Revolutions or Columbus (re)discovering America or the Hundred Years War or the Black Death or the Fall of Rome or whatever else.

Kind of like how we as westerners generally couldn't care less about Asian, African, Middle Eastern, Indian, or Polynesian histories.

The culture we grow up in and become indoctrinated by determines what is important and what is not.

And just so we're clear, this bit of ignorance is perfectly fine: Life is short, ain't nobody got time for shit that happened to people you don't even know who lived somewhere you will never see.



Your problem is that, instead of encouraging people to learn, you're just dismissing people who don't know one very specific historical fact as "uneducated". People in Europe generally know that America is a democracy and many probably also know that it was the first modern one, but to an average European it doesn't matter that much that it happened in the 1770s as opposed to, say, the 1650s.



If you didn’t know it, why not learn it now? Here’s your opportunity, one of today’s 10k.

Arguing you shouldn’t need to learn it isn’t impressive in any shape or form. It’s middle-brow level (non)curiosity, and less than welcome here.



Let me put it this way: Can you really blame someone for not knowing a historical fact that is completely irrelevant to their life, especially when they probably have more pressing concerns to learn and care about?

We only have so many hours in a day and so many days in a lifetime, while knowledge is practically infinite.



You could say that about anything.

I limited my initial statement to educated folks, and presumably those who would like to be one.

/history/democracy/milestones -> relatively important.



Absolute rubbish. There are plenty of facts that it would be quite surprising for an educated person not to know. The date of the American Revolution is not one of those facts, except for Americans.

Just like it would be unusual for an educated person not to know when the Battle of Hastings was... unless they aren't British.

And that's about as close a country to America as you can get. Do you think the educated in Singapore or Kenya learn about the American Revolution? Hell even in the UK we did not spent a single history lesson on it. I'm not exaggerating.



It’s not entirely surprising your country would prefer to forget that war. Disappointing perhaps but not surprising.

In my opinion, a battle is generally not important compared to a milestone of democracy. I learned about the Magna Carta in 1215 and saw a copy once and appreciate that knowledge.

But sure, keep arguing for ignorance and dismissing education if you’d like—we’ll enjoy the show here in posterity.



> It’s not entirely surprising your country would prefer to forget that war. Disappointing perhaps but not surprising.

Maybe not, but that's hardly the only thing from history that we didn't learn about. I think you're massively underestimating how much history there is. Most of the world has millennia of history. It's not like America where you can cover the entire history of the country.

Also history is an optional lesson past age ~14. You can choose to do Geography instead. But I'm not going to call you uneducated for not knowing what a medial moraine is.



As very broad subjects? Yes. But any particular factoid about them is practically irrelevant for most people who don't actually have anything to do with that factoid.

Hell, I would even go as far as to say the American Revolution is irrelevant even for most Americans because it has nothing of practical value. We (Americans) all know about it to varying degrees, but again that is due to growing up and being indoctrinated in it.



This is such a boring dismissal of a very interesting subject - what information is important to whom and why.

A pretty important reason for learning about the history of democracy is to learn how and why to preserve it.



This is the most boring knuckledragging subthread I’ve ever read here. Arguing we shouldn’t bother knowing anything but what’s underneath our nose, without qualifications. Not even a theoretically useful philosophy, such as solipsism is being espoused here.

All the while masquerading as intelligent conversation. It’s only interesting until you follow it to its conclusion. Anti-intellectualism in a nutshell, and you should be embarrassed to be seen in the vicinity. :D



Is it nice to learn about the American or French Revolutions? Absolutely. But time is a limited resource and must be rationed according to each individual's needs.

Most people are far too busy living lives to have time to spare spelunking into tomes; they usually don't even understand what democracy actually is either, but they get by fine in life just the same.

The reality is no man can learn all there is to know, it's physically impossible and lines in the sand must be drawn. Given that, which side of the line a certain historical factoid lies will vary; some will consider the American Revolution important and some will not, and some will be indoctrinated by their society, all are fine.

Also, I'm going to cite the HN Guidelines here for your reference going forwards:

>Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

>Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

联系我们 contact @ memedata.com