(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=38078063

总体而言,这些评论揭示了有关升级到 M3 MacBook 的多种观点,包括工作负载和优先级的差异、对安全漏洞和资源消耗的担忧、关于升级的价值和必要性的争论,以及关于保留的影响和未来潜在进步的讨论。 The presented arguments vary greatly in their reasoning and rhetoric, ranging from advocacy to criticism, skepticism, humorous observation, and philosophical reflection。 Ultimately, each individual decides for themselves whether or not investing in a particular hardware solution is justified based on their unique needs, budget, and preferences。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
Apple unveils M3, M3 Pro, and M3 Max (apple.com)
948 points by ehPReth 1 day ago | hide | past | favorite | 1004 comments










Related ongoing threads:

Apple unveils the new MacBook Pro featuring the M3 family of chips - https://news.ycombinator.com/item?id=38078065

Apple supercharges 24‑inch iMac with new M3 chip - https://news.ycombinator.com/item?id=38078068



A few things I noticed, as I'm seeing the variety of SKUs becoming more complex.

- Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

- Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

- The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

- The M3 Pro actually has more E-cores than the Max (6 vs 4). Interesting to see them take this away on a higher-specced part; seems like Intel wouldn't do this



I'm wondering now if Apple tracks this sort of stuff in their released machines. I know my iPad always asks me if I want to send stats to Apple (and I always say no). So let's say that enough people do, then do they have a good idea of how often all the performance cores are used? Max memory B/W consumed? Stuff like that. Back when I was at Intel there were always interesting tradeoffs between available silicon/thermal/margin resource and what made it into the chip. Of course Intel didn't have any way (at that time) to collect statistics so it was always "... but I think we should ..." not a lot of data.


Apple shares anonymous usage data by default on all their operating systems and users are asked again on every major update.

Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

But I suspect at high-end they only really care about the performance of a few dozen professional apps e.g. Logic or Final Cut. And at the low-end it's likely just efficiency.



> Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

A 95% opt-in rate is INSANELY high for any type of usage-stat opt-in, everything above 50% is usually outstanding.

What is known about "similar defaults"?



Apple enjoys a level of consumer trust that essentially no other technology business, and almost no other business at all. Whether that's justified or not is a matter of opinion.


It seems like the comment above is describing out-out and the it pesters you to opt-back in if you opt-out.


That's not how it works. You get asked the question again on every update, regardless of what you chose the last time.

So there are people who were opted-in that change their minds. My friends and family opt-in rate is



It honestly doesn’t matter. We’re talking about hundreds of millions of devices sending data in either case. A hundred million more provides no additional value.


Major updates are infrequent maybe once a year if you always update, it’s not pestering you. And the UI makes it very easy to skip unlike some designs.


Unless there is a flurry of network vulnerability updates, then a bespoke fork is set in the road for them.


Security/minor updates don't prompt for this AFAIK


It’s a step in a setup wizard. Whilst it’s explicitly asked, and far from dark pattern territory, it’s designed in such a way that I wouldn’t be surprised by a 95% opt-in rate.


I would be VERY surprised.

To someone with experience in that area of UX, a 95% opt-IN rate is ridiculously high.

A 95% consent-rate would already be hard to achieve as opt-OUT.

For opt-in a 95% rate would require both attention AND consent from 95% of the audience at this stage in the setup wizard.

I highly doubt that it can achieve 95% attention, let alone 95% consent.



But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.


I was actually more genuinely interested to learn about the "similar defaults" mentioned in the OP, the 95% comment was just a side-note to a huge overestimation on how easy consent is achieved.

> But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.

Thing is, you don't even have 100% of the users' attention in this case. The user wants to use the device, you're just standing in the way.

The scenario is this: You force the user to take a decision between option A and B. Regardless of his decision he will achieve his immediately desired outcome (move to the next screen / use the device).

Getting 95% to vote for 'A' would require some quite aggressive dark pattern, to the point that option 'B' would need to be almost invisible and actively discouraged.

Even if the UI would be a pre-checked check-box and the user would just have to select "Next" to Continue (=opt-out), your rate of consent would not be 95%. As mentioned, everything beyond 50% is already outstanding

Or, let's rephrase: If Apple would have 95% opt-in rate, they wouldn't bother chasing for consent again on every SW-update



To add to this, it's not like a mailing list, either. Marketing opt-in is lower because it's annoying. A lot of people don't want emails.

Anonymized stats from your machine? Most normal people (who don't use computers like we do) do not care and just click the most affirmative option so that they can move forward.



I think that was kind of the OP point. "Pro" users are significantly more likely to be opt-out in this scenario, unless they are not Pro users but just want the Pro machine for conspicuous consumption, making a much more dramatic swing in the usage data that is collected.


The word Pro in the product name really doesn't separate consumers as well as you might think.

Every college kid has a Mac Book Pro, yet they are by definition not Pros



It’s more like 15% opt in. I know because it controls dev access to analytics on their apps.


Wait telemetry is opt-out?

And I've never heard people complain?

Genuinely surprised as it seems to be quite a commonly controversial thing amongst devs.



It's not exactly 'opt-out', they ask you on first boot or after major upgrades, and you either select "Share with Apple" or "Don't Share with Apple". It's just that the "Share" button is coloured blue so looks more default since it's more prominent (at least on iOS, I think it's basically the same on macOS).

It's not like it's enabled by default and you have to know to go and find the setting to turn it off or anything..



It’s opt-out, but it’s not enabled silently. It’s a pre-ticked checkbox on a screen you have to review when you first setup the machine (and when you do a major version OS upgrade).

IMO that’s quite different to something that’s just silently on by default, and requires you to set an environment variable or run a command to opt out.



On a phone there is no box at all. It's two options to select. The opt-in is highlighted, but there is no "next" button -- you have to select an option.


I don't think it's pre-checked, is it? I thought it was Yes/No buttons


No the default action is to do nothing (ie do not install the OS). You have to actively consent or reject.


Yeah, that's kind of surprising, given that Apple is often hailed as a privacy champion.


It’s not really opt-out or opt-in: it’s an explicit, informed choice you have to make when you start up your Mac after first purchase or major upgrade.


Well, Apple generally has so much info about your every step people stopped caring a long time ago.


I think you are talking about Google, not Apple.


No, both of them actually. Don't trust them too much.

This calls out some soft spots that were exposed during the Hong Kong riots: https://www.youtube.com/watch?v=nQ9LR8homt4



> asks me if I want to send stats to Apple (and I always say no)

so you like them enough to pay them thousands for the premium product, but not enough to tell them how much CPU you use?



I have no idea what information they’re collecting on me, and it seems very few people do (given that nobody was able to answer the above question).

Could be “how much CPU does this user use?” but could also be “when prompted with a notification that a user’s iCloud backup storage is low, how long did they hesitate on the prompt before dismissing? How can we increase their odds of upgrading?”

Also, my willingness to provide information does not correlate to how much I “like” a company’s products. If I buy a sandwich from a deli, and they ask for my email for their newsletter or something, I won’t give it. That doesn’t mean I don’t like their company or their sandwich. Could be the best sandwich in the world, they don’t need my email.



In addition to the reduced memory bandwidth, the M3 pro also loses 2 performance cores for only 2 more efficiency cores.

M2 pro: 8 performance cores + 4 efficiency cores.

M3 pro: 6 performance cores + 6 efficiency cores.

Not a great trade... I'm not sure the M3 pro can be considered an upgrade



Depends. Is it faster? Then it's an upgrade.

Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought we'd learned our lesson with the silly Mhz Myth already?



I guess we'll have to wait for benchmarks but I did find this interesting:

Apple's PR release for M2 pro: "up to 20 percent greater performance over M1 Pro"

Apple's announcement for M3 pro: "up to 20 percent faster than M1 Pro" (they didn't bother to compare it to M2 pro)



Sure, that's the title, but at least in this PR they immediately show a graph with a comparison to both.

Presumably it makes more marketing sense to compare to the M1 family up front because most people that bought an M2 last year are probably not going to be upgrading to M3. They are speaking to the people most likely to upgrade.



fwiw, i cant remember the last time i saw a company go back more than a generation in their own comparison. Apple is saying here as much as they're not saying here. M2->M3 may not be a compelling upgrade story.


The vast majority of Mac users go years between upgrades. For any other vendor it might seem weird to show several comparisons going back multiple generations (M1 and x86), but for the macOS ecosystem it makes perfect sense since only a very tiny slice of M2 users will be upgrading.


and what makes you think windows users update their devices every single generation?


Windows has distinct groups: the people who buy whatever costs $700 at Costco every 10 years / when it breaks don’t care but there’s also a vocal enthusiast community who do upgrade frequently. That group gets more attention since it’s a profitable niche and gaming generates a lot of revenue.


I used buy a $700 Windows laptop every 18 months in the 2000s. Then I got fed up with them just falling apart and switched to Macbooks. My 2013 purchase is still alive and being used by the kids.


In the 2000s, I went through a wide variety of PC laptops (Lenovo, Toshiba, Dell, Alienware, Sony, etc.) all within the range of $1200-$6500 and they all died within 3 years (except for the cheapest one which was a Lenovo with Linux). Some died within a year.

When my first Macbook lasted for more than 3 or 4 years I was surprised that I was upgrading before it died. I went through many upgrades with almost zero issues(one HDD failure, one battery failure). I still have a 2012 Macbook Pro that I've since installed Linux on.

When I bought the first touchbar Macbook (late 2015?) I spent around $6k maxing out the options, and I was surprised at how totally trash it was. Hardware QC issues were shocking: particles under the screen from manufacturing, keys stuck within the first hour of usage, external monitor issues, touchbar issues...

I haven't bought a laptop since.



Did you buy a macbook for $700? That was a pretty low price back then which meant you were buying devices made to a price. Buying a Macbook is one solution, another would have been to spend more money on a higher quality Wintel system.


Yeah, the quality of PC laptops has improved but that really just means you can get closer to equivalent quality at equivalent pricing. I've heard people claim to have saved a ton but every single time I used one there was some noticeable quality decrease, which I find kind of refreshing as a reminder that the market does actually work pretty well.


Did you treat the MB differently because you paid more? If so, that may have yielded longer life in addition to quality design, etc.


> and what makes you think windows users update their devices every single generation?

They don't, but the difference is that Windows users generally don't know or care about processor generations. In contrast, it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.

You can test this by asking Windows users what CPU they have. For the few who know and who have an Intel CPU, you can ask what their Brand Modifier¹ (i3/i5/i7) is. If they know that, you can ask what the 5-digit number following the Brand Modifier is — the first two digits are the Generation Indicator¹. I'd be surprised if more than 0.01% of Windows users know this.

¹ Intel's name



Intel's CPU naming strategy used to drive me nuts when trying to talk to anyone at work who knew "just enough to be dangerous." Why is X so slow on this machine, it's got an [6 year old, dual core] i5! It runs fine on my laptop and that's only an [1 year old, quad-core] i3!


like everything you said could apply to nvidia gpus as well


man, that's a whole lot of mental gymnastics to justify scummy benchmark practices from apple.


How are they scummy? The M3 vs. M2 performance improvements they showed looked pretty modest.

My interpretation while watching the event is that this is a company persuading x86 holdouts to upgrade to Apple Silicon, and maybe some M1 users as well.



Windows users buy whatever, from so many brands, that it doesn't matter how often they upgrade, they're likely to not upgrade from the same vendor anyway (so that the comparison to its older generations to be meaningful in the first place).


It’s absolutely not, and that’s fine. The video has statements that the machines are made to “last for years” and they want to save natural resources be making long lasting machines.

I’m currently at 4 to 5 years on laptops and 3 to 4 years on phones, and even then I hand them over to kids/friends/family who get a bit more use out of them.



> they want to save natural resources be making long lasting machines.

Apple always comes from a position of strength. Again, they're saying as much as they're not saying.

Also, if they really cared about long lasting machines: slotted ram and flash please, thanks!



Huh. So they used to do this, but looking at the M series chips it seems like the architecture assumes the CPU-GPU-RAM are all on the same chip and hooked into each other, which enables zero copy. Someone more well versed in hardware could explain if this is even possible.

Expandable internal storage would be nice, yeah. But I get the sealed, very tightly packed chassis they’re going for.



> get the sealed, very tightly packed chassis they’re going for

The Dell XPS 17 is only 0.1 inch thicker yet has fully replaceable RAM and 2(!) m2 slots. I’m pretty sure what Apple is going for is maximizing profit margins over anything else..



I have an XPS 15. And while I liked that I could bring my own SSD and RAM, the build quality is nowhere near a Macbook Pro... like not even in the same galaxy. I had to have it serviced multiple times within the first few weeks. It had to be sent to Texas, and when it returned, one WiFi antenna wouldn't plug into the card, and the light on the front was permanently broken. I could have demanded Dell fix it - and I'd have been even more weeks without my main work laptop. So, by pure numbers/specs? Sure. By real world quality, no way would I favor Dell.


The issue is often comparing apples (heh) to oranges.

I understand the desire for slotted RAM, but the major limiting factor for nearly 10 years was CPU support for more than 16G of RAM. I had 16G of ram in 2011 and it was only 2019 when Intels 9th Gen laptop CPUs started supporting more.

The Dell XPS 17 itself has so many issues that if it was a Macbook people would be chomping at the bit, including not having a reliable suspend and memory issues causing BSOD's. -- reliability of these devices, at least when it comes to memory, might actually be worse and cause a shorter lifespan than if it had been soldered.

Of course it always feels good to buy an underspecced machine and upgrade it a year later, which is what we're trading off.

But it's interesting that we don't seem to have taken issue with BGA CPU mounts in laptops but we did for memory, I think this might be because Apple was one of the first to do it - and we feel a certain way when Apple limits us but not when other companies do.



There’s a lot of flat-out wrong information in this post. For one, even the low-power (U-series) Intel laptop CPUs have suported 32GB+ of memory since at least the 6th generation[1]. Many machines based on these CPUs unofficially support more than that. I have a Thinkpad with an i7-8550u and 64GB of DDR4, and it runs great.

On top of that, the higher-power laptop SKUs have supported 64gb or more since that time as well.

Secondly, it’s silly to claim that having RAM slots somehow makes a computer inherently more unstable. Typically these types of issues are the result of the manufacturer of the machine having bugs in the BIOS/EFI implementation, which are exacerbated by certain brands/types of memory. If you don’t want to mess around with figuring that stuff out, most manufacturers publish a list of officially-tested RAM modules which are not always the cheapest in absolute terms, but are always night-and-day cheaper than Apple’s ridiculous memory pricing.

[1] https://www.intel.com/content/www/us/en/products/sku/88190/i...



Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM. I know because I had to buy a high end workstation laptop (Dell Precision 5520 FWIW) because no other laptop was supporting more than 16G of RAM in a thin chassis.

No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

I know this because it was something I was looking at intently at the time and was very happy when the restrictions were lifted for commercially viable laptop SKUs.

Citing that something exists predisposes the notion of availability and functionality. No sane person is going to be rocking into the room with a Precision 7520 and calling it portable. The thing could be used as a weapon and not much else if you had no power source for more than 2hrs.

Also, socketed anything definitely increases material reliability. I ship desktop PC's internationally pretty often and the movement of shipping unseats components quite easily even with good packing.

I'm talking as if I'm against socketed components, I'm not, but don't pretend there's no downsides and infinite upgrade as an upside, it's disingenuous, in my experience there are some minor reliability issues (XPS17 being an exceptional case and one I was using to illustrate that sometimes we cherry pick what one manufacturer is doing with the belief that there were no trade offs to get there) and some limitations on the hardware side that limit your upgrade potential outside of being soldered.



> Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM.

> No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

Here are the Lenovo PSRef specs for the Thinkpad T470, which clearly states 32GB as the officially-supported maximum, using a 6th or 7th gen CPU:

https://psref.lenovo.com/syspool/Sys/PDF/ThinkPad/ThinkPad_T...

This is not a behemoth of a laptop; I'm writing this on a T480 right now, which supports 32GB officially and 64GB unofficially, and it weighs 4lbs with the high-capacity battery (the same as the T470).

I can't tell if you're trolling or what, but if you're serious, you clearly didn't look hard enough.

Edit: since you mentioned Latitudes, Elitebooks, and Fujitsu lifebooks:

- Dell Latitude 7480 (6th gen CPUs) officially supports 32GB: https://www.dell.com/support/manuals/en-us/latitude-14-7480-...

- HP Elitebook 840 G3 (6th gen CPUs) officially supports 32GB: https://support.hp.com/us-en/document/c05259054

- For Lifebooks, I couldn't find an older one that supported 32GB, but this U937 uses 7th gen CPUs, and has 4GB soldered and one DIMM slot which supports up to 16GB. This is a total of 20GB, again, breaking the 16GB barrier: https://www.fujitsu.com/tw/Images/ds-LIFEBOOK%20U937.pdf

I believe these are all 14"-class laptops that weigh under 4 pounds.



One more thought: you might be getting confused here with the LPDDR3 limitation, which was a legit thing that existed until the timeframe you're thinking of.

Any laptop which used LPDDR3 (soldered) typically maxed out at 16GB, but as far as I'm aware, this was due to capacity limitations of the RAM chips, not anything to do with the CPUs. For example, the Lenovo X1 Carbon had a 16GB upper limit for a while due to this. I believe the 15" MacBook Pro had the same limitation until moving to DDR4. But this is entirely the result of a design decision on the part of the laptop manufacturer, not the CPU, and as I've shown there were plenty of laptops out there in the ~2014-2016 timeframe which supported 32GB or more.



Intel actually has this documented all on one page: https://www.intel.com/content/www/us/en/support/articles/000...

DDR4 support was introduced with the 6th gen Core (except Core m) in 2016, LPDDR4 support didn't show up until (half of) the 10th gen lineup in 2019. It's just another aspect of their post-Skylake disaster, wherein they kept shipping the same stuff under new names for years on end before finally getting 10nm usable enough for some laptop processors, then a few years later getting it working well enough for desktop processors. In the meantime, they spent years not even trying to design a new memory PHY for the 14nm process that actually worked.



They haven't made slotted ram or storage on their macbooks since 2012 (retina macbooks removed the slotted ram afaik). It might save on thickness, but I'm not buying the slim chasses argument being the only reason, since they happily made their devices thicker for the M series cpus.


> It might save on thickness, but I'm not buying the slim chasses argument being the only reason

Soldered memory allows higher bus frequency much, much easier. From a high frequency perspective, the slots are a nightmare.



Yup. I’ve been looking at the Framework laptop, and it’s barely any thicker than the current MacBook Pro.


I have no excuse for flash, but memory can't really be slotted anymore since SODIMM is crap. High hopes for CAMM making it's way into every other machine 2024!


Given that there is a legally mandated 2-year warranty period at least in Europe, I would be surprised if any laptops weren’t made to “last for years”.

The problem with Apple, however, is that their hardware will long outlive their software support. So if they really want to save natural resources by making long-lasting machines, they should put much more effort into sustained software support.



> i cant remember the last time i saw a company go back more than a generation in their own comparison

Apple likes doing that quite frequently while dumping their "up to X% better" stats on you for minutes.



The majority of MacBooks out there are still intel based. This presentation was mostly aimed at them & M1 owners.


Nvidia did it when they released the RTX 3080 / 3090 because the RTX 2000 series was kind of a dud upgrade from GTX 1060 and 1080 Ti


Given how strong they emphasised the performance over the Intel base - who now have had their machines for 4 years and are likely to replace soon (and may be wondering if they stay at Apple or switch over to PCs), it is pretty obvious that they also want to target that demographic specifically.


That’s not what it says. Actual quote:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.



Ok, so then the M3 pro is up to 1.3/1.2=~8% faster than the M2 pro? I can see why they wouldn't use that for marketing.


Depends who they are marketing to I think is the point. If the biggest group of potential buyers are not M2 users, then it makes sense not to market to them directly with these stats.

I've got an M1 Max 64GB and I'm not even tempted to upgrade yet, maybe they'll still be comparing to M1 when the M10 comes out though.



I'm also far from replacing my M1. But if someone from an older generation of Intel Macs considers upgrading the marketing is off as well.


Plausibly they thought market is saturated with M1:s and targeted this to entice M1 users to switch.


> Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought this at first then I realized the cost-performance benefit gained from adding more cores often outweighs just improving the performance of single cores. Even in gaming. I think this is what led AMD to create their Ryzen 9 line of CPUs with 12 cores in 2019.

That being said, I abhor the deceptive marketing which says 50% more performance when in reality, it's at most 50% more performance specifically on perfectly parallel tasks which is not the general performance that the consumer expects.





> Depends. Is it faster?

The devil tends to be in the details. More precisely, in the benchmark details. I think Apple provided none other than the marketing blurb. In the meantime, embarrassingly parallel applications do benefit from having more performant cores.



Heh, I recall seeing many posts arguing against benchmarks when all Macs equipped with an M2/8GB/256GB SSD scored much, much lower than the M1/8GB/256GB SSD. People said the synthetic benchmarks were not representative of real world use and you'd never notice the difference. 'Twas a battle of the optimists, pessimists, and realists. In reality, 'twas just Apple cutting costs in their newer product.


> Heh, I recall seeing many posts arguing against benchmarks (...)

It's one thing to argue that some real-world data might not be representative all on itself.

It's an entirely different thing to present no proof at all, and just claim "trust me, bro" on marketing brochures.



oh absolutely, I can't wait to see the benchmarks. Per the (non-numerical data) benchmarks in the video tho - it is faster. So... until other evidence presents itself, that's what we have to go on.


I find that frustrating with how intel markets its desktop CPUs. Often I find performance enhancements directly turning off efficiency cores...


Faster than what? M1 Pro? Just barely.


Reference should be M2 pro


I suspect it's about equal or perhaps even slower.


Based on what? The event video says it's faster.


M2 Pro was about 20-25% faster than M1 Pro, M3 Pro quotes a similar number. It has faster cores but a weaker distribution of them. Seems like a wash, but we'll see exactly how close when benchmarks are out.


2.5x is "just barely"? lol k.


> 2.5x is "just barely"? lol k.

That's only rendering speed, and M3 Max vs M1 Max (not Pro). M3 Pro is only 30 percent faster:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.



20%


Let me re-write your post with the opposite view. Both are unconvincing.

I thought we'd learned our lesson with the silly cores Myth already? >>



I think you're misreading the comment you're replying to. Both "more cores is always better" and "more MHz is always better" are myths.


Yup, exactly what I was saying.


Yes, but the number of cores in similar cpus do provide a good comparison. For example, with base M2pro at 6 p cores and base M3pro at 5 p cores, one would want ~20% faster cores to compensate for the lack of one core in parallel processing scenarios where things scale well. I don't think M3 brings that. I am waiting to see tests to understand what the new M3s are better for (prob battery life).


That's... the same view, just applied to a different metric. Both would be correct.

Your reading comprehension needs work, no wonder you're unconvinced when you don't even understand what is being said.



That makes less sense because the MHz marketing came before the core count marketing.

I agree with GP that we should rely on real measures like "is it faster", but maybe the goal of exchanging performance cores for efficiency was to decrease power consumption, not be faster?



Probably a balance of both tbh, as it appears to be both faster AND around the same performance per watt.


The new efficiency cores are 30% faster than M2, and the performance ones 20% faster, so lets do the math:

    M2: 8 + 4

    M3: 6*1.2 + 6*1.3 =
        7.2 + 7.8
That’s nearly double the M2’s efficiency cores, a little less on the performance ones.

They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.



You're not considering the difference in performance between the p and e cores. The math should be something more like:

  M2 pro = 8*3 + 4 =28 (the *3 representing that the performance cores contribute ~3x more to total system performance than the efficiency cores)

  M3 pro = 6*3*1.15 + 6*1.3 =28 (apple claims 15% more performance for the p cores not 20%)
> They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.

They don't claim either of those things. They claim the performance is 20% faster than the M1 pro. Interestingly, they made that exact same claim when they announced the M2 pro.

Energy efficiency might be better, but I'm skeptical till I see tests. I suspect at least some of the performance gains on the p+e cores are driven by running at higher clock rates and less efficiently. That may end up being more significant to total energy consumption than the change in the mix of p/e cores. To put it another way, they have more e cores, but their new e cores may be less efficient due to higher clock speeds. Total energy efficiency could go down. We'll just have to wait and see but given that apple isn't claiming an increase in battery life for the M3 pro products compared to their M2 pro counterparts, I don't think we should expect an improvement.



If you wanted to be even more accurate, you'd also have to take into account that most tasks are executed on the E cores, so having more of those, or faster, will have a much greater impact than any improvement on the P cores. It's impossible to estimate the impact like this - which is why Apple's performance claims[1] are based on real-world tests using common software for different workloads.

In summary, there is supposedly improvement in all areas so the reduced P core count doesn't seem to be a downgrade in any form as the OP suggested.

[1] https://www.apple.com/nl/macbook-pro/



I wouldn't trust Apple's marketing on that if it's where you got those numbers from


Depends on what you consider an upgrade. As M3 cores perform better than M2 cores, I expect the M3 configuration to perform similar to the M2 one, even though it trades performance cores for efficiency cores. Apple apparently believes that its users value improved efficiency for longer lasting battery more than further improved performance.


E cores are ~30% faster and P about 15%. So the question would be how much the Es assist when Ps are maxed on each chip. In any other situation, more/better E cores should outperform and extend battery. I’m not saying that means you should want to spend the money.


I love Apple's E cores. It just sucks that the M3 pro gains so few given the reduction in P cores.

Apple's E cores take up ~1/4 the die space of their P core. If the M3 pro lost 2 performance cores but gained 4-8 efficiency cores it'd be a much more reasonable trade.



I’m sure the difference is GPU.


I’d like to see that. Good point about die space.


Could you not resolve these questions with benchmarking?


Functionally, how does this impact observed performance on heavy loads like code compile or video manipulation? I doubt it's not much, and these are the low/mid-tier priced machines we are talking about.

If you bought a $2k M2 machine and traded it for a $2k M3 machine, you may gain better battery life with no concessions, except for benchmark measurements (that don't affect your daily work).



These are not low/mid tier machines when talking about "consumer-grade".


Yeah.

$2K-3K is what my 3090/7800x3D sff desktop cost (depending on whether you include the price of the TV/peripherals I already own).



Within the MacBook Pro lineup, they are objectively the low and mid-grade pricing tiers.


Indeed, but that's a bit of an oxymoron as any Macbook Pro is not a "low/mid-tier priced machine"


We all know what is meant by “low/mid-tier”. This is pointless pedantry. Next someone is going to come by with the throwaway comment complaint about how OpenAI isn’t “open”.


Fair enough, I was just arguing even Mac users might not have the cash or the patience to commit into another machine.

We've seen the same with Nvidia's GPUs going from the 10 to 20 series. If people don't perceive higher gains without compromises, they won't buy it.



Then why do they come with (low end) consumer level storage and memory capacity?


Different people have different needs. I certainly need a MacBook Pro for my work, but I use next to no storage. I’ve never purchase beyond the minimum storage for an Apple computer. I did however up the processor on my current MacBook Pro.

Minimum 8GB RAM is more universally egregious but I’m not going to sit here and justify my own exception whilst discounting the possibility that 8GB works for others.



The cost for adding an extra 8GB would be insignificant for Apple, though. The only reason they don’t is to upsell higher tier models


It would make them less money. /thread

To be fair– While 8GB is annoying– I've bought the M1 MacBook Air when it came out and it's remarkably resilient. I've only had it freeze a few times due to too little RAM.

I've also been using many different programs. I just have to be a tad mindful about closing tabs (especially Google tabs) and programs.



This makes going Mac Mini M2 Pro over iMac M3 feel real compelling. The respective prices of these models are in fact the same, so if you happen to have a good monitor already... (also the iMac M3 curiously doesn’t even have a Pro option.)


> Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

I believe this is due to the TB4 spec requiring support for two external displays on a single port. The base spec M series SoCs only support one external display.

I’d expect the ports to work identically to a TB4 port in all other aspects.



I really, really wish they would fix this silly display scan-out limitation. I gave them a (frustrating) pass on the M1 given it was the first evolution from iPhone/iPad where it wouldn't have mattered. But seems silly to have made it all the way to the M3. Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I'm sure there is some kind of technical explanation but both Intel and NVIDIA seemed to managed 3+ scanouts even on low end parts for a long time.



The technical explanation is that on the base M1/M2 SoC there is one Thunderbolt bus that supports 2 display outputs.

On the MacBook Air one output is connected to the internal display leaving one output for an external display.

(The Mac Mini that uses the same SoC is limited to 2 external displays for the same reason)

To support more displays they would have to add support for a second Thunderbolt bus to the base SoC.



There's no reason a whole Thunderbolt bus is needed for every two displays. It's just Apple's decision to build their GPU that way.

And to not support industry standard NVIDIA GPU on ARM Macs, too. 1 GPU typically supports 5 output over as little bandwidth as PCIe x1.



Not with nVidia, no, they are 4 displays, always has been. The NVS810 8x display card is using two GM107 GPUs.

AMD is 6 displays. You see this rarely on consumer boards but the ASRock 5700 XT Taichi for some inexplicable reason did expose all six -- with four DisplayPorts to boot, too. I do not think there has been 4 DP or six output customer cards since.



There are couple 900-, 10-, 20-, 30-Series NVIDIA with 5 outputs. 700- and below had up to 4. IIUC it's more like up to (x px, y px) max with up to N independent clocks without external adapters or something along that.


Just because there are X outputs on GPU, doesn't mean it will work with all of them at the same time


I was doing 5 for no reason from a GTX970 at one point. They just work. But for some reason(segmentation?) NVIDIA brochure pages sometimes disagree or contradict with products in the market.


Is this an actual hardware issue though? One issue is MacOS has never supported DisplayPort MST (Multi-Stream Transport) EVER as far as I can tell. MST allows for multiple display streams to be natively sent over a single connection for docks or daisy chaining monitors. Back on Intel Mac's if you had a dock with 2 displays or daisy chained 2 together you would get mirrored displays. The exact same Mac/displays in boot camp MST would work perfectly. 1x display per Thunderbolt 4 port is the worst!


You can get multiple displays from a single port, the hubs are just expensive.


You can't do it with a base model M chip. Not supported on Mac unless you go with displaylink and displaylink has weird issues on mac like no hdcp support and screen recording enabled that make it a really bad experience compared to mac.


Right, but why can't you disable the internal display to run 2 external displays? That wouldn't be an unreasonble compromise but seems not possible.


M1/M2 only has 1 native HDMI pixel pipe in any form, I think? Apple uses the HDMI PHY to drive the screen on tablets, and the screen on laptops. Base-tier M1/M2 also only have a single displayport pixel pipe, and Pro/Max get +1/+2 respectively.

The base-tier chips are designed as high-volume tablet chips first and foremost, with ultramobility crossover capability.

Using DisplayLink or certain kinds of thunderbolt multimonitor are possible while running outside the pixel pipe or running multiple monitors on a single pixel pipe (this is not MST which is multiple pixel pipes on a single stream). But yeah it's ugly especially on a base-tier processor with this eating cycles/dumping heat. You're running the hardware encoder at least.

Discord had this weird error if you tried to enable the screen/audio capture, it tries to launch something and fails and the solution is you need to manually install "airfoil" because it's an audio capture module that discord licensed. you don't have to fully install it but the audio driver is the part that discord uses and that goes first (has to be allowed as a kext, ie non-secure mode). theoretically a kernel-level capture like that could be a ton faster than userland, I think that's the on-label use of airfoil.



Allow the user to turn off the internal display in favor of 2 external displays. That would be a usable docked configuration.


you are right, but apple won't do this.


independent repair technician demo video to mux MBA internal and external display?


>I'm sure there is some kind of technical explanation

I'm sure it's a marketing explanation: they make bigger margins on more expensive machines, and they need some feature differentiators to nudge people to move up. 2 extra displays is a poweruser/pro feature.

They make their own silicon, it's not like they're shy about designing hardware, if they wanted to stuff features into the lower end books they easily could.



> Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I mean, it almost certainly is? I would guess a majority of the low-end SKUs are rarely if ever attached to one external display. Two would be rarer still.



At a ~recent work place the entire floor of developers had (Intel) MacBook Pros with dual 24" monitors.

Some of them were trying out Apple Silicon replacements, though I'm not aware of what they did monitor wise. Probably used the excuse to buy a single large ultrawide each instead, though I don't actually know. ;)



Which workplaces are these that buy low-end laptops for their employees but shell out for dual monitor workstations?


Is a 1,599 laptop a low-end laptop? An M3 Macbook Pro 14" that costs $1,599 can only drive a single external monitor according to the spec. A $1,000 Dell XPS 13 can drive 4 monitors via a single Thunderbolt 4 Dock that also charges the laptop!

Honestly, I'm an accountant and everyone in my office uses 2-3 monitors with $1,200 business ultrabook.



I think this use case is probably not the majority.


So? Intel doesn’t seem have any issues supporting it regardless of that.


External displays can be used for multiple generations of laptop hardware. Unlike CPUs, displays are not improving dramatically each year.

MacBook Air is a world-leading form factor for travel, it's not "low-end".

MBA with extra storage/RAM can exceed revenue of base MBP.



We’re still talking the low end of this product line. If you’re buying two monitors for your employees, I’m not sure you’re skimping on the cost between an M3 and an M3 Pro.


As stated, it's not about cost.

The travel form factor of MBA is not available for MBP, for any price.



What's Apple high end laptop product line?


> low-end laptops

Heh, that's not how I would describe MacBook Pros. ;)



> low-end laptops

you're saying they're low-end because Intel? if you've got your macbook connected to two monitors, you're not very concerned about battery performance.

So isn't Intel silicon competitive speedwise? I thought the M[0-4]s were OK but sort of hypey as to being better in all regards.



I have worked in plenty i5-i7 windows/linux laptops before and a macbook m1 air with 16gb of ram is miles better in everything. Nothing like them.

And even if you do not care about battery, you still care about throttling.



Not a chance. Moving from an Intel MacBook Pro to an Apple Silicon MacBook Pro was absolutely revolutionary for me and my very pedestrian ‘interpreted language in Docker web developer’ workloads.

I’d seriously consider not taking a job if they were still on Intel MacBooks. I appreciate that an arch switch isn’t a piece of cake for many many workloads, and it isn’t just a sign of employers cheaping out. But for me it’s just been such a significant improvement.



I work at Motorola and we get M1 airs unless you specifically request a Linux laptop. I wouldn't call it low end though. Low end is an Intel i3.


More like cheap out on monitors such that devs want two crappy monitors instead of one crappy monitor


What dev shop gives their engineers base model machines?


Doesn't need to be a dev shop. Go into any standard office and most productivity office workers will be running dual monitors now.

But with the general power of the base model Apple Silicon I don't think most dev shops really need the higher end models, honestly.



Where are you getting that impression from the parent post? Maybe they were on a 2, 3, or 4 year upgrade cycle and still had a bunch of Intel MBPs when Apple Silicon hit the market. That'd be extremely typical.

What dev shop immediately buys all developers the newest and shiniest thing as soon as its released without trialing it first?



We stuck with Intel MBPs for awhile because people needed machines, but the scientific computing infrastructure for Apple silicon took more than a little bit to get going.


Yeah, they were running Intel Macbook Pros because that's what everyone was used to, and also because production ran on x86_64 architecture.

At least at the time, things worked a bit easier having the entire pipeline (dev -> prod) use a single architecture.



Yeah, that was my experience. The early M1 adopters at my previous company definitely ran into some growing pains with package availability, etc.

(Overall the transition was super smooth, but it wasn't instant or without bumps)



Huh? He was talking about dual monitor situations being a problem.

If the company bought Pro or Max chips and not base models, it wouldn’t be a problem.



Intel has supported three external displays on integrated graphics since Ivy Bridge in 2012.


I’m not sure what that has to do with it being a niche use-case or not.


Niche or not, being more than a decade behind the competition is gauche.


On one somewhat niche feature, on the lowest SKU in that particular product lineup.

I can pick areas where Apple is beating Intel. Different products have different feature matrices, news at 11.



They also don’t show any signs of catching up to the Raspberry Pi’s on GPIO capabilities.


They did with https://ark.intel.com/content/www/us/en/ark/products/series/... but sadly seem to have killed off that product line.


That was Intel, not Apple.

It does seem like a shame, though—Intel’s IOT department seems to try lots of things, but not get many hits.



Apple does not compete on checkboxes. If they deemed is necessary to remove, there’s a reason. Not saying I agree, just that’s how they operate. If there isn’t a need to support 3 displays then they won’t, regardless if the “competition” did it years prior.


> there’s a reason. Not saying I agree, just that’s how they operate.

Almost always it’s maximizing profit margins rather than anything else.



>there’s a reason

they operate 100% on profitability, not what's technically feasible. They are extremely focused on making money. Yes, there is a reason after all.



Exactly my point. It’s technically feasible to do many things. Apple will do what Apple does. Try to upsell you into the higher tier hardware.


If that were true Apple would have stopped bragging about battery life.


The longer battery life is genuinely useful to a wide range of people in a way that being able to connect 38 external monitors is not.

I recently went on a 5-day trip and forgot to bring the charger for my M2. The first day I thought I'd have to rush around and find one. By the fourth day I still had 8% and then finally realized I could charge it via USB-C instead of magsafe.



> connect 38 external monitors

Just 2 would be enough. Which seems like a basic feature their competitors are are capable of supporting for a very low costs.

They in fact are competing on checkboxes, specifically they are probably using this limitation to upsell theirs more expensive models.



Can you not connect 2 monitors on a Mac?


Not on those with a non-pro M chip.


Even if you use one of those Thunderbolt/USB-C expansion dongles?


It has nothing to do with niche use-case or not. This is a regression compared to their own Intel Macbooks.


Well the number with two screens would be zero, because you can't do it. That doesn't mean people don't want to do it because 0% of the laptops do it. They're just unable to.


It’s a bit funny though that their competitors don’t seem to have any issues supporting this on pretty much all of their products.


Display pipelines are expensive and take area.


Easy to say but hard to prove. How much more expensive would an MBP be if they supported it? How many fewer units would they shift?

Those are harder questions to answer. We could assume Apple crunched the numbers. Or perhaps they just stuck to the status quo.

Only an insider or decision maker (maybe that’s you) knows.



I think their assumption is that if you’re the kind of pro that needs that many monitors, you’ll upgrade to the better chips they sell.

But it’s a frustrating limitation and remains one of the only areas their old intel based laptops were better at.



For the past 3 years, including with the latest laptops, "better chip" means 14" M* Pro starting at $1,999. $1,299 M1/M2 or $1,599 Macbook Pro does not support that. When you can find support for dual external display on $600 Windows laptops, or Intel Macbooks since at least 2012. By any standard this is an embarrassment and a regression.


An assumption they are so unsure about, that they kind of force that decision on their users.


I mean they are physical things and you can look at how big they are. But sure the rest of how that factors into cost and sales is harder to figure out, yes.


It’s a money thing. Apple wants to upsell. The production cost would be negligible, but now you have to buy the next level of the product.


The CEO is a supply chain guy. They've been optimizing their profit margins ruthlessly since he took the helm. I don't think any savings are too small, particularly if comparatively few users are affected and it motivates upselling.

I think it's weird though how far people go to defend Apple. It's objectively making (some) users worse off. Apple clearly doesn't care and the people defending them also clearly don't. But the affected users do care and "but money" isn't really a good excuse for them. It also doesn't solve their problem of not being able to use two external monitors anymore without spending significantly more money.



Unless you’re Intel?


It's because they don't want to put a Thunderbolt controller on the right side of the computer


Is this a change to the spec, or did they skirt around that previously, because I didn't think they supported more than one screen per port on the M1/2?


> seems like Intel wouldn't do this

Wouldn’t do what? Intel has more E-cores than P-cores on most of their range, and especially on the higher end e.g. on raptor lake S the i9 all have 16 E and 8 P, the i7s have 8:8, only the lower end of the i5 (below 13500) have more P than E cores. And the i3 have no E cores.

The story is somewhat similar on mobile (H and HX), a minority of SKUs have more P than E, and none of them in P and U.

In fact that was one of the things which surprised me when Intel started releasing asymmetric SMT, they seemed to bank heavily on E cores when mobile and Apple had mostly been 1:1 or biased towards P cores.



I think you confirmed what you were replying to. Intel makes the numbers get bigger as you go up, regardless of whether that makes the most sense.


Oh yeah I misread the comment.

Although that’s not quite true either e.g. on raptor lake H, the upper i5 (13600H) has 8 E cores while the low range i7 (13620H) has 4, but the i7 has 6 P-cores versus 4. The base frequencies also get lower as you move from i5 to i7. And you have less GPU EU (80 -> 64).



Well, when your P is still quite E, I guess it’s a different equation :).


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

Is this because they are not populating all the memory channels, or just using lesser memory ICs?

If its the former... thats annoying. It just makes their products worse for artificial segmentation and very little cost savings.



The new M3 14" MBP seems to be a red herring - why does it even exist? Why not just refresh the MBA instead?

An obvious rule of thumb is for "Pro"-branded laptops to only use "Pro"-branded chips. It's what they follow for the iPhone lineup, but I suppose iPad Pros also use non-Pro chips. Just seems like a very confusing SKU to create, and definitely something Steve wouldn't approve of.



It replaces the 13 inch macbook pro with m2. Apple always has a “pro” macbook at that price point and it is one of the better selling macbooks, because not all “pro” users have a need for cpu grunt. A lawyer, for example, probably wants a “pro” class of hardware but doesn’t need more than an 8 gb m1. You could argue they should get a macbook air, but this 14 inch macbook pro is effectively that but with a better screen and more ports, which is exactly what that kind of buyer needs.


I feel there is an obvious appeal to the MacBook Pro 14"/16" with M3. It has a good display, lots of battery life, and plenty of performance.

I'm more confused about the "M3 Pro" variant. Its performance either seems to be overkill or not enough. A more sensible lineup to me would be:

M3 - 2 thunderbolt ports, dual monitor support, memory up to 8-24gb (2x4, 2x6, 2x8, 2x12, 2x16). In the MacBook Pro, always comes equipped with second tier upgrades.

M3 Max - 3 thunderbolt ports, quad monitor support, 32-128gb (8x4, 8x6, 8x8, 8x12, 8x16).

Then again this wouldn't let Apple upsell people on basic functionality like dual monitor support so they'll never do this.



About the M3 pro, I’ve heard a theory it’s most likely due to lower yields by TSMC and M2 pro and max being too similar.

Now it’s clearly, if you really need perf you get an M3 max.



I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

And yet, the MBA's screen in comparison is serviceable and nice, but nothing outstanding. That's the case for the MBP 14 (when the 16 is just too large and bulky).



Absolutely love my 14” M2 pro and use it daily for coding. Perfect size/weight for the backpack, and endless battery at the local coffee shop.


I find it to be the perfect size actually. Easily in a backpack and is light, and can use it on the couch, etc. comfortably. I’d never buy a 16” laptop.


> I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

absolutely not... working for 10 years on 13/14 and never _felt_ that way I get this is personal ;)



I find the 14" perfect, but I also find a tiling window manager (universally) vital.


The most popular Macbook Pro?

Look, I'm a 16" guy myself, I even carried one of the 17" cafeteria trays back in the day… but it's clearly the sweet spot for _most_ people.



> The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

That's not super surprising to me. Apple loves to charge stupid prices for storage and memory. Maybe it's worth it for lots of people to have the convenience of built in storage at the lower levels, but I have to imagine that most people would want 8TB of SSD would rather just get an external solution for... much less.



Yeah I can imagine that’s an incredibly niche setup. Maybe if you were editing on the go or similar, but even then, TB drives seems like a more pragmatic choice.


It was pretty hard to saturate the memory bandwidth on the M2 on the CPU side (not sure about the GPU).


The GPU can saturate it for sure.

Llama.cpp is a pretty extreme cpu ram bus saturator, but I dunno how close it is (and its kind of irrelevant because why wouldn't you use a Metal backend).



Well, Metal can only allocate a smaller portion of “VRAM” to the GPU — about 70% or so, see; https://developer.apple.com/videos/play/tech-talks/10580

If you want to run larger models, then CPU inference is your only choice.



Aren't these things supposed to have cores dedicated to ml?


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

just contrasting this with the recent TR4 announcements from AMD, apparently their PRO variants top (theoretically at least) at around 325GB/s (non-pro versions are half of this), so just from that perspective alone M3 Max's might be better ?

i always have the naive assumption here that keeping the-beast i.e. the cpu fed with data is much better for overall performance than just clock-rates etc.



I think what Apple is pushing for is computing efficiency. It still gets faster but with much less power. Focusing on performance solely would be the wrong way to evaluate these chips. https://www.tomsguide.com/news/apple-m3-chip


I think it's a bit more nuanced than that.

There's a reason they didn't just stick an A series chip in their laptops and call it a day - they want more performance even if it comes at the cost of efficiency. It's probably better to say that Apple is pushing performance within a relatively restricted power envelope.

Just to illustrate my point - if m3 had exactly the same performance as m1, but with 1/2 the power draw, I don't think many people would have been happy even if it would have been an amazing increase in computing efficiency.



This drives me crazy. Apple plays the market like Nintendo. Pick something that no one cares about, do it better than anyone else, and make a big deal about it.

I dream of a world where instead of a marketing company becoming a top 3 tech company, a tech company would have. Wonder what they would have done for their laptop...

Or maybe this is just an inevitable outcome of capitalism/human biology where a veblen goods company will become a top player in a market.

(I have my own Google and M$ complaints too)



So Apple is the most successful company because they prioritize things that no one cares about?

I dunno, if a there was marketing company that could design the likes of the M series chips along with the mobile versions, develop a full technology stack from programming language and compiler, custom chips, through to shipping whole devices at unimaginable scale would make me wonder what technology companies were doing.

What other “tech” company really compares from a hardware perspective? Samsung? Dell? AMD? Love them or hate them, there’s no denying that Apple has serious technical chops. One day people will only hate Apple for reasonable things, today’s not that day apparently.



Apple develops its own OS. Apple develops its own development stack, frameworks, etc. Apple develops its own CPU/GPU architecture. Apple develops its own battery architecture. Apple develops its own tooling to manufacture a lot of their products. Apple develops its own tooling to dispose off their products.

There are very few companies that have as much first party Tech in their products from start to finish.

I think Apple under prioritizes sdvanced functionality but if they’re not a Tech company than it’s hard to see what is.



It's probably fairer to say "Apple builds products focused on things I don't care about."

Obviously, other people care.



The 2x USB/Thunderbolt ports are on the same side. :(


The SKUs are becoming more complex because they are probably learning why Intel/AMD have so many SKUs. Making complex chips at scale results in a range of less-than-ideal chips. This drives a the need to segment and bin chips into different SKUs to reduce losses, rather than trying to sell one SKU and throw awaying the anomalies.


Wish that you could get the 16-core CPU with the smaller Max GPU, but alas I just ordered one anyway.


The max is probably only going to be in desktops, so better to use the die area for other things than E cores


One thing that left me a bit confused was the comparison against Intel Macs. Although I am still using an Intel 16-inch MacBook, I really wanted to see how the M3 fared against the M2, not Intel and M1. I think it's no surprise the M3 exceeds the Intel Core i7-9750H in basically all of Apple's own benchmarking. My real question, which will probably be answered next week, is how it compares to the generation right before it.

My work laptop is a 14-inch MacBook Pro and I've been impressed with the battery life considering all of the containers that run (on Kubernetes!) as part of my dev workflow. I guess Apple deciding to compare Intel and M1 was to try to convince existing MacBook users to upgrade to the latest generation CPUs.



I imagine the amount of people upgrading from an M2 Mac will be close to 0, while there's a lot of Intel stragglers out there.


Intel straggler here. And I'll remain one until Linux runs flawlessly on the newer generation. I have an older Thinkpad as my main workhorse and a now quite old Macbook air for on the road, it's battery has been shot for a long time but it still works quite good when plugged in and I'm too much of a miser to have it replaced. But if there is a way to run a full Linux distro on the newer hardware then I'll probably buy one. But I'm under no illusion that Apple makes their hardware to cater to me and that's perfectly fine.


Check out asahilinux.org - I have archlinux arm running as my main OS on Air M1 for a year now.


I recommend sponsoring Hector Martin here: https://github.com/sponsors/marcan

Yes, he's continuing to work on the long tail of drivers needed for Linux on Apple silicon. Looks like lately he's working on speakersafetyd, a daemon that will monitor audio levels and keep the builtin speakers from being damaged: https://github.com/marcan?tab=overview&from=2023-09-01&to=20...



You shouldn't link to Asahi Linux sites or individuals here - they (Asahi) don't like that.


They're a small team, they can only handle/reverse-engineer the M1 as of now.

edit: looks like they are make bigger strides than I thought on M2 as well https://github.com/AsahiLinux/docs/wiki/Feature-Support#m2-p...



How many hours of battery life do you typically get when running Linux on the M1?


I'm still holding out for HDMI output support so I can use it with the dock at work. But otherwise the project looks really good.


Replace it yourself using iFixit instructions and parts. Just screws and plugs, easy peasy.


> it's battery has been shot for a long time but it still works quite good when plugged in and I'm too much of a miser to have it replaced

Consider a battery pack? I have a Bluetti K2 and get far more hours running heavy workloads than people report on M1. It's never bothered me carrying it around, though some might care I guess.

My only regret is that it's slightly over the amount that is legal on planes in most jurisdictions.



Guessing using two smaller capacity battery packs instead isn't workable?


I'm curious. I run Ubuntu in a VM Fusion VM on my (Intel) Mac. Would that not work on a M-series Mac?


Absolutely! VMWare Fusion supports Apple Silicon (as do several other VM software, like Parallels.


I don't know if this is at all helpful, but I have Ubuntu running nicely inside Orbstack on a Mac Studio. Obviously it's all running inside MacOS though, but it works.


Intel straggler here ! When M1 hit I was certain I would get a M Apple in a year or two when they iron out the transition. It was just leaps and bounds ahead of others and the pricing was great.

Now I think AMD and Intel caught up enough, Apple didn't keep the momentum, and they do standard Apple pricing discrimination - I think I'll take a gamble on Framework AMD version. I really like the idea behind the self-repairable upgradeable device and want to support them. If I was able to work on my Apple Intel i9 overheating PoS for past few years I'll be better off with anything, might as well support the stuff I like.



I have a 12th gen Framework 13", 13" M1 Air, and a 15" M2 Air.

I use the Framework laptop for work because I need to use Linux.

The Framework laptop is mediocre just like pretty much all PC laptops. The hinges are awful, if you pick up the laptop upright, about 50% of the time the screen falls flat 180 degrees.

The trackpad is arse in Linux.

If you're lucky you can probably get 5 hours battery life, but on a realistic workload you're looking at 2-3 hours.

The keyboard is pretty nice, but I wish ctrl/fn is swapped like Apple and it has the inverted mini-T keyboard arrows (or at least I wish someone would make a swappable keyboard for the Framework).

The speakers are bloody awful.

Display/Webcam/Mic are fine.

I would like more ports over modular ports, but I appreciate the design that went into the modular ports.

Speaking of modular ports, sometimes they abruptly stop working and require removing and reseating.

All these small nits really add up and it just feels like a mediocre experience. It is my work laptop, but I try my best to avoid using it over my PC with WSL2 or either Air laptop, but I try my best not to mix work and personal.

Both the 13" M1 Air and 15" M2 Air are just amazing compared to the Framework, and I suspect PC laptops in general. They have their drawbacks, price (gouging in some ways), less ports, can't drive dual displays, but their trackpad, finish, speakers, etc. are just amazing. I personally prefer MacOS to Linux for a desktop experience as well.



Thing is I use integrated keyboard/trackpad maybe a few times a week in conference rooms, same for battery - 5 hours is plenty for presentations and meetings.

I want a portable workstation that I can occasionally use as a laptop, so build quality and laptop stuff isn't that big of a deal to me. I'm always using a screen via USBC + dedicated keyboard and mouse. Performance and noise are a factor - I'm hoping that AMD versions deliver on that.

I'm leaning towards framework because if my current MBP dies I can't do anything about it since it's been out of warenty for years. And upgrading it eventually with next gen CPU without having to change storage/RAM, etc. sounds nice.



> Now I think AMD and Intel caught up enough

Interesting. I don't really like Apple, mainly because of how they handle the app store vendor lock in stuff, and I hate macOS. But I use an M2 MBP purely because I can't find any other laptop that has the same fast performance, long battery life, quiet fan noise, no heat.

Can you recommend an AMD/Intel or anything that comes close? I'd switch in a heartbeat. The closest that comes to mind is the ThinkPad X13s.



I just bought a Thinkpad T14S Gen3 after evaluating a bunch of notebooks.

Compared to my Macbook Air the Mac excels in a couple of areas which Thinkpad is lacking in. Like ambient light sensor, port quality (ie. how recessed the USB-C port is and how much strain it can take), audio output quality (Macs have powered headphone jacks for high impedance headphones) and of course speaker quality which on Macs is second to none.

The Thinkpad by comparison has poor quality speakers, no ambient light sensor so it doesn't auto adjust the screen or keyboard backlight. Its USB-C ports are also not very strong so the any strain on them or wiggle will cause them to disconnect - they certainly feel very fragile.

I hear the Thinkpad Z13/Z16 is more comparable to a Macbook but again it doesn't have little details like an ambient light sensor which seems an odd omission in a luxury laptop and price wise it's practically the same.

That said, the new AMD Ryzen 7840HS and 7940HS chips are pretty competitive with an equivalent Apple M2:

https://browser.geekbench.com/processors/amd-ryzen-9-7940hs

https://browser.geekbench.com/macs/mac14-15



And GeekBench is a pretty bad benchmark.

There's also the 7945HX which is a 16 core CPU, but only comes with big dedicated GPUs, sadly.



I use my laptop as a portable workstation so I rarely use built in keyboard, touchpad, battery, etc. From what I see AMD 7840 CPUs deliver similar performance to M2.

I think I'll get an Framework 16 or 13 with AMD in a few months, will make sure Linux drivers are in order before ordering.



> I don't really like Apple, mainly because of how they handle the app store vendor lock in stuff, and I hate macOS. But I use an M2 MBP purely because I can't find any other laptop that has the same fast performance, long battery life, quiet fan noise, no heat.

Buying their products for the same reason everyone else does is "liking Apple".



You can like or use a product without liking the company making it or its practices


True. eg "For my last project, I used JavaScript, in anger."


Your only relationship to the company is via the goods and services it sells to you. If you buy their products, you like the company and their practices. The product is the culmination of all of their practices.


"I don't like being a galley slave but if I stop rowing, they whip me"

"Rowing for the same reason everyone else rows is 'liking slavery'".

Counter-proof by reductio ad absurdum.



So you admit of being a prisoner in an ecosystem built to ruthlessly exploit you...?


Wasn't AMD pretty much right behind in single thread perf and a bit behind in peak perf TPU, but always ahead in multi-core perf options?

I'm very happy with my Ryzen 7 6800U.

Matte screen? Check.

Lots of full perf cores? Check.

USB A and C? Check.

Really good battery even though I'm plugged in all the time? Check.

X86-64? Check.



The big thing I like about the M's is how quiet they are. I hate fan whirr in my old age.


I think the point is more that "is it worth buying this new m3 or should I buy a secondhand m2 for half the price?"


more like "a hundred bucks less". The M1's go for maybe half... if you're lucky. Most are more.


Because there's a newer model, which is now true for the M2 ones too?


Or should I buy a brand new M2 Mac Mini for half the price of iMac M3 if I already own a monitor. Or a M2 Pro for the same price where iMac M3 doesn’t even have a counterpart.


I don't think you're going to find any M2s for half the price. They hold their value pretty solidly.


They won't if the M3 turns out to be a significant upgrade.


That’s honestly just not how Apple hardware resale value has ever worked. The new M3 hardware could double the performance of the previous generation and they would only drop in price by maybe a few hundred.


One big outlier; the recent Intel Macs have dropped far more in value than expected.


And that outlier is the last /significant/ difference in performance.

M1->M2 wasn't that big, generally seen as a small incremental improvement at best, and the last few Intel updates barely moved the needle between them.



I’m an Intel strangler because it’s the only way I can have Mac and Windows on the same machine. Am I the only person that really needs this? Parallels has trash performance.


Hilarious typo — I picture someone throttling an Intel machine (which sadly throttles you right back).


Ha I just noticed it!


I bought a cheap Windows crapbook just to run the two Windows-only apps that I need (terrible apps for alarm systems and photovoltaic inverters).

Apple Silicon laptops are so much better than Intel-based machines, there is really no comparison.



I'm still running Windows on my Mac mini. I used to use Bootcamp but don't do so much Windows stuff now so can get away with VMWare. After June next year I can drop Windows completely so will likely upgrade then.

I'm not sure how VMWare compares to Parallels but performance isn't as good as Bootcamp. I wouldn't want to use it all day long.



Parallels is an overpriced rubbish with stupid limitations. I’m switching to WMWare or VMWare as of right now


Parallels used to be front-runner with better support for new macOS features. vmWare used to lag behind with weird issues. This is why I switched from vmWare Fusion to Parallels Desktop a few years back. What are these limitations you speak of? And what makes it rubbish?


8 GB max of RAM is a software limitation not present in the more expensive tier. I suspect it is not a technical limitation, just greed.


Probably the biggest complaint is fusion is now free (for personal use) and Parallels is a subscription.

Also you can get other free vm managers on Macs now since it ships with a hypervisor and vm framework.



I got a NUC for whenever I need windows and just RDP in it. It mostly collects dust.


You can run ARM Windows under Parallels, which is good enough for me.


Ah just updated my comment. I really dislike how slow Parallels is.


For gaming sure... but anything else... hypervisors are very good these days.


I need it for windows only CNC software. It’s pretty terrible for that purpose.


Funny, I never notice that.


have you tried using VMware Fusion at all?


my intel died last week. had to get an m2 pro and THIS drops today. kinda salty that my intel couldn' have held out till the M3 was available


Apple has a 14 day return policy. Don't need a reason.

https://www.apple.com/shop/help/returns_refund



This is what returns are for! It is definitely worth the hassle for a better chip


I bought a top spec Intel mac a month before the M1 came out. Just sold that computer for 1/3 it's original price :\


I don't recall if the M1 was a secret/surprise (I doubt it).

Before buying any Apple hardware, it's worth looking up a couple of sites that suggest when the next release cycle will be for a given item and what specs are expected in the next release. The one I use is https://buyersguide.macrumors.com/



M1 was not a surprise.

After years of waiting for a non-butterfly keyboard, they released a working Intel based MBP, but almost immediately made it worthless by announcing the migration to Arm. They even "rented" prototypes using A12Z chips for developers to use to prepare their apps nearly a year in advance. Hardly a stealth endeavour.



1/3 is pretty good for a used computer, TBH.

I usually get like 20% or less... sometimes they're hard to even give away.



Fair, this one was in literally perfect condition though. No scratches / damage and a brand new battery.


It's like buying a car. The moment your Mac walks out of the store, the ghost of Steve Jobs comes and personally curses it to 2/3 of its value that very instant.


Except with cars, that loss is against the sticker price. So if you get a good enough deal on your new car, it can in fact lose nothing when you drive it away (I got close to 1/3 off a new car in 2018, using a web-based buying agent, in the UK).

It's very hard to get anything like that much off a Mac, since Apple appears to have pretty tight control over prices, even as charged by third parties.



“Last week” is well within the 2-week return window.


unless you went second hand, you should still be covered with the 30 day return policy no?


wow it's 14! definitely need to hurry up :)


Intel straggler here. My laptop just completed 5 years. It says battery needs replacing which I’ve been putting off for a year. The thing is I have 32G which is quite good IMO so I’ll probably use it for two years at least.


You're just hurting yourself. Even the M1 Pro is like 3x faster than that Intel laptop. Plus so many tiny things like sleep/wake, power management, heat.. I really can't stress enough what a tectonic change the M1 was over Intel.


Depends on what they're doing with the machine. For casual use, Intel is perfectly fine and you do get some x86 benefits too. I have M1 Pro for work, personal M2 mac mini and typing this on 16" Intel MBP, watched the presentation live and still not convinced I need to upgrade my casual browsing and light programing machine this year. Battery is at 500 cycles and 84% health, still lasts 4 hours.


Exactly! I have a 2012 MBP and a 2015 MBP that are still kicking and work perfectly for my kids playing Roblox or Minecraft and any web browsing we want to do with a proper keyboard.


I've got an M1 mini already, so it's good to have the Intel MBP around for occasional x86-only stuff. Built-in USB-A is also convenient.


It really, really was. I’d been used to barely-noticeable performance gains for a decade or more. The responsiveness under load of the M1s made them feel like a decade of performance improvements overnight.


This was especially true with Apple/Intel thermal envelope, I don't know of any other manufacturer that purposefully ran their CPUs to the point of throttling just to keep fan noise down, and in such thin machines with wafer thin heatsinks for design aesthetics. I think they made the last few gens of Intel they used worse than they were (to be fair Intel did have poor TDP till recent 12th Gen)


But how will I ever survive without my touchbar?


Maybe an aftermarket keyboard? I know you're not serious but aftermarket keyboards some of them have little OLED inside the keys. Also somehow the Apple vision Pro headset might provide you with that one extra awkward step needed to accomplish a task interface that it seems like you might be humorously craving.


I'm one of them. For work, come and grab my 2019 16in from my cold dead hands - it may not have the battery lifetime of my private 2022 M2 MBA, but at least I don't have to fight weird issues with x86 Docker images (especially anything involving a JDK runtime, i.e. Tomcat, tends to act up). And no, converting these images to ARM isn't an option, the source image doesn't come from us, and we need to reproduce the exact software environment to reproduce bugs.

And for me as a Samsung phone user, I'm pretty annoyed that I have to drag out a win10 machine every time I want to update the firmware of my phone because Odin is only available on Windows and UTM can't use Rosetta to emulate a Windows VM at any acceptable speed or stability.



2019 16"cher here too. These are great machines just eclipsed by all the Apple silicon hype. Yes they suck comparing to performance, power use and heat to Apple silicon, they're still amazing x86 laptops compared to whatever other x86 machines are out there.


I have 1 of these for work and 1 for home. I'm waiting 5 years from release, so about 2024-2025. Computers got really good around 2017, and the only reason to upgrade is the heat and fan. I use remote VMs anyways, so chrome is really my limiter.


So an occasional firmware upgrade is holding you back? If you install the guest drivers, windows is superfast under UTM.

The docker thing is nonsense if you're using orbstack for example.



For years now, Apple's biggest competitor is Apple from five years ago. One of their biggest threats is that sales flatline because people are still using their perfectly good laptops and phones from several years ago. And that's a hard problem to deal with when you advertise your products as high-end goods that will last a long time.


They have solved this though by periodically introducing incompatible macOS updates, so that eventually you can no longer install the current macOS and then after a while you won't be able to install latest version of applications and you start to feel stronger need to upgrade...


Six year OS support, plus security updates past that.


Not as long as Windows support but still good enough.


This shouldn't be a problem, and the answer is to refresh devices less frequently.


The whole company is sized to a specific revenue stream, and this revenue stream requires a specific sales volume. Apple could get off Mr. Bones' wild ride, but this would require remaking how the whole company works:

- less frequent laptop releases mean lower sales (do you want this 2020 Mac or this 2023 Dell?)

- lower sales mean lower revenue, lower revenue means lower costs

- so now Apple has to spend less on R&D and at the same time convince its board of directors that lower sales don't mean Apple is failing



It objectively is a problem for Apple as a corporation though, as they are expected by their shareholders to to continue to grow and increase profits year over year.


Only in a modern capitalist society etc etc. One thing Apple diversified in in the past years is its non-hardware offering; iCloud earns them billions and Apple TV is churning out massive productions, a 3.5 hour Scorsese flick in cinemas for example.


Why would they do that? It wouldn't solve the problem @rueeeeru stated. They need to remain competitive and you can't do that by sitting on your backside for four or five years between products.


The comparison with M2 is not very useful at this point. Throughout the presentation, the gains were something like 15-20% in some areas.

Even if they do their own chips, Apple cannot achieve a revolutionary performance gain year over year, that's why between two generations we'll see something like this for a long time. And since their idea is for people to be using Macs for many years, it makes sense to compare it to older generations to try and persuade people to do the upgrade.



Not just that, but if you're not running ARM images then it's running a VM for your docker images to boot. It's just mind-blowing how many containers I have running and my laptop is cool to the touch, and at full battery.


You’re running a VM either way on a Mac. Docker without a VM is only available on Linux.

EDIT I get your intent though because it runs better when the images match the host.



Yep! But at least it can leverage hyperkit and vastly reduce overhead when running images of matching architecture.


hyperkit does not run on ARM Macs

  $ brew install hyperkit
  hyperkit: The x86_64 architecture is required for this software.
  Error: hyperkit: An unsatisfied requirement failed this build.


Oh I didn't realize, thanks for pointing that out. I guess they'll have to rely on the new virtualization framework going forward.


And on Windows while running Windows containers.


Before anyone else gets confused...Windows containers are only for Windows applications. If you have a Linux environment in your Docker image, then it's running on a VM. https://learn.microsoft.com/en-us/virtualization/windowscont...


Of course, what is there to be confused about?

It say it clearly on the name, Windows containers.



MacOS can run MacOS containers natively, but I understand that's not much help for most people.

https://macoscontainers.org/



Yeah, but that also requires disabling SIP, so you still might want to run it in a VM.


Aw. I wish there was a way to isolate corpoware crap like Citrix into its own little jail. Kinda like {tool,distro}box on Linux.


That’s interesting. It’s in 0.0.1 version though but it seems like a possible drop in replacement


Without network, IPC, PID and cgroup namespaces replacements it's not even close. How Windows Server does it? By using parts of Hyper-V.


This isn't as much the case anymore. There are typically arm64 images available for the more popular images, and anything you build locally is native, unless you're trying to build x86 software.

Redis, Memcached, PostgreSQL, MySQL- it's all native arm64 images now.



Even using ARM images you still need a VM because the MacOS kernel is not a Linux kernel, and containers rely on features of that kernel.


That only really helps if you also want to deploy on arm64. Developing on one architecture and deploying on another would kind of destroy on of the advantages of containers.


> if you're not running ARM images


The point was that there is rarely a reason to not run ARM images, since they’re widely available.

If I’m running M3 images on my Ryzen, performance is gonna be horrible, but why would I?



Today? Sure. Thus my statement, which came with a caveat about not running ARM images, the implication being that it's a rare thing today.


Do note that cool to the touch is a bad thing if the internals are otherwise hot


If the all-aluminum chassis remains cool to the touch despite boiling internals, that's impressive in its own right :-)


I was more surprised not to see any AI-specific benchmarks—sure, the most popular open-source models are from avowed competitors, but there should have been a way to define a re-training task that would be relevant for the wave of ML programmers.


They didn't improve the neural engine. M3's NPU is half the speed of the A17 Pro in iPhone 15 Pro.


I had the same thought. It seems like the improvements there weren't worth talking about.

AFAICT, the biggest benefit is just the unified memory model at this point.



It won’t excel at this. It is a mobile GPU and won’t be able to put up remotely similar numbers to a desktop GPU with massive amounts of power and cooling.


There were comparisons to the M1 and M2 in the presentation, and indeed the linked website.


The M3 chips are 30% faster than the M2 chips for efficiency cores and 15% faster for performance cores. The overall performance is still impressive and there is no alternative to M*s when compared with performance per watt. But others have caught up, for example https://www.amd.com/en/products/apu/amd-ryzen-9-7940hs is very good.


I note that if those performance numbers are correct it'll still be a lot slower than Intel's Core i9-13980HX laptop CPU. The M2 Max was between 50% and 80% of the speed of the 13980HX on most benchmarks. A 15%-30% uplift will get it closer to the Intel part but still not reach it.


Yeah, but the problem is that 13980HX has 24 cores, 32 threads and incredible 157W Turbo Power. That means it will be really slow when on battery. And when connected to the mains, it'll be as loud as an airplane taking off.

The perf per watt claims still apply.



Good point.

However, when they’re that close in performance, I think power usage should be taken into account. It’s a laptop CPU after all. Maybe I’d rather have 80% of the performance at 4x the battery life



Did you account for the M3 Max having four extra performance cores (12 versus 8 for the M2 Max)?


> But others have caught up

You mean others now got to use TSMC's 4nm process? These new M3 chips are probably on the 3nm process, Apple is still a generation ahead here.

It looks like Apples aim is to always stay a generation ahead of its competition, I wonder how long they can keep that up since they aren't running their own fabs.



Oryon seems to have caught up in performance despite being on N4 (which is just 5nm++), but it's an ARM design from the guys who made M1 in the first place.

M3 seems like it generally underwhelms. A17 had like a 3% increase in IPC. They didn't discuss battery much and I suspect that's because ramping the clockspeeds to over 4GHz isn't so good for the battery benchmarks.

The most worrying part is the transistor count. M3 Max gets quite a bit higher transistor count, but M3 is only 37B transistors while M1 was 33.7B. Apple and/or N3 absolutely suck here.

Looks like I might be keeping my M1 system for yet another upgrade cycle.



the backstory here is that TSMC N3 is a trainwreck, it's now separated into two different nodes, N3B (for bad) and N3E (enhanced). N3E gets the promised step that was originally for N3, but it only enters volume production next year. Supposedly it will actually bring costs down (and yields up) because this is where some additional steps go EUV. Both TSMC and Samsung have been fucking around with their marketing around nodes to try and say they're first in volume 4nm production, but both are having problems with the final bosses of FINFET at 3nm and after this both TSMC and Samsung do GAAFET and solve a different set of problems. Past that lies... nothing. Hyper-NA seems dead.

In the meantime, M3 is on N3B despite very low yields, which surely applied design pressure to keep size down, and the power gains are not as good, and also the density is worse than promised. Apple also surely feels pressure to keep prices high (bait-tier M3 base option with 8GB lol) and honestly they probably are going to be tough to justify on a performance/efficiency basis compared to very fierce ARMv8 competition (we are now testing the thesis there's no difference lol). Apple still has advantages but man do they take you to the cleaners for the result, a loaded apple laptop is obscene. I chose to go for an older loaded M1 Max instead of waiting for M3, because I could actually get a nice laptop that wouldn't impose limits on a prosumer etc. 1TB is all anyone can afford still and that's really silly.

(SSD prices in particular are absolutely inexcusable lol. Mandate a M.2 NVMe 3.0/4.0/+ port please, EU, it's time. Don't care how it works, slot it into the side or whatever if you want, it can be single-sided 2230 or 2242 if you want (or caddy-loading, the icy dock standard lol), but it's time.)

https://global.icydock.com/products-c5-s48-i0.html

https://global.icydock.com/vancheerfile/images/mb873mp-b_v2/...

I also wonder if losing a bunch of the PA Semi team to whatever startup (it may have been nuvia or tenstorrent lol) may also have hurt apple's velocity on A16/A17. There were a lot of apple silicons before M1, after all. But certainly TSMC is a bunch of the problem here.

I think they'll hustle to refresh it and do M3+ as a fast follow in 6-12 months with N3E, the cost economics are very favorable to jump as soon as there's the volume. That doesn't mean MBP gets refreshed immediately though, they'll ramp on the phones (iphone 16/16 pro, etc) and base-tier M3+ or whatever first.



>the backstory here is that TSMC N3 is a trainwreck, it's now separated into two different nodes, N3B (for bad) and N3E (enhanced).

Without N3B, there will never be N3E.

It was the same with N7. And none of these are new to a new node generation.



Apple signed a deal with TSMC to purchase nearly all of their available 3nm chips for the next year. Ethical or not, they positioned themselves to ensure almost no competitors could develop on 3nm until they did first since no fab on earth has the scale of TSMC. They could ride this plan for years if it pays them.


How is Apple buying all the production slots for a process in any way unethical? It's not like they're buying them and then burying the chips in a landfill. They paid TSMC's asking price for production slots. AMD or Intel could have bought those same slots but didn't. TSMC has limited capacity at 3nm, it was up for sale, Apple bought it. Where's the ethics question?


>How is Apple buying all the production slots for a process in any way unethical?

This has been the case on HN for more than 5 years. Intel Fabs used to sell their industry best node to only Intel themselves, and charges a premium for those newer CPU. I guess that is unethical too.



I’ve seen argument for ‘business ethics’ before from the losing side. It’s sometimes part of a media campaign. It’s likely cheaper than legal action.


How is Google buying search defaults on all iPhones in any way unethical? It's not like they're buying them and then burying the searches in a landfill. They paid Apple's asking price for search defaults. Microsoft or Brave could have bought those same search defaults but didn't. Apple has a limited amount of search defaults on iPhones, it was up for sale, Google bought it. Where's the ethics question?


not only is outbidding the competition not unethical (as a sibling notes), apple actually is very involved in the early node work etc. a lot of this work is literally done for apple, it is "exclusive games" in the "this game would not have been made without the sponsorship" sense. this literally would not have been brought to market on the same timelines if Tim Apple wasn't signing a couple billion dollars a year to TSMC right upfront.

Apple pays lavishly to support TSMC's early node research, and they get their say in what happens in the R&D process, and very early insight into the node and their say on how it would work for them as they do their rollout. TSMC gets carried through the research phases much faster than their competitors can do, and it's led them to be on an absolute tear starting with 7nm. And they absolutely cannot fill the same level of demand with the same level of R&D funding from any of their competitors.

It's been a healthy, productive long-term partnership, TSMC is maybe the only supplier Apple can't boss around and Apple is certainly a client that is always too big to fire. Doesn't mean every apple product is good (and TSMC can still flub, and their competitors are catching up a little bit) but Apple can move whatever they need to lol, they are masters of supply chain managment. They can cover TSMC's mistakes if needed, and they have insight into exactly what is happening as the node is developed and how they need to maneuver their product stack around to exploit it.

Engineers study designs, CEOs study logistics. Also true of NVIDIA btw lol, they are very logistics-oriented because they make up such a large marketshare. How many companies on the planet are ordering big bulk runs of GDDR? Well, if we are ordering 20% of the planet's GDDR on a fixed timetable then maybe we can get a custom version, micron, right? (9 months later, GDDR5X/6X is born lol)

It is an interesting contrast to Intel - this is almost the same kind of synergistic relationship as intel's own fab and IP side have historically had together. Did intel fail because they had a tight fab-design coupling, or did they fail because they had a rotted internal culture and then the fab slipped a bunch?



Not running their own fabs makes it easier, right? They presumably pay a premium for being the first to use a generation, but not anywhere near as much as the full cost of developing it, since TSMC can sell that capacity to everybody else when they are done with it.


They are literally on the 3nm process. They say so in the video.


Ok, so no probably, they are on 3nm. Anyway my point that they are a generation ahead still stands.


> But others have caught up

Then you proceed to link to one that... hasn't? (it's good, yes, but it's not caught up at all)



I still know a number of my colleagues who haven't jumped onto the Apple Silicon machines despite our 2019 Intel machines being out of warranty (and thus eligible for replacement) for two years now


I'm still using a 2015 macbook for my personal daily driver. I have to plug it in regularly, but otherwise it works just fine for everything except video editing.


I thought the same, but ended up upgrading to an M1 Air due to a hardware failure.

Sure it feels 100x faster. But more than that it’s DEAD SILENT (no fan in the machine!) and completely cool.

The temperature/noise, if I had experienced it myself first, would have probably gotten me to upgrade.

I use a 2019 Intel MBP at work. It’s much faster than my old 2015 too, but with the additional heat and noise I didn’t really want one.

I would have taken the noise/heat of the M1 + 2018 or 2019 performance. Instead I got heat/noise of the M1 and far better performance, for a fraction of what my 2015 cost (unadjusted) new.

Amazing upgrade.



I got an M2 Pro Mini a couple of weeks ago to replace my 2018 i7 MBP and while it is obviously snappier and you feel it's more powerful without running benchmarks the main difference is that: it is silent. I only got it warm to the touch rendering video with Da Vinci, when CPU temp usage quickly went to 75ºC.


Any reason for not upgrading?


Not OP, but

The new Magic Keyboard ( or Scissor 2.0 ) just doesn't suites me. 1.0mm Key travel is so so much worse than the 1.3/1.5mm old scissors on my 2015 MBP.

I actually dont need the seamless, ultra large trackpad. Which gets false positive from time to time. This has never been the case on a sane trackpad size.

My Workload is memory limited, and rarely CPU limited. Upgrading wouldn't bring a lot of benefits unless I have more memory, and Memory upgrade is expensive.

Did I mention keyboard or trackpad?

I just had a battery swap on this MBP earlier this year, hopefully it will last another 4 - 5 years or whenever I cant update Safari. Although I guess I could still use Firefox.



>it works just fine for everything except video editing

would be my guess



> Any reason for not upgrading?

Why is this sort of question always framed as if people have to provide a justification for not buying a new model, as if spending money on the new shiny without any reason is normal or desirable?

We live in a day and age where hardware bought a decade ago still packs enough punch to run most of today's software without any hiccup. Why would anyone waste their cash to replace something that works without having any compelling reason?



Some people have experienced downsides you might want to know about.

For example as you passed 2016-2019, not only did you have the butterfly keyboard mess but each generation reportedly got hotter and thus louder.

My 2015 was quieter/cooler than my 2019.

So if you’re happy it may turn out that even though the newer machine has better performance it feels like a downgrade for other reasons.

(I don’t think that’s the case here)



2019 16" no longer used the butterfly keyboard, they have a physical Esc key and something Apple called "magic keyboard", really a scissor based mechanism that's really pleasant to type on and doesn't stop working with a tiny bit of dust under the cap.

https://www.macrumors.com/guide/butterfly-keyboard-vs-scisso...



2019 MBP mostly fixed the issues introduced in 2016. But lack of USB and HDMI was enough for me to pass on it. Also, the 2015 one is prettier.


Ah I miss the lit up Apple logo.


Yeah I would definitely miss that, as well as the shape.


The new magic keyboard or new scissor keyboard has a key travel of 1mm, for me I wouldn't describe it as really pleasant.


I'm rocking a 2017 MacBook Air and it's great--really the only down side is that Apple stopped supporting it in macOS past Monterey. Almost all of my past and present Apple hardware has long outlasted software support, which is my biggest complaint with the company.


Because the productivity gains from upgrading can be quite large relative to the cost of upgrading, esp when you factor in the average salary in this community.

I spend 8-10 hours on my Macbook every day. The amount of time I've saved / productivity I've gained by things just running faster and by being more mobile (much longer battery life) is huge compared to the $2000 price tag.

Frugality is good but there are some things in life (depending on your personal circumstances) where it does in fact make sense to upgrade for clear benefits.



Really depends on what you're doing.

I spend 8-10 hours a day coding on my 2018 MBP (web apps - Postgres and Rails or Python) but almost none of that is really CPU-bound in my case. The meat of my work, the actual coding and iteration, is not limited by the aging CPU.

The one thing that's painfully slow is rebuilding Docker images, but we don't do that too often. Less than once per week.

I actually am upgrading soon, but it is not going to make an amazing difference for me in terms of productivity in my current work.



> Because the productivity gains from upgrading can be quite large relative to the cost of upgrading, esp when you factor in the average salary in this community.

You see, this is simply not true. At all. By far.

I have a cheap Intel laptop released 8-10 years ago. It shipped with 8GB of RAM and 4 cores. I bought it on a clearance sale for around $500. I use it still to this day to work on webapps, including launching half a dozen services with Docker Compose. The only time I experience any type of slowdown is when I launch IntelliJ.

I also have new kit, including a M2 MacBook.

There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing. The only issue I have with my old laptop is battery life, and that's just because I don't bother replacing it.

Please do point out a single concrete example of "productivity gains" that I would get by spending $2k on a new laptop.



> There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing.

[…]

> The only time I experience any type of slowdown is when I launch IntelliJ.

I can't tell if you're a serially dishonest interlocutor, or whether your fetish for making very emphatic generalisations with lots of intensifiers makes you seem like one, but once again this is very weak reasoning. You have yourself pointed out something you cannot do with your Intel laptop which you could with an upgrade.

> Please do point out a single concrete example of "productivity gains" that I would get by spending $2k on a new laptop.

You can run IntelliJ smoothly and have no battery life issues. (Literally from your own post… it's just so sad to see this utter lack of self awareness.)



The only downside would be struggling with 8GB, which should be upgradable just as well. 10y old would have a cd/dvd tray - that can be replaced by an SSD for 4TB of goodliness (SATA but still good enough).

My spouse has a 12y old laptop that has had pretty much everything (but the soldered GPU) upgraded - CPU, memory, HDD->SSD, CD-SSD, WiFi (to support 5GHz), keyboard (replaced), fan & heatsink, battery (replaced, might rebuild one w/ LG's 18650 MJ1). Unfortunately pre-Sandy Bridge memory is capped at 8GB, so it shows its age - still an amazing thing.



I love responses like this, we should think hard first why NOT to upgrade, instead of doing reverse.


Why do you have an M2 Macbook?


It is an honest question.

For some background in my thinking, it is because today's announcement really focused on Intel users. My semi-educated guess is that Apple did a whole bunch of user studies and realized there are a lot of people out there, like the OP, who haven't upgraded yet, hence the focus. As a result, I'm genuinely curious why this person hasn't upgraded.

And for my own personal experience, the upgrade/switch from intel to m*, is night and day better ergonomics as a developer. It isn't just some shiny new toy or a waste in cash. For the same reason professional mechanics in F1 don't use shitty tools to work on their cars. Or tour de france racers aren't using 30lbs Huffy bikes.

TLDR: I don't give a f'ck if you don't happen to upgrade, that's your choice. I'm just curious about why.



    For the same reason professional mechanics in F1 
    don't use shitty tools to work on their cars.
They also probably don't buy new wrenches every time new wrenches are released, if their current wrenches are completely sufficient and not holding them back in any way.


> if their current wrenches are completely sufficient

That's a fantastically entirely subjective opinion.



Sorry you're being downvoted for pointing out specious arguments.


You... don't think that mechanics on a racing team are qualified to know if their current wrenches are sufficient?


I suspect that OP thinks, as I did, that you've constructed an inane straw man.


User latchkey, the one you're agreeing with, is the one who very literally claimed that a developer using an Intel laptop is quite equivalent to an F1 mechanic using "shitty tools" or racing the Tour de France in a 30lb Huffy.


I'm not agreeing with latchkey's statement about developers and "shitty tools". I'm agreeing with them that when you say this…

> if their current wrenches are completely sufficient

… you are not making an honest argument, because it is entirely subjective as to whether their current wrenches are "completely sufficient".

The dog I have in this fight is not upgrade cycles or Intel vs. M1, it's "argue the fucking point without descending into high school rhetoric and logical fallacies".



A wrench has a finite set of objective qualities. Grip, length, strength, weight and maybe some special-case properties like being non-magnetic or spark resistant.

It's surprising to me that you think that cutting-edge racing mechanics don't have objective criteria for these things and that it's all some sort of subjective dark art. But it's a bad analogy to begin with and it's not my analogy.



I totally agree it's a bad analogy, and whether I succeed in defending it or you succeed in knocking it down, it doesn't really help us understand each other in greater fidelity.

The one thing I do admire about the person who offered it is that they are at least trying to persuade by offering different lenses through which to interpret their perspective, instead of repeatedly shouting THERE IS NO GOOD REASON TO UPGRADE LMAO.



> The one thing I do admire about the person who offered it is that they are at least trying to persuade by offering different lenses through which to interpret their perspective, instead of repeatedly shouting THERE IS NO GOOD REASON TO UPGRADE LMAO.

That nails it on the head.



> That's a fantastically entirely subjective opinion.

That's the point: objectively, there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

In fact, it boggles the mind how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools", as if MacBook Pros packing an Intel core 7/M1/M2 suddenly became shitty laptops just because Apple released a new one.



> That's the point: objectively, there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

Again, what you mean to say is that _you_ cannot think of a reason that would make _you_ upgrade from a 4 year old MacBook to a new M3 one.

> objectively

Do you understand that what you say is literally, definitionally, subjective? It's one thing to make primitive and clumsy generalisations, but quite another to be confusing subjectivity and objectivity.

> it boggles the mind

Starting to believe there isn't a lot of mind to boggle here…

> how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools"

I haven't noticed anyone making this argument, but I know many people who upgrade their tools -- whether computers or otherwise -- to the latest and greatest whenever they can, because working faster and more efficiently is a concrete benefit, and it really would take an inestimable moron to, say, argue that late Intel-era MacBooks can do the same things that M-series MacBooks can.



    I haven't noticed anyone making this argument
Yeah, you haven't read this thread.

Not that you missed anything of value. A previous poster, latchkey, quite literally made that argument:

    "the upgrade/switch from intel to m*, is night and day 
    better ergonomics as a developer. It isn't just some 
    shiny new toy or a waste in cash. For the same reason 
    professional mechanics in F1 don't use shitty tools 
    to work on their cars. Or tour de france racers aren't 
    using 30lbs Huffy bikes"
As to this assertion:

    it really would take an inestimable moron to, say, 
    argue that late Intel-era MacBooks can do the same
    things that M-series MacBooks can. 
In terms of raw performance and power efficiency, obviously the Apple Silicon laptops trounce the Intel-based Mac laptops.

But if you spend some time learning about our industry you'll realize that not all development workflows are identical, and not all have the same bottlenecks, and for many tasks an Intel-powered Mac is not a bottleneck. Surely you can understand that, or aspire to understand that.

I would certainly agree with a more generalized and reality-based version of what you and the other poster seem to be attempting to say: If your current hardware is bottlenecking you in any way, you should most definitely address that if at all possible. A hardware upgrade that unbottlenecks you and improves your developer ergonomics will almost certainly pay for itself in the long run. That is sane and profitable advice and something I've always done.



Thanks, I had missed that. It contains the phrase "don't use shitty tools", but I'll leave it to you to decide whether OP honestly recapitulated the same argument in their passing reference. The two seem somewhat different to me.

> As to this laughable claim […]

This is a response to a specific point which rewmie has made several times. They seem to genuinely believe there is literally no difference between M-series and Intel chips:

> There is absolutely nothing I can do with my M2 laptop that I cannot do well with my cheap old Intel laptop. Nothing.

> there is absolutely no concrete reason that justifies replacing a MacBook bought in the past 3 or 4 years with the M3 ones. None at all.

> it boggles the mind how anyone could justify replacing any MacBook pro with a M3 one by claiming "pros don't use shitty tools", as if MacBook Pros packing an Intel core 7/M1/M2 suddenly became shitty laptops just because Apple released a new one

I likely disagree with your position, and believe you have made some bad faith arguments, but you're at least compos mentis.

> But if you spend some time learning about our industry

Whoops.

> you'll realize that not all development workflows are identical, and not all have the same bottlenecks, and for many tasks an Intel-powered Mac is not a bottleneck. Surely you can understand that, or aspire to understand that.

Would you mind restating what you believe my argument to be? Because this reads as a patronising non-sequitur to me, and I'm sure you're not intending for it to land that way.

(If you are pushed for time, I'll do it: nearly everyone spending thousands of dollars to upgrade their computer has what they consider to be a good reason for doing so, whether that reason be boosting their self-esteem by having the latest toy, or a mild performance boost in their day-to-day work. You may not find their interpretation of "a good reason" to be persuasive, but there are likely to be many areas of your personal spending which they would see as imprudent or rooted in tenuous reasons. This thread is full of people incapable of understanding the reasons others have for upgrading and making emphatic sweeping statements. Everyone is different. News at 11.)



Replying to this one since I think we reached max nesting. Regarding as to why somebody might not be in a hurry to upgrade a 2015 Mac to an M2:

https://www.cpu-monkey.com/en/compare_cpu-apple_m2_8_gpu-vs-...

To put it in fully objective terms, a lot of development tasks (for many people) are still dominated by single-core performance.

The M2 has roughly 2x single-core performance, which is going to be absolutely awesome if you're spending a lot of time waiting for the CPU. But if that's not really a bottleneck, and the things you do are already completing at a speed that doesn't disrupt your flow state or otherwise consume significant amounts of your day.

I'm working (on my 2018 MBP) on some Python software that does science stuff. The single core perf delta between my CPU and the M2 is even smaller for a lot of tasks, more like 50% instead of 100%. And I'm not doing anything that would really benefit from more than 6 cores.

I'm currently planning an upgrade, but it's just not a pressing need as $2K-$3K is a significant investment for me at the moment.

    I can't imagine an F1 mechanic not taking an interest 
    in the latest marginally improved wrench
F1 teams have mandated cost caps. I'm not entirely sure if that includes tooling, but even if not, budgets are not infinite and there is a time cost required to research and acquire new tools. Time and money spend getting wrenches are time and money not spent elsewhere. So I would think there is a constant pressure (like in any business) to identify real bottlenecks, not just spend unlimited amounts of money on increased capabilities that may or may not have any bearing on actual performance. Presumably this is why a developer might choose a regular M2 or M3, but not necessarily the maxxed-out M3 MAX with 192GB of RAM and 8TB SSD for $10,000 or whatever (I know I'm exaggerating). Yes it's more performance, no it won't matter for many workloads.


There is no daylight between us on any of these points.

My position is not that there aren't good reasons to have not upgraded from a 2015 Mac, or that I'm having trouble imagining what they are, but rather that it's a reasonable question to ask of someone in this specific forum.

> F1 teams have mandated cost caps…

We're not really arguing the point here. OP was not trying to pass an exam about the specific details of how F1 teams operate.



     "There is absolutely nothing I can do with my M2 
     laptop that I cannot do well with my cheap old Intel laptop"
Well, I took that one in good faith and interpreted it to mean that the old Intel laptop was perfectly adequate for their personal needs.

The alternative interpretation, that they believed there was no objective difference in capability between Intel and Apple Silicon laptops, was so absurd I couldn't imagine anybody expressing it or believing it. I think I made the correct interpretation but it was definitely an extrapolation on my part and definitely fits the HN guideline of "assume best intentions."

To be clear, the Apple Silicon laptops certainly trounce the Intel MBPs and I think most developers will find them well worth the upgrade for most things -- I just didn't like the assertion that anybody still using an Intel Mac was equivalent to somebody riding the Tour de France in a Huffy.



> I took that one in good faith

I tried to, but found it hard given that OP also challenged people to provide "concrete reasons" to upgrade, and said things like "there is absolutely no reason". Everything OP says indicates to me that they actually meant this as evidence for their generalisation.

> The alternative interpretation, that they believed there was no objective difference in capability between Intel and Apple Silicon laptops, was so absurd I couldn't imagine anybody expressing it or believing it.

I agree it's a head scratcher… and yet here it is, before our very eyes, time and time again. I even recapitulated the argument in more reasonable terms ("I think what you meant to say is…"), but they seem resolute in their belief that there are no reasons to upgrade from a "late 2010s" MacBook to a new one.

> I just didn't like the assertion that anybody still using an Intel Mac was equivalent to somebody riding the Tour de France in a Huffy.

Heh, yeah that gave me pause too. I actually think that the example of the F1 mechanic slices the other way entirely: I can't imagine an F1 mechanic not taking an interest in the latest marginally improved wrench, given the narrow margins by which they succeed or fail in competition against other teams, and other mechanics.

You are right that many Intel machines are still highly capable. One could buy an Intel Mac Pro until earlier this year, for example.

But the trigger for Mr/Mrs/Mx "No difference between Intel Macs and the M-series" was another commenter benignly asking someone why they hadn't upgraded ("Any reason for not upgrading?") from a 2015 Intel MacBook Pro.

I said this elsewhere, but it seems like a fair question to ask someone on a computer/programming forum, particularly when the machine in question is close to EOL and has been blown away by a new technology. Don't get me wrong, if this was someone using a 2006 Core Duo in 2012, I'd think it was much of muchness, but the M-series does change things somewhat.



This isn't Huffy bikes vs F1 racers. Unless your workload is heavily CPU bound. And even then it's probably more like a 20yo F1 car vs a new one.

We also live on a finite planet. And then energy savings for many desk jockeys is unlikely to be worth it for a few decades more, if one considers the literal tons of material and energy in manufacturing.



That's totally not my experience at all.


> And even then it's probably more like a 20yo F1 car vs a new one.

This thread is literally about the decision to buy an M3-based MacBook Pro to replace M2/M1/Intel MacBook Pros. We're talking about hardware launched in the past 4/3/2/1 years.

That's hardly "20yo" anything.

Also, you failed to provide any concrete, objective reason to buy a M3. None at all. Is it that hard to put together any argument to justify the move?



> That's hardly "20yo" anything.

My car comparison was trying to propose an alternative metaphor since comparing a top-of-the-line racing car to a child's bicycle struck me as absurdly out of proportion. Cars are generally maintained and kept in service longer than computers, so I picked 20y out of thin air.

> Also, you failed to provide any concrete, objective reason to buy a M3. None at all. Is it that hard to put together any argument to justify the move?

My point is for most people there is no justification to move. Unless one has a device beyond repair, so old its software cannot be kept up-to-date, or the very rare need for the latest performance then stick with what you have.



I've been holding off on upgrading some older Intel Mac minis I have while waiting for the memory situation to improve, but so far it hasn't.

Ideally, I'd consolidate these older systems into one new Mac mini, or even a Mac Studio.

I'd like at least 64 GB of memory, at a reasonable price.

The latest Mac mini maxes out at only 32 GB of RAM, if I'm remembering it right.

I think the latest low-end Mac Studio could be upgraded to 64 GB of memory, but the last time I priced it, this upgrade cost more than I'd been expecting. It also put the overall cost above what I'd prefer to pay.

While I'd like to keep using a Mac, it's looking more and more like I'd be better off just building a PC, where I could likely get comparable enough processing performance, but far more memory (and storage) at a lower cost.



> My semi-educated guess is that Apple did a whole bunch of user studies and realized there are a lot of people out there

Apple has telemetry from macOS. So they knew exactly what percentage of users are still on Intel Macs.

And it's low-hanging fruit to go after them then try and convince existing Windows users.



Telemetry doesn't answer the important "why" question.


> better ergonomics

The newer 16 inch MacBook Pros are half a pound heavier than the Intel one.



I'm weird, I sit on the floor on a cushion with my back against a wall. I have a folding table over my lap that the laptop sits on. The keyboard actually works unlike my old Intel ones with the crappy butterfly. I hardly travel these days, but throwing it in a backpack isn't the end of the world.

That said, I was actually thinking ergonomics in terms of performance of development. The thing is so fast that commands complete faster than I can deal with them. My IDE can keep up with me. I can run a ton of apps and it doesn't slow down or glitch. It doesn't get nearly as hot and there is rarely fan noise. The screen is higher quality. The speakers sound better. Magsafe is back! The button for my fingerprint works very well. No more stupid touch bar. Function keys!

I could keep going...



The honest reason is that there is no practical reason to upgrade. The computer works, and despite the various FUD you might read, the attack surface for external attacks is quite small for personal computers.

That said, if anyone would like to send me $4000, I will absolutely upgrade to a new 14" Macbook in a heartbeat.



Because hardware vendors don't provide security updates forever, and they refuse to open-source enough of their code that other people can do it for them.


They mentioned their coworker and most companies have upgrade policies. Just fill in a form every x years and you get a shiny new laptop (and depending on where you work, you’ll get to keep your old one as a gift)


> They mentioned their coworker and most companies have upgrade policies.

Upgrade policies aren't driven by new requirements, or performance improvements. Some companies have mandatory hardware replacement policies which mostly serve to allow their tech support staff to standardized on a small number of devices. Getting a M3 MacBook Pro replacement just because your employer doesn't want to maintain an Intel MacBook Pro is hardly indicative that a M3 is worth spending money on, let alone replace a M2 or even M1 MacBook Pro.



> Why is this sort of question always framed as […]

OP didn't frame it any way at all as far as I can tell, but either way it seems like an entirely reasonable question to ask of someone on a forum which is largely comprised of computer and programming enthusiasts who has not upgraded their daily driver for nearly a decade.

> as if spending money on the new shiny without any reason is normal or desirable?

Every single person who spends their money on "the new shiny" has a reason. You may not find the reason edifying, but that's irrelevant to your stated argument.

> Why would anyone waste their cash to replace something that works without having any compelling reason?

As you were doubtless aware when you specifically constructed a straw man argument predicated on an entirely false premise and laden with your own subjective judgements about "waste" and things that "work" and "compelling" reasons, nobody does this.

I suspect what you really mean is that you believe people upgrade their machines without what _you_ consider to be a good reason. You think people are too quick to upgrade when their machine isn't the very latest, or when it's got a dent, or when it's slowing down a little.

If you'd written what you really believe -- that people should not upgrade as rapidly as they do -- you'd probably have pulled on the thread for a further 0.02s and realised that everyone has different values and priorities, and you likely "waste money" in others' eyes across multiple line items of your annual budget. So it's terrific luck, really, that the internet's various competing interpretations of a "compelling reason" can't stop you from spending your money however you'd like.



> (..) it seems like an entirely reasonable question to ask of someone on a forum which is largely comprised of computer and programming enthusiasts who has not upgraded their daily driver for nearly a decade.

Are professionals expected to mindlessly throw money around at the new shiny without having absolutely no compelling reason to do so?

I think my post was rather straight-forward: people buy things only when they feel there is a clear upside to it. If you made that purchase 2 or 3 or 4 years ago, you need a very good reason to just throw it away and buy a new replacement. You need to at least make a valid case for it, otherwise you are just wasting your hard-earned money for nothing at all.

> Every single person who spends their money on "the new shiny" has a reason.

Why was OP framing that question on whether no reason was needed then, and instead people had to justify why weren't they wasting their money on the new shiny? Why is being new and shiny such a strong rationale that the onus of not buying is placed on not buying?

These are simple questions. In fact, all it would take is provide a single compelling reason why it would be a good idea to waste money on a M3 Macbook Pro when you already own a M2/M1 Macbook Pro, or even a late 2010s Macbook Pro. Hell, why on earth would you even waste money on a M3 Macbook Pro if you already have a M2 Macbook Air?

If you cannot answer this question, why would it be anything than absolutely foolish to pretend that people should justify not buying a M3?



> Are professionals expected to mindlessly throw money around at the new shiny without having absolutely no compelling reason to do so?

Once again you're loading an incredibly tawdry straw man argument here with your own inane value judgements. The only difference is that this time you've undermined your argument with a typo: it's otherwise as self-evidently vacuous as your original comment.

Just look at this epistemological nightmare you enumerated with apparent sincerity:

> why on earth would you even waste money on a M3 Macbook Pro if you already have a M2 Macbook Air? If you cannot answer this question, why would it be anything than absolutely foolish to pretend that people should justify not buying a M3?

Putting aside haplography (I guess if your argument is just begging the question a dozen times it gets hard to write coherently), it seems that you're literally incapable of considering that other people have fundamentally different values and priorities to you.

Read this sentence you wrote:

> In fact, all it would take is provide a single compelling reason why it would be a good idea to waste money on a M3 Macbook Pro when you already own a M2/M1 Macbook Pro, or even a late 2010s Macbook Pro

It is axiomatic that there can be no "compelling reason why it would be a good idea [sic]" to "waste" money on an M3 MacBook Pro. It's a waste of money, so there cannot be a good reason. What you presumably intend to write is: "I cannot think of a single compelling reason for a person to upgrade to an M3 MacBook Pro if they already own an M2, M1, or late-2010s MacBook Pro."

And that's it. You can't think of a reason. People in this thread have given you both examples of reasons to upgrade, and clear-eyed explanations of why your inability to suspend your disbelief in this area is not the incisive general argument you think it is.

Much of the work I personally do will be made significantly faster by upgrading from the M1 to the M3 Max, which I will upgrade to. I upgraded to the M1 from an Intel Core i9.

You might think that this is a compelling reason -- wanting one's work to be faster and more efficient. You might not. It doesn't matter. It's a good enough reason for me to upgrade, and that's the rub. Everyone has a reason to upgrade, you just disagree with how compelling those reasons are. And again, the great news for everyone else is that your handwringing serves only to make you seem enormously judgemental and narrow-minded. You remain free to spend your money as you wish.



It would depend on workload. On my old intel MacBook, I was looking at an hour or so to build, and it could only complete one build on the battery if that. Testing took a similarly absurd amount of time.

The M1 dropped both times by in the region of 30 minutes, could do multiple rounds on a single charge and didn't make a tonne of noise while doing so.

The amount of time savings you get from the improved CPU perf is quantifiable, and you can assign a monetary amount to that time.

Now if your use case is not performance (cpu, battery, etc) limited then of course there's no reason to upgrade, ever really, but that would apply to any laptop or pc not just Macs.



Because capitalism, that’s why. Capitalists have convinced people it is a moral imperative to continue to spend constantly.


The mode of production doesn't affect the fact that you have to do production. If nobody's continually demanding laptops from the laptop maker, they will stop making laptops.

https://en.wikipedia.org/wiki/Paradox_of_thrift



If people don't need new laptops because their current ones already do everything they need, then reducing laptop production is good. Fewer resources and less pollution spent on things people don't need.


Not if it meant there are none when they do need a replacement.

Similarly, buying cheap used cars only works because someone else bought them new.



> sir this is a wendy's


I am sure that 16yo girl buying an iPhone is thoughtfully postulating about the juxtaposition of morality and capitalism.

And not because it's shiny, fun and lets her socialise with her friends.



> thoughtfully postulating about the juxtaposition of morality and capitalism

I don't know how you managed to read the comment you responded to as suggesting that.



Fiscal prudence?


To me, "personal daily driver" sounds like where you'd do online banking. A MacBook from 2015 can't run any OS newer than Big Sur, which is EOL right about now. And it sounds really imprudent to do online banking from an insecure device.


It should still be able to run an up to date web browser though, right?

If one is that concerned about someone exploiting an OS level security flaw to exfiltrate their online banking credentials (wildly unlikely), they should just be doing that stuff in a VM or similarly isolated environment anyways.



> It should still be able to run an up to date web browser though, right?

For a while, yes, but the browser being up-to-date doesn't make an EOL OS safe to expose to the Internet.

> If one is that concerned about someone exploiting an OS level security flaw to exfiltrate their online banking credentials (wildly unlikely), they should just be doing that stuff in a VM or similarly isolated environment anyways.

Just doing sensitive stuff in a VM isn't good protection at all, since a malicious host can trivially compromise the guest.



> A MacBook from 2015 can't run any OS newer than Big Sur

It can, Ubuntu runs just fine on it



You're right, I should have been more precise. But you still won't get security updates to firmware anymore that way.


If this is your personal threat model, I commend you on an exciting life well-lived that appears to entail sophisticated personal protection of the GPG keys and Bitcoin you need to run your business empire securely.


It can run the latest OS with the open core project


My 2015 MBP is supported in macOS Monterey.


The "Pro" makes a difference there. The Air and Pro from 2015 both got Monterey, but the regular MacBook from the same year didn't.


Many US bank websites have so few features I'm not even sure what hacking mine could get someone. They can transfer from my checking to my savings account?


The problem from another angle: I wouldn't trust anything made in the last decade for my airgap box.


I assure you, I take security quite seriously. The version of MacOS I'm using is nowhere near the top security risk.


x86-64 docker containers?

https://github.com/docker/roadmap/issues/384 is still open. :(



I don't know exactly why this bug would still be open, but you can use x86-64 images on an ARM64 Mac :

https://docs.docker.com/desktop/release-notes/#4250

I have been using it for a few months (in beta) and it works great !



Ah, just out of beta a few days ago. I'll try it out!


In case you are not already using orbstack, that might give you even more battery life (not affiliated, just a fan).


A container that is idle and do not serve any request only really as memory in its footprint. There is no reason having tens or hundreds of processes not occupying cpu and i/o time would affect battery life in a significant manner.


Bro and where do you think the containers and scheduler run? There is a whole linux underneath running all the time


Only one kernel and scheduler for all the containers, that doesn't have a lot to do if most of the processes are idle.

And I am not your bro.



Your whole argument is containers will not occupy CPU or I/O. Which is not true, you’re running a full fledged VM, not just a kernel.

And presumably the person running k8s is probably not running them to have them idle. They are technically knowledgeable enough to be aware how much resources do they consume, and be excited for such comparison.



A linux VM with only a kubernetes/docker with most workloads idling doesn't use a lot of resources, except memory, probably less than the typical open browser tab full of unoptimized js.

When you have kubernetes on your laptop, that is to test your code alongside a set of other microservices functionnally representative of a prod deployment. That doesn't mean your containers will have much load appart from your own punctual testing.



It shows the comparison right on the chart, what else you need? But you should get m3 most times if you want to use it for the longest period.


Out of curiosity, how much RAM do you have on your MacBook to run containers?


How much RAM do you want to give the containers on your MacBook?

I'm being facetious, but it's an unanswerable question.



However much you set on the slider, so not quite so unanswerable.


I mean the person I replied to asked the unanswerable question. How can we say how much memory they need in their computer 'for containers' without knowing where they want to set those sliders (and how many of them there are), and then it's not really worth asking, or it would be a question about runtime overhead or something.


I think this is squarely aimed at people who are holding onto their Intel-based Macs with an iron grip. There are always people out there that don't ever want to move to another architecture. I saw it going from Motorola 68030s to PowerPC. I saw people not wanting to upgrade from PowerPC to Intel. Now we're still seeing the people who don't want to migrate to Apple Silicon. They may have legit reasons and what-not. But time is ticking.

So I think it's mostly aimed at the Intel hold-overs.



Intel ride or die here. My 2019 i9 mbp is trucking along still - and this time of year the heat helps keep the room hospitable.

Was looking towards M3 for a big leap, but apart from heat and power (I use my MBP plugged in 95% of the time) there still isn't that compelling a reason to deal with some of the issues (thunderbolt / multiple displays) for my use case.

At 4 grand (sterling!) for comparable spec to my intel mbp, I just can't bring myself to take a plunge.



Nobody has to upgrade. m68030 Macs still worked after the PowerPC transition. I used my Mac mini G4 for years after the Intel transition, and still use both an m68030 Mac and that G4 mini with NetBSD now.

Currently I'm running Sonoma on a 2010 MacBook Pro. I'd love an ARM-based Mac, but can't afford one yet. I'd have to disagree about the idea that "time is ticking"...



True. But I was just offering a possible reason why Apple was hitting that comparison so hard in the presentation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com