Get the top HN stories in your inbox every day.
lukev
paulryanrogers
> to purchase a machine that feels like it really belongs to me
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
lukev
Of course I wish the hardware were somehow more open, but to a large extent, it's directly because of hardware based privacy features.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
nkmskdmfodf
> to a large extent, it's directly because of hardware based privacy features.
First, this is 100% false. Second, security through obscurity is almost universally discouraged and considered bad practice.
orf
One of the most underrated macOS features is the screen sharing app - it’s great for seamless tech support with parents.
It works via your keychain and your contacts, and the recipient gets a little notification to allow you to view their screen.
That’s it - no downloads, no login, no 20 minutes getting a Remote Desktop screen share set up.
traceroute66
> I wish the hardware were somehow more open
Some of us are old enough to remember the era of the officially authorised Apple clones in the 90's.
Some of us worked in hardware repair roles at the time.
Some of us remember the sort of shit the third-party vendors used to sell as clones.
Some of us were very happy the day Apple called time on the authorised clone industry.
The tight-knit integration between Apple OS and Apple Hardware is a big part of what makes their platform so good. I'm not saying perfect. I'm just saying if you look at it honestly as someone who's used their kit alongside PCs for many decades, you can see the difference.
amelius
> My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
Yeah, but this is hacker news.
ezfe
You can buy most parts officially from Apple - I just bought a new set of keycaps to replace some on my MacBook Air. Couldn't do that 5 years ago.
You can install whatever OS you want on your computer - Asahi Linux is the only one that's done the work to support that.
You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
talldayo
> You can buy most parts officially from Apple
But very distinctly, not all. Apple deliberately makes customers buy more than what they need while refusing to sell board-level ICs or allow donor boards to be disassembled for parts. If a $0.03 Texas Instruments voltage controller melts on your Macbook, you have to buy and replace the whole $600 board if you want it working again. In Apple's eyes, third party repairs simply aren't viable and the waste is justified because it's "technically" repaired.
> You can install whatever OS you want on your computer
Just not your iPhone, iPad or Apple Watch. Because that would simply be a bridge too far - allowing real competition in a walled garden? Unheard of.
> You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
And watch as they break after regular system upgrades that force API regressions and new unjustified restrictions on your OS. Most importantly, none of this is a real an option on Apple's business-critical products.
arzke
> How true is this when they devices are increasingly hostile to user repair and upgrades?
Not sure what you mean exactly by this, but to me their Self Service Repair program is a step in the right direction.
sqeaky
It was mandated by right to repair laws, it provides the absolute minimum, and they've attempted the price out people wanting to do repairs. The only way it could be more hostile to users is by literally being illegal.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
rad_gruchalski
What you see hostile to repair I see as not worth stealing. What you see as macOS dictating what you can run from where I see as an infiltration prevention.
justin66
They certainly are worth stealing. They get parted out and Apple's hostility towards making parts available means those stolen parts are worth more.
talldayo
What you see as anticompetitive payment processing on iOS, others may see friendly and harmless business model. HNers, be respectful when criticizing bigger companies like John Deere and Apple - it's important you don't hurt these customer's feelings and scare them off.
syndicatedjelly
> MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
heavyset_go
> Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want.
Try sharing a binary that you built but didn't sign and Notarize and you'll see the problem.
It'll run on the machine that it was built on without a problem, the problems start when you move the binary to another machine.
superb_dev
Apple also left a very convenient hole in their boot loader to allow running another OS. Linux works pretty well these days
schaefer
* on M1 and M2 variants.
bogantech
* As long as you don't want to use any external displays
nsonha
Really? I got a bunch of error upgrading my Arch-based Asahi and now chromium doesn't work anymore. Oh and no external display, or speaker.
jeffybefffy519
Considering you need an Apple ID to log into the hardware, id argue Apple gatekeeps that ownership pretty tightly.
lukev
This isn't true.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
ale42
It's optional and very easy to skip. Not like the requirement for a MS account on Windows 11, which is also skippable but not by the average user.
sroussey
I have the same problem with graphics cards (not upgradable—and cost more than the pc they are in!)
Same with server parts using HBM—won’t let me upgrade memory there either.
That said, the apple ssd situation is abysmal. At least with memory they have reasons.
nkmskdmfodf
> I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
You either have have very low standards or very low understanding if you think a completely closed OS on top of completely closed hardware somehow means it 'really belongs' to you, or that your data/privacy is actually being respected.
eilefsen
"completely closed OS" is not accurate. apple releases a surprising amount of source code.
fsflover
The closed part has full control over your system, so the released code is useless for privacy/ownership.
asp_hornet
Whats the alternative? Linux? Maybe OP likes that their OS doesnt crash when they close their laptop lid.
seandoe
Crash? I understand people's gripes with ui, hardware compatibility, etc, but stability? All my Linux machines have always been very stable.
WuxiFingerHold
It's not that bad anymore (e.g. with system 76), but I understand the point.
I disagree with OP celebrating Apple to be the least evil of the evils. Yes, there are not many (if any) alternatives, but that doesn't make Apple great. It's just less shitty.
stouset
You hit the nail on the head. And it’s something virtually everyone else replying to you is completely missing.
Apple isn’t perfect. They’re not better at privacy than some absolutist position where you run Tails on RISC V, only connect to services over Tor, host your own email, and run your own NAS.
But of all the consumer focused hardware manufacturers and cloud services companies, they are the only ones even trying.
lavela
You miss the point. It's not that I enact authority over my system in every detail all the time, but I want the ability to choose authority on the aspects that matter to me in a given circumstance.
siajdsioajd
They just have really good marketing. You fell for their pandering. If you really care about privacy use Linux. But Apple ain't it. Closed source and proprietary will never be safe from corporate greed.
bilbo0s
Linux doesn't give you privacy guy.
If you're using the web, your privacy is about your browser and your ISP, not your OS.
At times, it's even about how you use your browser. No browser will save you from telling google too much about yourself by using gmail, and viewing youtube videos, and using search. The AI's and algorithms collating all that information on the backend see right through "incognito" mode.
Telling people they can get security and privacy by using Linux, or windows, or mac just betrays a fundamental misunderstanding of the threat surface.
alexlll862
You missed the point completely. The problem with a user hostile closed OS like Windows is that they collect a lot of data from your computer even if you never open a web browser. You have no clue what they collect and what they do with the data
geysersam
If you're so focused on privacy why don't you just use Linux? With Linux you'll actually get real privacy and you'll really truly own the system.
Apple takes a 30% tax on all applications running on their mobile devices. Just let that sink in. We are so incredibly lucky that never happened to PC.
EthicalSimilar
As much as anyone can say otherwise, running Linux isn’t just a breeze. You will run into issues at some point, you will possibly have to make certain sacrifices regarding software or other choices. Yes it has gotten so much better over the past few years but I want my time spent on my work, not toying with the OS.
Another big selling point of Apple is the hardware. Their hardware and software are integrated so seamlessly. Things just work, and they work well. 99% of the time - there’s always edge cases.
There’s solutions to running Linux distros on some Apple hardware but again you have to make sacrifices.
jwells89
Even on the machines most well-supported by Linux, which are Intel x86 PCs with only integrated graphics and Intel wifi/bluetooth, there are still issues that need to be tinkered away like getting hardware-accelerated video decoding working in Firefox (important for keeping heat and power consumption down on laptops).
I keep around a Linux laptop and it's improved immensely in the past several years, but the experience still has rough edges to smooth out.
microkrat
I have used several distributions and daily driven linux for long periods of time (2-3 years) since 2008. Even today multimedia apps have issues, these can be solved by going through online forums, but it's always a frustrating start. Usually upgrades to software will re-introduce these issues and you will need to follow the same steps.
snoman
Which Linux?
amelius
> Private Cloud Compute
That's such a security theater. As long as nobody can look inside their ICs, nobody knows what's really happening there.
theshrike79
Oh? https://www.theregister.com/2024/10/25/apple_private_cloud_c...
> "Today we’re making these resources publicly available to invite all security and privacy researchers – or anyone with interest and a technical curiosity – to learn more about PCC and perform their own independent verification of our claims."
https://security.apple.com/documentation/private-cloud-compu...
There are also a million dollars of bounties to be had if you hack it
amelius
They mean that security researchers can look at the code, not the hardware at the transistor level.
ants_everywhere
They've certainly engaged in a lot of privacy theater before. For example
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
https://www.wired.com/story/apple-differential-privacy-short...
kalleboo
Does that mean you just don't bother encrypting any of your data, and just use unencrypted protocols? Since you can't inspect the ICs that are doing the work, encryption must all also be security theater.
IOT_Apprentice
Actually Apple has stated they are allowing security researchers to look at their infrastructure DIRECTLY.
saagarjha
They haven't done this.
amelius
That doesn't mean they get to know what happens inside the ICs.
Looking at a bunch of PCBs doesn't tell you much.
lukev
That's a fine bit of goalpost shifting. They state that they will make their _entire software stack_ for Private Cloud Compute public for research purposes.
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
davidczech
The first release set should be downloadable now for inspection. (It's binaries only, source is released for select components)
davidczech
That could be said of any device you own, ever.
victor106
I agree 100% with this.
Amongst all the big tech companies Apple is the closest you will get to if you want Privacy.
riazrizvi
The approach that the big platforms have to producing their own versions of very successful apps cannibalizes their partners. This focus on consumer privacy by Apple is the company's killer competitive advantage in this particular area, IMO. If I felt they were mining me for my private business data I'd switch to Linux in heartbeat. This is what keeps me off Adobe, Microsoft Office, Google's app suite, and apps like Notion as much as possible.
the_king
The single core performance looks really fast.
Chip | Geekbench Score (Process)
---- | ------------------------
M1 | 2,419 (5nm)
M2 | 2,658 (5nm)
M3 | 3,076 (3nm)
M4* | 3,810 (3nm)
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.M4 benchmark source: https://browser.geekbench.com/v6/cpu/8171874
toxicdevil
These numbers are misleading (as in not apples to apples comparison). M4 has a matrix multiply hardware extension which can accelerate code written (or compiled) specifically for this extension.
WithinReason
Technically it is Apples to Apples
grecy
So you're saying Apple added something to the design to make it do things faster...which is literally the definition of improvement.
When a company adds a supercharger to a car does it not count as faster?
When I add more solar panels to my roof does it not count as more power?
Surely doing this kind of thing is exactly what we want companies to be doing to make their products faster/better.
miahi
It will be faster only for code that uses/is optimized for that specific extension. And the examples you give are not really correct.
If you add a supercharger you will get more power, but if the car's transmission is not upgraded, you might just get some broken gears and shafts.
If you add more solar panels to your roof, you might exceed the inverter power, and the panels will not bring benefits.
It's true that you will benefit from the changes above, but not just by themselves - something else needs to change so you can benefit. And in the case of the M4 and these extensions, the software needs do be changed and also to have an use case for these extensions.
chipdart
> So you're saying Apple added something to the design to make it do things faster...
I think the whole point is that microbenchmarks provide no data in overall performance. They just test very specific and by no means common use case.
crest
Beware that some of the Geekbench number are the result of them suddenly gaining support for streaming SVE and SME just when Apple implements it.
I'm not doubting the number represent real peak throughput on M4. There is just a taste to the timing lining up so well. Also they don't take advantage of fully SVE2 capable ARM cores to compare how much a full SVE2 implementation would help especially at accelerating more algorithms than those that neatly map to streaming SVE and SME.
The single core performance gains of M4 variants over their predecessors are distorted, because the streaming SVE and SME are apparently implemented by combining what used to be them AMX units of four cores.
drunkenmagician
Processor roadmaps a multi year plans, I doubt Apple aligned their roadmap to geekbench supporting SVE. More likely a happy alignment of plan(et)s.
giobox
> I feel like Apple kind of buried this in their press release
The press release describes the single core performance as the fastest ever made, full stop:
"The M4 family features phenomenal single-threaded CPU performance with the world’s fastest CPU core"
The same statement is made repeatedly across most the new M4 line up marketing materials. I think thats enough to get the point across that its a pretty quick machine.
the_king
Exactly my point. Saying something is the fastest ever is marketing code (at least to me) for minor improvement over the previous generation.
If you're 30% faster than the previous generation, I'd rather see that because my assumption is it's 5%.
zamadatix
The article has all of the "x times faster than M1" notes but the video shows graphs with the M3 whenever they do that and it is usually ~1.2x in the CPU on that. I think it's probably a smart move this page (and the video) focused so much on 2x or greater performance increases from the M1 generation. After all, so what if it's 20% faster than the M3? As in: how many customers that weren't already interested in just buying the latest thing before reading your marketing material are you going to convince to upgrade from the M3 just because the M4 is ~20% faster vs trying to convince M1 users to upgrade because it's over twice as fast.
bluSCALE4
Yeah, better than the glaring, 10x better than i7 Intel Mac. Like that's even a valid point of reference.
sfmike
I maybe don't understand but isn't an intel at 5.5ghz faster in terms of bits processed then the 4.4ghz m4? wouldn't that be fastest as more data can be processed?
zamadatix
GHz represents the number of cycles per second, not the number of bits actually processed per second. On different CPUs the same instruction can take a different number of cycles, a different number of the instruction can be in flight at the same time, a different number of several different instructions can be issued at once, a different amount of data can be pulled from cache to feed these instructions, a different quality of instruction reordering and branch prediction, and so on.
As an example a single thread on an x64 core of an old Pentium 4 661 @ 3.6 GHz benchmarks at 315 with PassMark while a single x64 core of a current 285k @ 5.7 GHz turbo benchmarks at 5195. Some of that also comes down to things like newer RAM to feed the CPU but the vast majority comes down to the CPU calculating more bits per clock cycle.
resource_waste
lol
After decades of Apple, people still believe them.
tomcam
I don't know much about modern Geekbench scores, but it that chart seems to show that M1s are still pretty good? It appears that M4 is only about 50% faster. Somehow I would expect more like 100% improvement.
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
matthew-wegner
Some of it depends on which variant fits you best. But yeah, in general the M1 is still very good--if you hear of someone in your circle selling one for cheap because they're upgrading, nab it.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
Aurornis
> it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move,
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
mjlee
Apple claim up to 1.8x in the press release. They're cherry picking so 50% in a benchmark seems about right.
tomcam
Appreciate the sanity check.
giantrobot
The M1 was pretty fast when it debuted. If you own an M1 Mac its CPU has not gotten any slower over the years. While newer M-series might be faster, the old one is no slower.
The M1s are likely to remain pretty usable machines for a few years yet, assuming your workload has not or does not significantly change.
herpderperator
No CPU gets slow after any amount of years. That's not how it works. I think what you're trying to say is: Software gets more resource-intensive.
the_king
That's only single core. I think Logic is pretty optimized to use multiple cores (Apple demoed it on the 20 core Xeon Mac Pro back in 2019).
But if the M1 isn't the bottleneck, no reason to upgrade.
tomcam
Very good to know, thanks.
KptMarchewa
Those kinds of improvements are way of the past. Modern hardware is just too good.
q7xvh97o2pDhNrh
Absolutely incredible to see Apple pushing performance like this.
I can't wait to buy one and finally be able to open more than 20 Chrome tabs.
KptMarchewa
For a year, maybe. Web apps will use the opportunity and get slower enough so that you'll get old experience on new hardware soon.
KeplerBoy
I wonder how reliable geekbench tests are. Afaik it's the most common benchmark run on apple devices, so apple has a great interest in making sure their newest chips perform great on the test.
I wouldn't be surprised to hear that the geekbench developers are heavily supported by apple's own performance engineers and that testing might not be as objective or indicative of real world perf as one would hope.
zamadatix
Wouldn't it be more surprising to you Apple has been selling 4 generations of the M series to great acclaim on the performance but it all turned out to be a smoke and mirrors show because the hardware is optimized for one particular benchmark they didn't even reference in their comparisons?
From the data side: the M4 hasn't made it to the charts yet but the M3 already holds the 4th place in PassMark's single thread chart as well https://www.cpubenchmark.net/singleThread.html as well as tops Cinebench 2024 https://www.cpu-monkey.com/en/cpu_benchmark-cinebench_2024_s...
The only areas the M series "lags" is in the high end workstation/server segment where they don't really have a 96+ core option or in spaces where you pop in beefy high end GPUs. Everything else the M4 tends to lead in right now.
KeplerBoy
Okay, the linked benchmarks proved me wrong. I trust cinebench numbers a whole lot more than a geekbench score.
My bad.
And just to be clear: I didn't speculate that Apple tune's its chip to Geekbench, I speculated that geekbench was overly optimized towards apple's latest chip.
chipdart
> (...) but it all turned out to be a smoke and mirrors show because the hardware is optimized for one particular benchmark they didn't even reference in their comparisons?
I know iOS developers who recently upgraded their MacBooks and they claim they now feel more sluggish. I wouldn't be surprised if it was due to RAM constraints instead of CPU though.
So, take those artificial bencmarks with a grain of salt. They are so optimized that they are optimized out of the real world.
seec
It's kinda true but also not at all. In general, having better single thread performance means a more reactive UI and snappier feeling because blocking operation gets executed more quickly. On the other hand, many modern software having been extremely optimized for multi-threading and not all categories of software benefit that much from faster UI thread. If parallelization isn't too expansive, throwing more core at something can make it actually faster.
And the big thing you leave out is that it all depends on how well the software is optimise, how much animations it use and things like that. My iPhone has MUCH better single-thread performance than my old PC, yet it feels much slower for almost everything.
And this is exactly how I feel about Apple Silicon Macs. On paper, impressive performance. In actual practice it doesn't feel that fast.
choilive
They also explicitly called it out in their announcement videos that the M4 has the fastest CPU cores on the market.
jcmontx
> "up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro"
I insist my 2020 Macbook M1 was the best purchase I ever made
AdamJacobMuller
Yep.
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
jader201
> I've never kept any laptop as long as I've kept the M1.
My 2015 MBP would like to have a word.
It’s the only laptop purchase I’ve made. I still use it to this day, though not as regularly.
I will likely get a new MBP one of these days.
qubitcoder
You'll be glad you did. I loved my 2015 MBP. I even drove 3 hours to the nearest Best Buy to snag one. That display was glorious. A fantastic machine. I eventually gave it to my sister, who continued using it until a few years ago. The battery was gone, but it still worked great.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
ptmcc
My 2015 15" MBP is also still kickin, is/was an absolutely fabulous unit. Was my work machine for 3-4 years, and now another almost-6 years as my personal laptop. My personal use case is obviously not very demanding but it's only now starting to really show its age.
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
The longevity of Macbooks is insanely good.
0wis
If we are going this way… I still use a mid-2012 MBP as my main workstation.
Last one with upgrade capabilities, now it has two fast SSDs and maximum Ram. I changed the battery once.
Only shame is that it doesn’t get major MacOS upgrades anymore.
Still good enough to browse the web, do office productivity and web development.
12 years of good use, I am not sure I can get so much value anywhere now
oceanplexian
I still have my 2015, and it lived just long enough to keep me going until the death of the touch bar and horrible keyboard, which went away when I immediately bought the M1 Pro on release day.
grahamj
I loved my 2015 MBP, probably the best machine Apple made, overall, until arguably the 2019 16" (read: after the butterfly keyboard debacle)
Traded it for an M1 Air in 2021 and was astonished at how much faster it was. It even blew away my 2019 16" from work.
You're going to be even more blown away!
chrisweekly
My wife still uses my 2012 MBP 15 retina as her daily driver. The battery's terrible but everything else works fine.
alfiedotwtf
The 2015 MacBook Pro is the Nokia 3310 of our generation.
JohnBooty
My 2015 MBP would probably have been totally fine for development... except for the Docker-based workflows that everybody uses now.
Rebuilding a bunch of Docker images on an older intel mac is quite the slow experience if you're doing it multiple times per day.
touristtam
How do you justify this kind of recurring purchases, even with selling your old device? I don't get the behaviour or the driving decision factor past the obvious "I need the latest shiny toy" (I can't find the exact words to describe it, so apologies for the reductive description).
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
AdamJacobMuller
It's really very easy, honestly.
My laptop is my work life and my personal life.
I spend easily 100 hours a week using it not-as-balanced-as-it-should-be between the two.
I don't buy them because I need something new, I buy them because in the G4/Intel era, the iterations were massive and even a 20 or 30% increase in speed (which could be memory, CPU, disk -- they all make things faster) results in me being more productive. It's worth it for me to upgrade immediately when apple releases something new, as long as I have issues with my current device and the upgrade is enough of a delta.
M1 -> M2 wasn't much of a delta and my M1 was fine. M1 -> M3 was a decent delta, but, my M1 was still fine. M1 -> M4 is a huge delta (almost double) and my screen is dented to where it's annoying to sit outside and use the laptop (bright sun makes the defect worse), so, I'm upgrading. If I hadn't dented the screen the choice would be /a lot/ harder.
I love ThinkPads too. Really can take a beating and keep on going. The post-IBM era ones are even better in some regards too. I keep one around running Debian for Linux-emergencies.
szundi
There are 2 things I was always spending money on, if I felt is not the almost best achievable: my bed and my laptop. Even the phone can be 4 years old iPhone, but the laptop must be best and fast. My sleep is also pretty important. Everything else is just "eco".
szundi
In my country you can buy a device and write off in 2 years, VAT reimbursed, then scrap it from the books and you sell it to people without tax payed to people who otherwise would pay a pretty hefty VAT. This decreases your loss of value to like half.
undefined
nwhnwh
Consuming... for some people, is done for it's own sake.
qubitcoder
Apple has a pretty good trade-in program. If you have an Apple card, it's even better (e.g. the trade-in value is deducted immediately, zero interest, etc.).
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
LeafItAlone
>I've never kept any laptop as long as I've kept the M1
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
AdamJacobMuller
My work computer is my personal computer. I easily spend 100+ hours a week using it.
lqet
> I've never kept any laptop as long as I've kept the M1.
I still have a running Thinkpad R60 from 2007, a running Thinkpad T510 from 2012, and a modified running Thinkpad X61 (which I re-built as an X62 using the kit from 51nb in 2017 with a i7-5600U processor, 32 GB of RAM and a new display) in regular use. The latter required new batteries every 2 years, but was my main machine until 2 weeks ago when I replaced it with a ThinkCentre. During their time as my main machine, each of these laptops was actively used around 100 hours per week, and was often running for weeks without shutdown or reboot. The only thing that every broke was the display of the R60 which started to show several green vertical bars after 6 years, but replacement was easy.
garyrob
"I've had this laptop since release in 2020 and I have nearly 0 complaints with it."
Me too. Only one complaint. After I accidentally spilled a cup of water into it on an airplane, it didn't work.
(However AppleCare fixed it for $300 and I had a very recent backup. :) )
samtheprogram
If you don’t have AppleCare, it costs $1400+. M2 Pro here that I’m waiting to fix or upgrade because of that.
What’s more annoying is that I’d jus to get a new one and recycle this one, but the SSD is soldered on. Good on you for having a backup.
Do not own a Mac unless you bought it used or have AppleCare.
flemhans
Mine fell off from the roof of a moving car at highway speeds and subsequently spent 30 mins being run over by cars until it was picked back up. Otherwise no complaints.
halfmatthalfcat
My 2019 i9 going strong as ever. With 64gb ram, really don’t need to upgrade for at least a couple more years.
AdamJacobMuller
I had the 2019 i9. The power difference and the cooling difference is astounding from the 2019 to the M1 (and the M1 is faster).
I actually use my laptop on my lap commonly and I think the i9 was going to sterilize me.
KptMarchewa
That was the worst laptop I've ever had. Not only was it turning the jet engines on when you tried to do something more demanding that moving mouse around, it throttled thermally so much that you literally could not move that mouse around.
shade
I have the OG 13" MBP M1, and it's been great; I only have two real reasons I'm considering jumping to the 14" MBP M4 Pro finally:
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
bombcar
The M1 Max supports more than one external display natively, which is also an option.
KptMarchewa
I don't think it's niche. It's the reason why I and multiple of my coworkers waited till M1 Pro with buying one.
I'm definitely still happy with it, but job offers upgrade to M4 so... why not?
stetrain
Yep. That's roughly 20% per generation improvement which ain't half-bad these days, but the really huge cliff was going from Intel to the M1 generation.
M1 series machines are going to be fine for years to come.
Cthulhu_
It feels like M1 was the revolution, subsequent ones evolution - smaller fabrication process for improved energy efficiency, more cores for more power, higher memory (storage?) bandwidth, more displays (that was a major and valid criticism for the M1 even though in practice >1 external screens is a relatively rare use case for <5% of users).
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
drewbitt
And my 2020 Intel Macbook Air was a bad purchase. Cruelly, the Intel and M1 Macbook Air released within 6 months of each other.
rconti
In early 2020, I had an aging 2011 Air that was still struggling after a battery replacement. Even though I "knew" the Apple Silicon chips would be better, I figured a 2020 Intel Air would last me a long time anyway, since my computing needs from that device are light, and who knows how many years the Apple Silicon transition will take take anyway?
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
cantsingh
Exact same boat here. A friend and I both bought the 2020 Intel MBA thinking that the M1 version was at least a year out. It dropped a few months later. I immediately resold my Intel MBA seeing the writing on the wall and bought a launch M1 (which I still use to this day). Ended up losing $200 on that mis-step, but no way the Intel version would still get me through the day.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
chrizel
Oh yes, my wife bought a new Intel MBA in summer 2020... I told her at the time Apple planned its own chip, but it couldn't be much better than the Intel one and surely Apple will increase prices too... I was so wrong.
ElCapitanMarkla
Yeah I’m in the same boat. I had my old mid 2013 Air for 7 years before I pulled the trigger on that too. I’ll be grabbing myself an M4 Pro this time
JohnBooty
Amen. I got a crazy deal on a brand new 2020 M1 Max MBP with 64GB/2TB in 2023.
This is the best machine I have ever owned. It is so completely perfect in every way. I can't imagine replacing it for many many years.
markus_zhang
Congratulations, just curious what is the deal?
giik
At the end of 2023 BH Photo Video was selling the M1 Max 16” 64G/2TB for 2,499. It’s the lowest I’ve ever seen it anywhere and I got one myself.
undefined
leokennis
I still use my MacBook Air M1 and given my current workloads (a bit of web development, general home office use and occasional video editing and encoding) I doubt I’ll need to replace it in the coming 5 years. That’ll be an almost 10 year lifespan.
_19qg
It's a very robust and capable small laptop. I'm typing this to a M1 Macbook Air.
The only thing to keep in mind, is that the M1 was the first CPU in the transition from Intel CPUs (+ AMD GPUs) to Apple Silicon. The M1 was still missing a bunch of things from earlier CPUs, which Apple over time added via the M1 Pro and other CPUs. Especially the graphics part was sufficient for a small laptop, but not for much beyond. Better GPUs and media engines were developed later. Today, the M3 in a Macbook Air or the M4 in the Macbook Pro have all of that.
For me the biggest surprise was how well the M1 Macbook Air actually worked. Apple did an outstanding job in the software & hardware transition.
slmjkdbtl
I switched from a 2014 MacBook pro to a 2020 M1 MacBook Air, yeah the CPU is much faster, but the build quality and software is a huge step backwards. The trackpad is feels fake, not nearly as responsive, keyboard also feel not as solid. But now I'm already used to it.
seec
They also feel very bulky/innelegant while still being fragile for the most part and not really hitting workstation level territory.
I don't understand how people are enamored with those things, sure it's better in some way than what it was before but it's also very compromised for the price.
BrentOzar
The M4 Max goes up to 128GB RAM, and "over half a terabyte per second of unified memory bandwidth" - LLM users rejoice.
manaskarekar
The M3 Max was 400GBps, this is 540GBps. Truly an outstanding case for unified memory. DDR5 doesn't come anywhere near.
Rohansi
Apple is using LPDDR5 for M3. The bandwidth doesn't come from unified memory - it comes from using many channels. You could get the same bandwidth or more with normal DDR5 modules if you could use 8 or more channels, but in the PC space you don't usually see more than 2 or 4 channels (only common for servers).
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
rbanffy
> (only common for servers).
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
astrange
> In fact, every computer without a discrete GPU uses a unified memory model these days!
On PCs some other hardware (notably the SSD) comes with its own memory. But here it's shared with the main DRAM too.
This is not necessarily a performance improvement, it can avoid copies but also means less is available to the CPU.
binary132
I read all that marketing stuff and my brain just sees APU. I guess at some level, that’s just marketing stuff too, but it’s not a new idea.
manaskarekar
Yes, it's just easier to call it that without having to sprinkle asterisks at each mention of it :)
And yes, the impressive part is that this kind of bandwidth is hard to get on laptops. I suppose I should have been a bit more specific in my remark.
throwaway48476
High end servers now have 12 ddr5 channels.
oDot
Isn't unified memory* a crucial part in avoiding signal integrity problems?
Servers do have many channels but they run relatively slower memory
* Specifically, it being on-die
sunshowers
Yeah memory bandwidth is one of the really unfortunate things about the consumer stuff. Even the 9950x/7950x, which are comfortably workstation-level in terms of compute, are bound by their 2 channel limits. The other day I was pricing out a basic Threadripper setup with a 7960x (not just for this reason but also for more PCIe lanes), and it would cost around $3000 -- somewhat out of my budget.
This is one of the reasons the "3D vcache" stuff with the giant L3 cache is so effective.
Tepix
For comparison, a Threadripper Pro 5000 workstation with 8x DDR4 3200 has 204.8GB/s of memory bandwidth. The Threadripper Pro 7000 with DDR5-5200 can achieve 325GB/s.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
metadat
I was curious so I looked it up:
https://en.wikipedia.org/wiki/DDR5_SDRAM (info from the first section):
> DDR5 is capable of 8GT/s which translates to 64 GB/s (8 gigatransfers/second * 64-bit width / 8 bits/byte = 64 GB/s) of bandwidth per DIMM.
So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
DDR4 clocks in at 3.2GT/s and the fastest DDR3 at 2.1GT/s.
DDR5 is an impressive jump. HBM is totally bonkers at 128GB/s per DIMM (HBM is the memory used in the top end Nvidia datacenter cards).
Cheers.
reliabilityguy
> So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
sroussey
Yes, and wouldn’t it be bonkers if the M4 Max supported HBM on desktops?
jsheard
It's not the memory being unified that makes it fast, it's the combination of the memory bus being extremely wide and the memory being extremely close to the processor. It's the same principle that discrete GPUs or server CPUs with onboard HBM memory use to make their non-unified memory go ultra fast.
smith7018
I thought “unified memory” was just a marketing term for the memory being extremely close to the processor?
vid
It's not "DDR5" on its own, it's a few factors.
Bandwidth (GB/s) = (Data Rate (MT/s) * Channel Width (bits) * Number of Channels) / 8 / 1000
(8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
wtallis
> (8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
sliken
Fewer libraries? Any that a normal LLM user would care about? Pytorch, ollama, and others seem to have the normal use cases covered. Whenever I hear about a new LLM seems like the next post is some mac user reporting the token/sec. Often about 5 tokens/sec for 70B models which seems reasonable for a single user.
cjbprime
Right, the nvidia card maxes out at 24GB.
manaskarekar
Thanks, but just to put things into perspective, this calculation has counted 8 channels which is 4 DIMMs and that's mostly desktops (not dismissing desktops, just highlighting that it's a different beast).
Most laptops will be 2 DIMMS (probably soldered).
Y-bar
> This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
garciasn
We run our LLM workloads on a M2 Ultra because of this. 2x the VRAM; one-time cost at $5350 was the same as, at the time, 1 month of 80GB VRAM GPU in GCP. Works well for us.
alfonsodev
Can you elaborate, are those workflows in queue or can they serve multiple users in parallel ?
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources. Thanks !
garciasn
Our use case is atypical, based on what others seem to require. While we serve multiple requests in parallel, our workloads are not 'chat'.
bushbaba
About 10-20% of my companies gpu usage is inference dev. Yes horribly not efficient usage of resources. We could upgrade the 100ish devs who do this dev work to M4 mbp and free up gpu resources
Smart move by Apple
manaskarekar
If the 2x multiplier holds up, the Ultra update should bring it up to 1080GBps. Amazing.
SirMaster
There isn't even an M3 Ultra. Will there be an M4 Ultra?
charlescurt123
comparing a laptop to a A100 (312 teraFLOPS) or H100 (~1P FLOPS) server is a stretch to say the least.
An M2 is according to a reddit post around 27 tflops
So < 1/10 the performance of just computation. let alone the memory.
What workflow would use something like this?
hajile
They aren't going to be using fp32 for inferencing, so those FP numbers are meaningless.
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
kristianp
> comparing a laptop
Small correction: the M2 Ultra isn't found in laptops, its in the Studio.
Der_Einzige
Right now, there are 0.90$ per hour H100 80gbs that you can rent.
losvedir
I'm curious about getting one of these to run LLM models locally, but I don't understand the cost benefit very well. Even 128GB can't run, like, a state of the art Claude 3.5 or GPT 4o model right? Conversely, even 16GB can (I think?) run a smaller, quantized Llama model. What's the sweet spot for running a capable model locally (and likely future local-scale models)?
brandall10
You'll be able to run 72B models w/ large context, lightly quantized with decent'ish performance, like 20-25 tok/sec. The best of the bunch are maybe 90% of a Claude 3.5.
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
SkyMarshal
Lots of discussion and testing of that over on https://www.reddit.com/r/LocalLLaMA/, worth following if you're not already.
bufferoverflow
Claude 3.5 and GPT 4o are huge models. They don't run on consumer hardware.
moffkalast
Well it's more like pick your poison, cause all options have caveats:
- Apple: all the capacity and bandwidth, but no compute to utilize it
- AMD/Nvidia: all the compute and bandwidth, but no capacity to load anything
- DDR5: all the capacity, but no compute or bandwidth (cheap tho)
Dibby053
Why was this downvoted?
moffkalast
To quote an old meme, "They hated Jesus because he told them the truth."
jjcm
For context, the 4090 has 1,008 GB/s of bandwidth.
spacedcowboy
... but only 1/4 of the actual memory, right ?
The M4-Max I just ordered comes with 128GB of RAM.
Inviz
I have M3 Max with 128GB of ram, it's really liberating.
sfn42
I have 32gb and I've never felt like I needed more.
umanwizard
Having 128GB is really nice if you want to regularly run different full OSes as VMs simultaneously (and if those OSes might in turn have memory-intensive workloads running on them).
Somewhat niche case, I know.
moffkalast
Obviously you're not a golfer.
shiroiushi
No one needs more than 640kB.
thimabi
At least in the recent past, a hindrance was that MacOS limited how much of that unified memory could be assigned as VRAM. Those who wanted to exceed the limits had to tinker with kernel settings.
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
culi
you'd probably save money just paying for a VPS. And you wouldn't cook your personal laptop as fast. Not that people nowadays keep their electronics for long enough for that to matter :/
flkiwi
The weird thing about these Apple product videos in the last few years is that there are all these beautiful shots of Apple's campus with nobody there other than the presenter. It's a beautiful stage for these videos, but it's eerie and disconcerting, particularly given Apple's RTO approach.
brailsafe
Incidentally, when I passed through the hellscape that is Cupertino/San Jose a few years, I was a little shocked that as a visitor you can't even see the campus; it's literally a walled garden. I guess when I was initially curious about the campus design during its build, I assumed that even a single part, maybe the orchard, would be accessible to the public. I guess based on the surrounding urban development though, the city isn't exactly interested in being livable.
reaperducer
I used to think the videos with all of the drone fly-bys was cool. But in the last year or so, I've started to feel the same as you. Where are all the people? It's starting to look like Apple spent a billion dollars building a technology ghost town.
Surely the entire staff can't be out rock climbing, surfing, eating at trendy Asian-inspired restaurants at twilight, and having catered children's birthday parties in immaculately manicured parks.
flkiwi
Oh I think they're very well done and very pretty! But lately this discomfort has started to creep in, as you note. Like something you'd see in a WALL-E spinoff: everyone has left the planet already but Buy n Large is still putting out these glorious promo videos using stock footage. Or, like, post-AI apocalypse, all the humans are confined to storage bins, but the proto-AI marketing programs are still churning out content.
astrange
The neighboring city charges $100k per newly constructed unit for park maintenance fees. So there actually are a lot of nice parks.
davidczech
I think it’s usually filmed on weekends
monocasa
You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
hammock
The Apple brand is - foundationally - pretty solitary.
Think about the early ipod ads, just individuals dancing to music by themselves. https://www.youtube.com/watch?v=_dSgBsCVpqo
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
Cthulhu_
If only in some shots, but they are such a valuable company that they simply cannot afford the risk of e.g. criticism for the choice of people they display, or inappropriate outfits or behaviour. One blip from a shareholder can cost them billions in value, which pisses off other shareholders. All of their published media, from videos like this to their conferences, are highly polished, rehearsed, and designed by committee. Microsoft and Google are the same, although at least with Google there's still room for some comedy in some of their departments: https://youtu.be/EHqPrHTN1dU
filoleg
> You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
matrix87
> the concept of technology working for and with the people that use it
> liminal space horror
reminds me of that god awful crush commercial
theshrike79
Easier to track continuity between takes if you don't have people in the background.
hoherd
I interviewed there in 2017 and honestly even back then the interior of their campus was kind of creepy in some places. The conference rooms had this flat, bland beige that reminded me of exactly the kind of computers the G3 era was trying to get away from, but the size of a room, and you were inside it.
saagarjha
The Mac mini video from yesterday has employees: https://www.apple.com/105/media/us/mac-mini/2024/58e5921e-f4...
flkiwi
That by itself raises an interesting editorial question. Apple (like most big companies) doesn't do things randomly re: high impact public communications like this. I'm curious what made the Mac mini a product that merited showing people doing things, with a computer that is tied to one location, vs. a Macbook Pro's comparative emptiness, for a computer that can go anywhere and be with anyone. It could be as simple as fun vs focus.
a012
I imagine the Mac mini is really small now if it could be powered via USB PD then I think it’s no problem to put it in a backpack with a kb and touchpad then bring it home then to the office. This is because I notice there are many people just bring their MBP to the office then plug it to a big screen. The downside is just you can’t work anywhere like with a MBP, but the usability is mostly the same (to me)
kristianp
> MacBook Pro with M4 Pro is up to 3x faster than M1 Pro (13)
> (13) Testing conducted by Apple from August to October 2024 using preproduction 16-inch MacBook Pro systems with Apple M4 Pro, 14-core CPU, 20-core GPU, 48GB of RAM and 4TB SSD, and production 16-inch MacBook Pro systems with Apple M1 Pro, 10-core CPU, 16-core GPU, 32GB of RAM and 8TB SSD. Prerelease Redshift v2025.0.0 tested using a 29.2MB scene utilising hardware-accelerated ray tracing on systems with M4 Pro. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
So they're comparing software that uses raytracing present in the M3 and M4, but not in the M1. This is really misleading. The true performance increase for most workloads is likely to be around 15% over the M3. We'll have to wait for benchmarks from other websites to get a true picture of the differences.Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
14-inch MacBook Pro with M4 4.5x
14-inch MacBook Pro with M3 3.8x
13-inch MacBook Pro with M1 2.7x
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.No benchmarks yet, but this article gives some tables of comparative core counts, max RAM and RAM bandwidths: https://arstechnica.com/apple/2024/10/apples-m4-m4-pro-and-m...
LeifCarrotson
I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
jsheard
> I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
That's an architectural limitation of the base M4 chip, if you go up to the M4 Pro version you can get up to 48GB, and the M4 Max goes up to 128GB.
FireBeyond
The "base level" Max is limited at 36GB. You have to get the bigger Max to get more.
latortuga
The new mac mini also has an M4 Pro that goes up to 64GB.
redundantly
The M4 tops off at 32 GB
The M4 Pro goes up to 48 GB
The M4 Max can have up to 128 GB
ldoughty
It seems you need the M4 Max with the 40-core GPU to go over 36GB.
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
brailsafe
Somehow I got downvoted for pointing this out, but it's weird that you have to spend an extra $800 USD just to be able to surpass 48gb, and "upgrading" to the base level Max chip decreases your ram limit, especially when the M4 Pro on the Mac Mini goes up to 64gb. Like... that's a shit load of cash to put out if you need more ram but don't care for more cores. I was really hoping to finally upgrade to something with 64gb, or maybe 96 or 128 if it decreased in price, but it's they removed the 96 and kept 64 and 128 severely out of reach.
Do I get 2 extra CPU cores, build a budget gaming PC, or subscribe to creative suite for 2.5 years!?
SparkyMcUnicorn
It doesn't look this cut and dry.
M4 Max 14 core has a single option of 36GB.
M4 Max 16 core lets you go up to 128GB.
So you can actually get more ram with the Pro than the base level Max.
undefined
post-it
I haven't done measurements on this, but my Macbook Pro feels much faster at swapping than any Linux or Windows device I've used. I've never used an M.2 SSD so maybe that would be comparable, but swapping is pretty much seamless. There's also some kind of memory compression going on according to Activity Monitor, not sure if that's normal on other OSes.
lynguist
No it's true.
Apple has hardware accelerated compressed swapping.
Windows has compressed swapping.
And Linux is a mess. You have to manually configure a non-resizable compressed zram, or use it without compression on a non-resizable swap partition.
thimabi
Yes, other M.2 SSDs have comparable performance when swapping, and other operating systems compress memory, too — though I believe not as much as MacOS.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
post-it
Degrading the SSD is a good point. This is thankfully a work laptop so I don't care if it lives or dies, but it's something I'll have to consider when I eventually get my own Mac.
fckgw
On the standard M4 processor. If you move the M4 Pro it tops out at 48gb or moving to the M4 Max goes up to 128gb.
41995701
Weird that the M4 Pro in the Mac mini can go up to 64GB. Maybe a size limitation on the MBP motherboard or SOC package?
_diyar
Probably just Apple designing the pricing ladder.
donavanm
It looks like different versions of the ‘Pro’ based on core count and memory bandwidth. Im assuming the 12c Mini M4 Pro has the same memory bandwidth/channels enabled as the 14c MBP M4 Pro, enabling the 64GB. My guess would be related to binning and or TDP.
Tepix
The 96GB RAM option of the M3 Max disappeared.
Octoth0rpe
The max memory is dependent on which tier M4 chip you get. The M4 max chip will let you configure up to 128gb of ram
MaxDPS
It looks like the 14 core M4 Max only allows 36GB of ram. The M4 Pro allows for up to 48GB. It's a bit confusing.
MaysonL
Interesting tidbit: MacBook Airs also now start at 16GB. Same price!
undefined
undefined
throw0101a
> All MacBook Pro models feature an HDMI port that supports up to 8K resolution, a SDXC card slot, a MagSafe 3 port for charging, and a headphone jack, along with support for Wi-Fi 6E and Bluetooth 5.3.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
* https://en.wikipedia.org/wiki/Wi-Fi_7
* https://en.wikipedia.org/wiki/Wi-Fi_6E
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
cojo
I was quite surprised by this discrepancy as well (my new iPhone has 7, but the new MBP does not).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
pazimzadeh
they hold their value well so you could buy it this year and sell it next year when you buy the new one. you'd probably only lose ~$500
cojo
Good point! I hadn’t looked at how resale value holds up. Maybe I will do that after all… thanks for the suggestion!
404mm
The lack of Wifi7 is a real bummer for me. I was hoping to ditch the 2.5Gbe dongle and just use WiFi.
mort96
Hm why? Is 6E really so much worse than 7 in practice that 7 can replace wired for you but 6E can't? That's honestly really weird to me. What's the practical difference in latency, bandwidth or reliability you've experienced between 6E and 7?
404mm
I don’t have any 6E device so I cannot really tell for sure but from what I read, 6E gets you to a bit over 1Gbit in real world scenario. 7 should be able to replace my 2.5Gbe dongle or at least get much closer to it. I already have routers WiFi 7 Eeros on a 2.5Gbe wired backbone.
canucker2016
Yeah, I thought that was weird. None of the Apple announcements this week had WiFi7 support, just 6E.
https://www.tomsguide.com/face-off/wi-fi-6e-vs-wi-fi-7-whats...
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
ygouzerh
It looks like few people only are using Wifi 7 for now. Maybe they are going to include it in the next generation when more people will use it.
throw0101a
> It looks like few people only are using Wifi 7 for now.
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
sroussey
Yeah, this threw me as well. When the iMac didn’t support WiFi 7, I got a bit worried. I have an M2, so not going to get this, but the spouse needs a new Air and I figure that everything would have WiFi 7 by then, and now I don’t think so.
carstenhag
Faster is always nice, makes sense. But do you really need WiFi 7 features/speed? I don't know when I would notice a difference (on a laptop) between 600 or 1500 Mbit/s (just as an example). Can't download much anyhow as the storage will get full in minutes.
throw0101a
> But do you really need WiFi 7 features/speed?
One of the features is preamble punching, which is useful in more dense environments:
* https://community.fs.com/article/how-preamble-puncturing-boo...
* https://www.ruckusnetworks.com/blog/2023/wi-fi-7-and-punctur...
MLO helps with resiliency and the improved OFDMA helps with spectrum efficiency as well. It's not just about speed.
iknowstuff
Call of Duty is 200GB
RobinL
Can anyone comment on the viability of using an external SSD rather than upgrading storage? Specifically for data analysis (e.g. storing/analysing parquet files using Python/duckdb, or video editing using divinci resolve).
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
muro
Don't bother with thunderbolt 4, go for USB 4 enclosure instead - I've got a Jeyi one. Any SSD will work, I use a Samsung 990 pro inside. It was supposed to be the fastest you can get - I get over 3000MB/s.
Here is the rabbit hole you might want to check out: https://dancharblog.wordpress.com/2024/01/01/list-of-ssd-enc...
radicality
Though TB5 should be better. I think you can already find some of these on Aliexpress.
pier25
It's totally fine.
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
spopejoy
I'm a little sus of owc these days, their drives are way expensive, never get any third-party reviews or testing, and their warranty is horrible (3 years). I've previously swore by them so it's a little disappointing
pier25
The only OWC product I own is a TB4 dock and so far it has been rock solid.
trogdor
> Also, any recommendations for suitable ssds, ideally not too expensive?
I own a media production company. We use Sabrent Thunderbolt external NVMe TLC SSDs and are very happy with their price, quality, and performance.
I suggest you avoid QLC SSDs.
joshvm
Basically any good SSD manufacturer is fine, but I've found that the enclosure controller support is flaky with Sonoma. Drives that appear instantly in Linux sometimes take ages to enumerate in OSX, and only since upgrading to Sonoma. Stick with APFS if you're only using it for Mac stuff.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
rbanffy
The USB-C ports should be quite enough for that. If you are using a desktop Mac, such as an iMac, Mini, or the Studio and Pro that will be released later this week, this is a no-brainer - everything works perfectly.
rbanffy
Sorry. No Studios or Pros this turn. I’m as disappointed as everyone else.
thejazzman
i go with the acasis thunderbolt enclosure and then pop in an nvme of your choice, but generic USB drives are pretty viable too ... thunderbolt can be booted from, while USB can't
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
AlphaWeaver
I've used a Samsung T5 SSD as my CacheClip location in Resolve and it works decently well! Resolve doesn't always tolerate disconnects very well, but when it's plugged in things are very smooth.
radicality
Hopefully in the next days/weeks we’ll see TB5 external enclosures and you’ll be able to hit very fast speeds with the new Macs. I would wait for those before getting another enclosure now.
Afaik the main oem producer is Winstars, though I could only find sketchy-looking Aliexpress seller so far.
tomrod
This is the first compelling Mac to me. I've used Macs for a few clients and muscle memory is very deeply ingrained for linux desktops. But with local LLMs finally on the verge of usability along with sufficient memory... I might need to make the jump!
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
aidenfoxivey
You totally can after a little bit of time waiting for M4 bringup!
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
p_j_w
Asahi doesn't work on M3 yet after a year. It's gonna be a bit before M4 support is here.
quux
IIRC one of the major factors holding back M3 support was the lack of a M3 mini for use in their CI environment. Now that there's an M4 mini hopefully there aren't any obstacles to them adding M4 support
umanwizard
"a little bit of time" is a bit disingenuous given that they haven't even started working on the M3.
(This isn't a dig on the Asahi project btw, I think it's great).
__MatrixMan__
I miss Linux, it respected me in ways that MacOS doesn't. But maintaining a sane dev environment on linux when my co-workers on MacOS are committing bash scripts that call brew... I am glad that I gave up that fight. And yeah, the hardware sure is nice.
tomrod
IIRC brew supports linux, but it isn't a package manager I pay attention to outside of some very basic needs. Way too much supply chain security domain to cover for it!
__MatrixMan__
It does, but I prefer to keep project dependencies bound to that project rather than installing them at wider scope. So I guess it's not that I can't use Linux for work, but that I can't use Linux for work and have it my way. And if I can't have it my way anyway, then I guess Apple's way will suffice.
BenFranklin100
Off topic, but I’m very interested in local LLMs. Could you point me in the right direction, both hardware specs and models?
doctoboggan
In general for local LLMs, the more memory the better. You will be able to fit larger models in RAM. The faster CPU will give you more tokens/second, but if you are just chatting with a human in the loop, most recent M series macs will be able to generate tokens faster than you can read them.
int_19h
That also very much depends on model size. For 70B+ models, while the tok/s are still fast enough for realtime chat, it's not going to be generating faster than you can read it, even on Ultra with its insane memory bandwidth.
thrownblown
BenFranklin100
Thanks to both of you!
touristtam
Have a look at ollama? I think there is a vscode extension to hook into local LLM if you are so inclined: https://ollama.com/blog/continue-code-assistant
noman-land
Get as much RAM as you can stomach paying for.
w10-1
macOS virtualization of Linux is very fast and flexible. Their sample code shows it's easy without any kind of service/application: https://developer.apple.com/documentation/virtualization/run...
However, it doesn't support snapshots for Linux, so you need to power down each session.
darylteo
I've been lightly using ollama on the m1 max and 64gb RAM. Not a power user but enough for code completions.
lowbloodsugar
You can spin up a Unix OS. =) It’s even older than Linux.
umanwizard
NextSTEP which macOS is ultimately based on is indeed older than Linux (first release was 1989). But why does that matter? The commenter presumably said "Linux" for a reason, i.e. they want to use Linux specifically, not any UNIX-like OS.
lowbloodsugar
Sure. But not everybody. That’s how I ended up on a Mac. I needed to develop for Linux servers and that just sucked on my windows laptop (I hear it’s better now?). So after dual booting fedora on my laptop for several months I got a MacBook and I’ve never looked back.
tomrod
BSD is fun (not counting MacOS in the set there), but no, my Unix experiences have been universally legacy hardware oversubscribed and undermaintained. Not my favorite place to spend any time.
d1str0
Check out Asahi linux
opjjf
It seems they also update the base memory on MacBook Air:
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
electriclove
Wow, I didn't expect them to update the older models to start at 16GB and no price increase. I guess that is why Amazon was blowing the 8GB models out at crazy low prices over the past few days.
bronco21016
Costco was selling MB Air M2 8 GB for $699! Incredible deal.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
porphyra
I'm waiting for the 16 GB M2 Air to be super cheap to pick one up to use with Asahi Linux!
__rito__
I was seeing $699 MB Air M1 8 GB on Amazon India a week ago.
bhouston
But no update to a M4 for the MacBook Air yet unfortunately. I would love to get an M4 MacBook Air with 32GB.
I believe the rumor is that the MacBook Air will get the update to M4 in early spring 2025, February/March timeline.
ant6n
The big question for me is whether they will have a matte option for the Air. I want a fanless machine with a matte screen.
Unfortunately Apple won’t tell you until the day they sell the machines.
davio
1TB+ iPad Pro can be a fanless machine with a matte screen
nsbk
This is the machine I'm waiting for. Hopefully early 2025
rbanffy
There are still a couple days left this week.
brewmarche
Given that the Mini and iMac have received support for one more additional external display (at 60Hz 6K), I hope we’ll see the same on the MBA M4.
yurishimo
It'll be interesting to see the reaction of tech commentators about this. So many people have been screaming at Apple to increase the base RAM and stop price gouging their customers on memory upgrades. If Apple Intelligence is the excuse the hardware team needed to get the bean counters on board, I'm not going to look a gift horse in the mouth!
nsteel
But still just 256GB SSD Storage. £200 for the upgrade to 512GB (plus a couple more GPU cores that I don't need. Urgh.
DrBenCarson
It’s stationary. Just get a Thunderbolt NVMe drive and leave it plugged in
jq-r
Why buy a laptop then if you're lugging all those external hard drives?
jsheard
Every M-series device now comes with at least 16GB, except for the base iPad Pro, right?
fckgw
Correct, every Mac computer starts at 16gb now. 256gb/512gb iPad Pro is 8gb, 1tb/2tb is 16gb.
alsetmusic
Ohh, good catch. Sneaking that into the MBP announcement. I skimmed the page and missed that. So a fourth announcement couched within the biggest of the three days.
hiatus
If only they would bring back the 11" Air.
FireBeyond
Well, the issue for me with memory on these new models is that for the Max, it ships with 36GB and NO expandable memory option. To get more memory that's gated behind a $300 CPU upgrade (plus the memory cost).
aorth
Four generations into the new platform and there is no answer from anyone else in the industry. Incredible.
jsnndjxjd
There are both more powerful and more battery efficient offers available.
The Apple ARM laptops are just on an arbitrary point belong the power/efficiency scale.
If it happens to match your needs: great
But it's not like it's ahead of the industry in any way ^^
aorth
It's suspicious that your account created 1 day ago and you say there are more powerful and more battery efficient offers available but don't give an example.
P.S. writing to you from a six year old ThinkPad running Linux. I'm not an Apple fanboy. It is my opinion that Apple's products are leagues ahead of the rest of the industry and it's not even close: performance, efficiency, and build quality are incredible.
preisschild
I switched my work M1 Max with an AMD Ryzen 9 7940HS Framework 16 running Linux and am happier since.
I guess it depends on the person which computer is better for them.
greggroth
I'm torn between the new M4 MBP and a Framework laptop with Linux for my personal computer. Can you share some deciding factors for you? Does it mostly come down to the OS? How is the battery life with the Framework?
ElCapitanMarkla
I wish I could get that option but I'm still waiting for an NZ release :(
tucosan
Why are you happier exactly?
BossingAround
Intel's Core 5/Core 7/Core 9 are an answer.
aorth
I dunno. Four years ago the MacBook M1 Pro was able to compile WebKit nearly as fast as the Mac Pro, and nearly twice as fast as the Intel-based MacBook Pro, and still had 91% battery versus 24% in the Intel-based MacBook Pro. Incredible.
https://techcrunch.com/2020/11/17/yeah-apples-m1-macbook-pro...
zuhsetaqi
But now its four years later
Get the top HN stories in your inbox every day.
I really respect Apple's privacy focused engineering. They didn't roll out _any_ AI features until they were capable of running them locally, and before doing any cloud-based AI they designed and rolled out Private Cloud Compute.
You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.