When it was introduced, Apple said the trash can was a revolution in cooling design.
Then they said they couldn't upgrade the components because of heat. Everyone knows that wasn't true.
By the time Apple said they had issues with it in 2017, AMD were offering 14nm GCN4 and 5 graphics (Polaris and Vega) compared to the 28nm GCN1 graphics in the FirePro range. Intel had moved from Ivy Bridge to Skylake for Xeons. And if they wanted to be really bold (doubtful, as the move to ARM was coming) then the 1st gen Epyc was on the market too.
Moore's Law didn't stop applying for 6 years. They had options and chose to abandon their flagship product (and most loyal customers) instead.
The biggest issue was actually that the Mac Pro was designed specifically for dual GPUs- in the era of SLI this made some sense, but once that technology was abandoned it was a technological dead-end.
If you take one apart you'll see why, it's not the case that you could have ever swapped around the components to make it dual-CPU instead; it really was "dual GPU or bust".
Somewhat ironically, in todays ML ecosystem, that architecture would probably do great. Though I doubt it could possibly do better than what the M-series is doing by itself using unified memory.
I'll admit that while I've used the trash can but never taken it apart myself. But I can't imagine it would have been impossible to throw 2x Polaris 10 GPUs on the daughterboards in place of the FirePros.
For what is essentially a dead-end technology, I'm somewhat doubtful people would have bought it (since the second GPU is going to be idle and add to the cost massively).
the CPU being upgraded would have been much easier though I think.
Apple even in 2017 had the money and engineering resources to update or replace their flagship computer - whether with a small update to Skylake & Polaris and/or a return to a cheesegrater design as they did in 2019.
But they chose not to. They let their flagship computer rot for over 2000 days.
Aside from the GPU mess, the 2013 was a nice machine, basically a proto-Mac Studio. Aside from software, the only thing that pushed me off my D300/64GB/12-core as an everyday desktop + front-end machine is the fact that there's no economically sensible way to get 4K video at 120 Hz given that an eGPU enclosure + a decent AMD GPU would cost as much as a Mac mini, so I'm slumming it in Windows for a few months until the smoke clears from the next Mac Studio announcement.
At which point I'll decide whether to replace my Mac Pro with a Mac Studio or a Linux workstation; honestly, I'm about 60/40 leaning towards Linux at this point, in which case I'd also buy a lower-end Mac, probably a MacBook Air.
I'm in the Linux desktop / Mac laptop camp, and it works well for me. Prevents me getting too tied up in any one ecosystem so that I can jump ship if Apple start releasing duds again.
And to take the analogy even further, I'm sure there will be a subset of people who develop really strong opinions about a particular toolchain or workflow. Like how we have people who specialize in 70s diesel trucks or 90-00s JDM sports cars, there'll likely be programmers who are SMEs at updating COBAL to Rust using Claude.
You mean Metro/Modern? I never see it anymore, and to be honest I prefer the 10 look over Aero, despite being a Millennial vaporwave fan. I haven't spun up 7 in years and years, and I don't miss it at all. But OS X 10.4's Aqua was the peak.
But also, why wouldn't UI changes be possible if the source was open? I remember WindowBlinds and patched uxTheme.dll in the XP days, and that was /without/ source being available. So in this hypothetical, what's stopping hackers from backporting the things they like about 7 to 10 or adding more rounded translucency?
Because Windows isn't really an OS anymore, but a "platform" to deliver advertisements and lock you into Microsoft services. The OS core itself is fairly solid (and has been since Vista/7) but it's all of the crud shoved on top which really ruins everything.
The LTSC IoT releases are easy to find (wink-wink) and don't have 80% of the annoyances, including constant "feature upgrades" - still not Linux, but better than consumer Windows.
I'll correct myself: it sounds good for about 5 seconds before you think about it and realize it's an unworkable idea which creates more problems than it solves.
I also did that for some time, I just don't perceived clocks to have a single point that is up and mentally rotated clocks all the time. The hours just lost their meaning beyond their numerical value.
>I don't want sunrise to happen at 21:00, noon at 3:00, and sunset at 9:00
but it will happen regardless of what you think about it, the only choice you have is to pretend it's happening at a "different time" because you assign a different number to it
Another falsehood programmers believe about time. A stopped clock is only right twice a day if it is a 12 hour clock and only if it’s not set at a leap second or at a skipped time during the shift from standard to daylight time.
Haven't met him personally, but it's nice to hear he's as regionally popular as he is. I'm just going off the fact I've seen him around and he's often documenting nearby streets. It's good stuff and super informative.
Then they said they couldn't upgrade the components because of heat. Everyone knows that wasn't true.
By the time Apple said they had issues with it in 2017, AMD were offering 14nm GCN4 and 5 graphics (Polaris and Vega) compared to the 28nm GCN1 graphics in the FirePro range. Intel had moved from Ivy Bridge to Skylake for Xeons. And if they wanted to be really bold (doubtful, as the move to ARM was coming) then the 1st gen Epyc was on the market too.
Moore's Law didn't stop applying for 6 years. They had options and chose to abandon their flagship product (and most loyal customers) instead.
reply