A thermally constrained system design doesn't lend itself to upgrades, and unrealized processing potential is also a shame. But while the form factor may run hot, it sure looks cool.
It’s been a trusted workhorse for many years. Ever since I bought it used in September 2011, I’ve successively upgraded many elements of it; adding expansion cards, updating the graphics subsystem, expanding its RAM, migrating its mass storage to high-performance WD Raptor HDDs (but not, at least yet, to SSDs), etc. But as of September 2018, when MacOS 10.14 exited beta (obsoleting Mac OS X 10.11 in the process) it’s no longer Apple software supported.
That August 2019 article kicked off a multi-post series of articles on what I at the time thought might be its successor(s), a series of HP PCs that were particularly amenable to “Hackintosh” conversions courtesy of their Mac-compatible hardware building blocks. Reality has a habit of undershooting the hype, however; even with a still-vibrant enthusiast community backing it, the “Hackintosh” concept has been perpetually plagued by fragile software workarounds that tend to break every time Apple pushes an operating system or application update.
Recent O/S advancements such as System Integrity Protection have further added to the Mac emulation challenge, especially in conjunction with silicon advancements such as the T2 security chip. At this point, Apple has nearly completed its migration from Intel-based Macs to those based on its own Arm-derived SoCs; whenever the company turns its attention away from x86-based software, the “Hackintosh” will inevitably also fade away. And since the HP PCs I was planning on using are relatively geriatric, they’re not candidates for Windows 11 either, further constraining their remaining useful lives. But hey, there’s still Linux! (More likely, they’re probably destined for charity donation while Windows 10 is still supported, to be honest.)
So, what am I replacing the MacPro3,1 with instead? Meet my just-received birthday present, a MacPro6,1, alongside a legacy (non-Retina) 27” Thunderbolt display that I bought from a moving-away neighbor last year, and on my downstairs workbench for setting-up purposes:
This is the longest-longevity computer model in Apple’s to-date history; it shipped essentially unchanged for six years, from 2013 to 2019. My particular unit, based on a six-core Intel Xeon CPU, had an original MSRP of $3,999 when equipped with 16 GBytes of RAM and a 256 GByte SSD; my wife bought my like-new condition unit upgraded to 32 GBytes of RAM and a 512 GByte SSD from LA Computer Company for $1,099. Check out all those backside ports (legacy Thunderbolt 2, unfortunately, versus newer Thunderbolt flavors):
And check out, too, its sci-fi interior visible after outer case removal:
You might be surprised to sense my enthusiasm about a product that had prompted an uncharacteristic public apology from Apple executives four years after its introduction. Part of the reason admittedly involves the low price my wife paid for it, both absolutely and relative to its original MSRP. And part of the reason involves its greater-than-expected user upgradeability (which I’ll discuss next), versus the unfairly short shrift I gave it back in August 2019. But the criticism isn’t completely meritless, in an allusion to this post’s title which I’ll elaborate on soon.
Upgrades first: turns out that (as well-documented here) the RAM is incredibly easy to update, with the SSD only a bit more challenging. My system’s 32 GBytes of RAM was supposed to consist of two 16 GByte DIMMs, with two remaining unpopulated slots capable of boosting total system memory to 64 GBytes. Turns out LA Computer Company ended up shipping it with four 8 GByte DIMMs instead, but two 16 GByte replacements are headed my way as I type this; the company’s support is always excellent (I’m a repeat customer). And as it turns out, it’s actually (unofficially) possible to boost total system memory to 128 GBytes, although doing so comes with a clock speed tradeoff: whereas “official” DIMMs (up to 16 GBytes each) run at 1867 MHz, 32 GByte DIMMs run at 1066 MHz in four-to-eight CPU core systems and 800 MHz in twelve-core Mac Pros. The capacity-vs-bandwidth tradeoff makes for interesting benchmarks.
The CPU can even be updated, although the process is a fair bit more involved; I’ve already picked up a used twelve-core Intel Xeon E5-2697 v2 processor on eBay for $70. The GPU, on the other hand, is a different story, and marks the transition to the “but” part of this writeup. First off, I should say GPUs, because there are two of them, proprietary in design and sourced from AMD. And why two? Because at the time, Apple had decided to place a big bet on the OpenCL API that it had originally developed internally and then turned over to the industry for management by the Khronos Group (most notably the longstanding maintainer of OpenGL).
OpenCL enables operating system and application software alike to heterogeneously leverage all (function-dependent appropriate) processing resources available in a system: CPUs (including big.LITTLE and other multi-core configurations), GPUs (such as the dual-graphics board configuration in my new toy), special-purpose coprocessors, etc. It’s been only modestly adopted by the industry; the professional CAD, graphics, still image and video processing and other applications that commonly run on high-end systems such as the Mac Pro are particular success stories, but mainstream application adoption has been more muted (in no small part because only a subset of apps can effectively leverage GPUs’ massively parallel but not-general-purpose acceleration potential, for example). And this adoption largely stalled at the version 1.2 level, which dates from more than a decade ago; subsequent OpenCL versions haven’t received comparable uptake, and Khronos responded with a notable “reset” in latest v3.x API versions.
Even more notably, from an Apple-centric standpoint, both OpenCL and OpenGL were deprecated beginning with 2018’s MacOS 10.14 “Mojave”, in favor of Apple’s proprietary Metal API. That said, both APIs still exist for legacy software compatibility, even running natively on Apple’s newer Arm-based SoCs and systems containing them. But any OpenCL (or OpenGL, for that matter) development within Apple save for maintenance has seemingly long ago ceased.
When Apple issued its “apology” to the professional community via a tech news outlet briefing in 2017, one particular quote stood out to me:
“I think we designed ourselves into a bit of a thermal corner, if you will,” one of Apple’s top executives reportedly said.
To be precise, the second-generation “trash can” (more officially, “cylinder”) Mac Pro I now own, quoting from Wikipedia’s entry, employs “an overhauled case design, a polished reflective aluminum cylinder built around a central thermal dissipation core and vented by a single fan, which pulls air from under the case, through the core, and out the top of the case.” Unfortunately, although its thermal capabilities were adequate for the initial silicon and broader hardware allotment, it couldn’t keep pace with newer, higher power and hotter-running chips. Wikipedia again: “The cylindrical thermal core was unable to adapt to changing hardware trends and left the Mac Pro without updates.” Analogies to the G4 Cube are apt.
Circling back to graphics, I’m essentially stuck with what I’ve got. The internal graphics cards, as previously mentioned, are not only hot when running but also proprietary in form factor. And although I conceptually could supplant them with an eGPU, only Thunderbolt 3 and 4 are officially supported in Apple’s O/Ss. Hacks exist to circumvent this limitation, but…well…they’re hacks (see my earlier “Hackintosh” comments). And earlier-generation Thunderbolt bandwidth limits might end up overly constraining the eGPU’s incremental performance potential, anyway.
That all said, I’m hopeful that my new toy will serve me at least as long as its predecessor did; even though its elderly CPU and lack of a TPM module preclude native Windows 11 support via Boot Camp, for example, Parallels and its ilk aspire to provide virtualized support for years to come. The wild card, of course, is how long Apple will continue to support x86-based Macs in its operating systems. On the one hand, the company had dropped PowerPC support with 2011’s Mac OS X 10.7 “Lion”, released six years after the transition to Intel systems was first announced. But I’d expect (or at least hope for) a longer deprecation cycle for successor x86-based software given the much larger installed base this time around.
Thoughts? Questions? Sound off in the comments!
This article was originally published on EDN.
Brian Dipert is Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.