FOMO Reading

Read not to contradict and confute; nor to believe and take for granted; nor to find talk and discourse; but to weigh and consider. Some books are to be tasted, others to be swallowed, and some few to be chewed and digested: that is, some books are to be read only in parts, others to be read, but not curiously, and some few to be read wholly, and with diligence and attention.

– Francis Bacon, The Essays [source]

A professor of mine once remarked on the vast expanse of knowledge contained within a university library and how each of us would hardly make a dent in it despite our best efforts. His point was not to discourage us as students from reading, but rather to acknowledge to sheer scope of what was available to read. The message was to tailor our efforts to reading efficiently to gain a diverse and interdisciplinary appreciation of the corpus of a university library. Scarcity of our time to read requires a strategic and purposeful approach – Bacon’s quote being an apt companion in navigating that path.

But making those choices – to skim or skip books altogether – has always been hard for me to implement in practice. I’ll often read a book cover-to-cover only to walk away with a few salient points to carry forward. Why do I persist in behavior that I fully well know is sub-optimal for making the most of scarce free time? I’m probably the last person to try objectively evaluating my own behavior, but I would guess it has something to do with loss aversion: “the pain of losing is psychologically about twice as powerful as the pleasure of gaining, and since people are more willing to take risks to avoid a loss, loss aversion can explain differences in risk-seeking versus aversion. [source]”

Or to put it in terms the average millennial would understand? FOMO: Fear Of Missing Out. Could that be why I subscribe to RSS feeds and never let an article go unread (at least the headlines)? Or why I stopped using Twitter for much since I lacked the time to be a Twitter completionist (reading every tweet in my stream). The irony is, of course, that by drenching my mind in the quotidian tech news and tech culture world, I miss out on some of the deeper learning opportunities afforded by books. Books that can, in turn, be skimmed or quickly red-through to allow for more time to read a greater variety of books overall.

I can’t directly change the behavioral tendency toward loss aversion/FOMO that I exhibit, but perhaps at least with an awareness of it I can disarm it of its potency when I stop to think about its effects. So the next time I feel the need to read an entire book I can pretty well guess won’t be jam-packed with insight or even postpone starting to read one that is information-dense, I can think about Bacon’s advice and first bite off just enough to get a good taste of what’s in store from a full course meal to follow.

Back to the Mac – Part II

As a continuation of my last post, I will cover the remaining aspects of the new 5K iMac and conclude my thoughts on the initial experience. First up: … wait, why can’t I remember what the first item is?


Memory (RAM)

As the only upgradeable part of the 27” iMac (and not even a user-serviceable option on the 21.5” model), the memory is the one part I can go with the minimum for now and upgrade at a later date without voiding the warranty by opening the case. The included 8GB should be ok for now, but if I find myself running into memory limits with Final Cut Pro X or Logic Pro X projects in the future, it’s a problem that is easy (although not inexpensive) to fix. Also, DDR3 is the older standard at this point – but it’s apparently not enough of a concern for Apple to prevent using it (for low-power reasons) in their recently revamped MacBook Pro laptops.


Fusion Drive (Solid State Drive + Hard Drive Disk)

This is something I’ve commented on before, and is worth a revisit 3 years later now that I’m putting my money where my mouth was and still is. As I’ve been a staunch advocate of solid state drives for years now, buying a new computer with a hybrid solution (NOT pure solid state storage) is an interesting step for me. And again, while I haven’t spent much time with the system yet I would echo Anandtech’s review of using Apple’s Fusion Drive:

For the first time since late 2008, I went back to using a machine where a hard drive was a part of my primary storage – and I didn’t hate it. Apple’s Fusion Drive is probably the best hybrid SSD/HDD solution I’ve ever used, and it didn’t take rocket science to get here. All it took was combining a good SSD controller (Samsung’s PM830), with a large amount of NAND (128GB) and some very aggressive/intelligent software (Apple’s Core Storage LVM). Fusion Drive may not be fundamentally new, but it’s certainly the right way to do hybrid storage if you’re going to do it. – Anandtech.com

The version of the Fusion Drive is largely the same in the computer I purchased as what Anandtech reviewed three years ago – a 128GB SSD paired with a 2TB traditional spinning hard drive. What’s changed is the SSD is a lot faster now (NVMe PCIe vs. AHCI). But while the Fusion Drive has largely kept a good balance of speed and capacity at a reasonable price, the continuing improvement in pure SSD storage prices (sometimes as low as $0.25 per GB) means that balance is less optimal as time goes on.

Long term, I can upgrade this iMac with an external Thunderbolt 2 enclosure to house a high-capacity SATA III SSD (e.g. Samsung 850 Pro 2TB) as a replacement boot drive someday. (Thunderbolt 2 has an effective throughput of 20Gbps bi-directional, which is more than sufficient for a SATA III SSD. However, it is not the PCIe powerhouse of USB-C/Thunderbolt 3 at 40Gbps that is already in the latest MacBook Pros and rumored for the next iMac refresh). A tradeoff for future-proofing ability, but at this point a budget-conscious one at that.

Overall, the storage situation for this iMac is a lot different what I originally pictured. With a little over 2TB of storage (2TB + 128GB), I have enough storage space to comfortably edit moderately large amounts of video, audio, and photos –albeit not at blazing fast speeds (thank goodness I lack a 4k video camcorder). But if I were to opt for pure SSD storage directly from Apple, that’s a price premium I am simply unable to justify despite my preference for SSDs. As a reference, how much do you think Apple charges to upgrade the new MacBook Pro models from the default of 256GB of NVMe PCIe Flash Storage (a fast SSD) to the 2TB SSD option?  Try an eye-watering $1,400, which is almost as much as I paid for the entire iMac itself.


Screen (the “5K” in 5K iMac)

I almost left this one out, but it’s really the central feature of the iMac line: a gorgeous 27” Retina display. Apple marketing does a good job of selling this feature, and to their credit it is difficult to find a suitable comparison in the all-in-one desktop market with similar specs and quality (the $3,000 Surface Studio being the sole competitor). This second revision of the 5K display uses what Apple calls a “Wider Color Palette” that for the photography/videography aficionados is a P3-based wider gamut than the typical sRGB. You have to have a capture device (such as the iPhone 7) in order to display images with a P3-based color gamut, but the difference is noticeable in terms of more accurate color reproduction (reds in particular stand out more).

As I tend to pay closer attention to the aspects of computers directly related to performance (CPU, GPU, Storage), it’s tempting to gloss over the 5K screen as just a really nice built-in display. But that’s really selling this retina display short, so let me elaborate. For starters, 5K displays have a resolution of 5,120×2,880 pixels, or about seven times the pixels of a standard 1,920×1,080 (1080p) display. That’s an incredible amount of additional information to show on the screen at one time. At a screen size of 27″ a 1:1 scaling from that of a 1080p display would result in ludicrously tiny fonts and user interface elements. Instead, Apple has long taken the approach (starting in 2010 with the iPhone 4) with their “retina” displays of using 4 pixels in place of 1 (on a 1080p display) to give everything on the screen an incredible level of sharpness and fine detail (especially obvious with text).

An aside:  As long as developers include appropriate retina-class high-resolution elements in their applications, it’s an easy way to implement support for retina on macOS. This was a bit of a problem early on when retina came to the Mac in 2012 with the MacBook Pro, but adoption was relatively quick (aided by the transition on iOS two years prior). Worst case was that some parts of the application would be “blurry” because of their lower resolution images and text. But otherwise the quad-pixel 200% scaling was – and still is – so much better than the HiDPI scaling mess of Windows (better in Windows 10, but still hit-or-miss). Unfortunately, Apple has started to mess with that formula with some recent Macs that aren’t quad-pixel multiples to give the user more “room” on the screen for more windows. It’s a trade-off that offers more screen real estate at the expense of crystal clear text and UI elements. (You could always manually do this in display settings for any retina Mac, but never before was it set as the default “best” setting). It isn’t terribly noticeable thanks to the scaling process macOS performs to render what’s on the screen, but I’m just glad my 5K iMac has a big enough native resolution that I can have my cake (effective “room” of 2560×1440 of a previous-generation 27″ iMac) and eat it, too (quad-pixel scaling for crisp text and images).


Miscellaneous

As I mentioned, the Thunderbolt 2 standard has been eclipsed by Thunderbolt 3/USC-C. But if the latest MacBook Pro revision is any indication, the next iMac might require dongles and adapters for anything else besides a USB-C/Thunderbolt 3 connector. The iMac I purchased also has a gigabit Ethernet port, SDXC card slot, regular USB 3.0 ports (compatible with just about every USB device to date) and 3.5mm headphone jack. It wouldn’t surprise me if in the interest of pushing toward the future of computing Apple will deprecate all but the USB-C/Thunderbolt 3 connector on the next iMac – making #donglelife a thing on the desktop.

In terms of physical dimensions, I still maintain that a svelte desktop is much more of a nicety than a functional or ergonomic advantage that would be the case with a laptop or mobile device (which you carry with you). Nevertheless, it remains impressive just how much Apple’s industrial design team has done to make a tapered aluminum shell housing a gorgeous screen – and, you know, an entire computer as an added bonus. The “lampshade” iMac G4 with its round base and crane-like arm holding the display mid-air will always have a special place in Apple’s history of impressive designs, but the current “chin” design of all iMacs after that seem to have settled on the Platonic ideal of the desktop PC.


Initial Impressions

So, with the specs and features outlined (in many more words than are probably necessary), I’ll revisit my experiences with this new iMac in the future once I get a chance to put it through its paces (Final Cut Pro X and Logic Pro X, I’m looking at you both). Of particular interest will be how the Fusion Drive performs over time as I start loading more content onto it (beyond the 128GB of PCIe Flash storage). Hard Drives perform increasingly slowly as more and more content gets added (and data fragments over time), so that’s a potential pitfall. And with the upcoming APFS (Apple File System *ding*) coming to the next version of macOS, I’ll be really interested to see how it will work with the Fusion Drive (if at all – the beta builds don’t support Fusion Drives at this point).

But even at this initial point, I can say that I’m glad I moved away from the Hackintosh – this is such a better experience overall.

(Although in a twist of fate, by troubleshooting another graphics card I found out that a BIOS setting was preventing discrete cards from working – and I verified that my old workhorse AMD Radeon 6870 with its native macOS support is still functional. Which means that the Hackintosh isn’t totally dead after all since the GPU was the tipping point for my decision to leave the Hackintoshing world for good. So, if I ever decide to resurrect the Hackintosh in Victor Frankenstein-fashion… No! Don’t give in to that siren song yet again!)

Back to the Mac – Part I

The time had finally arrived. Anticipation for the event was at fever pitch. And more than anything, I was ready to enter a new era of personal computing…

The delivery of the future of my creative computing endeavors marked a milestone. It was laid upon our front porch by the FedEx Delivery person with the significance of the final spike driven into the First Transcontinental Railroad. There was no mistaking it – this was an historic moment.

OK, back to reality. I got a new 5K iMac and have been looking forward to it for some time. It is not an inexpensive computer, but thanks to its refurbished status and relative age (a late 2015 model, although currently the latest and greatest from Apple) it didn’t break the bank.

My goal with getting an actual Mac instead of building a Hackintosh was to reduce the inconvenience and anxiety of a custom-built machine with the peace of mind that my computer would be fully supported by Apple for years to come. And although it’s been less than 48 hours with the new machine, I am already enjoying the difference tremendously.

I plan to blog more about my time using the device in the future beyond just the initial impressions, but for this post (and the following) I want to focus on the rationale for my selection in terms of the hardware. But an important caveat to my discussion is this: Apple is rumored to be on the verge of releasing a new iteration of the iMac (having been over a year since the last update). I hope to avoid buyer’s remorse, but there will always be a newer model coming ‘soon.’

So, my justification for purchasing this refurbished iMac must compete with the unknown variable of what’s next on Apple’s product road-map for the iMac. It’s a trade-off for taking the known over the potential benefits of the unknown. But as with much in the technology world, price is critically important.

So enough with the prologue, and onto the specs! (For the real geeks, here’s the nitty gritty spec list).

Apple 27-inch Retina 5K iMac

  • Intel 3.3GHz (3.9 GHz Turbo) Quad-Core i5 CPU
  • 8GB DDR3
  • 128GB SSD + 2.0TB 7200RPM HDD (Fusion Drive)
  • AMD Radeon R9 M395 with 2GB VRAM
  • USB 3.0 and Thunderbolt 2
  • 802.11ac Wireless, Bluetooth 4.0

Processor (CPU)

At first blush, it’s a reasonable machine in terms of specs. The CPU was a bit of a compromise (in an ideal world an i7-6700k clocked at 4.0GHz would have been my first choice), but otherwise a fine quad-core choice for CPU-heavy tasks. It’s much better than many Macs wielding only dual cores, and has the thermal headroom to run full-tilt with video encoding tasks that would be thermally limited on some smaller form-factor devices (e.g. laptops). And in a recent revelation (for me) that challenges the marketing appeal of an i7 being better than an i5 for heavily threaded workloads (such as video encoding) there’s this from Anandtech comparing the (newest generation) Core i5 and i7 flagships parts from Intel:

Both the HandBrake tests essentially mirror what we saw in [Cinebench]15 – the Core i5-7600K is there or thereabouts when frequency is the main factor, and when we stick a register-heavy threaded situation in the path, the effect from not having hyper-threading compared to the Core i7-7700K is relatively muted – in this case the i7 is only +20% performance over the i5, despite costing nearly 50% more. Our Hybrid test is somewhat similar to the HandBrake HQ test, showing the fact that heavy threads reduce the efficacy of hyper-threads. – Anandtech.com

Translation – you don’t get as much bang for your buck from the more expensive i7 model with hyper-threading as you’d think.  The i5 can be the sweet spot even for some heavily-threaded use cases depending on the workload.

Graphics Card (GPU)

Despite AMD’s new Polaris architecture and accompanying GPUs being launched this past summer, the current iMac has a GPU architecture based on a 28nm process (Pitcairn/Tonga/Fiji) technology. It therefore doesn’t benefit from either the architectural changes with Polaris and also lacks the newest generation’s cooler-running 14nm process technology. However, the importance of the GPU in a Mac (not used for gaming in my case) really comes down to a few key points:

  • User Interface (UI) responsiveness in macOS
  • Ability to effectively drive a high-resolution (5K) display
  • External monitor support (not applicable in my use-case)
  • Thermal performance (GPUs tend to generate the most heat of any part of the computer)
  • Application support / utilization (GPU acceleration)

For comparison, the GPU in the model I have is 2nd from the top in terms of ability.  The top end version (AMD Radeon R9 M395X graphics processor with 4 GB VRAM) is a pricey upgrade for only the most demanding application needs (and frankly isn’t a great gaming GPU either). The extra RAM could be useful in games or external monitor support (more frame-buffer to fill), but the one I have should be sufficient for my modest video / podcast editing needs.

The UI for macOS will run well in the upgraded GPU I have (vs. the base model M380 or M390) and will be plenty for driving just the one 5K display (no external monitors). Thermal performance for the M300 series is much improved vs. last generation’s M200 series (which saw thermal throttling limit performance in GPU-intensive tasks) despite being on the same 28nm process. And really, who likes to hear computer cooling fans spinning at high RPM? While I won’t get the full benefits of an M395X with 4GB VRAM for the applications that can use it, the scenarios that would optimally use those resources (heavy effects and filters in Final Cut Pro X or Photoshop) aren’t really anything I plan to do.

In Part II, I’ll wrap up my thoughts on the memory, storage, screen, miscellaneous items, and conclusion. To be continued…

The Siren Song of the Hackintosh

Over eight years ago, I built my first Hackintosh – a commodity PC configured to run Apple’s Macintosh OS X operating system. Since that time, I’ve experimented with a wide variety of Hackintosh configurations; all of them lasting for variable periods of time before the configuration become too unstable to make maintaining it worth the effort.

The reason for building a Hackintosh involves the thrill of getting something to work that wasn’t originally intended as such. But as I mentioned in my last post, that thrill can come at a cost of time spent tinkering to maintain a Hackintosh. My most recent Hackintosh build was perhaps the easiest to configure and most stable (for a time) of all my Hackintoshing experiences. Alas, in the end the tinkering took its toll and I was left with an unusable macOS install.

In summary, my build consisted of a Gigabyte Z170 motherboard, an Intel Core i7-6700k Quad Core @ 4.0GHz (Skylake) CPU, 16GB of RAM, a 256GB SSD (along with various other hard drives for storage), and an NVIDIA GeForce GTX 960. Not too shabby as Hackintoshes go. In previous builds, I had been using a much older AMD Radeon 6870 graphics card that – through luck of Apple’s component choices – was a natively-supported graphics card in macOS. That’s a much bigger deal than it sounds like as proper graphics acceleration (meaning: the graphics card is able to correctly do its job) is one of the most finicky parts of any Hackintosh build.

As with most Hackintosh builds, there was some initial tinkering to figure out exactly how to get the system to boot properly and install macOS. (For the record, it ended up being an XHCI handoff setting needing to be enabled). But the biggest change this time around was the graphics card. My venerable AMD Radeon 6870 had finally bit the dust, leaving me without my natively-supported graphics card. Surely, I thought, I could navigate the waters of NVIDIA web drivers to successfully install a newer graphics card, right?

Again returning to the theme of this post, while after some research and troubleshooting I did get it working, a slight change was all it took to topple my short-lived success. I downloaded and installed the NVIDIA web drivers as you do for a Hackintosh build (NVIDIA provides updated drivers for macOS in support of their Quadro series cards – but they just so happen to also support the GeForce line of cards in macOS). So far so good.

And after installing the drivers, the system required a reboot to start using the new driver. So I obliged. Unfortunately, that toggle setting to use the NVIDIA web driver instead of the basic macOS driver (which has terrible screen-tearing, visual artifacts, and slow enough redraw times that you can literally follow a screen redraw with your eyes) didn’t stay selected. About 20 more minutes of research, a kext update, and another reboot later and I had a fully-accelerated GPU working in macOS. Yay!

Or so I thought. Once I made the crucial step of cloning my drive (using Carbon Copy Cloner) to a new SSD to re-purpose the original SSD temporarily, I broke my Hackintosh. A cloning operation will move all the officially-sanctioned macOS system files, but not the custom Hackintosh code. Which is certainly understandable, and is something I was prepared to fix with a reinstall of the post-install steps I had originally performed to get it working. But alas, the one part that did not work – after hours of fruitless troubleshooting – was that graphics card. “No GPU-acceleration for you!”

With an essentially unusable Hackintosh at this point, I had come to a moment of crisis. With any Hackintosh, the key is backups and ‘recoverability’ if you break something – it just comes with the territory. But since my cloning backup failed to restore it to working order (meaning any future recovery attempts would likely run into a similar problem), I knew I would be relying on painstaking clean installs followed by Time Machine restores. Not ideal. But the biggest challenge was that I had lost my advantage of a natively-supported GPU and would likely be wrestling (unsuccessfully, so far) with GPU-acceleration/driver issues at potentially any turn with updates or changes to the system.

In the end, it was a lot of wasted time. Or at least it was looking like it would be more trouble than it would be worth. It had its fun moments, to be sure. And my masochistic relationship with Hackintoshing in general over the years is a testament to my willingness to subject myself to these challenges. But at the end of this most recent endeavor – and eight years of prior experiences – I think I’m finally ready to throw in the towel and leave the Hackintoshing to those with the will and free time I no longer have. A sad day, true. But one that reflects the reality that I’d much rather spend my precious free time actually doing the creative work (video editing, podcasting, etc) I was building a Hackintosh for in the first place.

For anyone who has read my previous musings on Hackintoshing projects will call foul at this point – I’ve “thrown in the towel” on Hackintoshing on more than one occasion only to come crawling back to it like some technological addiction. But today marked more than just a symbolic declaration of independence from my Hackintoshing past. For today I received (and unboxed) my 100% genuine, Apple-built Macintosh: a 5k iMac. (I’ll share more about that new development in another blog post). But at this point I think it is finally safe to say – the era of the Hackintosh (at least for me) is finally over.