Going SSD: Breathing new life into a 2012 MacBook Pro

ssd-drive-iconI’m not the first to extol the virtues of upgrading one’s computer to a Solid-State Disk. But man does it improve your computing experience.

First, some context: I use my Mac a lot. I constantly push it to keep up with my digital projects that range from simple word processing (like writing this blog and some academic essays for university), to really complex tasks like desktop app development via a virtual machine using C# in Visual Studio, manipulating 3D geometry in ARCHICAD, SketchUp, Rhino and Grasshopper, drafting and graphic design with AutoCAD and Illustrator, and some heavy Photoshop work; even the occasional render with ARCHICAD’s CineRender. Then there’s the side stuff like some motion graphics work and video editing for SKKSA.

So the kind of tasks this mid-2012 MacBook Pro has handled have been quite diverse. But the machine was showing its age; boot times were extremely slow, ARCHICAD took forever and a day to load-up, and it was generally a very rough experience toward the end-days of the “old” MacBook Pro ride. There was also this annoying bug where stock Apple apps would crash after a few hours (and I think it had something to do with ARCHICAD or its BIM Server component…)

Without forking out the new Apple tax to get a completely new machine – and my Mac is indeed old enough to justify a full computer upgrade, being in service sine the beginning of 2013 – I’ve managed to extend the longevity of an already solid machine with the upgrade of my RAM earlier this year (to 16GB), and now adding an SSD as the boot drive.

Here’s the run-down. I found a good deal on an SSD from a local Apple service center in Durban. So earlier this year, on the recommendation of my good friend Bryan, I upgraded the RAM from the stock 8GB to 16GB while I was in Cape Town. According to Bryan, macOS (or OS X, whatever you prefer) loves to be fed more RAM. This OS is RAM hungry, and my first upgrade definitely showed it: I used Activity Monitor to check up on my system, and it was evident that the system wanted to push past the 8GB limit; I was frequently hitting 11GB at times after the upgrade.

Adding RAM allowed me to do more with my Mac. But adding the SSD… that just made this thing feel like a new beast altogether. SSDs, as you know, read data via flash storage; there’s no spinning platter hard drive that needs to spool-up before you can access data. To give a simple example: it used to take over 2 minutes for my MacBook to fully boot-up and be ready to use; sometimes longer, as the Finder and other media took a long time to initialize. With the new SSD installed, my boot time is 20 seconds – that’s 20 seconds from the time I press the Power key to having a fully ready system waiting for me to give it commands. This kind of speed means a heck of a lot to me, as I spend most of my life on this machine.

The Setup

I opted to remove my Superdrive (CD/DVD drive). I reasoned that I hardly use optical media these days. So I moved my existing 750GB hard drive over to that bay, and installed the new 280GB SSD in the original hard disk slot.

macOS Sierra now boots from the SSD; I’ve got all my apps installed on this drive as well. My documents, iTunes library, pictures and other media are sitting on the old hard drive, which is permanently attached to the system (think of it like an always-connected external hard drive). I used a few Terminal commands to create symlinks (System Links) from the folders on the SSD to their corresponding folders on the HD. This ensures that Time Machine backs up all my data correctly. Having my larger media files on the HD allows me to take advantage of all that space, whilst still having the lean speed of the SSD for booting the system and other apps.

Going forward I might use the SSD to store ongoing project files for quick access, and then move them over to the HD once those projects are done.

Going SSD on my Mac was indeed like night and day. I’m still amazed how snappy ARCHICAD is now; AutoCAD is operating like a dream, as are my other creative apps like Photoshop and Illustrator.

I’m still finalizing a few minor apps, and I need to re-do my Bootcamp partition for Windows 10. But overall the system is functional, and I can’t wait to really put it through its paces as my Mac accompanies me on my Masters of Architecture journey starting next month.

Platform Wars are a Waste of Time

Mac vs PC. iOS vs Android. Automatic Transmission vs. Manual Transmission.

Since the dawn of technology, the platform wars have raged. The decision to use one system over another has somehow become suggestive of the character of a person. If you’re a Mac user, you’re suddenly labelled an “Apple sheep”. If you’re a diehard Windows person, then you’re associated with someone who does “real computing”, is “uncreative”, and a “numbers person.”

These labels serve no purpose other than to perpetuate a divide within technological circles, oft exploited by marketing teams to propagate one platform over another. They’ve been used to attack not just the flaws of a platform, but the people using these tools. Most frustratingly, they obscure the fact that no matter the characteristics of a particular platform, technology today has become so advanced that it is sometimes indeed indistinguishable from magic.

Here’s the thing: technological progress has been so dramatic over the past few years that it really doesn’t matter what platform you use. Especially in creative fields like design, cinema and photography: most applications are cross-platform, and the platforms themselves proffer enticing options no matter whether you’re macOS or Windows.

I grew up on Windows, and have programmed some significant (well, significant for me) projects. My prefered platform for the past 8 years has been macOS. I have very personal reasons – as many people do regarding their tools of choice to get the job done or to unwind with. These range from certain intricacies with macOS: the way applications are managed, the overall user interface, window management, the robust industrial design of Mac hardware, a trackpad that has yet to let me down and means I don’t have to always rely on an external mouse to get most design-related tasks done (and that even augments my mouse when designing on macOS). There’s also the comfort factor: I’ve grown very used to the Mac way of doing things. The list goes on, but it is indeed very specific to my own use case. The beauty is that I’ve been able to install the “best of both worlds” on my MacBook: I can experience the things I love about macOS like the Finder and the Alfred search extension, whilst simultaneously using Visual Studio on a Windows installation through VMWare to develop Windows desktop apps critical to the operation of SKKSA.

Look, we’re all entitled to our own opinions. And technology is as opinionated a field as you can get. Our lives are intricately entwined with the devices and platforms we use daily to live, to exist. So it makes sense that one becomes vehemently passionate about their platform of choice. But when that passion extends to bashing others for their choice of platform, especially without having a reasonable experience of said platform to base opinions upon, then it becomes a serious problem. In fact, it may say more about that person than their attacks and scorn of their target’s platform and by extension, the character attack associated with the choice of platform. If anything, it represents a juvenile, immature mindset; a rather closed, small-minded viewpoint of the vastness that is modern technology.

We should be excited and grateful that we have choice. More than one platform means that the developers of these tools are constantly competing to make their product better. This only benefits us, the end users.

Platform wars are a waste of time because they detract us from the beauty that is our modern, advanced systems. They detract us from actually focussing on collaborating, on creating and on using our incredible tools to help make the world a little bit better.

Architecture + Innovation

Following what I wrote recently about the “PC takeover” of architecture, as posited by renowned Zaha Hadid Architects partner Patrik Schumacher, I’ve been further intrigued by his sentiments when Tesla CEO, Elon Musk, recently announced a revolutionary new roof system.

This kind of technology is the innovation that is sorely lacking in the profession of architecture. Technical prowess has been dismantled from the profession as the architect begins to lose focus of the core aims of the profession – utilitas, firmitas, venustas (function, structure and beauty) – aims that are as old as Vitruvius himself. These are the pillars upon which our profession is built, yet we somehow seem to forget this as we begin to take on more abstract roles as politician, social justice warrior, philosopher, bureaucrat…  

Our lofty goals of achieving social justice, of shaking the foundations of dogmatic political practices and ushering in an era of collectivism, of social coherence and aesthetic and cultural harmony through our designed environments appear as noble pursuits. And no doubt they are essential, for we are in a unique position as a practice that situates itself at the intersection of the humanities and the sciences. We can balance these precarious entities through our designed intervention and intellectual prowess powered by years of pouring over precedent, theory, political studies and the philosophies that empower us as architects.

However, the technological agency that lies at the heart of our profession – the technological agency that binds the trifecta of utilitas, firmitas, venustas, is the very thing becoming rapidly marginalised in contemporary practice. We are being sidetracked by more ambiguity rather than pouring our collective talents into actually innovating the architectural technology that ultimately transforms our abstract world into the physical manifestations that form our built environments.

Musk’s development of a unique solar roofing system is exactly the kind of architectural innovation that is being “outsourced” to those outside our field. Yes, I acknowledge that as architects, we are not trained in the minutiae of such technical systems; the kind of product that Musk announced is the culmination of a variety of fields (industrial design, electrical engineering, manufacturing…). However, we are trained in the field of ideas. We should be the ones embracing and advocating for such advances. The Tesla + SolarCity roof tile system is the kind of product that is inherently architectural. It ticks all of the great Vitruvius’s boxes: it is functional (it is highly efficient at collecting solar energy and storing that in the Tesla PowerWall), it is incredibly strong – far stronger, in fact, than traditional building materials like terracotta – and it is beautiful. This last one is particularly important: in order to gain mainstream traction, aesthetics are paramount. 

The Tesla roofing system proposes, for the first time, a viable technology for taking buildings off the grid entirely. As architects, we are in the business of consumption – the very act of building requires consuming the earth in order to make space for our creations. The age of sustainable design is well and truly underway. The urgency for technical architectural innovation – the proposition, promotion and integration of imaginative technical ideas that further the environmentally-cenered design approach that will make or break this era – is sorely needed in a time when the role of architect as master of information is being challenged from within.

Equipment Doesn’t Define Creativity

There’s this wonderful saying that perfectly captures my thoughts on this topic. Essentially, what I believe, after going through three years of intensive design instruction in my undergraduate architecture degree, and throughout my various design-oriented ventures for personal work and for SKKSA, is that equipment does not dictate creativity. Indeed, it’s not what you use, but how you use it. This is where the magic happens; this is the act of art, where the depth of the creative act becomes apparent at the hand of the craftsman.

So before I delve deeper into this topic, here’s the gist of this idea: you wouldn’t compliment a chef’s kitchen utensils if you enjoyed his meal; you would commend his skills at bringing forth a delightful gastronomic experience. Similarly, one shouldn’t say “wow, that’s a great photo. You must have an amazing camera.” Because, like the chef and his delicious meal, a beautiful photograph is the creative proof of the photographer’s skillset: of understanding light, composition, technical dexterity and that unique aspect of the creative process that transcends mere product and renders a piece “art” – judgement and intuition.

One could have the most expensive creative equipment at their disposal, but without the knowledge of how to drive these tools, without intuition and passion and a deeper, rooted understanding of the art form – whether it’s a literary work, a piece of art, a photograph or the design of a building – the resultant work would be mundane, lacking a sense of meaning and connectedness to humanity, to society, and thus considered a positive contribution to the world.

All too often, in our consumeristic mindset, driven by the fast-paced nature of technology, society and an ever-increasing pressure to constantly produce for insatiable, all-consuming minds, we forget the magic that can arise when we transcend focus on equipment and rather consider the actual act of creativity. The act of creation, of making something out of nothing, is a rather sacred thing. To render something from the mind into reality is a cornerstone of mankind’s evolution, of our ascent from mere hunter-gatherers purely concerned with survival, into creators and thinkers with the potential to build entire cities and venture forth into the stars.

So these platform debates and mock-wars over which brand or product, or tool is better, are rather meaningless in the grander scheme. Whether you’re Windows or Mac, analogue or digital, it’s the way you use what you have to create that determines your prowess. In the end, not many will care how you created it; it’s the end product that matters to the large portion of society. But it’s up to us, as the creators, to imbue in our work meaning, and a rootedness to culture, society, history – to the precedents that provide richness and add dimension – because these are the elements that will ensure longevity in the final product. These, and not what was used to create them, will immortalise our names and ensure our creations add value to our fellow humans.

Celestial Jukebox in the Sky: The Life and Death of iPod

My iPod Classic

The first Apple device I owned was an iPod. Specifically, the iPod with Video (fifth generation, 60GB). It’s dead now; its hard drive failed some years ago and prompted my “upgrade” to an iPod Classic. It’s the shiny evolution of the device that changed the trajectory of Apple’s fortunes. The quirky click-wheeled ‘pod launched the company on a hyperdrive trip of success that has eventually led to last month’s announcement of Apple Watch, the latest darling to enter a strong lineage of beautiful industrial design from this Cupertino behemoth.

iPod is effectively dead right now. Its death occurred on June 29, 2007 when Steve Jobs announced iPhone, a “revolutionary” device touted as a “wireless communicator, Internet device and windscreen iPod – all in one.” iPhone, and the modern smartphone revolution it inspired, led to smaller device storage capacities and thus the emergence of streaming: instead of keeping music onboard, music could be – had to be – streamed from this mystical thing called the cloud. No longer are we compelled to maintain ginormous music libraries on iTunes, no longer do we have to carry our entire collection in our pockets: now, we can have the entire world’s music library beamed to us wherever we are (provided there’s network). The future is here, folks. This is the stuff writ in science fiction lore for decades: life in the connected network.

But here’s the thing: iPod was personal. iPod reflected who you were – because music is intrinsically personal, emotional, something that appeals to us all on the most base level. iPod was a tiny mirror of your personality. But with this shift to the cloud, the rise of iPhone and streaming services like SoundCloud, Spotify, Rdio and iTunes Radio, maintaining large libraries is a chore. The very purpose of iPod for the mass market is obsolete. I guess all technology has a shelf life – a fairly short shelf life at that – and it’s impressive that the iPod Classic remained in Apple’s lineup for so long. Its last refresh was in 2009, a mere consolidation of storage capacity to 160GB (that’s the iteration of iPod that I still own).

When Tim Cook announced 1 Infinite Loop’s new creation, Apple quietly pulled the iPod Classic from its lineup. This is a logical move for the company; earlier this year Cook even said in a quarterly earnings call that the iPod business was declining rapidly. Hell, even their accessories category is doing better than the iPod business. And the shift to streaming has caught Apple fumbling to maintain their dominance in the music industry – an industry it helped reinvent over a decade ago with this very device. Apple’s acquisition of Jimmy Iovine and Dr. Dre’s Beats Electronics is indicative of their yearning to pull themselves back into the game. iTunes sales are growing disproportionately to the sales of streaming subscriptions from rival services. The message from the consumer couldn’t be clearer: people don’t want to own music anymore. Small storage capacities on beloved smartphones – space that is hotly contested by a multitude of media, from apps to videos to music – justifies the raison d’être for streaming services. Streaming music means more space on devices for more apps. And add to that the notion that you can listen to a catalog far greater than the capacity of your device, and the idea of owning an iPod Classic seems unreasonable.

This shift is reminiscent of the music industry’s transition from vinyl to tapes, the Walkman to the Discman and CDs, to MP3s and the iPod. An industry susceptible to change, at the mercy of the never-ceasing flow of technological invention will always face the challenge of maintaining its emotional connection to the human spirit – emotion is at the core of music, after all. And emotion is what many designers of music services and technologies try to imbue in their creations, creations that by their very nature are ephemeral.

iPod changed the game. It reflected who you were through something intangible: music. It did the impossible. It created magic. In a way, devoid of myriad features, the infinite possibilities of a canvas-like multitouch interface and a massive app store – devoid even of any network connections – iPod was the most magical device Apple created. It forged invisible yet strong connections between people through music. And its death, its yielding to a far more exciting, intense era of technological possibility, signals also the death of this singularly beautiful experience: the idea of the focussed technological device, the product that does one thing, but does it insanely great.

So, where to now? What is to become of the iPod line with the death of its last great ancestor, its direct line to the original iPod? Apple still has one more event left for this year, its October event that was historically dubbed the “music event” – the one that was reserved for announcing new iPods and a new version of iTunes. In recent years that has been replaced with iterative updates to the Mac line, and new iPads. The only iPod getting any love is the Touch – and rightly so. It is the only iPod that bridges the origin line (iPod) with the newer kids on the block (iPhone and iPad) through iOS. I don’t think iPod as a brand is dead. A new iPod Classic with a solid-state drive and support for high-fidelity music files has been talked about a lot on Apple forums. But only by diehard audiophiles, because that is exactly to whom this kind of device would appeal. It’s a highly-focussed segment of a small marketshare, and hardly anything that a behemoth like Apple, already deep in existing and well-performing product lines and ventures into entirely new industries (high fashion with Apple Watch) would even bat an eye at. It is for this reason – a pragmatic one, coldly looking at the statistics of market share – that I think iPod Classic is dead; iPod nano, shuffle and touch will continue on to ride out the ever-diminishing sales of a brand that brought a struggling company back from near-extinction, as Apple focusses on pushing people onto iPhones and the iOS ecosystem.

Tony Fadell, “father of the iPod”, put it succinctly when he said in a recent interview with Fast.Co Design:

“It was inevitable something would take its place. You know, in 2003 or 2004, we started asking ourselves what would kill the iPod […] And even back then, at Apple, we knew it was streaming. We called it the ‘celestial jukebox in the sky.’ And we have that now: music in the cloud.”

Like many people, I love music. It’s an incredibly important part of my life. iPod was – no, is – still a fundamental part of my personal tech setup. I am sad to see the Classic go, and have (begrudgingly) come to accept that, logically, there cannot be another Classic; that’s not the direction that the world is moving in. But I hope that the experience, the magic that Apple created with iPod, remains coursing through its DNA as it shifts focus from a consumer electronics company into a lifestyle one.

F.lux makes your computer’s screen a sight for sore eyes

flux-icon-smI wish I’d found out about F.lux sooner. After using this little app for just a week now, it’s already transformed the way I work with my Mac during long-haul overnight sessions with looming deadlines.

Being an architecture student, I’m well-versed in the All Nighter. This phenomenon means staring at an LCD screen for hours on end, a concept that would send any optometrist into a fit. But it’s a necessary evil, something we need to do in order to get through a mountain of work.

F.lux makes this ordeal bearable.

I was compelled to download the utility after reading about it on the Sweet Setup. What F.lux does is simple, but incredibly effective. It’s based off intense research, and whilst its method is yet to be scientifically proven, I’ve personally found that it has made my staring at the screen late into the night far easier than before.

F.lux basically adjusts your computer’s display in accordance with the ambient lighting conditions. You’ve just got to enter your location, and it will do the rest. As the sun begins to set, your screen will gradually begin to tint to an orange-reddish hue. This means that as you get deeper into the night, you won’t have to stare into the obnoxious blue glow of the standard computer screen. Of course, this isn’t conducive to any graphic-related work where colour accuracy is of importance. But F.lux has a series of options allowing customisation, so you can, for instance, disable it for an hour, or for a specific app (like Illustrator or Photoshop). From the app’s description:

“f.lux makes your computer screen look like the room you’re in, all the time. […] Tell f.lux what kind of lighting you have, and where you live. Then forget about it. f.lux will do the rest, automatically.”

F.lux goes on to claim that it can even help you sleep better. According to the developer: “We know that night-time exposure to blue light keeps people up late. We believe that f.lux adjusts colors in a way that greatly reduces the stimulating effects of blue light at night.”

Whilst I haven’t noticed changes in my sleeping pattern (all nighters for days here), I have found that using my Mac at night is now a lot easier.

F.lux is available to download for Windows, OS X and Linux for free. You can even get it for Android, and jailbroken iOS. Your eyes will thank you.

Technology’s Disappearing Act: Apple Watch and the Next Big Tech Era

A curious thing is happening in the technology sphere. As we crave mobile devices with larger screens, a consequence of the ever-deferring nature of content to the smartphone that our hands are seemingly glued to, the very nature of technology is becoming… invisible.

As a long-time admirer and fan of Apple’s work, you might expect me to be raving about the new Apple Watch CEO Tim Cook unveiled this week. Here’s the thing, though: I’m still skeptical about this category. Apple has successfully redefined several categories in the past, but the problem with smartwatches – with wearables, in fact – is that we don’t exactly know what it is we want from them just yet. This product type is still largely in its infancy. And thus, can Apple truly change the game when its rules are still being debated?

But with Apple Watch we see this idea of “technology’s disappearing act” come alive. Here is a device that, unlike its competitors (Samsung Gear and Moto 360), actually seeks to address the nature of a watch in this connected age. Apple hasn’t merely taken what works at one scale (the UX of iPhone) and resized it; rather, they’ve first decided to probe the sociocultural implications of what timekeeping means. This, in essence, is what design is all about: understanding a move, critically examining your position and its effect, before executing it.

Our technology is increasingly becoming invisible, permeating almost every facet of our lives – not just for entertainment, but work, play, health, fitness. Thus moves like Apple Watch, the entire smartwatch category, wearable tech: this is an exercise in disrupting a logical path. It’s about re-imagining the way technology fuses with how humans have been living for centuries, because these devices are inherently intimate. They will be more connected to your physical being than any bit of tech before. Following a logical trajectory of design will only result in technology that is trying to solve problems that don’t actually exist: this is the conundrum I’ve been facing when thinking about this next paradigmatic shift in tech.

Wearable technology, in order to justify its raison d’être, will need to tackle this very issue: how can it enhance human life, rather than try and solve unnecessary, non-existent problems? This is where I think Apple Watch excels. Whilst I am not entirely blown-away by its industrial design, I think that particular design decisions made by Jony Ive and his crew set this device apart. The digital crown, deep connection to the art of timekeeping through fun digital watch faces, and the tactic engine that gently pulses on one’s wrist, providing a distinctive tactile dimension, are elements that will significantly add to the user experience. I understand the digital crown as a new interaction model that will define the smartwatch, just as the click wheel defined iPod. And, of course, wearable tech is intimate; it’s personal. Watches are a reflection of one’s taste, style, and fashion sense. How can digital technology, ephemeral in nature, with an ability to rapidly skin new themes and change experiences through the dynamic essence of software, be used as a conduit to channel this idea of enabling personality to surface?

The Apple Watch is pretty, and it tackles some serious issues that others have lacked to do so in favour of getting their products out first. But justification for its actual existence is what bothers me, and what I found lacking in Tim Cook’s delivery of his first significant product as CEO. The thing is, unlike iPod, iPhone and iPad before it, this device faces a critical challenge: that of actual necessity. Why would I want a tiny screen strapped to my wrist when a larger iPhone (which I have to carry with me anyway, since Apple Watch is highly dependent on this) can provide a less-frustrating experience when doing things like showing photos or sending messages? I think that where a device like this will truly excel is in the fitness category, and that itself is a fledgling arena.

When Steve Jobs introduced iPad, he spent a significant amount of the keynote before the unveil to explain the iPad’s purpose: as a device to fit a gap between smartphones and notebooks; Apple’s response to the then-burgeoning “netbook” craze. But with Apple Watch, Tim Cook launched straight into it after an ode to his predecessor (“one more thing”). After the slick video intro, he returned to stage, arms held up in triumph. No explanation of the watch’s purpose, its reason – it felt like he was relieved to finally release his first defining product as new CEO. It’s a problem, I think, if we cannot understand the Apple Watch’s significance: what makes it unique and not just a response to the strong competition from Samsung, LG, Motorola and the others? What purpose does it serve that will define it, apart from its innovative user interface design and complexity of customization options?

So, invisible tech…

Today, software defines our mobile experience. When the hype around new hardware dies, it is the software that remains as the defining experiential aspect of a device. And this is where technology is beginning to shrink its physical appearance and maximize a more intimate, invisible force. A device that is as personal as a wearable offers the opportunity to craft products around experiences that augment daily life, where the physical object moves to the background, providing subtle feedback on various operations. Apple Watch’s tactile engine and digital crown are two elements that come to mind here.

Wearables are going to be the next decade’s smartphone: as content gets bigger and more mobile, the opportunity for daily tasks to be augmented or replaced by digital variants will be great. It is in this sphere that wearable tech and the permeability of software into daily objects will enable technology to effectively disappear as it transcends from objects that exude “tech”, to more mundane guises that are powered by clever engineering and sleek industrial design.