Architecture + Innovation

Following what I wrote recently about the “PC takeover” of architecture, as posited by renowned Zaha Hadid Architects partner Patrik Schumacher, I’ve been further intrigued by his sentiments when Tesla CEO, Elon Musk, recently announced a revolutionary new roof system.

This kind of technology is the innovation that is sorely lacking in the profession of architecture. Technical prowess has been dismantled from the profession as the architect begins to lose focus of the core aims of the profession – utilitas, firmitas, venustas (function, structure and beauty) – aims that are as old as Vitruvius himself. These are the pillars upon which our profession is built, yet we somehow seem to forget this as we begin to take on more abstract roles as politician, social justice warrior, philosopher, bureaucrat…  

Our lofty goals of achieving social justice, of shaking the foundations of dogmatic political practices and ushering in an era of collectivism, of social coherence and aesthetic and cultural harmony through our designed environments appear as noble pursuits. And no doubt they are essential, for we are in a unique position as a practice that situates itself at the intersection of the humanities and the sciences. We can balance these precarious entities through our designed intervention and intellectual prowess powered by years of pouring over precedent, theory, political studies and the philosophies that empower us as architects.

However, the technological agency that lies at the heart of our profession – the technological agency that binds the trifecta of utilitas, firmitas, venustas, is the very thing becoming rapidly marginalised in contemporary practice. We are being sidetracked by more ambiguity rather than pouring our collective talents into actually innovating the architectural technology that ultimately transforms our abstract world into the physical manifestations that form our built environments.

Musk’s development of a unique solar roofing system is exactly the kind of architectural innovation that is being “outsourced” to those outside our field. Yes, I acknowledge that as architects, we are not trained in the minutiae of such technical systems; the kind of product that Musk announced is the culmination of a variety of fields (industrial design, electrical engineering, manufacturing…). However, we are trained in the field of ideas. We should be the ones embracing and advocating for such advances. The Tesla + SolarCity roof tile system is the kind of product that is inherently architectural. It ticks all of the great Vitruvius’s boxes: it is functional (it is highly efficient at collecting solar energy and storing that in the Tesla PowerWall), it is incredibly strong – far stronger, in fact, than traditional building materials like terracotta – and it is beautiful. This last one is particularly important: in order to gain mainstream traction, aesthetics are paramount. 

The Tesla roofing system proposes, for the first time, a viable technology for taking buildings off the grid entirely. As architects, we are in the business of consumption – the very act of building requires consuming the earth in order to make space for our creations. The age of sustainable design is well and truly underway. The urgency for technical architectural innovation – the proposition, promotion and integration of imaginative technical ideas that further the environmentally-cenered design approach that will make or break this era – is sorely needed in a time when the role of architect as master of information is being challenged from within.

Superman: Modern Day Socrates

In Quentin Tarantino’s Kill Bill vol.2, Bill has a very interesting monologue. Perhaps the most famous monologue in the entire two-part saga.

The essence: he slices through the very nature of Superman, and argues for the idea that Clark Kent is the image Superman perceives of us, as a species, as the human race.

Here’s the speech:

(edited to remove spoilers)

As you know, l’m quite keen on comic books. Especially the ones about superheroes. I find the whole mythology surrounding superheroes fascinating. Take my favorite superhero, Superman. Not a great comic book. Not particularly well-drawn. But the mythology… The mythology is not only great, it’s unique.

[…]

Now, a staple of the superhero mythology is, there’s the superhero and there’s the alter ego. Batman is actually Bruce Wayne, Spider-Man is actually Peter Parker. When that character wakes up in the morning, he’s Peter Parker. He has to put on a costume to become Spider-Man. And it is in that characteristic Superman stands alone. Superman didn’t become Superman. Superman was born Superman. When Superman wakes up in the morning, he’s Superman. His alter ego is Clark Kent. His outfit with the big red “S”, that’s the blanket he was wrapped in as a baby when the Kents found him. Those are his clothes. What Kent wears – the glasses, the business suit – that’s the costume. That’s the costume Superman wears to blend in with us. Clark Kent is how Superman views us. And what are the characteristics of Clark Kent. He’s weak… he’s unsure of himself… he’s a coward. Clark Kent is Superman’s critique on the whole human race.

–Bill, from “Kill Bill vol.2” (2004)

Digital and Analog: A Tale of Two Mediums

I watched Quentin Tarantino’s latest film, The Hateful Eight, recently. The big hype about this picture – apart from it being a Tarantino western, a surefire classic if ever there was one – is that he chose to film it in Ultra Panavision 70. This is a format that hasn’t been used in mainstream cinema since the 60s. In fact, Hatful Eight was filmed with the same lens used in classics like Ben Hur.

Tarantino is one of the most prolific proponents of shooting on film. He believes that this traditional method is something that should be preserved, and that it adds an ineffable quality to the cinematic experience that is surely lacking from modern digital cinema.

This argument – film vs. digital – is an age-old debate. It’s not unique to the realm of cinema, in fact. As a young architect, I have already faced this debate, first encountering it in my undergraduate years. Architecture is a field that is constantly facing the challenges of technological progress. On either end of the spectrum (construction and design) technological development poses fundamental issues that deeply affect the very core of the practice.

The argument for an analog approach to creativity is that it brings one closer to the work. There is no denying that the connection between brain and hand is inextricably strong. So yes, when introducing a new stream of knowledge (in this case, the many aspects of architectural education) beginning with analog methods is critical. Most importantly, it gives the student a better sense of proportion, geometry and scale. These are aspects that are oft distorted if one were to begin in a virtual sphere.

However, the paradox ensues when we face the fact that most architectural students are educated by teachers who were themselves taught in the older craft of analog production. This method, when one gets to the higher level, is anachronistic in a highly digital world. Digitally produced work is thus frowned upon; a seemingly easy-way-out approach to the creation of architecture. Yet in the “real world”, a digital production environment is critical to the bottom line – to staying relevant in a fast-paced and rapidly developing economy.

Here’s the thing: the virtual world is just another type of canvas. In many cases, digital form making brings forth accelerated creativity as complex geometries are made possible. Proponents of the analog method will argue that digital work lacks the je ne sais qua of hand-produced work. I would rebut by saying that digital work can be just as expressive as its analog counterparts – it all comes down to the practitioner, to how the creator wields the tools available to him.

At the end of watching Hateful Eight, it didn’t really matter to me that the movie was filmed in the Ultra Panavision 70 format. Yes, it looked beautiful. And the cinematography accented an already entertaining and gripping story. But what mattered most was the experience: I cared less for how it was made, and more about how entertained I was – the final goal of any cinema. Likewise, it matters not how the piece of architecture was conceptualised. What matters is that it conveys the right information, it describes the idea in the best way possible, and it ultimately enriches the viewer or occupant.

Bionic Citizen

Apple WatchWearables are the hot topic in technology today. What once seemed like something out of a Star Trek episode is now reality. And whilst Samsung, Motorola and others are veterans (if you could call more than two years in the product category “veteran”) I truly think that Apple’s design and brand-respected clout will be the final tipping point in solidifying this curious new niche as a distinct product line.

The Apple Watch’s gutsy move in positioning itself as a piece of haute couture is indeed a bold move; I have many questions about the longevity of such a device when the ephemeral (technology) is juxtaposed with the eternal (high-end watch design). It seems like a product conceived of contradiction as much as it is a device seeking to bridge two seemingly disparate factions of society (fashion and technology). However, it represents a progression of technology. A powerful progression, I might add: we are on the cusp of transcending the notion that technology exists as a realm separate to the organic body that is us. The Apple Watch’s ideal of being a fashion piece means it encompasses the design traits distinct of fashion: personality, individuality, intimacy.

But even more than that: wearables are a step closer to the assimilation of the inorganic — the world of binary code and cold, calculated lines of coding — with the organic: us, human beings, sentient creatures who have created these strange contraptions. Are we on the brink of becoming bionic? Is it possible that, by following this progression to its logical destination, we can assume that there will be a convergence: that these two worlds will properly collide; that we will become bionic creatures, beings made just as much with technology as we are with blood and muscle?

The singularity theory postulates a point in time in the distant future, called an event horizon. Beyond this point, we cannot predict, using the cognitive skills we have honed over millennia, what will occur next. The idea that we are assimilating ourselves with the technology that we use can be considered as an event horizon. The current state of technological development suggests that this is the case, but I think that if we examine the nature of our current society, we can further understand why this might very well be the eventual outcome of our species.

We live in an increasingly connected world. The Grid rules our lives: we are beings that connect to it daily; it is what provides us with a large amount of our daily intellectual sustenance. We cannot function without the Grid. Our society is hyper-connected, and devices like smartwatches illustrate how reliant we are on the Grid. That such a device exists signifies the ever-encroaching grip our digital lives have on us. Our lives are lived as much in the virtual sphere as they are in the realm of reality. When technology begins to disappear yet exert an even more potent force upon our existence, we begin to step closer towards that event horizon of the bionic being. Technology is beginning to disguise itself. It is no longer taking the visage of a “device that looks like a computer” — that traditional aesthetic is being challenged as microchips get smaller and more powerful, thus making it possible to insert them into devices that we are accustomed to for centuries.

As design becomes invisible, it will allow technology to ingratiate itself far more easily into our daily lives, slowly tightening the grip the Grid has on our psyche. As microchips become smaller and their power more potent, we will begin to subconsciously slip into a symbiotic reliance with the devices and the virtual networks we’ve created. This is going to change society in incredible ways. Everything from culture to politics to economics to the built environment will be affected. One area I’m particularly curious about is, of course, the urban architectonic. How will our bionic beings navigate an architecture of the future? How will our built structures evolve to support the new lifestyle that will emerge? How can virtual reality coexist with the concrete and steel that is so intrinsic to maintaining our human existence at the base level?

One simple device, oft derided and questioned for its very purpose, can have the potential for a significant sociocultural impact. For now, though, we play the waiting game: watching, patiently, as the progression of technology and society slowly merge, an intricate dance orchestrated by a plethora of parameters and the organic ebb and flow of time…

 

Originality v Hollywood: Dawn of Mediocrity?

A curious phenomenon is occurring in the centre of society’s entertainment universe. Perhaps it’s a sense of potential failure casting a net of fear around what was once a creative powerhouse. Perhaps it’s a descent into mediocrity as our collective society has embraced a sense of complaisance, where banality passes for acceptable quality. Whatever it is, there can be no denying it: Hollywood appears to be running out of fresh ideas.

Instead, we’re being treated to the wonders of rehashed entertainment. I’m reminded of a sentence Nick Offerman’s character, Deputy Chief Hardy, says in 21 Jump Street (ironically, a reboot of a popular television series)

“We’re reviving a canceled undercover police program from the ’80s and revamping it for modern times. You see the guys in charge of this stuff lack creativity and are completely out of ideas, so all they do now is recycle shit from the past and expect us all not to notice.”

I feel like this is exactly what an executive-led creative industry is doing. I can almost picture the suits in their corner offices somewhere in Los Angeles, cigar in hand, smug grin on their faces, signing-off another reboot, knowing that our pop-obsessive society will eat this all up and fatten the studio’s bottom line. How stupid do they really think we are?

There will come a point, hopefully soon, when cinema audiences will tire with this. When we will finally open our eyes to the fact that it’s the same movie, with the actors-du-jour fitted snugly in to a predictable plot.

Look, don’t get me wrong. I’m just as excited about the new Star Wars as the next fan. Likewise, I can’t wait to see what Marvel has in store with Avengers: Age of Ultron. I’m an (obsessive?) follower of their Agents of S.H.I.E.L.D. series, and an ardent watcher of both Arrow and The Flash, two of DC’s darling television spin-offs. These are all properties based off existing source material, whether it’s comic books or one of the most famous cinematic franchises of all time.

However, I feel that there are talented writers out there with exciting, fresh stories yearning to be unleashed from their paper bounds and brought forth onto the reflective-silver screens of our cineplexes. These stories are being marginalized when studio execs opt to “play it safe” with rehashes of recently-completed rehashes (I’m looking at you, Spider-Man), with bloated adaptations of beloved source material (The Hobbit) or the hope of capitalizing on unexpected, explosive success. In the case of this last example, I’m of course referring to the recent news that Lionsgate, boon of the young adult dystopian fiction adaptation fad, is considering continuing the Hunger Games stories beyond the book. As a fan of the series and its cast and wonderful director, I sincerely hope this will not materialize. Whilst it would be great to see more of the world that Katniss inhabits, and the fact that the last book left much to be desired in terms of an ending, the stories should just be left alone. Hollywood needs to learn about a story’s limits. They need to learn how to let go.

At the end of the day, we as cinemagoers make the final decision. We have a choice about what we want to watch. That’s the great thing about cinema: we live in an era when there are so many possibilities; were spoiled for choice, essentially. We can choose whether we feel like watching an inventive story like Birdman, or rekindle some nostalgic feels with a viewing of a Godzilla (or Ghostbusters or Robocop or Terminator) reboot. The thing I truly wish for, through, is for original stories to receive the same level of care and treatment that these existing, beloved properties are currently getting.

Equipment Doesn’t Define Creativity

There’s this wonderful saying that perfectly captures my thoughts on this topic. Essentially, what I believe, after going through three years of intensive design instruction in my undergraduate architecture degree, and throughout my various design-oriented ventures for personal work and for SKKSA, is that equipment does not dictate creativity. Indeed, it’s not what you use, but how you use it. This is where the magic happens; this is the act of art, where the depth of the creative act becomes apparent at the hand of the craftsman.

So before I delve deeper into this topic, here’s the gist of this idea: you wouldn’t compliment a chef’s kitchen utensils if you enjoyed his meal; you would commend his skills at bringing forth a delightful gastronomic experience. Similarly, one shouldn’t say “wow, that’s a great photo. You must have an amazing camera.” Because, like the chef and his delicious meal, a beautiful photograph is the creative proof of the photographer’s skillset: of understanding light, composition, technical dexterity and that unique aspect of the creative process that transcends mere product and renders a piece “art” – judgement and intuition.

One could have the most expensive creative equipment at their disposal, but without the knowledge of how to drive these tools, without intuition and passion and a deeper, rooted understanding of the art form – whether it’s a literary work, a piece of art, a photograph or the design of a building – the resultant work would be mundane, lacking a sense of meaning and connectedness to humanity, to society, and thus considered a positive contribution to the world.

All too often, in our consumeristic mindset, driven by the fast-paced nature of technology, society and an ever-increasing pressure to constantly produce for insatiable, all-consuming minds, we forget the magic that can arise when we transcend focus on equipment and rather consider the actual act of creativity. The act of creation, of making something out of nothing, is a rather sacred thing. To render something from the mind into reality is a cornerstone of mankind’s evolution, of our ascent from mere hunter-gatherers purely concerned with survival, into creators and thinkers with the potential to build entire cities and venture forth into the stars.

So these platform debates and mock-wars over which brand or product, or tool is better, are rather meaningless in the grander scheme. Whether you’re Windows or Mac, analogue or digital, it’s the way you use what you have to create that determines your prowess. In the end, not many will care how you created it; it’s the end product that matters to the large portion of society. But it’s up to us, as the creators, to imbue in our work meaning, and a rootedness to culture, society, history – to the precedents that provide richness and add dimension – because these are the elements that will ensure longevity in the final product. These, and not what was used to create them, will immortalise our names and ensure our creations add value to our fellow humans.

Celestial Jukebox in the Sky: The Life and Death of iPod

My iPod Classic

The first Apple device I owned was an iPod. Specifically, the iPod with Video (fifth generation, 60GB). It’s dead now; its hard drive failed some years ago and prompted my “upgrade” to an iPod Classic. It’s the shiny evolution of the device that changed the trajectory of Apple’s fortunes. The quirky click-wheeled ‘pod launched the company on a hyperdrive trip of success that has eventually led to last month’s announcement of Apple Watch, the latest darling to enter a strong lineage of beautiful industrial design from this Cupertino behemoth.

iPod is effectively dead right now. Its death occurred on June 29, 2007 when Steve Jobs announced iPhone, a “revolutionary” device touted as a “wireless communicator, Internet device and windscreen iPod – all in one.” iPhone, and the modern smartphone revolution it inspired, led to smaller device storage capacities and thus the emergence of streaming: instead of keeping music onboard, music could be – had to be – streamed from this mystical thing called the cloud. No longer are we compelled to maintain ginormous music libraries on iTunes, no longer do we have to carry our entire collection in our pockets: now, we can have the entire world’s music library beamed to us wherever we are (provided there’s network). The future is here, folks. This is the stuff writ in science fiction lore for decades: life in the connected network.

But here’s the thing: iPod was personal. iPod reflected who you were – because music is intrinsically personal, emotional, something that appeals to us all on the most base level. iPod was a tiny mirror of your personality. But with this shift to the cloud, the rise of iPhone and streaming services like SoundCloud, Spotify, Rdio and iTunes Radio, maintaining large libraries is a chore. The very purpose of iPod for the mass market is obsolete. I guess all technology has a shelf life – a fairly short shelf life at that – and it’s impressive that the iPod Classic remained in Apple’s lineup for so long. Its last refresh was in 2009, a mere consolidation of storage capacity to 160GB (that’s the iteration of iPod that I still own).

When Tim Cook announced 1 Infinite Loop’s new creation, Apple quietly pulled the iPod Classic from its lineup. This is a logical move for the company; earlier this year Cook even said in a quarterly earnings call that the iPod business was declining rapidly. Hell, even their accessories category is doing better than the iPod business. And the shift to streaming has caught Apple fumbling to maintain their dominance in the music industry – an industry it helped reinvent over a decade ago with this very device. Apple’s acquisition of Jimmy Iovine and Dr. Dre’s Beats Electronics is indicative of their yearning to pull themselves back into the game. iTunes sales are growing disproportionately to the sales of streaming subscriptions from rival services. The message from the consumer couldn’t be clearer: people don’t want to own music anymore. Small storage capacities on beloved smartphones – space that is hotly contested by a multitude of media, from apps to videos to music – justifies the raison d’être for streaming services. Streaming music means more space on devices for more apps. And add to that the notion that you can listen to a catalog far greater than the capacity of your device, and the idea of owning an iPod Classic seems unreasonable.

This shift is reminiscent of the music industry’s transition from vinyl to tapes, the Walkman to the Discman and CDs, to MP3s and the iPod. An industry susceptible to change, at the mercy of the never-ceasing flow of technological invention will always face the challenge of maintaining its emotional connection to the human spirit – emotion is at the core of music, after all. And emotion is what many designers of music services and technologies try to imbue in their creations, creations that by their very nature are ephemeral.

iPod changed the game. It reflected who you were through something intangible: music. It did the impossible. It created magic. In a way, devoid of myriad features, the infinite possibilities of a canvas-like multitouch interface and a massive app store – devoid even of any network connections – iPod was the most magical device Apple created. It forged invisible yet strong connections between people through music. And its death, its yielding to a far more exciting, intense era of technological possibility, signals also the death of this singularly beautiful experience: the idea of the focussed technological device, the product that does one thing, but does it insanely great.

So, where to now? What is to become of the iPod line with the death of its last great ancestor, its direct line to the original iPod? Apple still has one more event left for this year, its October event that was historically dubbed the “music event” – the one that was reserved for announcing new iPods and a new version of iTunes. In recent years that has been replaced with iterative updates to the Mac line, and new iPads. The only iPod getting any love is the Touch – and rightly so. It is the only iPod that bridges the origin line (iPod) with the newer kids on the block (iPhone and iPad) through iOS. I don’t think iPod as a brand is dead. A new iPod Classic with a solid-state drive and support for high-fidelity music files has been talked about a lot on Apple forums. But only by diehard audiophiles, because that is exactly to whom this kind of device would appeal. It’s a highly-focussed segment of a small marketshare, and hardly anything that a behemoth like Apple, already deep in existing and well-performing product lines and ventures into entirely new industries (high fashion with Apple Watch) would even bat an eye at. It is for this reason – a pragmatic one, coldly looking at the statistics of market share – that I think iPod Classic is dead; iPod nano, shuffle and touch will continue on to ride out the ever-diminishing sales of a brand that brought a struggling company back from near-extinction, as Apple focusses on pushing people onto iPhones and the iOS ecosystem.

Tony Fadell, “father of the iPod”, put it succinctly when he said in a recent interview with Fast.Co Design:

“It was inevitable something would take its place. You know, in 2003 or 2004, we started asking ourselves what would kill the iPod […] And even back then, at Apple, we knew it was streaming. We called it the ‘celestial jukebox in the sky.’ And we have that now: music in the cloud.”

Like many people, I love music. It’s an incredibly important part of my life. iPod was – no, is – still a fundamental part of my personal tech setup. I am sad to see the Classic go, and have (begrudgingly) come to accept that, logically, there cannot be another Classic; that’s not the direction that the world is moving in. But I hope that the experience, the magic that Apple created with iPod, remains coursing through its DNA as it shifts focus from a consumer electronics company into a lifestyle one.