The Post-Truth Era

To say we’re living in a complex world would perhaps be an understatement. Complexity and contradiction are the pervading forces of contemporary society. So it would come, perhaps, as no surprise that something rather peculiar, yet also seemingly fitting, would emerge from such a unique epoch as this.

The ancient mathematician Pythagoras once said:

Reason is immortal, all else mortal.

Unfortunately, our era has somehow managed to kill rationality. In its place, we have inserted “feelings”. Welcome to the Post-Truth Era.

Post-Truth is becoming the new buzzword in the world of politics – specifically the 2016 US Presidential election. Trump’s ability to use emotive language in passing known falsehoods off as facts has been at the core of his notorious rise in popularity. The Economist has a wonderful article that examines this phenomenon from a political angle.

However, I feel that this idea of post-truth is infiltrating other parts of our society. I’m certainly not arguing for an abolishment of emotion, or for the cultivation of a generation of stone-faced, unemotional robots (although, let’s face it, robots would do a far better job at this civilisation thing than us humans have in the last few decades). But the replacement of all rational thought by pure emotionalism has brought into question our ability to think critically, to closely examine what’s being presented to us.

Rationality doesn’t sound so fun. The word feels like it’s implying you to actually use that computer-thing encased in your skull to do a bit of intellectual work. Emotion, by contrast, triggers soft ideas of pure idealism, of hope and an essentially cleaner path to seeking truth. And yes, emotion is a crucial part of what makes us human, of what defines our character and our compassion to fellow humans. So it’s perfectly fine in some, more social situations.

But when it comes to critical things that affect society – politics, but also ideas, debates, discussions around issues of epistemology, ontological arguments, education, the state of our nation – then it’s absolutely crucial that we still approach these topics from a critical, rational viewpoint. It’s inevitable that our emotional side will, to some extent, factor in our opinions and the reception of other’s opinions. The challenge comes in listening to the opposing or other view, then processing it with a critical sensibility. Or at the least, analyze it critically before passing any emotional judgements.

A lot of what’s happening in our society – both global, and in the local context (South Africa, and the global south in my case) is a direct result of irrationality overtaking our sense of judgement on multifaceted and interlinked issues. It’s when we let our emotions take control that we become vulnerable to the Thought Police (which is another issue all of its own), who will then proceed to slice and dice our very language until it resembles a form that is emotionally sensitive to every single issue affected by every human being, thus emptying it of any credibility, logic or rationality.

Post-truth operates through a series of logical fallacies that inject emotive propaganda, aimed directly at inciting one to make decisions with their heart and not their head. In our constant effort to seek truth, to understand our world and the complexities and intricacies of our society, we need to actually think first. In this era of digital noise, where we are susceptible to a swarm of emotion, of mindless chatter and the sharing of the minutiae of every person’s daily life, have we become so intellectually drained through technology that we’ve forgotten this very primal human trait?

Truth lies in the world around us.

– Aristotle (384–322 BCE)

Advertisements

Platform Wars are a Waste of Time

Mac vs PC. iOS vs Android. Automatic Transmission vs. Manual Transmission.

Since the dawn of technology, the platform wars have raged. The decision to use one system over another has somehow become suggestive of the character of a person. If you’re a Mac user, you’re suddenly labelled an “Apple sheep”. If you’re a diehard Windows person, then you’re associated with someone who does “real computing”, is “uncreative”, and a “numbers person.”

These labels serve no purpose other than to perpetuate a divide within technological circles, oft exploited by marketing teams to propagate one platform over another. They’ve been used to attack not just the flaws of a platform, but the people using these tools. Most frustratingly, they obscure the fact that no matter the characteristics of a particular platform, technology today has become so advanced that it is sometimes indeed indistinguishable from magic.

Here’s the thing: technological progress has been so dramatic over the past few years that it really doesn’t matter what platform you use. Especially in creative fields like design, cinema and photography: most applications are cross-platform, and the platforms themselves proffer enticing options no matter whether you’re macOS or Windows.

I grew up on Windows, and have programmed some significant (well, significant for me) projects. My prefered platform for the past 8 years has been macOS. I have very personal reasons – as many people do regarding their tools of choice to get the job done or to unwind with. These range from certain intricacies with macOS: the way applications are managed, the overall user interface, window management, the robust industrial design of Mac hardware, a trackpad that has yet to let me down and means I don’t have to always rely on an external mouse to get most design-related tasks done (and that even augments my mouse when designing on macOS). There’s also the comfort factor: I’ve grown very used to the Mac way of doing things. The list goes on, but it is indeed very specific to my own use case. The beauty is that I’ve been able to install the “best of both worlds” on my MacBook: I can experience the things I love about macOS like the Finder and the Alfred search extension, whilst simultaneously using Visual Studio on a Windows installation through VMWare to develop Windows desktop apps critical to the operation of SKKSA.

Look, we’re all entitled to our own opinions. And technology is as opinionated a field as you can get. Our lives are intricately entwined with the devices and platforms we use daily to live, to exist. So it makes sense that one becomes vehemently passionate about their platform of choice. But when that passion extends to bashing others for their choice of platform, especially without having a reasonable experience of said platform to base opinions upon, then it becomes a serious problem. In fact, it may say more about that person than their attacks and scorn of their target’s platform and by extension, the character attack associated with the choice of platform. If anything, it represents a juvenile, immature mindset; a rather closed, small-minded viewpoint of the vastness that is modern technology.

We should be excited and grateful that we have choice. More than one platform means that the developers of these tools are constantly competing to make their product better. This only benefits us, the end users.

Platform wars are a waste of time because they detract us from the beauty that is our modern, advanced systems. They detract us from actually focussing on collaborating, on creating and on using our incredible tools to help make the world a little bit better.

Utopian Illusion

“The society we have described can never grow into a reality or see the light of day, and there will be no end to the troubles of states, or indeed, my dear Glaucon, of humanity itself, till philosophers become rulers in this world, or till those we now call kings and rulers really and truly become philosophers, and political power and philosophy thus come into the same hands.”

Plato, The Republic

There are many illusions that exist like a thin veil obscuring the reality of society. Political correctness is just another layer that serves as a distraction from the bleak truths we are sometimes afraid to confront. Chief amongst those illusions is the notion that freedom, equality, liberty – these ideals we hold so dear to our sacred conception of “democracy” – can bring about a utopian society where everyone’s needs are satisfied.

Utopia cannot exist because it is the product of human creation. We are a flawed species, and thus any system we invent will inherently have its problems. Like there is no ideal form of government or any singular, perfect philosophy, there can only exist the pursuit of the utopian ideal, but never any true attainment of the ideal.

Things in nature exist in duality; for there to be good, bad must exist as its counter-balance. For there to be day, there must also be night to give it meaning. It’s a sentiment best captured in the ancient wisdom of the Tao Te Ching:

“Everyone recognizes beauty

only because of ugliness

Everyone recognizes virtue

only because of sin”

– Lao Tzu, Tao Te Ching (Verse 2)

We would only seek to fool ourselves further if we were to believe in the illusion of a utopian society. Plato himself, the paragon of Western philosophy, denounced the notion that the democratic state is the apogee of government. Our society is far more complex, nuanced, multifaceted to be easily controlled by a single system.

Once we can accept these limitations, and embrace the complexity of modern society, understanding that nothing will be perfect, and nothing can be perfect, we will truly begin to move forward. The stagnation felt by many as we struggle to enter a world that is seemingly wrought with inequality, despair, hoplessness, with unfair economic systems that only further the class gaps, leads to this yearning for the antithesis of the dim present – that is, the utopian dream.

Utopia, and its sibling, perfection, are ideals to strive toward, not something we can ever truly grasp. It’s like that notion of design as being a function of infinity: it’s something that has no end to it in and of itself. It’s a system that will forever be held just beyond our grasp, as we progress towards it.

“One may look for fulfillment in this world

but his longings will never be exhausted

The only thing he ever finds

is that he himself is exhausted.”

– Lao Tzu, Tao Te Ching (Verse 2)

The Business of Aesthetics

Patrik Schumacher, partner at Zaha Hadid Architects, recently took to Facebook to voice his opinions on Alejandro Aravena’s Prizker award earlier this year. His formidable position in our contemporary architectural discourse, coupled with his work in the arena of parametricism as style and his collaboration with the “queen of the curve”, the late and great Dame Zaha Hadid, add a certain gravitas to his sentiments and indeed reignite the debate over architecture’s societal role:

The PC takeover of architecture is complete: Pritzker Prize mutates into a prize for humanitarian work. The role of the architect is now ‘to serve greater social and humanitarian needs’ and the new Laureate is hailed for ‘tackling the global housing crisis’ and for his concern for the underprivileged. Architecture loses its specific societal task and responsibility, architectural innovation is replaced by the demonstration of noble intentions and the discipline’s criteria of success and excellence dissolve in the vague do-good-feel-good pursuit of ‘social justice’.

– Patrik Schumacher, ZHA

Architecture as a practice has long sought to root itself within a societal discourse. And rightly so, for the artefacts it produces stand as anchors in time, reflections of the zeitgeist, and responses to various social flows – the flows of people, of money, of technology and of power.

Yet at its core, I believe, architecture holds firmly to the business of producing artefacts. For, once stripped of all the intellectual mist that surrounds a piece of architecture, the thing that remains, the concrete and brick and mortar that form the geometries so intricately laboured over by the practitioner, lies firmly within the realm of aesthetics. It is an artefact, an object that was created to appeal, at its very base level, to certain rules of beauty that have been argued over for millenia.

In pop culture, the way something looks is paramount to its success. Let’s not kid ourselves about this. The aesthetic conception is something that pervades contemporary society; it’s the veil that draws one in to whatever intellectual (or “abstract beauty”) lies behind.

Perhaps there is another debate lurking here – what is it that defines beauty? Ideas of cultural bias, of historic prejudiced views, mathematical proof and geometric arguments all play pivotal roles in discussing this. But that’s not the point of today’s post.

My argument is that, in a world that has become susceptible to politically correct language, it is very easy for discourse around architecture to become dramatically defensive and deny the unavoidable (if perhaps harsh) truth: that aesthetics is the name of our game. We should rather embrace this discourse, and begin to tamper with it: to engage in the idea of aesthetics being a crucial part of architecture, and to interrogate its various virtues and disadvantages.

Architects are not politicians. We’re not activists, nor are we philosophers. Yes, we may harbour sentiments that are shared with these other groups, but at the heart of our profession is a desire to shape worlds, through imagination and the pursuit of the creative spirit. We are ever aware of the gravitas that underscores our duty to society, yet that doesn’t mean we can’t also have a little fun too.

Bionic Citizen

Apple WatchWearables are the hot topic in technology today. What once seemed like something out of a Star Trek episode is now reality. And whilst Samsung, Motorola and others are veterans (if you could call more than two years in the product category “veteran”) I truly think that Apple’s design and brand-respected clout will be the final tipping point in solidifying this curious new niche as a distinct product line.

The Apple Watch’s gutsy move in positioning itself as a piece of haute couture is indeed a bold move; I have many questions about the longevity of such a device when the ephemeral (technology) is juxtaposed with the eternal (high-end watch design). It seems like a product conceived of contradiction as much as it is a device seeking to bridge two seemingly disparate factions of society (fashion and technology). However, it represents a progression of technology. A powerful progression, I might add: we are on the cusp of transcending the notion that technology exists as a realm separate to the organic body that is us. The Apple Watch’s ideal of being a fashion piece means it encompasses the design traits distinct of fashion: personality, individuality, intimacy.

But even more than that: wearables are a step closer to the assimilation of the inorganic — the world of binary code and cold, calculated lines of coding — with the organic: us, human beings, sentient creatures who have created these strange contraptions. Are we on the brink of becoming bionic? Is it possible that, by following this progression to its logical destination, we can assume that there will be a convergence: that these two worlds will properly collide; that we will become bionic creatures, beings made just as much with technology as we are with blood and muscle?

The singularity theory postulates a point in time in the distant future, called an event horizon. Beyond this point, we cannot predict, using the cognitive skills we have honed over millennia, what will occur next. The idea that we are assimilating ourselves with the technology that we use can be considered as an event horizon. The current state of technological development suggests that this is the case, but I think that if we examine the nature of our current society, we can further understand why this might very well be the eventual outcome of our species.

We live in an increasingly connected world. The Grid rules our lives: we are beings that connect to it daily; it is what provides us with a large amount of our daily intellectual sustenance. We cannot function without the Grid. Our society is hyper-connected, and devices like smartwatches illustrate how reliant we are on the Grid. That such a device exists signifies the ever-encroaching grip our digital lives have on us. Our lives are lived as much in the virtual sphere as they are in the realm of reality. When technology begins to disappear yet exert an even more potent force upon our existence, we begin to step closer towards that event horizon of the bionic being. Technology is beginning to disguise itself. It is no longer taking the visage of a “device that looks like a computer” — that traditional aesthetic is being challenged as microchips get smaller and more powerful, thus making it possible to insert them into devices that we are accustomed to for centuries.

As design becomes invisible, it will allow technology to ingratiate itself far more easily into our daily lives, slowly tightening the grip the Grid has on our psyche. As microchips become smaller and their power more potent, we will begin to subconsciously slip into a symbiotic reliance with the devices and the virtual networks we’ve created. This is going to change society in incredible ways. Everything from culture to politics to economics to the built environment will be affected. One area I’m particularly curious about is, of course, the urban architectonic. How will our bionic beings navigate an architecture of the future? How will our built structures evolve to support the new lifestyle that will emerge? How can virtual reality coexist with the concrete and steel that is so intrinsic to maintaining our human existence at the base level?

One simple device, oft derided and questioned for its very purpose, can have the potential for a significant sociocultural impact. For now, though, we play the waiting game: watching, patiently, as the progression of technology and society slowly merge, an intricate dance orchestrated by a plethora of parameters and the organic ebb and flow of time…

 

Originality v Hollywood: Dawn of Mediocrity?

A curious phenomenon is occurring in the centre of society’s entertainment universe. Perhaps it’s a sense of potential failure casting a net of fear around what was once a creative powerhouse. Perhaps it’s a descent into mediocrity as our collective society has embraced a sense of complaisance, where banality passes for acceptable quality. Whatever it is, there can be no denying it: Hollywood appears to be running out of fresh ideas.

Instead, we’re being treated to the wonders of rehashed entertainment. I’m reminded of a sentence Nick Offerman’s character, Deputy Chief Hardy, says in 21 Jump Street (ironically, a reboot of a popular television series)

“We’re reviving a canceled undercover police program from the ’80s and revamping it for modern times. You see the guys in charge of this stuff lack creativity and are completely out of ideas, so all they do now is recycle shit from the past and expect us all not to notice.”

I feel like this is exactly what an executive-led creative industry is doing. I can almost picture the suits in their corner offices somewhere in Los Angeles, cigar in hand, smug grin on their faces, signing-off another reboot, knowing that our pop-obsessive society will eat this all up and fatten the studio’s bottom line. How stupid do they really think we are?

There will come a point, hopefully soon, when cinema audiences will tire with this. When we will finally open our eyes to the fact that it’s the same movie, with the actors-du-jour fitted snugly in to a predictable plot.

Look, don’t get me wrong. I’m just as excited about the new Star Wars as the next fan. Likewise, I can’t wait to see what Marvel has in store with Avengers: Age of Ultron. I’m an (obsessive?) follower of their Agents of S.H.I.E.L.D. series, and an ardent watcher of both Arrow and The Flash, two of DC’s darling television spin-offs. These are all properties based off existing source material, whether it’s comic books or one of the most famous cinematic franchises of all time.

However, I feel that there are talented writers out there with exciting, fresh stories yearning to be unleashed from their paper bounds and brought forth onto the reflective-silver screens of our cineplexes. These stories are being marginalized when studio execs opt to “play it safe” with rehashes of recently-completed rehashes (I’m looking at you, Spider-Man), with bloated adaptations of beloved source material (The Hobbit) or the hope of capitalizing on unexpected, explosive success. In the case of this last example, I’m of course referring to the recent news that Lionsgate, boon of the young adult dystopian fiction adaptation fad, is considering continuing the Hunger Games stories beyond the book. As a fan of the series and its cast and wonderful director, I sincerely hope this will not materialize. Whilst it would be great to see more of the world that Katniss inhabits, and the fact that the last book left much to be desired in terms of an ending, the stories should just be left alone. Hollywood needs to learn about a story’s limits. They need to learn how to let go.

At the end of the day, we as cinemagoers make the final decision. We have a choice about what we want to watch. That’s the great thing about cinema: we live in an era when there are so many possibilities; were spoiled for choice, essentially. We can choose whether we feel like watching an inventive story like Birdman, or rekindle some nostalgic feels with a viewing of a Godzilla (or Ghostbusters or Robocop or Terminator) reboot. The thing I truly wish for, through, is for original stories to receive the same level of care and treatment that these existing, beloved properties are currently getting.

Equipment Doesn’t Define Creativity

There’s this wonderful saying that perfectly captures my thoughts on this topic. Essentially, what I believe, after going through three years of intensive design instruction in my undergraduate architecture degree, and throughout my various design-oriented ventures for personal work and for SKKSA, is that equipment does not dictate creativity. Indeed, it’s not what you use, but how you use it. This is where the magic happens; this is the act of art, where the depth of the creative act becomes apparent at the hand of the craftsman.

So before I delve deeper into this topic, here’s the gist of this idea: you wouldn’t compliment a chef’s kitchen utensils if you enjoyed his meal; you would commend his skills at bringing forth a delightful gastronomic experience. Similarly, one shouldn’t say “wow, that’s a great photo. You must have an amazing camera.” Because, like the chef and his delicious meal, a beautiful photograph is the creative proof of the photographer’s skillset: of understanding light, composition, technical dexterity and that unique aspect of the creative process that transcends mere product and renders a piece “art” – judgement and intuition.

One could have the most expensive creative equipment at their disposal, but without the knowledge of how to drive these tools, without intuition and passion and a deeper, rooted understanding of the art form – whether it’s a literary work, a piece of art, a photograph or the design of a building – the resultant work would be mundane, lacking a sense of meaning and connectedness to humanity, to society, and thus considered a positive contribution to the world.

All too often, in our consumeristic mindset, driven by the fast-paced nature of technology, society and an ever-increasing pressure to constantly produce for insatiable, all-consuming minds, we forget the magic that can arise when we transcend focus on equipment and rather consider the actual act of creativity. The act of creation, of making something out of nothing, is a rather sacred thing. To render something from the mind into reality is a cornerstone of mankind’s evolution, of our ascent from mere hunter-gatherers purely concerned with survival, into creators and thinkers with the potential to build entire cities and venture forth into the stars.

So these platform debates and mock-wars over which brand or product, or tool is better, are rather meaningless in the grander scheme. Whether you’re Windows or Mac, analogue or digital, it’s the way you use what you have to create that determines your prowess. In the end, not many will care how you created it; it’s the end product that matters to the large portion of society. But it’s up to us, as the creators, to imbue in our work meaning, and a rootedness to culture, society, history – to the precedents that provide richness and add dimension – because these are the elements that will ensure longevity in the final product. These, and not what was used to create them, will immortalise our names and ensure our creations add value to our fellow humans.