The Dark Knight; Avatar; The Age of Innocence; The Wolf of Wall Street
It has been five years since the release of Side by Side, Chris Kenneally’s vertical documentary on the digital filmmaking revolution, as told by Hollywood’s top directors, cinematographers, editors and executives. The question at the center of the film is the same question facing the world today: What are the consequences of the digital revolution?
Hollywood was a forerunner in adopting digital technology, as studios and filmmakers alike pushed to develop better tools to realize their vision onscreen. As such, Side by Side has become a fascinating time capsule from 2012 when filmmakers were grappling with questions that echo our current dilemmas: With so much digital information, do we have enough time to think through our choices? Can people distinguish between what is real and what is fake? If so, how well? Are we more or less engaged with our lives through digital technology? Is our quality of life made better or worse by this ubiquitous invention? The documentary is a blueprint for digital modernization that takes stock of what we’re gaining as a society, and what we may have lost.
Atonement
There are two definitions of revolution which are, on the surface, at odds. The first sees a revolution as a physical rotation or orbit with a return to the point of departure. The second definition is a permanent, extraordinary departure from one way of life into the unknown. This inherent contradiction in definitions makes it challenging to forecast when you’re in the midst of sweeping change. When you leave the house in the morning are you coming back, or are you leaving forever? Side by Side illustrates how technological revolution is a departure and a roundtripat the same time.
At its core, the digital takeover in Hollywood was driven by economics. Traditionally, filmmaking was expensive and labor-intensive. The cost of film stock alone was prohibitive to independent directors. The delays and technical issues that arose on film shoots were often a result of the limitations of physical film. As such, studios and corporations had long been in the business of developing more reliable methods for film production and delivering them to the film community for testing and feedback.
The other driver of the digital takeover was artistic vision. Action films are reliant on visual effects. Directors such as George Lucas and James Cameron were frustrated by the limitations of celluloid. They led the way in developing hardware and software to bring their futuristic visions to the screen. The result has been a permanent departure from making movies in the traditional way, with each advancement in digital technology taking the industry farther afield of historical norms.
Sin City: A Dame to Kill For
Once digital recording passed muster with enough filmmakers, studios pushed to use the technology on all films as a cost-saving measure. This set in motion a disruption of the traditional film production model and permanently impacted every aspect of the process from development to projection. For some in the industry, technological advancement was an inevitable learning process. Each new tool or skill brought people back to their job wiser and better equipped. For others, advancement carried them away from a beloved art form into new territory and sacrificed everything they couldn’t bring with them.
Filmmakers featured in Side by Side have unique processes and points of view, but they all agree on one issue: those who wanted to work in one format or the other had to find each other. A director who wants to shoot on digital isn’t going to work with a cinematographer who only shoots on film. When you apply this notion to society as a whole, the current polarization of America makes sense. Americans best served by digital advancement are largely unconcerned with who is left behind, taking the general view that there is always loss with gain. Meanwhile, Americans ignored or harmed by technological advancement assert that it’s not advancement if it’s not inclusive; that there are costs associated with progress; that sacrificing people for technology isn’t beneficial to some individuals, even if it benefits society as a whole. Likeminded individuals band together and the digital revolution has thus created two polarized camps. Both want their country to succeed, but they’re pitted against each other because their definitions of success are at odds. The mere existence of digital technology divides us even when our ultimate goal is the same.
Star Wars
In Side by Side, it’s striking thatthose who advocate for celluloid describe it in futuristic terms. There’s a wonderful stretch of interviews with directors, cinematographers and actors describing a shoot day with film. They note the distinctive sound of the “money” running through the camera that ups the tension on set. Richard Linklater likens it to an athletic event, where participants mentally and physically prepare for a heightened moment of performance and then…Action! Words like “magic” and “leap of faith” are used to refer to the act of recording on film with the same kind of awe one might reserve for flying cars or teletransport. The sentimental language of people who are making a visionary plea is now used to entreat listeners to buy into history. This is a tipping point on the arc of a revolution. Where we once romanticized the future, now we romanticize the past.
The Social Network
Lucas, Cameron, David Fincher, Danny Boyle and Robert Rodriguez all speak convincingly to the massive benefits to digital filmmaking. Lucas describes the antiquated process of color-timing which has now been replaced by the entirely new artform of digital colorizing. Fincher recalls an issue with camera weight when filming the rowing scene in The Social Network, and how a 5.5lb digital camera made his impossible shot possible. Rodriguez says he wouldn’t have attempted to make the comic book thriller Sin City without the myriad freedoms afforded by digital manipulation; the movie simply wouldn’t exist.
In perhaps the most compelling testimony, Boyle vividly describes how smaller digital cameras interacted with his actors on the streets of Mumbai in Slumdog Millionaire. His DP, Anthony Dod Mantle, was free to roam in and around the sets, improvising with angles and capturing images with a kind of intimacy that was previously unattainable with cumbersome film cameras. Mantle won an Academy Award for Slumdog, the first ever awarded to a film with digital cinematography.
The counterargument to these digital discoveries, however, is stark. Christopher Nolan, Martin Scorsese, Wally Pfister and others are vocal about the loss of realism with so much image manipulation. They discuss the importance of slower pacing during the filmmaking process, and how the encumbrances of physical film force necessary pauses in the creative process. Where filmmakers once shot scenes in 2–minute bursts and broke to reload the cameras, now digital cameras run without cutting. People are always “on.” This is frustrating for some actors (Robert Downey, Jr., Keanu Reeves) and welcomed by others (John Malkovich.)
Scorsese and Nolan indirectly raise the question of whether there’s enough room to think, focus, and make good decisions on the timeline dictated by digital technology (a question Americans ask daily, both of themselves and their tweeting president.) Listening to their reasoning, it seems incredibly foolish to argue with genius, yet five years on we know that’s precisely what studios have done. Scorsese’s last two films, TheWolf of Wall Street and Silence, were a hybrid of film and digital shots. In 2014, Paramount announced it would no longer release movies on film. Undoubtedly other studios will follow suit. Nolan is the high-profile holdout. He will release Dunkirk this year, which Hoyt van Hoytema shot (by all accounts, magnificently) on 65mm film.
Anne V. Coates, the celebrated editor whose career has spanned 70 years, is eloquent on the broader impact of working at digital speed. She makes an excellent case that the automation of the editing process delivers less-considered work and has all but eliminated happy accidents. For example, Lawrence of Arabia (for which she won an Academy Award) includes a scene in which Lawrence blows out a match and then cuts directly to the sunset over the desert. The cut delivers a startling, thrilling visual. Coates observes that a dissolve was originally written in the script and if she’d been editing the film digitally the transition would’ve been added automatically. Instead, she was working with physical film that required manually cutting the film strips and taping them together, so the first edit had the film clips “butted together” without any transition added. When they watched the results of that first cut through the machine…“Magic.”
Lawrence of Arabia
Early adopters of digital technology — Lucas, Cameron, Rodriguez, the Wachowskis, et al. — are known for inventing their worlds; much of their work is futuristic and fantastical. Early defenders for shooting on film — Scorsese, Soderbergh, Nolan — typically apply their vision to the world as it is and explore stories of the past and the present. From one angle, these groups can be boiled down to “fake versus real.” In a fake world, the audience is treated to superhuman visuals and challenged to think beyond corporeal limitations. In realistic films, audiences watch drama or comedy unfold between recognizably limited characters and are offered a touchstone for processing their own lives. Both of these experiences are powerful. Both have value. In 2017, only one is thriving.
Out of Sight
What has the changeover from film to digital cost us in terms of emotional depth? For me, the difference is palpable if not measurable. Even the work of visionaries like Lucas and Cameron has suffered slightly. Some of the most exhilarating moments of Titanic came from the film-shot grainy underwater footage of the ship itself. The visual experience of watching film, versus digitally-shot footage, is shades closer to real life. Those scenes anchored the film emotionally (if not literally.)
Meanwhile, Avatar was a visually stunning experience but it didn’t leave emotional fingerprints the way Titanic did. Similarly, I loved Star Wars before most of today’s technology was available and I don’t like what was done to the original films with the technology that has been developed since. There is an emotional connection to what we recognize as real. From theater to film to television to digital streaming, we’ve stepped farther and farther back from flesh and blood experience, ever-widening the space for others to reach in and manipulate what we see. The more we watch digitally perfected images, the less satisfied we become with real life, and the less prone we are to connect with it emotionally.
In 2017, these shades of the fake/real divide are central to digital’s impact on our political process. While politicians and pundits argue over what is real and what is fake, consumers of the information are less and less able to discern between the two on their own. It’s the information version of photoshopped models. When an altered image is presented to millions of people as real, there is mass diversion from reality. The same holds true for facts. The outcome is a misinformed populace.
Dunkirk
The final issue discussed in Side by Side may be the most salient for American politics in 2017. While the image quality of digital filming can be hashed out by filmmakers and camera developers, the choice to watch a film together in the theater is up to audiences. Michael Chapman’s comment that “cinema was the church of the 20th Century” feels right, and dated. The 21st Century is a world full of worshippers-on-the-go. Only streaming services and online video stores know a subscriber’s true religion.
The loss of a unifying arbiter of culture has untold implications. I suspect it’s responsible for the aggressive reactions I get when I say I don’t watch television. People recount entire shows for me on the spot, as though my reason for not watching is that I think I won’t enjoy it, not that I have limited time. In the midst of this unsettling revolution, people are unconsciously searching for common ground. Someone who doesn’t watch Game of Thrones or Girls is no longer simply missing out on something great. They’re perceived as a threat to the diminishing pool of broadly shared culture that binds us together. On this and so many other levels, fear of other has defined the digital revolution so far. If Hollywood’s experience is a predictor of our trajectory then we’ll fight our way out of this polarized state to find common ground again, and we’ll have cultural scars and bruises to show for it.
Critics reviewed Side by Side favorably in 2012 but noted its “inside” and “geek heaven” tendencies. In 2017, it is a film for everyone. We’re savvier by necessity, as digital technology has taken over the most important aspects of our lives: communication, organization, and archiving, or memory. We’re also reengaging vociferously with the political realm after several decades of relative quiet. As noted by Nancy Benac and Ben Nuckols for the Associated Press, “[the] Women’s March on Washington appeared to accomplish the historic feat of drawing more people to protest the inauguration than the ceremony itself attracted.” New forms of digital engagement are clearly having an effect on politics but it’s too soon to draw conclusions about where they will ultimately take us.
The digital revolution is an unfinished story. The internet has usurped much of our physical infrastructure, but a forced takeover doesn’t engender trust. With each incursion into our privacy, and with cyberattacks on the rise, people are increasingly aware of technology’s reach and they don’t like it. When a foreign country can damage our democracy and take away our freedom of choice by influencing our election through digital media, voters may finally see fit to push back. Silicon Valley has been an unapologetic proponent of the digital revolution. Baked into their philosophy is an anti-consumer approach: We tell you what you want. Some call that tastemaking, but the ubiquity of smartphones and computers means that the Facebooks of the world have too great an influence over events as important as our presidential election. In 2017, Silicon Valley has a lot to answer for.
As we grow with this rapidly expanding technology, it’s important to continually redefine our philosophy in a rapidly shifting context. Are we moving forward as a society? Is this technology helping or hurting us? Do the ways that we incorporate it serve our values? …and one question I couldn’t shake while writing this piece: Should we even call Side by Side a “film?”
Side by Side is available to stream on Amazon, Netflix, iTunes, and elsewhere.
George F. Cram (1842–1928) — Cram’s Unrivaled Family Atlas of the World, Chicago IL. Lithograph color print. Diagram of the Principal High Buildings of the Old World
In an article published in The Atlantic this week, Walter Isaacson laid out his vision for “how to fix the internet.” The problem, he says, is Trolls. Bugs. Lack of tracking. He believes anonymity has “poisoned civil discourse, enabled hacking, permitted cyberbullying and made email a risk.” His solution is to do away with anonymity, thereby offering himself as the mouthpiece for every Silicon Valley titan with deep pockets and a hunger for data.
I’ve written on how we civilize technology before, on the challenges we face with each shift forward in technology, whether it’s ships, trains, radio transmitters or nuclear energy. The trajectory involves a series of near-misses while we get the hang of our shiny new toy. When cars were first invented there were no laws to govern driving. As cars proliferated, accidents increased. Now we legislate everything about car and road safety, down to the driver’s decision to wear a seatbelt. There are fines for not wearing one. If the trouble with internet technology is bad behavior why not address the behavior?
What Isaacson skims over in his trolling lament is that the worst trolls on the internet are the very people he thinks should solve the trolling problem. Huge media companies like Facebook shamelessly collect their users’ data and sell it. Anonymity is not permitted on Facebook because the company can’t use you, can’t parse your information into demographics and ad bins, if they don’t know who you are. Similarly, the “trust” relationship built into place by search engines like Google is merely a handshake agreement that the company won’t “be evil.” That doesn’t mean Google deletes your search history. As we saw in the 2016 election, “evil” is a word that’s up for interpretation in our society. We, as users of Google, don’t know who is deciding what’s evil at any given time. Isaacson wants users to lose anonymity but notably makes no mention of tech companies and their addiction to opacity. In Isaacson’s future world, users are the biggest losers.
Isaacson offers logical suggestions for what a safe internet might include but how he gets there is the sales pitch of the century. Certainly, it’s important to institute payment for work. We don’t need a new internet for that. I’ve been pitching companies like Medium on this concept for years. “Find a way to pay your writers, even one cent per read, and you will revolutionize the publishing industry.” “A pay model is the only way forward to maintain the integrity of what is published and read.” Medium could institute a pay-model today. What Isaacson misses is that companies and sites most users rely on for information offer their services for free so that they can take whatever consumer data they want in return. The internet hasn’t evolved naturally into a pay model because the people currently making big bucks off of internet technology are also in charge of its design. There are no checks and balances built into the governing of the internet. This does not mean we do away with internet privacy. It means we legislate it.
To revolutionize the internet, the Googles and Facebooks would have to become industry-leading pay-model services. In a pay-model service, user-consumers would lose anonymity to the company offering the service (via their credit card), but maintain privacy in whatever further capacity they wished while using the service. It would be no different than walking into a Starbucks and ordering a latte. Give the barrista your own name or someone else’s, pay with cash or credit, hide your face behind sunglasses or don’t…at the end of the day, you’re physically standing in the store and someone can deal with you if you cause a disturbance. As long as you’re a peaceful coffee drinker you still have agency to determine your level of privacy. The same is true of a paying customer online.
Finally, and this is perhaps the most important omission in Isaacson’s piece, there is presently a massive power struggle underway between government and technology elites — specific, powerful individuals within broader industries. Both groups are greedy for data. One group wants to retard technology in order to maintain control over its electorate. The other group wants to advance technology so fast it will maintain control over its creations and, by extension, its users. The electorate and users are one in the same. The bad seeds among us exist whether anonymity is built into the internet or not. They exist in government, they exist in boardrooms and they exist in chatrooms. It is persistant abuses of power which promote toxicity. Unless government and technology elites find a way to work together for the betterment of society as a whole, that toxicity will continue no matter what internet protocols are put in place.
The fundamental growing pain of the Information Age is distrust.
I don’t want medical information from Del Bigtree, producer of Vaxxed and a former producer for the Dr. Phil-created show The Doctors. Sadly, millions of Americans listen to people like Bigtree because faux medical shows run on free television and are endorsed by celebrities like Oprah. For this reason, Vaxxed must be addressed.
I also don’t want medical information from ABC News after listening to the questions posed to Bigtree by their segment reporter during their unedited 10-minute interview prior to releasing Vaxxed. She asked general rather than science-based questions and subsequently ran a piece focusing on celebrity-non-medical-professional Robert De Niro. Sadly, millions more Americans get their medical information from ratings-chasing sources such as these.
The confluence of too much information and a massive shift in newspaper revenue streams means many journalists have cut the corner of agnosticism and taken the shortcut to opinion. Opinions sell faster and better than impartial news because they provide an extra service. The public is overwhelmed by the sheer scope of information out there. The layperson’s response to information overload has been to confer trust on opinionated individuals in the media, whether those individuals have any expertise or credentials or not. (Dr. Phil has a masters degree in experimental psychology. Millions of people are unwittingly participating in his experiments.)
The underlying problem is this: Everything Del Bigtree says in his interview about the way our institutions are supposed to work is correct. His logic about our broken system lends disproportionate weight to his unrelated thoughts about vaccines. Donald Trump is presently enjoying the same path to success. People are habituated to follow the breadcrumbs of rational-sounding speakers, even if their only rational thoughts are to voice obvious grievances. However, it no longer goes without saying — just because people are right about the way the system is broken doesn’t make them right about anything else.
Our refusal as a society to properly fund journalism by embracing “free” information on the internet is directly responsible for proliferating misinformation.
Distrust of our institutions has ultimately fostered an environment where people distrust professionals. The majority of us are not doctors, haven’t attended medical school, and therefore rely on trained doctors for good/best information. When trust in that system breaks down, the next line of defense is journalism. When trust in that system breaks down, whistleblowers come forward. When trust in whisteblowers breaks down, you have millions of people basing important medical decisions on uneducated readings of partial and/or decontextualized information online or on television. In the case of vaccines, this creates unnecessary dangers and has already lead to unnecessary deaths.
To be extra clear: shaming people for their refusal to vaccinate is profoundly unhelpful. Shaming people for looking for explanations and answers…also profoundly unhelpful. Shaming people for blatantly not doing their jobs is completely acceptable.
To that end, I’d like to publicly shame the writers at mainstream media outlets who pressured the Tribeca Film Festival to pull Vaxxed from their line-up, not because I think the film has an ounce of validity (…how could I know? I haven’t seen it…), but because we have a problem with people not vaccinating their children. When film critics and science writers suppress a film that illustrates a real problem, namely broken trust in our institutions, they feed the narrative on both sides of the vaccine issue (Andrew Wakefield’s a quack/Andrew Wakefield’s being suppressed) and perpetuate a serious problem. A journalist’s job is to convey the necessary facts in order to resolve the issue. When journalists publicly decline to see a film AND assert it is quackery, they squander what little trust remains in the institution of reportage.
If the answer to our vaccine problem is as simple as debunking a quack doctor, then journalists should sit through a two-hour movie, wade through the information yet again, debunk the father of this misinformation and demonstrate to a skittish public that no stone has been left unturned. Journalists should do this not because Vaxxed has any validity, but because anti-vaxxers think it does, and those people are not vaccinating their children. The number of people who will see Vaxxed is negligible compared to the millions of people who will read a widely shared takedown piece. The stronger the case science journalists and film reviewers make against a film like Vaxxed, the sooner this issue will be resolved.
If journalists can’t make a strong enough case for this problem to be resolved — and I doubt they can because the task is too big; a “strong enough” case today entails renewing people’s trust in the entire healthcare system. We’re that far down the path of suspicion — then the issue should continue to be treated with skepticism while a second case is made for the public to accept and weigh the alternatives: potential return of deadly disease versus potential vaccine-autism links. There is no third option at present. “Waiting” for a different vaccine is equivalent to not vaccinating and carries consequences. You vaccinate or you don’t. Personally, I encourage people to do as much investigation of the diseases they aren’t vaccinating against as they do of the vaccines. That precious airtime spent looking at Robert De Niro’s headshot should be filled with information on what happens when we don’t prevent preventable diseases. (I expect he would agree.)
This issue will continue to worsen until we respectfully acknowledge that people’s trust in their institutions is broken, and behave accordingly. Yelling at people to trust something never works. The vaccine debate, like so many debates cropping up across the country, came about due to systemic distrust. The way forward is for institutions to demonstrate their trustworthiness, not their disdain, and to give the public a free, considered, informed alternative to Dr. Phil and his ilk.
Upstairs at the Last Bookstore (photo credit: E.C. McCarthy)
I’ve always been passionate about supporting fellow writers and artists. I recently received a lovely email from a college student who I babysat for when he was an infant. He asked for advice on everything from publishing to internships. Even though I’m convinced I don’t know a goddamn thing about life (and have extensive proof to back this notion up), I shared a few things I’ve picked up along the way that help me make sense out of being a writer. This email was written specifically for him, and belongs to him, but I thought it might be helpful to other writers who are starting out, or to people who work with writers, are parents to writers, are friends with writers, are in love with writers…
There’s a world of difference between someone who writes and someone who identifies unequivocally as a writer. This letter is for the latter.
Dear D,
Well, this is lovely symmetry. You’re now the age I was when you were born. If I remember correctly, you and I had a few one-way conversations that summer, mostly me reasoning with you to stop crying and fall asleep. If only I’d had the foresight to suggest that someday you’d want my help getting an internship. At least now I can honestly say “He never sleeps.”
This is going to be a long email that won’t make complete sense to you now. I recommend hanging on to it and rereading it in a couple of years. Most of this advice will eventually synthesise. I’m including even the kitchen sink because I wish someone had said these things to me at your age so that the issues were in my peripheral thoughts and not complete surprises when they presented themselves, generally at inopportune moments. Anything here that doesn’t make sense, just push it aside. It may crop up down the road.
Most writers you ask for advice will warn you off becoming a writer. Don’t take it personally but do take it to heart. It’s not rewarding in any traditional sense. It’s lonely and it’s hard work, but it’s incredibly meaningful because of what you’re giving up to pursue it (stability, regular income, a sense of belonging.) We live in a culture that romanticises a writer’s life, so you aren’t allowed to complain about your choice even though you’re sacrificing just as much as the people who take jobs they don’t want in order to support families, or themselves. Your life won’t look like a sacrifice because you’re privileging your thoughts over everything else. It’s the “everything else” that you sacrifice.
Don’t underestimate the emotional toll it takes to carry a mountain around on your shoulders. To mitigate the discomfort, do the uncomfortable things like emailing other writers and artists and connecting with people. Build a network of support with people who grasp what you’re trying to do. Also keep in mind that artists can be some of the most fucked up people in the world, so try to be smart about who you trust and don’t beat yourself up when you get burned. It’s going to happen and it will suck. One of my favourite quotes is from Bohumil Hrabal’s I Served The King of England — “He was a gentle and sensitive soul, and therefore had a short temper, which is why he went straight after everything with an ax…” (it’s a great book that I highly recommend, but only after you read Too Loud a Solitude, which is even better.) Some artists (friends, mentors, colleagues) will come after you with an ax, but chalk it up to their having bruised temperaments from a lifetime of being misunderstood (as all writers have been) and get really good at forgiveness. Forgiveness is probably the third most useful skill in this profession, after curiosity and word usage.
Here’s the good news — you can’t screw this up. You can only fail to meet certain expectations that you’ve created in your mind, like getting published or living the life of a writer. Those are broad concepts. Yet, as you’re reading them here, they probably call up specific images because you’ve been thinking about them a lot. The truth is our future life only promises shades of what we imagine. I can’t think of a single time something happened exactly the way I imagined it, and more often things work out completely differently than I thought they would. The imagination is a writer’s bridge to their finished work, but using it to reach practical decisions can blur the line between goals and expectations which leads to crushing (but avoidable!) disappointment. So, you can’t screw this up because there’s nothing to screw up. The future is a blank. Set yourself a few goals, preferably achievable ones, and then set to work attaining them through curiosity, investigation, skill, and hard work. If you want to write about politics, move to Washington. If you want to write about love, love people…mindfully and willfully, not like it’s expected of you, and not like it’s a given. If you want to write about seeing the world but can’t travel, talk to people who have and write nonfiction. If you want to write about a world that’s better/different/more advanced/more regressed than the one you live in, become knowledgable in what this one offers. These are all logical choices, but when you begin to write it can feel overwhelming to identify what you want to write about because you haven’t really tried anything yet and you may not know where your true interests lie. See, screwing up is what you’re supposed to do. It’s the currency of writing. The screw ups become your material. The thing to get good at, then, is failing. (And yeah, it’s painful to write that, because I often wonder if I’m TOO good at failing, and then I write something that speaks to people and it gives me a good enough reason to go out and fail again. ☺)
For good measure, here’s a piece of my advice you can ignore — don’t be afraid of the smiley face. (DO. BE AFRAID. I use it all the time and it’s complete laziness on my part. Inventor of the Smiley Face, I shake my fist at you! I undercut everything I write with the smiley face. ☺)
re: Being terrified
Being terrified is completely normal. You’re taking a big risk. I was just having this conversation with a friend the other day and we disagreed over the concept of bravery (which I mention only to remind you that everything I say here is subjective and easily disagreed with.) For me, being scared is inherent in bravery. Understanding what’s at stake and being willing to face that loss repeatedly is imperative to living an honest life. Some people rely on religion for this sort of self inventory. Personally, I prefer philosophy and science. No matter your chosen paradigm, fear is going to be a part of everything, always. People have ways of convincing themselves it isn’t but writers don’t have that luxury. We live with an excruciating awareness of life’s poverty of reassurance. (It’s why we write.)
re: Publishing now vs later
I’m not a great person to answer this question. Publishing was never on my radar and I have an ambivalent relationship to all things “publishing.” Actually, that’s an understatement. If I’m completely honest with you, I’m disdainful of a lot of publishing practises, but that’s my idiosyncratic view and it doesn’t serve me well. I’m a purist first, careerist last, and it’s highly likely that’s why my novel isn’t currently sitting on a bookshelf somewhere, so…take heed. I didn’t go to school for writing and I send out one story a year. I prefer to publish in unprotected venues (like Medium, Red Lemonade, G+) just to see what sort of readers it brings me into contact with. I meet the most interesting people through the writing I put out there. I’m familiar with what’s going on in the world, publishing-wise, so if you have specific questions down the road feel free to drop me an email. For now, given that you’re pursuing writing in an academic setting, I’d listen to your professors on this one. Further to that, if you’re in school and around these experienced people it makes sense to get the most out of their expertise by following their advice; it doesn’t strike me as the right time to rebel against it. Without meaning to sound pretentiously zen, be where you are. Embrace where you’ve chosen to learn. Anything else is a waste of energy, and writing already requires more energy than most of us have. I’d focus your time in school entirely on developing your writing practise. Having a practise is SO important. Simply put, think of yourself as a word athlete. You have to train your muscles every day. When you don’t, you grow weak. When you do, you become boundless.
On practical matters — an internship.
Do you want one? If you want one, apply for one. Your resume looks great and I’m sure anyone would love to have you. I don’t think you’ll need any help from me (remember, you’re giving them your time for free) but I’ll happily put in a call for you if you don’t hear back from someone. As for whether to apply, it all comes down to what you think you need most. If you get an internship over the summer, then a year from now you’ll be writing about bureaucracy, unpaid work and publishing a literary magazine, so if those things intrigue you then pursue them. To answer your question, I interned at the UN in college. It was an insane time. I ended up with the largest, nicest office I’ve ever had while I harassed the Senate Budget Appropriations Committee for more money, went to State Department briefings and wrote policy papers on the effects of economic aid in developing economies, all while I was an unpaid 20-year old college student! But that’s a longer story for another time. It’s a great story, actually, and one I should write someday. It makes a great case for getting an internship somewhere completely unrelated to writing, in a field you’d be happy in if writing doesn’t pan out the way you need it to. Just a thought. As an aside, unrelated to anything in your email, I’m always skeptical of people who go to the ends of the earth to find adventure when they aren’t really adventurers. The people who spend three months on an Alaskan fishing boat just to say they did it? I’ve only ever met two people who truly belong on an Alaskan fishing boat, or somewhere comparable, and the rest are sadly misguided individuals. If you don’t want to be an Alaskan fisherman then what the hell are you doing on that boat? (That’s my question; rarely gets a decent answer.) There’s plenty of adventure to be found in the middle of whatever excites you, so my suggestion is to start exploring things that are interesting to you, the stuff you’ve always wondered about, whatever it may be, and don’t let anyone else paint the picture of what “adventure” looks like. They don’t have to live it, you do.
Money, not publishing, is the thorn in the side of every writer. There’s no money in this work whatsoever. Zero. I’ve been in debt for almost my entire career, as have many of my friends, and it’s exhausting and demoralising. That said, I worked at Apple for a few years and took the time to figure out if I’d be happier with a great paycheck at the top company in the world, building a “music geek team” — ostensibly a dream job. The experience was awesome, but once the learning curve levelled off I thought about writing every day. I was just too tired and distracted to sit down and do it. I made the practical decision to stay at the company for several years to save money so I could eventually pay myself to write, and guess what — the screenplay I just turned in has a character built on my experiences in tech. I traded five years of writing for five years of security and experience, and I don’t know what on earth I’d be writing about now if I hadn’t had that experience. Which loops back to the idea that no matter how hard you try, there’s really no screwing this up.
…Except if you don’t develop a writing practise. That’s the only way to screw up being a writer.
The direct segue from money is to the subject of time. Most people will be profoundly confused by any decision you make that privileges anything over money. My suggestion is to form a relationship to money, understand what your threshold for risk is before you start to feel like your writing process suffers because your time is bound up with the wrong place. (Some of the greatest novels in the world are written on this subject. Bel Ami comes to mind.) Make it a priority to learn how much time you need to write and live above that threshold — is it a job at Starbucks, a tiny apartment and writing every day? A job at Apple and writing a novel every four or five years? The main thing to remember is that a financial goal won’t be met by writing, so be on the lookout for ways you can support yourself and accomplish your writing goals, knowing that the way you support yourself will also likely become your material.
The last thing I’ll say is on your work, which I enjoyed reading. As I mentioned above, it’s your life experience that informs the subject matter of your writing, and both pieces you sent me involve young men who are observing their lives and the people in their immediate vicinity (parents, friends, etc.) I expect this is why your professors aren’t pro-publishing — because right now that experience is not substantially unique, no matter how unique your voice or skill set may be. To me, the greatest books, and my favourite writers, bring me into worlds they’ve seen and experienced, and share their observations of life and human behaviour that come from those places. They make meaning out of the human condition, as all writers are obligated to do, but they make the journey exhilarating with backdrops and characters I wouldn’t otherwise know about. My first novel was about a young woman trying to figure out how to live on her own — not unique. I only gave it to a few people, and someone wanted to make a TV series out of it, but I declined and am very glad I did. I would hate to be known for that piece of work now. You strike me as someone with strong observational skill, and when you turn that outward it will open up so many worlds for you to write about. But, for now, I’d sit tight and develop your writing habits so you have a foundation to stand on when you’re out there having the singular experience of pursuing what interests you.
I really hope some of this is helpful to you. It’s a lot of information. No matter how confidently I state it, it is subjective. Use what works, discard what doesn’t. I think the strangest thing about writing is that you’re always writing about something from the past, but in the name of the future. People don’t understand how unsettling that can be, the mental time-travel of it all. Most people live life looking forward — earning money they’ll receive next week, or planning weddings, holidays, babies. Writers spend as much or more time thinking about the baby that was born, the holiday that was had, and the wedding that happened. To do that effectively without letting life pass you by it’s helpful to see the present moment for what it is: what you’ll be writing about tomorrow. So…make now interesting.
Take care, and keep me posted on how things go. You can email me any time.
The latest outing from James Bond serves up a host of Fleming tropes, from ski slope chases and black tie flirtations to the bad guy who just won’t die. While M and C don’t normally stand for Mouse and Cat, in Spectre perhaps they should. Size matters but smaller is better. I sat down with the Bond franchise’s lesser-known field agent, Millicent Brie-Jones, to chat about her latest role, the cat-and-mouse game, and why the Hollywood wage gap is such a big deal.
EC McCarthy: This is your first franchise film. How did it differ from past roles?
Millicent Brie-Jones: For the first time in my professional career, I’m portrayed in a realistic light. It’s a substantial part. I don’t eat cheese onscreen. Nobody enhanced my ears or overdubbed me in a squeaky voice. I stare Bond down, vulnerable and unarmed. With the sheer force of my gaze I convince him that peace is preferable to violence. I mean, without my character the plot just stops right there in that room. There’s nowhere to go. This is a watershed moment for mice everywhere.
ECM: You shot mostly on location in Tangiers. Do you speak Spanish?
Milicent Brie-Jones in Spectre
MBJ: I didn’t prior to this film, and I was admittedly a bit nervous, but that’s why I do this work! The studio got me a language coach, and Sam [Mendes] did ask me to improvise a bit on our second day, just for coverage. I also stood off-camera for Daniel [Craig]’s stuff, and we bantered to ratchet up the tension. It’s a subtle scene. I was happy with my accent, but in the end it’s the Jaws effect — the less you see of me, the more powerful I am.
ECM: Did you train at all? What was your workout regimen?
MBJ: I like to work out, and I love being outdoors. It’s never been an issue for me, so I just did what I always do.
ECM: What’s an average workout for you?
MBJ: Mostly I run up and down the tree on my property, perhaps increasing the intensity a bit, and lots of pull-ups. [flexing her biceps] Very proud of these. Michelle Obama is my idol.
ECM: In Spectre, it’s implied that a cat…
MBJ: I don’t want to comment on cats.
ECM: There’s been speculation that the cat…
MBJ: The press is always trying to stir up controversy. There’s nothing to talk about.
ECM: Did you meet Schmidt Redgrave [who played the role of Blofeld’s Persian cat]?
Milicent Brie-Jones and Daniel Craig in Spectre
MBJ: It was like so many films I’ve done, where I’m familiar with Schmidt’s work, and such an admirer of his family, but there was no crossover on the schedule. I think he shot exclusively in the desert? Look, I know you’re fishing for a sound bite, but I have to disappoint you. The wage gap isn’t a personal issue, it’s about what’s fair, and implying animosity between professionals does all of us a disservice. What I will say, and I said this to Sam and Barbara [Broccoli], is that I was disappointed there was no onscreen cat and mouse confrontation of any kind. I think audiences are ready to see me and Schmidt go head-to-head. They can handle it. I can do so much more — I’m a black belt, for crying out loud. And I’ve read that Schmidt is a crackerjack archer. This was a missed opportunity, as far as I’m concerned, but I understand the focus has to be on Bond. He sells the tickets.
ECM: Finally, as the lone mouse on set, did you feel welcomed and comfortable?
MBJ: More than [on] any other film I’ve done. The crew was amazing. Nobody freaked out when I hung out on the craft service table. The actors ate with me. It was collegial and I learned a lot. He’ll kill me for saying this, but Daniel drops an unusual amount of food because he talks when he chews. I’ll eat under him any day!
ECM: [laughter]He makes a mess?
MBJ: [chuckling] Raining crumbs.
ECM: Thank you, Millicent, for taking the time to speak with me.
I learned about the righteous deal when I negotiated to buy my first car. It was a five-hour ordeal at the dealership. I met with a rotation of Car Cops: good cop, bad cop, friendly-but-stern cop, calculator-wielding cop, they’re-going-fire-me cop, and finally contrite cop. Contrite cop acknowledged that my negotiating skills (which included a late-in-the-game “memory lapse” over how many months I had left on the leased Jetta I was selling them) won me a new car at close to base price. I wasn’t in it to gouge them. I wanted what was fair and told the floor salesman up front what I was willing to pay. He assured me that that was all I’d have to pay. Then six men spent five hours trying to break me. “In the car business we say you got ‘the righteous deal,” contrite cop told me. It was the proudest moment of my negotiating career.
I applaud Jennifer Lawrence for speaking up today on her experience with gender discrimination. I also completely understand why she didn’t speak up sooner, and I support the decision made by many women to remain silent about the double standards they contend with daily. The nature of the wage gap is such that merely beginning a conversation about changing our expectations and standards feels subversive. Lawrence’s situation is unique, as she repeatedly mentions, because her decision to speak up comes from a position of considerable power. Nonetheless, it’s a generous contribution to solving an industry-wide problem which she could easily have addressed privately with her agents. Her essay demystifyies the wage discussion and sets an invaluable precedent for women working in Hollywood.
I’m older than Lawrence and have been working in and around Hollywood for long enough to remember when there were no role models for what I do. Only Nora Ephron and Nancy Meyers were writing and directing studio comedies when I came up. That fact remained true for years. When I say there were “no role models” I mean this: Nora and Nancy were anomalies. There was no sense patterning your career after them. The job of being Nora Ephron was already taken. The absence of women at the top of the creative side of the business made it abundantly clear that those two women got to make movies because of who they were, not because women had any rightful place in the director’s chair. Today, thankfully, there are more women in those jobs than ever before, and young women have comparatively more opportunities to see women at work behind the camera. This is progress.
Lawrence is right when she suggests that her male counterparts are respected for getting better deals for themselves. It’s a man’s game and men begrudgingly respect other men who beat them in competition. In general, men don’t lose to women with the same equanimity. In fact, in most instances losing to women brings out the basest qualities of bad-loserdom in men, including name-calling. Lawrence mentions the “spoiled brat” tag as one she wanted to avoid. Personally, I think “crazy” is more potent, and “difficult” and “nightmare” are reputations that stick. Brattiness can be outgrown but crazy/difficult/nightmare are terminal traits of people you can’t trust and don’t want to be near. That women are routinely called crazy in Hollywood is, in my mind, a highly effective method for marginalizing them.
As for straight talk, Lawrence’s experience of giving direct feedback and essentially being told to “calm down” is pervasive. The unspoken expectation for women is that they should mother their work relationships and creative projects, and selflessly donate their time. If something comes back to them in any form — money, an agent, an opportunity — then they’re expected to feel grateful, not deserving and highly skilled. The upshot is that women are rightly confused as to the real value of their work. They contribute the same work as a guy, but for less money and with the added burden of social cues which actively dissuade them from confidently communicating their opinions. It’s a mindfuck, and sadly a lot of men aren’t aware of how they perpetuate it.
As long as women struggle to negotiate for themselves, studios will profit from paying them less than their male peers. It’s worth mentioning the Hollywood Diversity Report that came out this year which assessed the 2012–13 production year and found the executive ranks of TV networks and studios to be 71% male, and at film studios the number was an astounding 100% male. This means that if male executives independently decided to end the wage gap, it would be gone. Women may be slow to value themselves, but men are failing women too, possibly intentionally. (How’s that for blunt?) As Lawrence sees it, the onus is on her to negotiate for herself. She’s fortunate to have a powerful team of negotiators to help her make that happen. Most women begin their careers negotiating for themselves, without anyone to advise them, and my personal experience is that this is a losing battle. Every single time I’ve asked for equality, I’ve lost. There is no shortage of opportunities for me to write either on spec or for under-the-table money, and men later put their names on my work. This gives me zero negotiating power in the long run. The deck is stacked so many ways in favor of the wage gap.
Lawrence mentions her desire to be liked, and that it’s a part of her personality she’s trying to change. I also have a desire to be liked. I don’t think that’s a bad thing. I get a natural high off of creative collaboration, and at the heart of those relationships is a genuine like and respect for the person or people I’m working with. These days I find I’m less and less able to respect the guys who can’t or won’t see the discrimination I’m dealing with. I mostly attribute their complacency to busyness, but I harbor a fear that they turn a blind eye to my situation so they won’t have to do anything about it. In short, their inaction makes me not like them, which is a state of affairs I’m wholly uncomfortable with. I don’t like not liking people.
I take heart in the cultural changes that are underway right now. As I read Lawrence’s essay, every frank conversation I’ve ever had with a guy in charge that cost me an opportunity made me feel retroactively empowered rather than foolish. It’s a huge relief when someone speaks the truth about the inequality that women face. I’m grateful every time someone acknowledges the problem, even more so when someone begins a national discussion. These are not easy things to do.
On a positive note, there is an upside to gender discrimination. When you give a group of talented people very little to work with, over time they become adaptable, resourceful and more creative. Women writers, directors, actors and comedians are kicking ass right now because they have an astounding work ethic, and they’re exceptionally nimble. There’s a richness and depth to their creativity that comes from time spent watching and waiting in the wings. The next step is encouraging women to sit down with the guys and negotiate a righteous deal.
Living in Asia in the late 90s, I spent time in countries that were then considered “developing” economies. Textbooks were filled with prognostications about the potential growth and downfall of these places but no bar chart captured the terrifying hilarity of driving an hour outside of Seoul at high speed in a brand new sedan on unpaved roads and only potholes and feral animals to navigate by. Technology was tangibly out of sync with infrastructure. When something blocked the road drivers veered onto the front steps of houses to get around it. Parking was wherever you feel like it, and parked cars were often rendered inaccessible due to other people’s feelings about parking. Disagreements were resolved the old-fashioned way, with pointing, yelling, and threat of fists. Over time, enough pedestrians were casualties and enough expensive tires were blown in potholes that laws became necessary, as did the paving of roads. The automobile is no less amazing because society set a speed limit. We mitigate and retard technology where it threatens and outpaces us. This is how we civilize our innovations.
The most poignant irony of the Information Age is the internet’s role in restructuring our relationship to politics. In CITIZENFOUR, Edward Snowden avowed his intent to end the tyranny of the snooping government, but technocratic paternalism is equally invasive and it’s built into the digital realm. Complicated legal documents pop up at the outset of a business relationship and people with no legal background are conditioned to move ahead with a trust us one-click “Agree.” Our relationship to intelligent technology is best portrayed by the routine updates we tacitly agree to without reading or understanding what they entail. I Agree to whatever you’re about to load onto my phone or into my computer, agree to what you think is best for this device and my use of it, agree without stipulation, agree without working knowledge, agree because not agreeing seems time-wasting and foolish and questioning is beyond my technical ability. I always agree with you because everyone else is agreeing with you so it must be okay. I always agree with you because I don’t know why I should disagree.
This habitual agreement has proved deadly to the exchange of real information. The technocracy devised the fastest, most appealing method for securing a user, and internet users subsequently became desensitized to the act of giving away their rights. The repetitive process has numbed healthy suspicion of any organization that demands legal agreement to a loss of personal agency. Those internet service agreements are not there to protect individuals, they are documents created by expensive legal teams to ensure a company has no responsibility to the consumer. If these statements aren’t disturbing enough, stretch them to apply to the government in the shocking months and years after 9/11. The PATRIOT Act was the federal government’s service agreement, and the majority of the American people agreed to it without understanding what they were signing away.
Fourteen years on, perhaps the greatest misstep in rectifying our mistake is to begin with privacy. Loss of privacy is an end result. Privacy can be protected, it can be violated, but it cannot be given. That notion is a falsehood born of Victorian manners — I’ll give you some privacy — which preempt uncomfortable directives: Leave the room. Get off the line. Turn your head. Don’t read my emails. I need my privacy. The sci-fi notion of “mindreading” is terrifying precisely because it violates the only space on earth that belongs entirely to us. When we communicate with people, through talking, writing, or touch, we consciously extend that private space to include others. A violation of private space is a form of mindreading. In building society around the digital world, we’ve ceded a massive amount of private space to move in safely. The only recourse to learning your boyfriend has read your journal is to hide it in a new place, but the only recourse to discovering people can hack your emails is to stop writing anything sensitive or private at all. By necessity, we’ve retreated inward. Our truly private worlds are almost entirely interior now. That loss of intimacy has already alienated us from one another. Unable to safely extend a hand or share a thought, our knowledge of people stops with avatars and public text. We can’t know people’s deeper feelings and they can’t know ours. There’s nowhere safe to talk. We are alienated.
In Citizenfour, Glenn Greenwald asked Edward Snowden why he would risk imprisonment — the obliteration of privacy. In doing so, Greenwald identified the one circumstance where personal agency is taken away. That the cyber debate revolves around the give and take of privacy tells us that we’re already in a prison of sorts. To get out, we need to reestablish laws and agreement. Not the tacit agreement of accepting free stuff in exchange for unknown costs but overt agreement and expectation of legal recourse if our rights are violated. As political theorist Stephen Krasner observed in the early 1980s: “The Constitution is a document more concerned with limiting than enhancing the power of the state.” Modern lawmakers violated this precept into extinction with the USA PATRIOT Act. There’s no current expectation that the present government will give up the Patriot Act of their own volition, and no reason to believe the public has the will to make them. This is where most people drop out of the resistance movement and succumb to prison life.
The other misstep in solving the puzzle is a myopic focus on the future. Pew Research Center’s Net Threats survey asked over 1400 technology experts to predict “the most serious threats to the most effective accessing and sharing of content on the Internet.” With so much focus on forecasting, we’re overlooking a wealth of facts in the present. Ask a South Korean mother living 20 miles from the DMZ in 1997 what the most serious threat to her children’s lives was and most Americans would have predicted a doomsday fear of war with the north. However, it’s just as likely she would have said: “See that black sedan driving 50mph over my front doormat…?” Attention-grabbing headlines often obliterate imminent dangers. Public discussion leapfrogs over what we could solve today because no one wants to dig in and do the unglamorous work of painting a dotted line down the center of the road. (Put another way: Why isn’t Pew asking these 1400 experts to identify today’s most solvable problem and offer a specific solution? That’s 1400 solutions right there.)
If technology is responsible for creating a state of alienation then the government is guilty of capitalizing on that alienation. When politicians appeal to people’s confusion over new technology, they perpetuate a dangerous myth that people can separate themselves from the digital age. Lindsey Graham’s opinion on cyber surveillance is useless if he doesn’t understand how Americans use email or why they might be upset that those emails are intercepted and read by government officials. Perhaps he’d like to turn his diary over to the CIA and see how that feels. His vote on privacy legislation would certainly be made with the necessary wisdom.
America is a world leader in computer technology and innovation. Every member of Congress, and certainly the next president, should be knowledgeable about computer technology. America’s elite governing body must be prepared to debate cyber. My 90-year-old grandmother has been sending emails for years and she has a Facebook account. If United States senators can’t keep up with her computing skills then they don’t belong anywhere near the Capitol. The most important action Americans can take is to vote for cybersmart House and Senate representatives in upcoming elections.
As backwards as Washington seems, cybersmart politicians do exist. It’s clear from Hillary Clinton’s decision to house computer servers in her home during her tenure at State that she’s knowledgeable about cyber. Despite her public statement, Clinton’s use of personal servers has nothing to do with convenience and everything to do with security. Clinton owns her data. She also possesses depth of knowledge about what goes on in the intelligence community. I expect that is what drove her to take control of her privacy. If she wants to do the country a great service, in or out of the White House, she should make cyber legislation her top priority and level the playing field for citizens everywhere. It would unite the country to speak plainly about the state of our internet. Honest talk about cyber surveillance from a public figure who can speak to both sides of the debate would be a huge step forward for the country.
What will hopefully become apparent, to decision makers and citizens alike, is that both sides of the ideological struggle derive their power from the online participation of citizens. The present situation has left people with nowhere to turn for trustworthy leadership. The conditions that permitted fascism’s spread after World War I — post-war malaise, financial struggles, political distrust — tamp down people’s natural resistance to incremental loss of agency. The circumstances that facilitated the rapid creation of totalitarian governments in previously liberal, rational societies are cropping up exactly one century later. The situation is again ripe for machtergreifung, or power-grab.
Democratic European societies once made a desperate attempt to escape their status quo by funding unstable third parties with disastrous consequences. We are now seeing many radical ideas thrown into the mix, some backed by logical process, others attempting to shake people out of rhetoric fatigue. Reboot the Government! Reboot the Bible! Reboot the Brain! Drop one letter from those slogans and we’re deep in A.I. territory. Bill Gates, Elon Musk, Stephen Hawking and their ilk proclaim their fear of the dark side of artificial intelligence with increasing regularity. We should be afraid too. There’s no precedent for the power vacuum created by a flaccid Congress and a disproportionately wealthy technology sector. This situation could pave the way for the first artificially intelligent leader. The engineering is getting there, and the rest would be…history.
What if there were a way to influence the past and change the future? With every choice we make — voting for president, purchasing a stock, getting married — we hold an entrenched view that possibilities evolve with time. We discuss the future in predictive terms (likelihood of, on target for, could go either way if…) and plan accordingly. To the extent that future outcomes don’t fall in line with our expectations we infer that we lacked information, were poor readers of probability, or experienced a devilish bit of bad luck.
There’s also a sense of momentum as we approach a crossroads where probability becomes inevitability. Expectations take over. This is evident in the person who doesn’t vote because their preferred candidate is almost certainly going to win, or the person who marries despite back-of-the-church jitters because halting a wedding is impossible. We rationalize away outcomes even though they exist up to “I do.” Would we feel differently about those discarded chances if they were sent to us from the future?
John Cusbert, Research Fellow at the Future of Humanity Institute at Oxford University, challenges our foregone conclusion about chanciness. In his paper “Backwards causation and the chancy past”, Cusbert asserts that chanciness isn’t tethered to time in a linear fashion, and that future outcomes can possibly affect chanciness in the past. This is not to say that all chanciness originates in the future, but theoretically some ofit could.
I discovered Cusbert’s paper just as I finished rewatching Christopher Nolan’s excellent space epic Interstellar and the two works independently made sense out of each other. Cusbert provides a framework for what happens in time’s physical dimension in the film, while Interstellar plays out a dramatized version of Cusbert’s backwards causation scenario. The implications for everyday life are extraordinary, and also very fun to consider.
First, a bit of housekeeping. Backwards causation of chance is only possible if we unlink time and chance. Cusbert does an excellent job of explaining the whys and hows, but his conclusion is the jumping off point for this piece. To wit: It is false to assume that chances are defined at times.
Thus, imagine Time and Chance as two objects held up in the air by you (the universe.) When you hold them together they exhibit certain properties (perhaps they’re magnetically attracted) and when you move them apart they exhibit other properties (perhaps one becomes smaller without the heat reflection of the other.) Whatever their properties, Time and Chance are separate entities, bound by the laws of the universe, which interact with each other in noticeable ways that affect our lives.
Now the fun part…hunting for backwardly caused chance in the lives of Interstellar’s Astronaut Cooper and and his daughter Murph.
Assumption #1 — Cooper will pilot the Endurance
Cooper will pilot the Endurance because he pilots the Endurance. It is a property of time that the past cannot be changed.
Chance #1 — Cooper may or may not make himself stay on earth
When Cooper travels into a black hole near the end of the film, he encounters a physical dimension of time. The tesseract is a construct of Murph’s bedroom during the week before Cooper left earth on the Endurance. This stretch of time is in the past but within the tesseract it is also a fragmented, nonconsecutive part of the present.
Present Cooper desperately communicates with Past Murph using gravity to knock books to the ground. The past cannot be changed, but Cooper hasn’t realised this yet and is backwardly causing chances to make himself stay. From the tesseract in the present, there is zero probability of those chances working, but they’re chances in the past until Past Cooper leaves earth. They’re also chances in the present until Present Cooper gives himself the coordinates to NASA. Chanciness is chancy. It doesn’t dictate an outcome, it only offers the possibility for it. For a brief window of time, Cooper’s dropped books and coded messages are backwardly caused chances that his past self ignores and Past Murph puzzles over.
Assumption #2 — Cooper will send himself on the mission
Once Cooper realizes that he sent himself on the NASA mission, and that he needs to go on the mission in order to arrive at the present moment, he locates the night of the dust storm in the tesseract and gives his past self the coordinates to NASA in binary through the falling dust. This is a fascinating moment that seems to be filled with chance―Cooper could decide not to send himself the coordinates, leaving his past self unaware of NASA’s nearby outpost from which his departure from earth is inevitable. However, in the present, Cooper begins to grasp that he has a chance to help Murph and civilization on earth by bringing himself to the tesseract, so he doesn’t even hesitate to send his past self the coordinates. Therefore, there is no chancy element to this event whatsoever. Past Cooper already received the message from Present Cooper, found NASA and left earth.
Chance #2 — Cooper may or may not increase the chances of saving the people on earth
Once Cooper realizes he can’t change the past but he might be able to change the future, he interprets his purpose in the tesseract as being “the bridge” to Present Murph. He encodes quantum data in a wristwatch in Past Murph’s bedroom for Present Murph to find decades later. That he chooses the wristwatch and that he encodes the data are two ways he’s backwardly creating chanciness. She might not find the watch and she might not be able to use the data. Neither outcome has occurred yet for Cooper or Murph.
Chance #3 — Murph may or may not find Cooper’s quantum data
A ticking hand on an old watch in an abandoned bedroom in a house where she is not welcome…these are seemingly insurmountable odds against Present Murph finding the data, but the tesseract offers an emotionally significant time for both father and daughter which enables Present Cooper to weight the chanciness heavily in favour of Murph’s eventual discovery of the watch.
Artificially intelligent robot TARS is with Cooper in the tesseract, trying to parse his logic:
TARS: “Cooper what if she never came back for it?”
COOPER: “She will. She will.”
TARS: “How do you know?”
COOPER: “Because I gave it to her.”
TARS is unable to match Cooper’s innate confidence that emotional attachment is a powerful enough influencer of probability to overcome inevitability. Cooper’s love for his daughter made him give her a watch as a way to keep him close. Murph’s love for her dad will make her happy to find the watch he gave her years later. Murph’s inquisitive nature, nurtured by her dad, will likely cause her to recognize his message encoded in the second hand. It’s not a given that Murph will find the data. It is chancy. The tesseract might belong to descendants of the civilization that Dr. Brand is starting on a new planet, and maybe their only requirement in bringing Cooper into the tesseract is to send himself to NASA to successfully pilot Dr. Brand through space. Cooper’s extra help for Murph is chancy and unproven. Even so, Cooper is powerfully assured that his plan worked because the tesseract closes once he finishes encoding the quantum data. At that same moment across spacetime, we see Present Murph recognize her father’s message in the wristwatch in her bedroom. The future is changed for father and daughter through backwards causation of chance.
*
Could chance be a type of emotive gravity? Emotions certainly influence our decision-making. Could chance be the force that pulls present-time Cooper in line with past time inside the tesseract, acting on him to respond in lockstep with a past he’s already lived? Cooper exhibits a spectrum of emotions during his time in the tesseract. He is distraught when he first arrives and doesn’t understand the system. He’s calmest when he realizes he has an opportunity to transmit useful information across spacetime.
The moment Cooper is no longer controlled by past events, he regains control of his emotions.
Similarly, young Murph is most distressed by Cooper’s highly emotional, ghostly communication through falling books, likely because she is powerless to use the information to convince her father to stay on earth. She is calmest when she recognizes his calmly-sent data decades later, even though her circumstances are considerably more fraught and dangerous. Both father and daughter are calmest when they aren’t trapped by inevitability and have a future-oriented purpose. They’re calmest when they have chances to make informed choices.
One of many interesting definitions Cusbert puts forth in his paper is that “[it’s] essential to chance that a system’s chance properties be among its physical properties: this is what distinguishes chances from other kinds of objective probabilities (such as logical and evidential probabilities).” In the context of Interstellar, gravity is the only force Cooper can use to physically communicate across space-time and cause chanciness. However, the past chances Cooper physically sets up are too weak to make a difference. Without Murph caring that her dad is gone, without Cooper caring whether he saves Murph’s life, without a powerful love and emotional bond between them, the wristwatch would be just another object in a house of objects that is tossed away after decades of no use. Time and gravity need emotion to effectively communicate possibility.
Yet, emotion isn’t powerful enough to change the past. If it were, there’d be nothing constant in our lives. We would have no history. Who doesn’t have an important decision they’d do over? It’s difficult to watch Cooper fight his past, seemingly able to make different choices if only he’d calm down. But of course, he can’t calm down. He’s in a state of agony at being separated from his daughter. Within the tesseract, Cooper’s actions aren’t chancy because his love for Murph is constant. The emotional pull is unwavering and it exists uniformly across space-time. It makes Cooper behave predictably in line with the past. Perhaps emotive gravity is what pulls time powerfully in one direction. Of those two objects you hold in the air, Time and Chance, it would be incredible if Chance were the more powerful of the two.
Cusbert’s theoretical reasoning uses coin tosses, time shifts and algebra to illustrate what Christopher and Jonathan Nolan portray through space travel, tesseracts and a father-daughter bond. The fictional story applies workable science to the real world, then adds the notion that love is the determining factor in backwards causation of chanciness. This is especially pertinent to examinations of modern crises. In so much as love is absent, or not evident, there is no benevolent force steering our lives and a sense of hopelessness and doom pervades our outlook for the future.
It was chance that I found Cusbert’s paper. I wasn’t looking for it. It is one of millions of papers on the internet. It was also chance that I read his paper at a time I was considering time, as opposed to last summer before Interstellar was released. By chance, the publication date of Cusbert’s paper, printed on the front page, is a highly significant date for me, which mildly disposed me toward reading it rather than passing it over. (I am someone who attributes compelling qualities to coincidence; when I meet someone with my same name I am affected.) None of these chancy elements are gravity-related, but rather are familiar examples of chance that moves linearly with time. Cusbert doesn’t suggest that all future outcomes determine all past chanciness, just as Interstellar doesn’t suggest that future beings control the present through spacetime. However, both works offer compelling reasons to reconsider our long-held view that future outcomes are caused by past and present possibilities alone. By entertaining the notion that chance could come to us from the future, we have yet another reason to listen to our hearts and learn to better read our emotions.
There’s a gut-wrenching scene at the climax of the World War II biopic The Imitation Game. Alan Turing and the codebreakers at Bletchley Park decrypt a German cable and suddenly they know the enemy’s plan to attack Allied ships and, incredibly, all attacks for the foreseeable future. Their celebration is short-lived. Turing grasps the ephemeral nature of their discovery and has a sickening epiphany: To win the war they can’t tip off the Germans that they’ve decoded Enigma. Instead they must simulate ignorance by choosing strategic victories and sacrificing the rest of their men. Panic sets in. One of the codebreakers has a brother serving aboard a targeted convoy. He begs his colleagues to use what they know to spare his brother’s life but Turing is resolved. Their secret must be concealed at the highest cost. The ensuing choices haunted the intelligence community long after the war was won.
Over the last 14 years, Americans have been conscripted into an information war. Individual privacy is now incidental to the objectives of government and technocratic elites, and vulnerable to the exploits of criminals and extremists. The battle for control over the digital space is a gloves off, civil-liberties-be-damned free-for-all. To reestablish trust in our oldest institutions it’s necessary to parse the steps that led to the present situation and decrypt the objectives of contemporary leaders and policymakers.
RED FLAGS
Nearly 100 years after Nazism flourished in Germany, the question is still asked with incredulity: Why did German citizens permit and participate in genocide? There will never be a satisfactory answer to the moral question of why, but there is a clear beginning in the circumstances of how. The rise of fascism in post-World War I Europe began with a confluence of domestic troubles in Italy: a financial crisis, concomitant economic hardship, grief over millions of Italian war casualties, widespread dissatisfaction with political parties that failed to deliver on promises, and a perceived threat to financial security from a foreign (Communist) ideology.
Onto this stage stepped Benito Mussolini, a staunch nationalist and war veteran whose preoccupation with violence inspired the formation of an army of uniformed “Blackshirts” — unemployed youth, funded by the middle and upper classes, who assassinated opposition leaders, suppressed and destroyed opposition newspapers, and eventually marched on the capital to take power in 1924. “A Brief History of the Western World” summarizes Italian fascism thus:
“In the beginning, as Mussolini himself admitted, [fascism] was largely a negative movement: against liberalism, democracy, rationalism, socialism, and pacifism…[Italians] had been cast adrift, let down by failed hopes of progress and happiness. Faceless in a mass society, they also felt alienated from themselves. The Fascists found an answer to this emptiness by arousing extreme nationalism….The fascist myth rejected the liberal reliance on reason and replaced it with a mystical faith. Stridently anti-intellectual, it held that the “new order” would spring from the conviction of the “heart.” Fascists therefore looked upon intellectuals as…suspicious characters…. Most ordinary Italians accepted Fascism with enthusiasm. The individual who formerly felt alone and unneeded, enjoyed a new sense of “belonging.”
The rise of fascism in Italy took less than six years from invention to political dominance. Fostered by comparable conditions in neighboring countries, the ideology spread across Europe and fatefully intersected with the political ascent of Adolf Hitler in Germany. The Germans have a word for Hitler’s rise to Fuehrer: machtergreifung — macht, meaning power, and ergreifen, to grab or seize. Like Mussolini, Hitler headed up a violent army of unemployed youth and committed illegal acts to dissuade and undermine his opponents, but it was the power vacuum created by ineffective German leadership that paved the way for the Third Reich and Nazism.
*
Flag of the Soviet Union
A second world war and one Pax Americana later the world was pumped with Cold War adrenalin. In 1962, nuclear superpowers bumbled their way into a stand-off and lucked their way out of the unthinkable during thirteen days of diplomatic posturing over Cuba. The rapid advancement of nuclear technology meant there was no room for error, yet error upon error was made. In effect, American leadership failed the test but passed the class. America and Russia skated by on their shared basic values, but the crisis taught no lessons on how to face an adversary with profoundly different goals, specifically those rooted in tribal conflict and revenge.
In the aftermath of America’s nuclear showdown, political theorist Graham Allison published his seminal work “Conceptual Models and the Cuban Missile Crisis.” It would form the foundation of American foreign policy. Allison defined three distinct methods for understanding policy outcomes: The rational policymodel (foreign governments behave rationally in relation to their goals), the organizational-processmodel (the military typically wants X, the bureaucracy typically wants Y, and historically they have n relationship to each other so the outcome will predictably be z), and the bureaucratic politicsmodel, where shapeshifting factors such as interpersonal conflicts, bureaucratic inertia, and availability of resources act on each other to influence foreign policy outcomes. Government elites strongly favored the bureaucratic model as conventional wisdomthat would shape American foreign policy for decades to come.
Political theorist Stephen Krasner reassessed Allison’s models, first in 1972, and later at the height of the “first” Cold War. He was troubled that President Kennedy, subcabinet members, and scholars from top public policy programs in the 1960s wholly adopted the bureaucratic approach, where outcomes were viewed as an evolving compromise of inputs. Krasner identified the fundamental flaw in the model as giving elite decision-makers a blanket excuse for their failures. Specifically, he reframed bureaucratic-politics thinking as a biased framework for blaming policy errors on the “self-serving interests of the permanent government,” where elected officials were viewed as powerless to corral the government “machine.” He summarized the infinite loop of accountability thus:
Bureaucracy is a machine, and “[machines] cannot be held responsible for what they do, nor can the men caught in their workings.”
This is a stunning double entendre for the Information Age.
DIGITAL DICTATORSHIP AND WARRING ELITES
Rights and privacy are dictated by an elite group of decision makers who control the laws (Government) and the digital infrastructure (Technocracy.) Internet usage and hardware purchases now constitute a “vote.” Government and technology sectors each employ 1% (3–4 million people) of the American population. The percentage of top-level decision-makers, technicians and analysts within those fields is assumed to be less than .01% of the American public and is therefore elite. Technocratic elite lumps Anonymous hackers in with tech CEOs, and government elite includes members of all branches of government and political influencers with monetary or legislative sway. Since both elites invest billions of dollars successfully marketing themselves to society, the benefits they provide are widely known and will not be discussed here. Instead, the focus is the encrypted cost of advancement. Decoding the costs reveals which services and policies are truly beneficial, and to whom.
*
The Technocracy
The history of the government’s relationship with computer technology is long and complicated. Perhaps only one fact is universally accepted: Al Gore did not invent the internet. Contrary to popular folklore, he never claimed to invent the internet. Gore’s words were twisted, the transcripts are widely available and he was subsequently defended by two of the “fathers of the internet” as deserving “significant credit for his early recognition of the importance of what has become the Internet.” The urban legend illustrates the strange paradox of the Age of Information. Even with unprecedented access to the truth, millions of people are often misinformed.
Internet development began in the 1960s, became its broadly used iteration in the mid-1970s, was commercialized through the 1980s and came into its own in the early 1990s with the introduction of the World Wide Web, the universally accepted infrastructure for data exchange on the internet. Web engineering is credited to Tim Berners-Lee’s 1989 proposal at CERN. It was developed over the next few years and made free to the public in 1993. Anecdotally, this snippet enumerating current issues confronting global governing bodies from the then-definitive International Law Anthology reveals the digitally unsophisticated world that received this new technology:
Global Communications: The earliest topics in this burgeoning field were international postal services and the laying of submarine cables. The invention of radio, television, and facsimile and modem communications technology, have led to explosive growth in this area of international regulation. Jamming and counter-jamming of another nation’s radio wave frequencies, channel regulation, remote sensing, and stationary satellite transmission are matters of intense interest. There is a move toward international broadcast standards and transmission quality. But there are also countervailing pressures against freedom of information, with some nations (and religious groups) desiring the suppression of international telecommunications relating to the advocacy of war or revolution, criticism of governmental officials or policies, regulation of commercial messages, and materials depicting real or fictional violence or pornography. — Anthony D’Amato, “Domains of International Law,” International Law Anthology
It reads like a mid-century newspaper clipping but that passage was published in 1994. Bill Clinton was president.
Twenty years later, Laura Poitras’s Oscar-winning documentary CITIZENFOUR is more than an exceptional historical record. The film is also a primer for technocratic culture and ideology. In June, 2013, after months of anonymous communications, National Security Agency contractor Edward Snowden sat down face-to-face with Poitras and The Guardian journalist Glenn Greenwald in a Hong Kong hotel room. Snowden spoke eloquently and fluently about the values at the root of his dangerous undertaking to leak classified documents detailing secret surveillance programs run by the United States government.
From CITIZENFOUR:
Glenn Greenwald: So, why did you decide to do what you’ve done?
Edward Snowden: For me, it all comes down to state power against the people’s ability to meaningfully oppose that power. I’m sitting there every day getting paid to design methods to amplify that state power. And I’m realizing that if the policy switches that are the only thing that restrain these states were changed you couldn’t meaningfully oppose these. You would have to be the most incredibly sophisticated technical actor in existence. I’m not sure there’s anybody, no matter how gifted you are, who could oppose all of the offices and all of the bright people, even all of the mediocre people out there with all of the tools and all of their capabilities. And as I saw the promise of the Obama Administration be betrayed and walked away from and, in fact, actually advance the things that had been promised to be curtailed and reined in and dialed back, actually got worse. Particularly drone strikes…That really hardened me to action.
GG: If your self interest is to live in a world in which there is maximum privacy, doing something that could put you in prison in which your privacy is completely destroyed as sort of the antithesis of that, how did you reach the point where that was a worthwhile calculation for you?
ES: I remember what the internet was like before it was being watched and there has never been anything in the history of man that’s like it. You could have children from one part of the world having an equal discussion where they were granted the same respect for their ideas in conversation with experts in the field from another part of the world on any topic anywhere any time all the time, and it was free and unrestrained and we’ve seen the chilling of that, the cooling of that, the changing of that model toward something in which people self-police their own views and they literally make jokes about ending up on “the list” if they donate to a political cause or if they say something in a discussion. It’s become an expectation that we’re being watched. Many people I’ve talked to have mentioned that they’re careful about what they type into search engines because they know it’s being recorded and that limits the boundaries of their intellectual exploration. I’m more willing to risk imprisonment, or any other negative outcome personally than I am willing to risk the curtailment of my intellectual freedom, and that of those around me whom I care for equally as I do for myself. Again, that’s not to say that I’m self-sacrificing because I feel good in my human experience to know that I can contribute to the good of others.
[transcription from video]
It’s striking that Snowden didn’t say privacy in his mission statement. Greenwald framed the debate with the question many of us would ask after hearing that we’re being surveilled, and subsequent news reports by outlets across the globe widely referred to “privacy.” It’s unclear whether Greenwald and Poitras heard more of Snowden’s thoughts where he raised the issue of privacy himself, but he doesn’t say the word. He advocated an unmonitored internet from the vantage point of someone who is highly skilled at protecting his own privacy. He recollected the realization, at his NSA desk, that before too long he — a member of the tech elite — would be technologically outpaced and unable to protect his privacy. The technocracy was losing ground to the government.
Society owes Edward Snowden an enormous debt for his decision to blow the whistle on the NSA at great personal risk. To be clear: he enabled a profoundly necessary conversation to begin. However, his poetic description of the unrestrained nature of intellectual advancement is technocratic rhetoric for a digital utopia that never existed. As compelling and passionate as he is, Snowden made several incorrect assertions that should be dispelled in the interest of productive discussion.
First, there have been many inventions in the history of man like the internet, including the space shuttle, the airplane, the telephone, or the galleon, all of which brought people together across vast distances at previously unmatched speeds to have discussions and exchange knowledge. Mankind went through periods of adjustment to those profound changes in infrastructure and we will navigate this one as well. Innovation is not unprecedented. This invention will mature beyond its makers and it must assimilate to the needs of civilization, not the other way around.
Second, the children can still spend their days online talking to experts as equals if they want to (though it’s doubtful they do.) Invoking chilled children and cooled innocence is misleading rhetoric when it’s primarily adults who spend their time staring at a screen. Further, the tech industry pushes expensive gadgets and software for kids but, as highlighted by the New York Times’ “Steve Jobs Was a Low-Tech Parent,” many technocrats strictly limit gadget exposure for their own families because they’re aware of the harmful effects of internet and technology use on young minds. Teenage youth are a more complicated issue with regard to internet freedom, which is especially clear in the case of ISIL’s recruiting techniques, but Snowden wasn’t referring to Muslim children discussing ideas with expert terrorists across the globe. He wasn’t lamenting privacy incursions on thugs. In fact, he didn’t acknowledge the grey areas of internet freedom at all.
The most important falsehood in Snowden’s statement, and the core message of the technocratic ideology, is that the internet was once and should always be free. This is a seductive idea, especially to people with good computing skills and entrepreneurial leanings, but it is patently untrue. Getting online requires expensive hardware and infrastructure that is designed and sold by the same community that dominates the internet through technical expertise.
For the last 20 years the technology industry has hard-sold hardware to citizens, corporations and governments alike along with software that seamlessly replaced or supplanted infrastructure for everything from financial transactions and brick-and-mortar stores to research and even face-to-face meetings. The technocracy orchestrated one of the greatest heists in history by amassing “free” content from writers and established media publications trying to maintain their brands with a millennial generation that wasn’t taught to pay people for their time, research, and intellectual work. As a final insult to “freedom,” tech companies undertook the systematic repackaging of users’ private information as data useful for advertizing, which they bundle and sell to whoever they choose at a profit. (The word “user” rather than “customer” has always implied a barter arrangement, but it is rarely spelled out exactly what is being given and gotten. You open a social media account once, perhaps only use it for an hour or a day, but the service provider owns your personal information forever and can sell it many times over.)
In 2015, Apple, Microsoft, Google, IBM and Samsung have risen to the top ten of Forbes’ World’s Most Valuable Brands, and 11 more technology companies are in the top 100. Six of the world’s 20 richest billionaires are computer technology elite. All of that free internet has paid for mansions and private educations. There’s nothing wrong with companies and people making money off of this invention — America is a proudly capitalist society — but perpetuating myths about intellectual freedom and raging against government misuse of personal data is hypocritical and misleading.
If it appears I’ve misinterpreted Snowden’s meaning entirely, breathe easy. It’s clear that Snowden’s “free internet” refers to freedom of thought, communication and information, not freedom of goods and services. However, the cyber conversation can’t bifurcate those billions of dollars from the billions of devices and trillions of gigabytes of data. Doing so hides the massively lucrative business objectives behind fun, sometimes addictive, products. If technocrats truly want a free, unrestrained internet they’re now rich enough to forgo that pile of money, make cheap hardware, set chaos-legitimizing rules (First Rule of Internet: There are no rules) and enforce the entropy. I doubt they’d have billions of takers and no one would be typing their credit card number into a chaos box.
*
Screenshot from the Department of Justice website
The Government
Spying, surveillance and covert activity have always been part of America’s security and defense apparatus; that activity just wasn’t legal. Illegality was at the heart of clandestine work, making it extremely risky and therefore far more considered by those commissioning it and those undertaking it. The legalization of amoral behavior came about in the weeks after 9/11 because, ostensibly, the president and his cabinet wanted the freedom to openly plan illegal activity without fear of legal repercussions. The PATRIOT Act inoculated government officials from risk and, many would say, ethical pause. What followed was a confident, unrisky expansion of intelligence infrastructure with no heeded supervision or endgame.
A nation that was once gripped by the unraveling of Richard Nixon now shrugs off revelations of CIA agents breaking into Senate Intelligence Committee computers in 2014. Government workers have spied on elected officials before, but today the public digests these incidents with a vague assumption that all criminal behavior by the government has a footnoted legal justification somewhere. These stories translate as infighting among elites. Fourteen years of the Patriot Act have conditioned Americans to expel what little outrage they can muster in a matter of days and then go limp. The groups taking legal action against injustices are typically news or special interest organizations with a financial or moral dog in the fight and powerful legal teams to back them. (The latest New York Times op-ed piece from Wikipedia’s Jimmy Wales and the AP’s lawsuit against Hillary Clinton are two cases in 2015 alone.) Even with funded legal representation, there’s a pervasive sense that their effort is futile. For all of the flagrant rights abuses, the government’s tracks are papered over by the PATRIOT Act.
One way to step off the merry-go-round is to take a page from Alan Turing’s estimable problem-solving approach and look at what isn’t happening in our every day lives. Government elites have made several huge assumptions on our behalf and, in light of Edward Snowden’s unspooling NSA leaks, it’s worth revisiting their decisions after seeing the results. The government uses negative hypotheses to great effect (if we don’t renew the PATRIOT Act…) and so can the people whose rights are in the balance.
What isn’t being done with NSA-collected data?
Potentially, the important stuff. Through indiscriminate data-collection, the NSA is extensively aware of wrongdoing by the American people, corporations, government agencies and officials. We don’t need Edward Snowden’s evidence to know this is true. Daily news stories show that digital communications include sexually harassing emails in the workplace, threats of murder or violence, faxed paper trails of embezzlement, proof of premeditated theft, telephonic recordings of gender and race discrimination, and documented personal indiscretions by public officials. The American government inadvertently nets evidence to myriad criminal acts, both domestic and foreign. It then employs people to sift through these stores looking for some lawbreakers, but not others. When intelligence officers stumble upon criminal or threatening activity that doesn’t serve their objectives do they look the other way to conceal their methods? It’s conceivable and probable that actual lives have been lost to inaction rooted in concealment. What happens in situations like these? What do the numbers look like on paper — lives lost or ruined versus casualties from terrorist attacks. The legal ramifications are mind-boggling but the ethical question is straightforward: Is a government obligated to protect its people or its objectives?
What else isn’t being done with NSA surveillance data? For all of their time spent sweating over Apple’s Xcode, the U.S. government didn’t stop the Tsarnaev brothers, the French government didn’t stop the Charlie Hebdo murderers, and the U.K. government isn’t stopping thousands of teenagers from leaving the country, unaccompanied, to join ISIL. Most disturbing was the story of three teenaged girls who left the U.K. in February and may have been aided by a western spy in transit, forcing us to question why governments aren’t helping their most vulnerable citizens return to safety (and whether they may be using them as unsuspecting spy assets instead.) With the Snowden data we have proof that our individual rights, and lives, are considered a worthy sacrifice to what the government deems “the greater good.” When spy agencies might be risking the lives of teenagers in the name of future terrorist attack victims, it’s clear government objectives no longer align with the values of the citizens they work for.
What if we don’t have the internet?
When Lindsey Graham weighed in on Hillary Clinton’s email debacle on Meet the Press with an I’ve-never-sent-an-email statement, he pumped a figurative fist of defiance. He’s a loud, proud Luddite in the new millennium. However, ask him where he does his banking, whether he gets money from the ATM, uses a cellphone, watches cable television, or has ever read the news online and he’ll be forced to admit he’s got a digital footprint. His televised statement gives him credibility with the anti-technology demo, the people who are done with all the smart talk and just want to love America with all of their hearts [see: Fascism, precursor to]. The only people alive today who aren’t consciously reliant on cyber technology are toddlers. The rest of the modern world communicates regularly online and is increasingly aware that public officials lack cyber expertise.
But what if we did live in Lindsey Graham’s la-la-land and didn’t have a digital footprint? A world without the internet is inconceivable today, but that world existed only two decades ago. In that time we traded infrastructure for more than just privacy. What we save in time and gain in information should be held up to what we spend in dollars to participate in the digitized world.
A sliver of the data shows that in 2014, 177 million smartphones sold in North America, amounting to $71 billion in sales. Globally, 1.3 billion smartphones sold. Add to that the pc, tablet and cellphone sales, software sales, internet and cellphone service contracts…Americans pay a lot of money to go about their daily lives. This is not to suggest we should shun progress and innovation, but we should know what we’re getting for our money. We aren’t getting shiny new laws for the digital infrastructure we depend on. Our brightest technological minds unwittingly innovated a cyber-police state and elected officials aren’t knowledgeable enough, or confident enough, to walk back what technology wrought. For a country that leads the world in cyber technology, many of our legislators are tech-dumb to the point of ridiculousness. The fatal mistake would be to insist we can separate ourselves from the infrastructure of modern society by never sending an email. Politicians like Graham sell that idea because it sounds freeing [See: Paternalism, Fascism’s sweet-faced uncle named] but they’re diverting attention from the pressing issue of lawmaking because they clearly have no idea where to begin. The gridlock in Congress might not be gridlock at all. Perhaps our representatives are simply confused about how to hit “Send.”
Finally, who doesn’t control personal data?
If the answer to this question isn’t obvious yet then it’s worth stepping into the nearest bathroom and checking out the wall above the sink. (Or ask Hillary Clinton. She gets it.) In military jargon, intelligence refers to strategically useful information. Information becomes intelligence when it has an application, and that application is determined by whoever finds, reads, assesses and controls the information. To grasp how important this seemingly obvious statement is, consider the juxtaposition of Director of National Intelligence James Clapper and former NSA contractor Edward Snowden, two men working at the same government agency in control of the same information who found starkly different uses for it.
From this we must conclude that, within the government, a select group of officials and contractors control our information and they each have specific objectives in mind. Then we must acknowledge that almost none of us can articulate what those individuals’ objectives are so we don’t know if we agree with them. As internet-reliant citizens, we play the odds every time we connect digitally, not knowing which side of the numbers game we’re on. To use the analogy of WWII Britain, are we the majority at home or the unsuspecting brothers on targeted convoys? None of us can answer this question because the government elite draws up the map in secret. To the extent that events unfold in a manner we agree with and our lives aren’t negatively affected, we can only say we got lucky.
Loading screenshot of Google’s Virtual Library project
HOW WE CIVILIZE TECHNOLOGY
Living in Asia in the late 90s, I spent time in countries that were then considered “developing” economies. Textbooks were filled with prognostications about the potential growth and downfall of these places but no bar chart captured the terrifying hilarity of driving an hour outside of Seoul at high speed in a brand new sedan on unpaved roads and only potholes and feral animals to navigate by. Technology was tangibly out of sync with infrastructure. A blocked road sent drivers veering onto the front steps of houses. Parking was wherever you feel like it, and parked cars were often rendered inaccessible due to other people’s feelings about parking. Disagreements were resolved the old-fashioned way with pointing, yelling, and threat of fists. Over time, enough pedestrians were casualties and enough expensive tires were blown in potholes that laws became necessary, as did the paving of roads. The automobile is no less amazing because society set a speed limit. We mitigate and retard technology where it threatens and outpaces us. This is how we civilize our innovations.
The most poignant irony of the Information Age is the internet’s role in restructuring our relationship to politics. Snowden avowed his intent to end the tyranny of the snooping government, but technocratic paternalism is equally invasive and it’s built into the digital realm. Complicated legal documents pop up at the outset of a business relationship and people with no legal background are conditioned to move ahead with a trust us one-click “Agree.” Our relationship to intelligent technology is best portrayed by the routine updates we tacitly agree to without reading or understanding what they entail. I Agree to whatever you’re about to load onto my phone or into my computer, agree to what you think is best for this device and my use of it, agree without stipulation, agree without working knowledge, agree because not agreeing seems time-wasting and foolish and questioning is beyond my technical ability. I always agree with you because everyone else is agreeing with you so it must be okay. I always agree with you because I don’t know why I should disagree.
This habitual agreement has proved deadly to the exchange of real information. The technocracy devised the fastest, most appealing method for securing a user, and internet users subsequently became desensitized to the act of giving away their rights. The repetitive process has numbed healthy suspicion of any organization that demands legal agreement to a loss of personal agency. Those internet service agreements are not there to protect individuals, they are documents created by expensive legal teams to ensure a company has no responsibility to the consumer. If these statements aren’t disturbing enough, stretch them to apply to the government in the shocking months and years after 9/11. The PATRIOT Act was the federal government’s service agreement, and the majority of the American people agreed to it without understanding what they were signing away.
Fourteen years on, perhaps the greatest misstep in rectifying our mistake is to begin with privacy. Loss of privacy is an end result. Privacy can be protected, it can be violated, but it cannot be given. That notion is a falsehood born of Victorian manners — I’ll give you some privacy — which preempt uncomfortable directives: Leave the room. Get off the line. Turn your head. Don’t read my emails. I need my privacy. The sci-fi notion of “mindreading” is terrifying precisely because it violates the only space on earth that belongs entirely to us. When we communicate with people, through talking, writing, or touch, we consciously extend that private space to include others. A violation of private space is a form of mindreading. In building society around the digital world, we’ve ceded a massive amount of private space to move in safely. The only recourse to learning your boyfriend has read your journal is to hide it in a new place, but the only recourse to discovering people can hack your emails is to stop writing anything sensitive or private at all. By necessity, we’ve retreated inward. Our truly private worlds are almost entirely interior now. That loss of intimacy has already alienated us from one another. Unable to safely extend a hand or share a thought, our knowledge of people stops with avatars and public text. We can’t know people’s deeper feelings and they can’t know ours. There’s nowhere safe to talk. We are alienated.
When Glenn Greenwald asked Edward Snowden why he would risk imprisonment — the obliteration of privacy — Greenwald identified the one circumstance where personal agency is taken away. That the cyber debate revolves around the give and take of privacy tells us that we’re already in a prison of sorts. To get out, we need to reestablish laws and agreement. Not the tacit agreement of accepting free stuff in exchange for unknown costs, but overt agreement and an expectation of legal recourse if our rights are violated. As Stephen Krasner observed: “The Constitution is a document more concerned with limiting than enhancing the power of the state.” Modern lawmakers violated this precept into extinction with the PATRIOT Act. There’s no underlying belief that our present government will give up the PATRIOT Act of their own volition, and no reason to believe the public has the will to make them. This is where most people drop out of the resistance movement and succumb to prison life.
The other misstep in solving the puzzle is our obsession with predicting the future. Pew Research Center’s Net Threats survey of over 1400 technology experts asked them to predict “the most serious threats to the most effective accessing and sharing of content on the Internet.” But with so much emphasis on forecasting, we’re overlooking today’s storm. If you’d asked a South Korean mother living 20 miles from the DMZ in 1997 what the most serious threat to her children’s lives was, most Americans would have expected her answer to be a doomsday scenario of war with the north. However, it’s just as likely she would have said: “See that black sedan driving 50mph over my front doormat…?” The news-grabbing headlines often obliterate imminent dangers. Public discussion leapfrogs over what we could solve today because no one wants to dig in and do the unglamorous work of painting a dotted line down the center of the road. (Why isn’t Pew asking these 1400 experts to identify today’s most solvable problem and offer a specific solution? That’s 1400 solutions right there.)
If technology is responsible for creating a state of alienation then the government is guilty of capitalizing on that alienation. When politicians appeal to people’s confusion over new technology, they perpetuate a dangerous myth: that people can separate themselves from the digital age. Lindsey Graham’s opinion on cyber surveillance is useless if he doesn’t understand how Americans use email or why they might be upset that those emails are intercepted and read by government officials. Perhaps he’d like to turn his diary over to the CIA and see how that feels. His vote on privacy legislation would certainly be made with the necessary wisdom.
America is a world leader in computer technology and innovation. Every member of Congress, and certainly the next president, should be knowledgeable about computer technology. America’s elite governing body must be prepared to debate cyber. My 90-year-old grandmother has been sending emails for years and she has a Facebook account. If senators can’t keep up with her rudimentary computing skills then they don’t belong anywhere near the Capitol. The most important action Americans can take is to vote for cybersmart House and Senate representatives in upcoming elections.
As backwards as Washington seems, cybersmart politicians do exist. It’s clear from Hillary Clinton’s decision to house computer servers in her home during her tenure at State that she’s knowledgeable about cyber. Despite her public statement, Clinton’s use of personal servers has nothing to do with convenience and everything to do with security. Clinton owns her data. She also possesses depth of knowledge about what goes on in the intelligence community, and I expect that is precisely what drove her to take control of her privacy. If she wants to do the country a great service, in or out of the White House, she should make cyber legislation her top priority and level the playing field for citizens everywhere. It would unite the country to speak plainly about the state of our internet. Honest talk about cyber surveillance from a public figure who can speak to both sides of the debate would be a huge step forward for the country.
What will hopefully become apparent, to decision makers and citizens alike, is that both sides of the ideological struggle derive their power from the online participation of citizens. The present situation has left people with nowhere to turn for trustworthy leadership. The conditions that permitted fascism’s spread — post-war malaise, financial struggles, political distrust — tamp down people’s natural resistance to incremental loss of agency. The circumstances that facilitated the rapid creation of totalitarian governments in previously liberal, rational societies are cropping up again a century later. The situation is again ripe for machtergreifung.
Democratic European societies once made a desperate attempt to escape their status quo by funding unstable third parties with disastrous consequences. We are now seeing many radical ideas thrown into the mix, some backed by logical process, others attempting to shake people out of rhetoric fatigue. Reboot the Government! Reboot the Bible! Reboot the Brain! Drop one letter from those slogans and we’re deep in A.I. territory. Bill Gates, Elon Musk, Stephen Hawking and their ilk proclaim their fear of the dark side of artificial intelligence with increasing regularity. We should be afraid too. There’s no precedent for the power vacuum created by a flaccid Congress and a disproportionately wealthy technology sector. This situation could pave the way for the first artificially intelligent leader. The engineering is getting there, and the rest would be…history.
CONCLUSION
At the end of The Imitation Game, when the Germans have been defeated and the war declared a victory, the British codebreakers sit around a table to be dismissed. They are solemn and alienated from one another because of secrecy, spying, suspicion, and lying, though they each believe their transgressions were the morally responsible thing to do. They’re ordered by their government to keep yet another secret — deny everything they know and deny they know each other. The path they’re on has no exit and no truth. They’re in a prison of past decisions and will be for the rest of their lives. However, the circumstances that created their prison are the opposite of America’s situation today. In WWII the British government was desperate. The enemy was winning. Their strategy wasn’t clandestine by design but by circumstance, and the British public was spared the burden of deciding who to sacrifice.
Today we’re faced with governments and corporations that spy, lie, classify decision-making, and manipulate online users. These conditions are self-perpetuating. There is no definitive endgame in the shapeshifting political narratives and money-making schemes except to exert more power over the online space. To reclaim the space for public privacy, we must take the messages we’re being sent and decrypt the puzzle ourselves. Whether your bias is to fault the system or the individuals who make decisions within it, both are responsible for mistakes, and both hold the keys to solving the puzzle. The trick is to look at what isn’t there, and to ask why something is free.
At the Oscars last weekend, Sean Penn presented Mexican director Alejandro Inarritu with the Best Picture Oscar and a joke. “Who gave this son of a bitch his green card?” His comment hurt people but it was important that he said it. With a seemingly off-the-cuff remark he reminded billions of people worldwide that for the fifth year in a row America’s most celebrated film came about because America is both a temporary and permanent home to talented hard-working foreigners. Acknowledging Inarritu without acknowledging how he came to make his much beloved film is the ugly habit that perpetuates a damaging fiction of American life.
Rudy Giuliani tried his hand at the same topic last week. His swipe was serious where Penn’s was a joke but both men drew similar reactions in the media. Giuliani blessed us with something approximating the cliché second act plot twist in a romantic comedy when he announced at a GOP fundraising dinner that President Obama doesn’t love America. “He doesn’t love you, and he doesn’t love me.” Giuliani clarified in a follow-up interview that Obama is a patriot but that doesn’t mean he loves his country. (Relationships are so complicated.)
Giuliani and Penn are both savvier than their comments suggest. There’s a political message buried in the birthright narrative that Americans are finally on track to demystify. Beneath the fabled veneer of an all-American childhood is the reality that there is no uniformity to the American way of life. This is a vast, complex, multicultural democracy. Between cities, towns, states, timezones, and even between parents and children, there are stark differences in American upbringings and American lives. Anyone who defines America by his own experience is describing a culture of one.
When Giuliani stated that “[Obama] wasn’t brought up the way you were brought up and I was brought up…” he gets one important fact right. Giuliani and the President have different backgrounds. Giuliani was born and raised in New York. Obama belongs to a group of Americans (and other nationalities) who spent part of their childhood as expatriates. The term Third Culture Kids (now Third Culture Individuals or TCIs) arose in the 1950s to describe American children of expatriate military, foreign service, missionary and business families. Modern surveys on TCIs paint an interesting portrait of the “global citizen” with hallmark traits of linguistic adeptness, creativity, and excellent observational skill, but the salient characteristic of a TCI is multiculturalism.
There are various interpretations of multiculturalism (as one would expect) but the predominant tenet is to preserve and respect cultural and ethnic differences. Coexistence is the goal, rather than dominance by one culture. In extreme situations, rejection of multiculturalism is the justification for genocide. In more moderate societies, cultural intolerance plays out in the economic realm, when minorities suffer without access to the same opportunities as the majority, the armed, or the wealthy. Although the world has always struggled to accommodate cultural differences, modern civilization presents a unique confluence of culture, technology and mobility. In 2015, a person can physically travel to anywhere in the world within 24 hours and can virtually connect with nearly 50% of the world’s population in mere seconds via the internet. This unprecedented proximity of cultures is not optional food for thought. The interconnected world requires multicultural leadership.
Thus, multiculturalism is the key to America’s future. America’s power as a world leader is predicated on a thriving world to lead. (Translation for Team Giuliani: You can’t be great if you’re the only country at the table.) Leadership means fully grasping both your own potential and the potential of those you lead, and those you compete against. A quarterback is nothing without his team, and a great quarterback is the first to acknowledge the talent and efforts of his opponents after the game. Why? The point of competition is to test your skills, not to marginalize others. A victory over the weak is profoundly unexceptional. Collaboration is the unspoken foundation of competition. The ethics of sportsmanship assume that we make each other better by giving our best effort and playing our hardest and fairest. Countries are no different than teams in this regard. America needs the world as much as the world needs us.
As America’s quarterback, Obama has struggled to find his footing. He took office with the expectation that people were rolling up their sleeves beside him, yet his message of multiculturalism and his invitations for all views at the table were met with increasingly virulent suspicion from opponents and supporters alike. The people who voted him into office were unprepared for the “otherness” of his ideology and they reacted by withdrawing. Obama was equally at a loss for how to allay people’s fears. He doggedly stuck with the tactics that won him the presidency — the explanations of inclusion and the promises of cultural prosperity — but the through line of the story wasn’t conveyed: that change begins at the top, but real change happens within the people. This dynamic is at the heart of America’s culture crisis. If we resolve it, we will be exceptional for doing so.
In the meantime, the communication breakdown has spiraled into the worst gridlock in Congressional history and Obama has lost the trust of his constituents. The quarterback is nothing without his team… It’s irrefutable that Obama loves America. He has served as our president for six years, and counting. (If that service isn’t enough for Giuliani then he’s taking the definition of love to a whole new level. I kind of want to go there just to see what I’ve been missing.) However, Obama likely sees one intractable barrier to America’s limitless potential: just like the rest of the world, America is a population of Rudy Giulianis. Not the Giuliani who makes racially insensitive remarks, or spouts provocative political rhetoric, but the broadly-drawn Giuliani who is fearful and suspicious of “other” and demands reassurance that he is exceptional. There’s a Giuliani in each of us and he is at odds with multiculturalism.
The most powerful act of cultural evolution Americans can hope to achieve today is to embrace their diversity. A global community resides within American borders. Acceptance of American diversity as our new millennium identity is a conscious act of self-education, of stepping beyond familiar terrain and learning about the people who reside steps, minutes, or miles away. To fully grasp our potential we need to know each other. With one click we can learn about, connect with, and even see people far beyond our own borders.
We have in Obama a president who is uniquely suited to help us balance a multitude of views. A TCI president was an intelligent choice to grow America’s position as world leader in the Information Age. Obama possesses both an abiding love of this country and a deep understanding of the riches of the world at large. His espousal of multicultural views is exceptional in the canon of presidential rhetoric. However, for this president to be effective Americans must actively embrace diversity on an individual level with self-directed hiring practices, awareness of conflicts, and learned skill at resolving intercultural differences. No policy will make the difference. Obama can only lead by example.
Sean Penn knew there was a high probability he’d be handing the Best Picture Oscar to Alejandro Inarritu, his Mexican friend and colleague. I expect the “green card” comment was a calculated statement, not a flippant joke. Ever the activist, Penn took it on the chin for progress and invited people to hate him for the sake of opening dialogue. The hurt people feel at the mention of “green card” is not because Penn said the words, but because of what those words have come to signify in daily American life: other, different, less than, less American. Wasn’t brought up the way you were brought up and I was brought up. Doesn’t love you, and doesn’t love me. That story is out of date. The new romantic tale of America is one where we take our love to the next level and learn to embrace who we want to be, a society of tolerant, peacefully coexisting people who draw on our vast, diverse strengths, each of us different, all of us equal.