Tag Archives: internet

Magic and Loss

Magic and Loss: The Internet as ArtMagic and Loss: The Internet as Art by Virginia Heffernan
My rating: 3 of 5 stars

I decided to read this after hearing the author on the Recode Media podcast and reading some of her shorter pieces in the Times over the years. There’s a lot to think about in Magic & Loss – I enjoyed the lucid language and often insightful commentary. The homage to the death of the telephone was wonderful, and the quick take on ‘science’ writing in the mainstream media was funny – but there were also a share of flimsy moments (did she really just try to summarize a billion photographs on Flickr by talking about the style of two users?) I found myself occasionally waiting for more substantive technical discussion (maybe I’m conditioned to expect it in any writing about the internet) but I guess the ‘internet as art’ premise doesn’t leave room for grubby engineering stuff. Unfortunately, the end of the book veered into esoteric academia. It’s impressive to see someone versed as equally in obscure Youtube clips as they are in Wittgenstein, but wrapping the book’s closing chapters in personal academic history (something about Tweeting to a physics professor?) left me feeling disconnected. I may eventually give this book another try, but next time I’ll go for the text (instead of the audiobook) so I can pause and follow up on the many arcane references.

on ‘The End of Absence’

“I fear we are the last of the daydreamers. I fear our children will lose lack, lose absence, and never comprehend its quiet, immeasurable value.”  –  The End of Absence

Many children this winter, especially in Boston, are having days off from school because of the weather. They’re being ‘absent.’ I used to love being ‘absent,’ on snow days. There was a peculiar isolation in it, a kind of detachment that’s almost impossible to reproduce now. This winter, those kids in Boston are having an entirely different ‘absence.’ They’re not absent in the way that I used to be absent.

The End of Absence by Michael Harris is another book about the internet and how modern technology is changing the human experience. I keep reading books like this. Most of them have a pessimistic take on what it all means, and the fact that I spend many evenings reading stuff like this is at least moderately contrary to the fact that I spend all my days getting paid to embrace it. That’s going to have to wait for another blog post.

So, is this particular work saying something of significance, that other books like ‘The Circle,’ ‘The Shallows,’ or ‘You are Not a Gadget‘ hasn’t said already? Maybe, maybe not. They’re all reminders that this isn’t a localized phenomenon – everybody’s feeling it.

The book starts with a summary of ‘kids these days,’ laments how no one reads anymore, and guesses that due to the changing nature of communication and availability, neuroplacticity will turn our brains to puddles. The internet has led us to a permanent state of ‘continuous partial attention’ and we should be adequately concerned. One dramatic statistic claims that if you’re over thirty, you’re probably having just as many electronic interactions as you are physical ones. This is particularly difficult, because if you’re over thirty, you’re also old enough to remember when this wasn’t even possible, and be bewildered at what things have become.

So, what are the products of ‘continuous partial attention?’ We’re confessing a lot of stuff, writes the author: “it often seems natural, now, to reach for a broadcasting tool when anything momentous wells up.” Why does that matter? Because it’s apparently made us all think we’re celebrities. The findings of a study of 3000 parents in Britain was cited:

“the top three job aspirations of children today are sportsman, pop star, and actor. Twenty-five years ago, the top three aspirations were teacher, banker, and doctor.”

The technology enables our banalities to become public performance, so public performers we (or our children) want to be.

In addition to our newly permanent residence in a virtual confessional booth, we’re also all experts now. The expression of public opinion is no longer filtered, edited, and perfected before presentation by trained editors. Some validations are in place to prevent complete falsities to spread in places like Wikipedia and Yelp, but those forums are just too big to moderate efficiently. Bullshit abounds. Bullshit is what happens when someone is forced to talk about something that they don’t know anything about, and it exists everywhere, now that everyone is encouraged to be an ‘expert’ and rewarded for their ‘competence’ by likes, comments, re-tweets, etc.

Bullshit proliferation leads into the next problem created by the ‘end of absence’ – Authenticity. The author makes an interesting point about how ‘young, moneyed people’ have made the ‘re-folking’ of life a priority – think Mumford & Sons. The IFC show Portlandia has been awkwardly successful at satirizing and celebrating this kind of ‘return to roots’ culture, where after decades of fast food, people now want to know what kind of farm their dinner was raised on; or in the midst of the digital technology era, ‘steam-punk’ advocates rebel and hold intensely serious seminars. The fetishization of the ‘authentic’ – record players and ‘old-fashioned’ moustache wax – is ‘the exception that proves the rule,’ according to the author.

Between all our confessing, expertise-sharing, and bullshit spewing, we hardly have the attention for anything else. In the chapter on ‘Attention,’ and its recent universal obliteration, the author documents his attempt to read ‘War & Peace’ with the tone of someone trying to swim to the moon. He eventually finishes reading the novel, but not without claiming that he’s alienated himself from everyone and everything he knows in the process.

A few more chapters about erosion of ability to memorize, and the ‘permanent bathhouse’ state of mind afflicting online romance-seekers, lead up to the book’s final act – the author attempts a temporary return to absence. His phone duct-taped to a table, internet connection severed, kooky old neighbors visited for coffee – he makes a valiant effort to go back in time, to when people could be ‘unavailable.’ No one ends up homeless or murdered, but the experiment reads dangerously close to the irrevocable shattering of domestic tranquility between the author and his partner.

Following the toe-dip experiment in returning to absence, the book’s final lesson is this:

“Just as Thoreau never pretended that cutting out society entirely was an option— and never, as a humane person, wanted to be entirely removed— we shouldn’t pretend that deleting the Internet, undoing the online universe, is an option for us. Why would we, after all, want to delete, undo, something that came from us? It bears repeating: Technology is neither good nor evil. The most we can say about it is this: It has come. Casting judgments on the technologies themselves is like casting judgment on a bowl of tapioca pudding. We can only judge, only really profit from judging, the decisions we each make in our interactions with those technologies.”

– The End of Absence 

Two Documentaries on ‘Hacking’

BBC – How Hackers Changed the World

While Anonymous was given most of the treatment in this piece, I didn’t feel like their story was the strongest thread in the overall hacker narrative. Lulzsec appears to be the group that did the most actual damage, while WikiLeaks is the most ethically challenging.

Anonymous comes across as a bunch of people posting on message boards who all showed up to protest the Church of Scientology once. Occasionally they were able to DDoS some government websites. They are presented as being large and formidable because allegedly ten thousand people participated in the Scientology demonstrations – but 10k isn’t that much, in the grand scheme. Ten thousand people shop at the Gap, ride the metro, buy a hot dog, blah blah every day. Boring and mundane stuff also attracts many people. Crowds don’t predicate meaning or importance.

Lulzsec on the other hand appeared to be able to cause more disruption with a much smaller and more focused group. Anonymous appeared to be too large to manage any kind of cohesive campaign, and the premise that it could operate without leadership doesn’t really have a historical precedent – anarchistic groups or societies have not accomplished much in the world, compared to groups with leaders and structure.

Discovery Channel – Secret History of Hacking

I think the order in which I watched these two documentaries made a difference in how I reacted to them. Watching the BBC doc first, and being exposed to Anonymous before Wozniak & Cap’n Crunch, had a different effect than if I would have reversed the order. I think I had less sympathy for Anonymous’s cause without the historical context of hackers that came before them.

The hackers of earlier decades, who began with ‘phone phreaking’ seemed to have a more innocent purpose. In the BBC doc, hearing the stories of Anon and Lulzsec and Wikileaks, I didn’t detect any of the harmless and curious aim that the original hackers had.

The earlier hackers also seemed to push the technology further. They subverted the intended use of the technology in a way that was unprecedented. Instead of disrupting or harming the way other people used it, they just found a way to use it more freely themselves. They were apolitical.

One interesting moment came when an early hacker spoke on the initial perception of computers – ‘why would anyone want one of these in their house?’ seemed to be the original sentiment, back in the 70’s when computers were large and not very powerful. It makes me wonder what technology is in its infancy today, that people are having the same reaction to. 3D printers? Self-driving cars? Google Glass? What else are people shunning for its perceived uselessness, that might someday dominate markets?

Two Bloggers Blogging

I stumbled twice today on articles by bloggers, about blogging.

First, in the Washington Post, Barry Ritholtz celebrated his 30,000th blog post. Yes, thirty thousand. If you’re having trouble comprehending that volume, you aren’t alone. He writes about finance in his personal blog and also contributes to several papers. Today was the first time I’ve ever read anything by him. Anyway, it’s taken me seven years to amass a paltry 172 entries in Brian Writing. At my current output, it will take me ONE THOUSAND two-hundred and fifty years (1,250 years) to catch up with Mr. Ritholtz. I better get going.

Secondly, my LinkedIn feed promoted a post by Richard Branson on his blogging tendencies. Instead of celebrating the milestone of an umpteen-thousandth post, Mr. Branson offers a general treatise on writing habit. The Virgin Galactic founder says that topics can be found in anything (I agree) and also praises the art of delegation – although he insists his posts are self-written, a team of ‘content’ people help him generate ideas. Apparently, among the committee’s ideas are to format each post with no less than three or four large, carefully composed promotional photographs of the author: talking on phones, wearing leather jackets, gazing up at the sky or pensively stroking his beard.

I enjoy writing in this blog for one of the same reasons that propelled Mr. Ritholtz to churn out thirty-thousand entries. He says: ‘Writing is a good way to figure out what you think. Often, I have no idea what I thought about a subject until I begin to write about it.‘ 

Well put, Ritholtz. It’s also a good way to make fun of billionaires and their library of self-stock imagery.

on Walter White and ‘Offline’ Identity

I’m apologetically writing this well after it originally aired, but I’ve been watching Breaking Bad for the first time. (Spoilers will be small and few, out of respect for the uninitiated.)

Instead of offering my own full-bootlicking about how amazing the show actually is, I’ll simply quote from, and agree with, these words from the AV Club’s review of the episodes ‘ABQ’ and ‘Full Measures’ –

“…this show has been one of serialized drama’s greatest accomplishments.  Television itself suddenly seems to have an expanded horizon of possibilities — for characterization, for juxtaposition, for thematic depth.  Whatever happens from this hellish moment, the long descent to this point, with all its false dawns and sudden crashes, was singularly awe-inspiring, uniquely cathartic. People living through a golden age often don’t know it.”

“Extraordinary flowerings of art, technology, culture, or knowledge are obscured by intractable problems, crises, declines in other parts of the society… It’s easy to look at television, with its 500 channels worth of endless crappy versions of the same empty ideas, and conclude that everything’s gone to shit… Ironically, this pronouncement coincides with the greatest flowering of televised drama and comedy in the medium’s history.”

There are many qualities that make Breaking Bad an incredible viewing experience, the first of which is Bryan Cranston’s boundless performance in the lead role. His acting is the only reason I’m able to think of this show in such a realistic context, and analyze his character as if it were an actual person in the same world that I live in. I could offer unending praise on the acting, the brilliant camera work, dialogue, etc. But I just want to focus this post on one specific thing that’s caught my attention, as I set out to finish the series over the next few weeks.. (let’s be real.. Days.)

Walter White’s defining characteristic is arguably his squabble with identity – is he the murderous meth-cooking gangster boss Heisenberg? Or is he the doting father, soft husband, and nerdy brother-in-law?  Is there room in a single fictitious character for both? (Yes.) Is there room in a real human being for both? (I believe so.) Maybe the show’s finale will answer some of these questions definitively, but I haven’t reached that point, so I’m still undecided on the matter. I’m willing to guess that there will still be plenty of room for interpretation on Walter’s moral character, even after the last episode’s credits roll.

The question of his identity seems so important because of other things happening in our culture right now. This is the age of facebook, where the privatest lives of the most everyday people are just as public as any royal. The first season of Breaking Bad aired in 2008, the heady days in which ‘social media’ became a phenomenon too large for anyone to ignore. The show continued playing out on the screen while in the audience’s living rooms, internet technologies connected the personal lives of everyone around the world at a breakneck pace – most intensely, the lives of comfortably wealthy Americans, especially those with an interest in the sciences or technology.

If the impetus for Walter’s entire journey is his need for money – in Season 1, funding cancer treatment was his reason for embarking on a criminal campaign – how could someone of his cultural demographics overlook the most money-making industry of this decade, the internet? When Breaking Bad premiered, and for years before, the American economy has been driven by the high value of software and computer technology. But Walter isn’t part of that America, somehow.

By all accounts, Walter White, caucasian middle-class scientist, teacher, and 2004 Pontiac Aztec driver, is the incarnate persona of a modern American internet user. If you knew a man with Walter’s pedigree, you would expect he spent his time off in some dorky enterprise like geo-caching, or beta testing Google Glass. His chemically-laced resume screams “Googler.”

But in which episode did we ever see Walter crack open a laptop? Somehow, all this fancy new ‘social’ technology has overlooked him. Instead of the positive social incubator it is intended to be, it only becomes an opportunity for further advancement into Walter’s fragile anonymity as a criminal.

The show doesn’t completely leave the internet out of its narrative – Walter Jr. raises money for his Dad’s cancer by setting up a donation website, Skylar does her research for money laundering on Wikipedia – but it rejects the idea, so often presented in today’s culture, that all of this online transparency is influential in a way that would prevent someone from taking fuller measures to hide their deviant intentions.

In the world of Breaking Bad, Walter is not persuaded by these popular new gadgets to connect in a positive way to his community, as much as Facebook would like to “make the world a more open place,” and Google would like everyone to follow its corporate motto, “Don’t be evil.” Silicon Valley’s utopian rhetoric falls limply on Walter/Heisenberg’s deaf hears.

I might be overly sensitive to this idea, working as I currently am for a company, ID.me, whose purpose is to enable an individual’s authority over their identity on the internet. In this field, as it exists now, all roads are converging on transparency. There is no accommodation for subversive duality, in the minds of those leading the development of digital identity. On Google,Facebook, ID.me, and anywhere else you want to be yourself online, you only get one persona, and it’s intended to comprise your whole self.

Popular opinion has recently treated privacy as debatable, far from an ‘inalienable right,’ and the public parade of social media is driving the idea further.  The notion that governments and neighbors can snoop and sneak through a citizen’s life, online, is commonplace.

The narrative of Breaking Bad indirectly comments on the situation: it says Yes, a person may keep part of their life private… but they might be a drug kingpin. And with its morally circuitous characters, it also diffusely challenges the evolving concept of identity, by illustrating – No, the depths of a person probably cannot be summarized by a few photographs they post to their ‘wall.’

In Reality, Googling

The line of audience members queuing up for their turn to throw a question at Eric Schmidt, Google’s Executive Chairman, seemed oddly like an inefficient search engine. There were so many things un-Googley about it, like having to wait for someone else to finish before I could ask a question, and having to get up out of my seat to get in line.

Otherwise, the hour that Schmidt spent discussing his latest book “The New Digital Age,” with co-author Jared Cohen, covered much ground and put a human face on a company that often seems much more robotic than peopled.

The book was just released in paperback and plastered with glowing reviews from statesmen including Bill Clinton, Henry Kissinger, Tony Blair and the like. In it, the authors Cohen and Schmidt attempt to map out a future which they label as humanity’s greatest experiment to date with ‘anarchy’ – the internet.

The forum at Sixth & I Historic Synagogue in Washington was largely open to audience participation and effectively managed by a moderator who was prone to poking fun at the speakers – he claimed, ‘No matter how many billions of dollars he has, Eric is still a dork,’ after a story about Schmidt’s peculiar interest in flak-jackets was told.

Topics from gender equality in the workplace, the role of technology in societies at war, and the responsibilities of parents in the realm of online privacy were all touched on. Hard working women were given ample credit for helping Google achieve the success it has, and Schmidt, when asked how the public sector might follow the same course, responded by saying, simply – ‘Promote them!’

He circled around several points of praise for promoting women in the workplace, but coming up short on actual advice for the public sector, retreated to saying ‘The fact that there’s a conversation about this right now is a start.’

It wasn’t the only topic which would prompt the ‘…it’s good to talk about…’ refrain. Inevitably, the conversation turned to government surveillance. Schmidt began to outline the international reactions to the idea, saying that if you ask a citizen in Germany about government internet snooping, you’ll get a totally different answer than you would if a citizen in Britain, or the United States gave their opinion. ‘The fact that we’re having this conversation is a start,’ he said again.

Its a reasonable answer, and that this discourse is taking place so amicably between citizens and government is fantastic, but Schmidt’s story of ‘to each their own’ fell short of making a real statement.

“Beware the myth of the single omnipotent decision maker,’ Schmidt related when asked about his philosophy on leadership. He went on to describe a room full of people, sitting around a table and shooting down each other’s ideas as the most effective way to come to a solution, lambasting the idea of a heroic individual effort in coming to profound conclusions. His regard for collaborative decision making might explain his reticence on American leadership in the debate about government snooping – perhaps its better to wait and see what everyone else thinks, too.

Cohen, a younger Google employee and the leader of the ‘Google Ideas’ branch of the company, took over when an audience member began to inquire about online privacy. ‘Its the parent’s responsibility,’ he began, ‘to talk about privacy before they even discuss the birds & the bees’ with their children. I felt like this was a punt, and much the same kind of argument that pro-gun advocates make when claiming that it’s the shooters who are to blame in a killing and not the guns.

One of the final questions of the session was the most interesting – a man asked if in this age of information inundation, whether tools like Google are doing anything to help filter the signal from the noise, or if they are actually making it harder to sift through unnecessary information – and again, from Cohen, a punt: ‘It’s a human problem, not a technology one.’

on Efficiency, Depression, Happiness, and Beer

The Google Ngram viewer charts the incidence of terms in 5.2 million books dating back centuries. It’s a pretty amazing tool.

Choosing the words ‘efficiency‘ and ‘depression‘ I graphed their usage over the last 500 years. I found a close correlation between the terms, with both beginning to rise around 1750 and sharply peaking around 1925. As a control variable I also included ‘weather’, which showed little correlation to either efficiency or depression. Something to think about as we continue to make ourselves faster, better, and stronger!

****EDIT****

To avoid being too much of a downer on a Friday, I want to also include my findings on the relationship between happiness and beer. If this doesn’t demonstrate true progress, I don’t know what will – we’ve almost reached equilibrium!

Screen Shot 2014-01-24 at 11.12.36 AM

I don’t expect to break any ground with these findings, I just wanted to share what is a very fun set of data to play around with. For more nuanced and careful analysis, see the book ‘Uncharted – Big Data as a Lens on Human Culture‘ by Aiden and Michel.

Three Books About Computers

I’ve been reading some more essays on software engineering and computer programming lately, from the three following books. Here’s a brief synopsis and some of my thoughts on each:

Program or Be Programmed: Ten Commands for a Digital Age – Douglas Rushkoff

The back jacket of this book describes Douglas Rushkoff as an author and media theorist – not as a programmer, which should be a yellow flag for anyone coming to this text looking for pragmatic programming advice.

That said, he offers an easily digestible summary of trends in internet technology, and where he thinks society as a whole would benefit most if certain standards of thought were subscribed to in the future. Many of his concepts are agreeable, if a little alarmist. (Which is okay, because I think I might be turning into a bit of an alarmist myself.)

I think the most important message Rushkoff is trying to send is one that communicates the significance of the current moment in the history of human communication and cognition. He offers ten bullet points for how to get out in front of the avalanche.

We are not just extending human agency through a new linguistic or communications system. We are replicating the very function of cognition through external, extra-human mechanisms. These tools are not mere extensions of the will of some individual or group, but tools that have the ability to think and operate.”

His response to these ‘extra-human mechanisms’ are ten ‘commands’ for how to navigate the new normal. Instead of letting the programs affect their logic into our own biologic cycles, he warns that we must continue to be in charge. A few of his ‘commands’ –

  • Time: We Must Not Always Be On.
  • Place: Live in Person. There is No Substitute for In-Person Interaction.
  • Discrete: Everything Is a Choice. You May Always Choose None of the Above.
  • Identity: Always Be Yourself. Accountability Must Exist Online.
  • Openness: Share. Don’t Steal.

Rushkoff makes several statements about why it is so important to be careful in how society progresses alongside internet technology, but the one that struck me most was an analogy that might be stretching the limits of fair comparison – he compares people not learning programming, to people over the last few decades not understanding how their cars work.

He lists all the gripes about the modern automobile situation in the United States, identifying sprawl, environmental hazard, stress of drivers, accidents, etc., and proposes that all these negative things could have been avoided if people would have spent more time wondering how cars work and what they do instead of just getting in and driving them.

Therefore, he says, everyone should know how to program, so that sixty years from now we aren’t stuck in a debate about how to fix all the problems that poor programming has caused, or how to catch up with the other nations who have advanced past America in their programming ability, etc.

His argument isn’t ridiculous – he’s saying that by blindly accepting the technologies, we blindly accept the risks and dangers they bring with them – but these seem like two different animals.

I like the idea, and I agree with some of the sentiment, especially that automobile culture has created an environment with just as many new problems as solutions to old ones – but I disagree on where to lay the blame.

Not everyone is meant to be a programmer, just like not everyone should be a Navy Seal, a mural painter, or a tour de France cyclist. it requires a person with a unique set of skills and personality.

The automobile problems society has now seem to me as much the result of poor land use decisions, mismanaged federal funding, and urban planning as they are the result of ‘people not knowing how their cars worked.’ How can that claim even be justified? How can you measure what would be different if people had ‘known how their cars worked?’

If Rushkoff is arguing we should all learn to program for the purpose of being able to check off the box, and have our asses covered later just in case someone needs to take the fall for writing bad programs – why is he choosing the actual programming layer as the thing society must learn?

There are so many components and pieces of modern technology that make the internet work, from the electrical engineering in circuit boards to the network infrastructure of fiber optic cable, that correlating programming to world-saving feels like a weak handed grab at one small piece of something that happens to be currently fashionable. Programming just seems like an easy target because it’s the most popular.

Hackers & Painters: Big Ideas from the Computer Age – Paul Graham

Unlike Rushkoff, Graham hasn’t written this book as a treatise for an entire society to absorb and go forth with. Graham’s ideas don’t feel forced, or directed at a readership who wouldn’t already understand many of the concepts. He’s not trying to push the envelope as much as make the envelope shinier and easier to read.

In a masterfully clear writing style, Graham lets his intellect run wild through a variety of topics. He observes ‘Why Nerds are Unpopular,’ in one chapter. He expounds that Startups are the safest bet for wealth generation in another. Choosing an operating system, finding the ‘perfect’ programming language, and why income inequality is a positive thing in society, are all topics treated with their own essays.

Graham doesn’t make any bold statements in these essays that shares Rushkoff’s sentiment that programming should be a giant rainbow on which every type of person in the world can dance – on the contrary, Graham comes across as more likely to believe that programming should only be done by people who want to do it, and who understand how to do it, and everyone else should stick to the things they are good at.

In his ‘Nerds’ essay, he outlines the formative experiences of programmers (nerds) in school, and how their outcast separation from others lays the foundation for them to be successful when they enter the ‘real’ world.

The most interesting chapter for me was the exploration of wealth inequality in America. Graham argues that the only logical way to reduce income inequality is to take money from the rich – and that to do so damages the entire economy.

If inequality were solved by taking money from the wealthy and handing it over to the poor, Graham states, the wealthy would have none left to invest, Startups would disappear, technological growth would halt, and society would stagnate.

Graham’s proposed alternative is not to attack wealth itself, but the corruption that it so often enables. If you kill corruption, the wealthy will continue to grow small businesses. If you kill wealth, for the sake of redistribution, then everything that wealth enables dies alongside it.

His argument earns its credence from his own experience in capital growth – Graham built one of the internet’s first e-commerce platforms, which was later acquired by Yahoo!, minting him with great riches.

In turn, he founded the Y Combinator, a place for Startups to network, incubate, and find investors. He proves his inequality argument by outlining the way his new wealth helped to create more where it didn’t previously exist.

Close to the Machine: Technophilia and Its Discontents – Ellen Ullman

Ullman, author of ‘The Bug’ (Finalist for the PEN/Faulkner award) drapes her tales of computer life in savory detail and lucid prose. She finds a way of paying computers their due respect, while concurrently reminding the reader that programs are unnatural, dumb and fragile tools, driven by the peculiarities of the programmer’s mind.

Unlike Rushkoff, Ullman finds solace in placing a distance between herself and the goal of a program. Where Rushkoff wants to insert himself and his ethics and his sense of right and wrong into programs, Ullman wants to back far away.

Describing a project she worked on in which she was tasked with building a registry for AIDS patients, she speaks of the system users and their “fleshy existence” as a distraction, something that must be ignored in order to create the program.

She writes of coding the AIDS registry:

Real, death-inducing viruses do not travel here. Actual human confusions cannot live here. Everything we want accomplished, everything the system is to provide, must be denatured in its crossing to the machine, or else the system will die.

Neither Rushkoff or Graham would seem as ready as Ullman to portray themselves as freaks – while Graham makes ample reference to the history of Nerdism, and takes up personal residency there,  he hints at none the fear or anxiety that Ullman experiences as she contemplates what motivates the programming mind.

I’m upset, so I’m taking apart my computers,” she writes. “If I were a poet, I would get drunk and yell at the people I love. As it is, I’m gutting my machines… there’s a perverse comfort in broken machinery.

Ullman’s focus on physicality between person and machine is unique. Her relationships with the objects carry more significance than any that Graham or Rushkoff mention, and she finds contrast between that ‘closeness’ and the dislocation felt by many who live their day to day lives online.

In one of the books most poignant moments, Ullman tours real estate that she and her sister inherited from their deceased father. One of her newly inherited tenants, a purse shop owner on Wall Street in Manhattan, explains to her that his business is failing, “because of the modems.” According to the shop owner, Wall Street managers who used to stop in and buy bags are all telecommuting from Connecticut now, while the remaining customers are ‘of a different class.

Ullman recoils at the idea her profession, programming, building applications for remote communication, is the root of failure in the businesses supported by her father. She mourns at what feels like her father’s generation’s work evaporating.

In my world, it was so easy to forget the empty downtowns. The whole profession encouraged us: stay here, alone, home by this nifty color monitor. Just click. Everything you want – it’s just a click away. Everything in my world made me want to forget how – as landlord, as programmer, as landlord/programmer helping to unpeople buildings like my very own – I was implicated in the fate of Morty and the bag shop.”

Her resistance to hail computers as mankind’s last savior (a line tiptoed by Rushkoff and Graham) is all the more authoritative given her experience as a programmer.

She writes of her pedigree – “I have taught myself six higher-level programming languages, three assemblers, two data-retrieval languages, eight job-processing languages, seventeen scripting languages, ten types of macros, two objects-definition languages, sixty-eight programming library interfaces, five varietys of networks, and eight operating systems.

In contrast, Graham spends a lengthy chapter huzzah’ing the merits of a single programming language, LISP, while Rushkoff mentions nary a learned language under his own belt.

the Fictionalizations of ‘the Google’

I had a colleague a few years ago who joked about how his aging parents always referred to Google, the search engine, as “the Google,” as if the internet giant had become an entity of such massive, generic proportion that it deserved its own “the..”, like “the city,” or “the ocean,” or “the internet.” The Google.

Popular culture has been producing fictionalized narratives about what life at Google might be like, to complement the hordes of reportage documenting the reality of the company. For an account of how it came to be, and an outsider’s view of the founders, Ken Auletta’s non-fiction book “Googled” tells a fascinating story.

But the real story of Google is about the people who work there, and what they are trying to accomplish. There are plenty of imaginary guesses as to what that’s like – in ‘The Internship,’ actors Owen Wilson and Vince Vaughn actually have the blessing of Google’s marketing department to use the company’s real logo, and refer to it by name, in their imaginary take on what it’s like to work for the massive company.

The American author Dave Eggers has recently published “The Circle,” his take on life at Google, (or maybe some combination of Google and Facebook) and how the company is changing the world, but without the happy rainbows and moon-glow sheen of the Wilson/Vaughn film.

Of the two accounts, is one more accurate than the other? I would need first hand experience to answer that with any authority. My best guess is that Eggers is reaching closer to Google’s heart than Vaughn and Wilson.

At Eggers’s Google (He calls it ‘the Circle’) the campus glistens and sprawls, the office parties are legendary, and the ‘Circlers’ on staff are all brilliant, young intellectual heavyweights. But eventually, the villianization of privacy becomes overwhelming, the expectations of world-saving become untenable, and the marriage of life and work becomes suffocating.

Eggers’ Google follows these guiding principles, echoing Orwell’s Big Brother:

“SECRETS ARE LIES. SHARING IS CARING. PRIVACY IS THEFT.”

The Circle has incredible ambition – an imaginative product called ‘TruYou,’ which is your real identity, everywhere online; ‘SeeChange,’ a YouTube-like network of tiny cameras placed everywhere in the world, broadcasting everything to satisfy anyone’s curiosity; and ‘Transparency‘, which puts the cameras on individual persons, worn as a necklace, making their every movement a publicly broadcasted act. Numerous other realistic inventions are sprinkled throughout the story, introduced as positive societal game-changers, but simmering beneath the surface with totalitarian terror.

As Eggers’ describes these fictional innovations, without diving into technological reality, they actually seem very close to the realm of possibility – or at least near to the trajectory we can expect to see over the next few decades.

The story follows the path of Mae, an ambitious young woman drawn to the company by its promise of involvement, optimism and excellence. The journey she takes is one that moves from initial bewilderment at The Circle to a creeping acceptance and incapacitating servitude, while she alienates and betrays every real relationship in her life along the way.

The ugly consequences of The Circle’s mission to publicize everything are highlighted by the revulsion felt by Mae’s ex-boyfriend, who chastises her:

 “Every time I see or hear from you, it’s through this filter. You send me links, you quote someone talking about me, you say you saw a picture of me on someone’s wall… It’s always this third-party assault. Even when I’m talking to you face-to-face you’re telling me what some stranger thinks of me. It becomes like we’re never alone. Every time I see you, there’s a hundred other people in the room. You’re always looking at me through a hundred other people’s eyes.” 

At Vaughn and Wilson’s Google, in “The Internship,” the company is nothing more than a place for two aging slackers to take a second shot at being financially responsible adults, who are capable of earning a living to support themselves – it just so happens this place is also Google, where everyone who wears the logo must be disruptively smart and attractive.

‘The Internship’ doesn’t touch on a single thing that Google actually does, or how their real products and technology are used by the world, until a thrown together final scene which vaguely hints that Google can help a small pizza shop – yet this is the fictionalization that the corporation gave a real blessing to, with ample permission to display their bright and shiny logo in nearly every scene, from the extensive coverage of the ‘nap stations’ on campus, and the ample free food and snacks, to the team-building trips at San Francisco strip clubs.

The British film critic Mark Kermode described The Internship as “one of the most witless, humourless, vomit-inducingly horribly self-satisfied, smug, unfunny comedies I have ever seen.”

So which of these representations is the real Google? Hmm, I don’t know…. Maybe you can Google it.

When I was a kid, I remember having playful arguments with friends during our imaginative games that were settled by how many multiples of a number we were better than one another – ‘I’m a million times taller’ or ‘I’m ten million times faster!’

One day, in a conversation with my Dad, he explained the number ‘googol,’ and I felt like a huge cloak had been lifted from the possibilities of the universe. It was the biggest number ever! In my imagination, I could be GOOGOL times faster!

So now, along with all the other cosmic and intricate coincidences that fill up my life, I’m an adult, and Google is still the easiest way to end an argument.

on The Facebook Effect

I decided to read David Kirkpatrick’s book, The Facebook Effect, because I wanted to rationalize my somewhat recent decision to ignore a product that has become one of the most widely used in the world, achieved staggering valuations, etc. So here is my rambling reaction to the book, and my latest thoughts on Facebook in general:

There are reasons I want to like Facebook. I love sharing photos, reading opinions, and the little dopamine spritz that comes with any online interaction. Mark Zuckerberg even seems like a decent guy, at the very least a champion of my generation in leadership and business acumen. When I go all the way back to 1984 to compare our lives’ paths, starting with his birthday about 3 weeks before my own, it’s impossible not to be awed. Although we probably took the same spelling lessons in 4th grade, and maybe asked similar questions in high school government classes – years later while I was either underpaid or unemployed, Mark Z. was turning down multi-billion dollar offers for the company he founded.

But Facebook’s core principle of sharing everything with everyone, and in turn letting everyone I know share what they know about me with everyone else I know – I still can’t fully agree is a socially responsible or pleasant practice. Maybe it’s the introvert in me. Maybe it’s because transparency and openness don’t translate well into hierarchical organizations, which most workplaces are. Since work is sadly where I, like most 9-6 adults, do the majority of my socializing, it just doesn’t make sense to conflate the professional space with the casual, friendly mentality of ‘friending’.  It was no accident that Facebook never organized its growth by expanding to corporate usage in the way it initially did at universities. I enjoyed using Facebook while I was a student, but in the world of colleagues, bosses, and board members, the share-all mentality took on a different patina.

I think one of the great things Facebook does is let people use their real identity online, an option that had been missing before. Unfortunately, identity gets stretched thin, defined too easily by relationships, associations, and Likes. Identity is a complicated construction. We aren’t just books we’ve read or bands we like. All the empty database fields in the universe couldn’t capture who I really am. I could go on about what constitutes a person’s identity, but I’m fairly certain there’s more to it than what can be captured on a profile page.

facebook, 2010

facebook, 2010

A favorite feature of Facebook, often used by its champions as an argument in its defense, is the presentation of ‘news that matters to me.’ The ‘News Feed’ as it was called when released caused a fury of opposition initially, agitating users who didn’t want their activity broadcast to everyone they knew. Since the initial outcry, a few minor policy changes have placated the chorus of dissent and everyone who continues to use the service considers the News Feed a useful part of the product.

The information you receive through your mapped relationships on the site is likely to carry significance because it’s coming from the people you’ve indicated are important to you, and not an impersonal corporation, or a profit-driven media conglomerate. Unfortunately, what this creates is the possibility of an echo chamber – you’re only going to find stories that agree with your own perspective.

Friendship, as C.S. Lewis defined it, is a conspiratorial act. To be friends with someone is to implicitly agree that it’s ‘us’ against ‘them.’ So what happens when we all become ‘us’? If everyone is friends with everyone, which is Mark Z.’s ‘openness’ philosophy taken to its logical extreme, who is left to be the ‘them’ that we check our behavior or beliefs against? In a world of truly open friendships how does the formal, technically bureaucratic process of sending ‘requests’ and ‘accepting’ people fit in? If we benefit from all being connected, as Mark Z. prophecies, why the need to draw documentary lines between the connections?

Maybe this odd conception of friendship on Facebook is one of the reasons I don’t feel awkward exposing my thoughts in blog posts, but do in Facebook status updates. With a blog, there is no expectation of friends to read or comment on my writing – this post is intended for anyone who is interested enough to seek it out, not because they know who I am, but because they’ve identified a topic I’ve written about as interesting or useful to them.  Is my blog more transparent than a Facebook profile? Perhaps. Does that translate to mean Facebook is a safer, less revealing place for my online identity?  No, I don’t think so.

The book describes situations like the anti-FARC uprising in Colombia and dissenting Egyptians as movements empowered by the connections created online. I think it’s great that in some places this communication tool has served an ideological purpose, but I don’t believe it’s the magical democratic bullet that some have made it out to be. The power of organization offered by Facebook can help small causes receive greater attention. But on a grand scale, the expectation of transparency seems intrusive.

The most fundamental principle of democracy, the vote, is an anonymous, private act – and designed to be that way for a reason. In essence, each ‘Like’ I create on Facebook is a kind of ‘vote’ – not for an election, but for popularity, relevancy, or brand loyalty. And each one of those ‘Likes’ – whether for Barack Obama or Coca-Cola –  is an action my ‘friends’ (but for many people, anybody) can see, comment on, and either celebrate or criticize. Facebook hardly provides the curtain between voting booths that we expect when we cast our ballots.

Amid questions of Facebook’s usefulness, its violations of privacy, regulatory concerns and potential infringement on the operations of governments – the executives of companies like Sony, Microsoft, the Washington Post, and Reuters all provide quotes heaping compliments on Facebook, and the author gives these recommendations as indicators of the products credibility and usefulness for everyone. Lest we forget these corporate heads are in the business of selling targeted advertisements, a practice so enhanced by Facebook that when the company began to implement it, revenues soared and made multi-billionaires of the founders.

Since I stopped using the service a few years ago, there have been occasional twinges of nostalgia and moments of regret, when appealing people ask to ‘friend’ me or when I read about an interesting new feature release (the graph search seems to be a tremendous improvement to the site as a whole.) I’ve recently started using it again, in a very limited capacity, with the sole purpose of interacting with family members. I’m somewhat nervous that the wheels have started turning, and I will find other compelling ways I can begin to reuse the site.

My main takeaway from the book, aside from the inspiring story of its founders’ vision and dedication, and the mind-boggling numbers behind its user base and financial war chest, is that Facebook is impossible to ignore. Whether I’m personally using it or not, it is certainly going to be around for a while, and I will probably end up in the minority if I continue to play deaf dumb and blind to it. In the end, whether it’s evil or holy, enabling or destructive, its effects will be felt by everyone who uses the internet. Perhaps, and I’d be surprised if I’m the first to admit it – my biggest problem with Facebook is that I didn’t think of the damn thing myself.