In 2007, Twitter leapt into geek consciousness at SXSW Interactive. Monitors had been placed in the halls of this tech conference, displaying what people were tweeting about. I thought it was an interesting curiosity, like watching telegrams in real time. Little bursts of text scrolled across the screen, as people shared opinions about the workshops that they were in.
Imagine, prior to this epochal event of just five years ago, we had no easy way of getting real-time information from our friends, unless of course we talked to them. And when we went to events, we were fully present, listening to speakers without constantly checking our electronic devices. We paid attention, more or less. Or nodded off. Or wandered away, in search of something more interesting, guided only by instinct.
I blame Margie Newman. An early-adopter of social media, she sold me on Twitter, pitching it (she does PR) as a great way of getting relevant information delivered right to your iPhone. I followed her as she shared interesting articles about digital marketing, as well as her daily coffee runs.
And it was good, this small-scale tool, this curiosity that entertained and informed me during odd moments during the day. It was a niche web service, used by the super-geeky and wired.
The fact that it failed on almost daily basis (the Twitter fail whale became an Internet meme) confirmed the fact that it was not ready for the mainstream. No one talked about its potential – just having Twitter working seemed like a feat.
I found a use for it. I tweeted the news that I was writing a book, Murder in Ocean Hall. I tweeted how many words I had written. Friends replied that they were looking forward to reading the book. This type of public accountability ensured that I would finish the book, no matter what. And it built an audience for my novel.
While I was engaged in my archaic writing activity, Twitter took off. Tweetups were held. Celebrities began using it. Tweets floated down from space.
The potential of Twitter was fully-realized, as it tied the world together in 140-character chains. Which was wonderful. It’s a great and easy tool that’s perfect for the non-geeky.
Twitter was big, mainstream and it even worked. The fail whale was no more.
But as Twitter and social media took off, several insidious psychological elements of the experience became apparent.
The idea of “followers” is contrary to our democratic ideals. We don’t slavishly follow our leaders, as if they’re Eva Peron. We’re not Argentina.
Early Twitter was a free exchange of information among equals. But now Twitter has become about popularity contests and numbers of followers. It’s about sucking up to some pseudo-famous person for a bit of recognition.
This is galling. It’s like recreating the world of high school cliques in cyberspace. I see the Washington Post’s top tweeps and I’m not sure what’s more absurd, a dead-tree paper defining Internet success or the dubious nature of their picks.
Social Media Expert is an Oxymoron
How can you be an expert in a medium that is so new? More than another media, social media is guided by zeitgeist and the fickle whims of the public. You cannot make a video go viral; only millions of people and their individual choices can do that.
You can train people to use social media but, in the end, it’s about knowing your audience and connecting with them in an authentic manner. And then hoping for the best.
The expert that promises that there’s some magic formula is not to be trusted. And if someone labelled as a “guru” or a “rockstar” comes waltzing into your office, then hold on to your wallet, for you’re about to make an expensive mistake.
The overhyped aspects of social media irritate me so much that I included a social media guru in my latest novel, Don’t Mess Up My Block. She writes only in lower-case and is constantly “crushing it” in everything she does.
The Collapse of Attention Spans
I worry about Millenials. Growing up digital once seemed like a gift. It may be a curse. I’m fortunate that I was young before the onslaught of attention-stealing electronic media. As a child, I could sit and read for hours. I still can, but it’s more of an effort now. I have to put the iPhone and iPad out of reach.
In the brilliant book Cognitive Surplus, Clay Shirky makes the point that the rise of the West can be traced to coffeehouses. Caffeine allowed us to concentrate and keep working. But if social media is destroying our ability to focus then how can civilization progress?
The Melding of Work, Life and Home
Facebook comes with one major misgiving for me – it mixes together people from different spheres in my life. Coworkers, high school friends, college buddies, family members and professional contacts are all thrown together in a boiling stew of political theories, recipe suggestions, kid photos, medical problems and news of the weird. It’s disturbing, this mix of people from different times and places, jumbled together and all sharing way too much information.
We used to keep different parts of our life separate, like Don Draper. I’ve known people who have quit Facebook entirely, unable to tolerate this blurring of social and professional lines.
So, What Do You Do?
My advice is to only use the tools that you enjoy. You don’t have to be on Facebook or use Twitter. For me, it’s Twitter, Flickr, Instagram and Facebook (with misgivings). I’m interested in tech, photography and events in DC.
Not on my list: Foursquare (why let people know where I’m at?), Pinterest (it’s just for girls, right?), Tumblr (don’t understand) and YouTube (cat videos).
You could spend all your time tweeting, retweeting, sharing links, rating videos and commenting on life around you. Or you can get out there and live.
After all, on your deathbed, you’re probably not going to say, “I wish I had tweeted more.”