Skip to main contentSkip to navigationSkip to navigation
A teenager trying to take a selfie in front of the Queen in Belfast.
A teenager trying to take a selfie in front of the Queen in Belfast. Photograph: Peter Macdiarmid/PA
A teenager trying to take a selfie in front of the Queen in Belfast. Photograph: Peter Macdiarmid/PA

‘Pics or it didn’t happen’ – the mantra of the Instagram era

This article is more than 9 years old
How sharing our every moment on social media became the new living

Our social networks have a banality problem. The cultural premium now placed on recording and broadcasting one’s life and accomplishments means that Facebook timelines are suffused with postings about meals, workouts, the weather, recent purchases, funny advertisements, the milestones of people three degrees removed from you. On Instagram, one encounters a parade of the same carefully distressed portraits, well-plated dishes and sunsets gilded with smog. Nuance, difference, and complexity evaporate as one scrolls through these endless feeds, vaguely hoping to find something new or important but mostly resigned to variations on familiar themes.

In a digital landscape built on attention and visibility, what matters is not so much the content of your updates but their existing at all. They must be there. Social broadcasts are not communications; they are records of existence and accumulating metadata. Rob Horning, an editor at the New Inquiry, once put it in tautological terms: “The point of being on social media is to produce and amass evidence of being on social media.” This is further complicated by the fact that the feed is always refreshing. Someone is always updating more often or rising to the top by virtue of retweets, reshares, or some opaque algorithmic calculation. In the ever-cresting tsunami of data, you are always out to sea, looking at the waves washing ashore. As the artist Fatima Al Qadiri has said: “There’s no such thing at the most recent update. It immediately becomes obsolete.”

Why, then, do we do it? If it’s so easy to become cynical about social media, to see amid the occasionally illuminating exchanges or the harvesting of interesting links (which themselves come in bunches, in great indigestible numbers of browser tabs) that we are part of an unconquerable system, why go on? One answer is that it is a byproduct of the network effect: the more people who are part of a network, the more one’s experience can seem impoverished by being left out. Everyone else is doing it. A billion people on Facebook, hundreds of millions scattered between these other networks – who wants to be on the outside? Who wants to miss a birthday, a friend’s big news, a chance to sign up for Spotify, or the latest bit of juicy social intelligence? And once you’ve joined, the updates begin to flow, the small endorphin boosts of likes and re-pins becoming the meagre rewards for all that work. The feeling of disappointment embedded in each gesture, the sense of “Is this it?”, only advances the process, compelling us to continue sharing and participating.

The achievement of social-media evangelists is to make this urge – the urge to share simply so that others might know you are there, that you are doing this thing, that you are with this person – second nature. This is society’s great phenomenological shift, which, over the last decade, has occurred almost without notice. Now anyone who opts out, or who feels uncomfortable about their participation, begins to feel retrograde, Luddite, uncool. Interiority begins to feel like a prison. The very process of thinking takes on a kind of trajectory: how can this idea be projected outward, towards others? If I have a witty or profound thought and I don’t tweet or Facebook it, have I somehow failed? Is that bon mot now diminished, not quite as good or meaningful as it would be if laid bare for the public? And if people don’t respond – retweet, like, favourite – have I boomeranged back again, committing the greater failure of sharing something not worth sharing in the first place? After all, to be uninteresting is a cardinal sin in the social-media age. To say “He’s bad at Twitter” is like saying that someone fails to entertain; he won’t be invited back for dinner.

In this environment, interiority, privacy, reserve, introspection – all those inward-looking, quieter elements of consciousness – begin to seem insincere. Sharing is sincerity. Removing the mediating elements of thought becomes a mark of authenticity, because it allows you to be more uninhibited in your sharing. Don’t think, just post it. “Pics or it didn’t happen” – that is the populist mantra of the social networking age. Show us what you did, so that we may believe and validate it.

Social media depends on recognition – more specifically, on acts of recognition. The thing itself is less interesting than the fact that we know someone involved, and if it is interesting or important, we can claim some tenuous connection to it. We enact the maxim of the great street photographer Garry Winogrand: “I photograph to find out what something will look like photographed.” We document and share to find out how it feels to do it, and because we can’t resist the urge. Otherwise, the experience, the pithy quote, the beautiful sunset, the overheard conversation, the stray insight, is lost or seems somehow less substantial.

Sharing itself becomes personhood, with activities taking on meaning not for their basic content but for the way they are turned into content, disseminated through the digital network, and responded to. In this context, your everyday experiences are only limited by your ability to share them and by your ability to package them appropriately – a photograph with a beautiful filter and a witty caption, or a tweet containing an obscure movie reference that hints at hidden depths. For some users, this process is easy: snap a photo, write below what you are doing, send it out on one or several networks. For others, it can lead to paralysing self-consciousness, a sense that no social broadcast is good enough, no tweet or Facebook status update reflects the mix of cool, wit, and elan that will generate feedback and earn the user more social capital. Along the way, we have developed an ad hoc tolerance for these gestures, as well as a shared familiarity and understanding. Who hasn’t stopped an activity mid-stride so that a friend can send out some update about it? Who hasn’t done it himself?

On the social web, the person who doesn’t share is subscribing to an outmoded identity and cannot be included in the new social space. If not off the grid, he or she simply is not on the grid that matters – he may have email, but is not on Facebook, or he is present but not using it enough. (The prevailing term for this is “lurker,” an old online message board term, slightly pejorative, describing someone who reads the board but doesn’t post.) It is not uncommon to ask why a friend is on Twitter but rarely tweets, or why she often likes Facebook statuses but never posts her own. Why are they not busy accumulating social capital? Still, being in the quiet minority is far better than not participating at all. Worst, perhaps, is the person whose frequent tweets and updates and posts earn no response at all. In the social-media age, to strive for visibility and not achieve it is a bitter defeat.


We become attuned to the pace and rhythm of sharing and viewing, building an instinctual sense of the habits of our followers and those we follow, those we call friends and those we just stalk. We develop what some social scientists have termed “ambient awareness” of the lives of those in our social graphs and we intuit, Jedi-like, when they have been absent from the network. Our vision becomes geared towards looking at how many likes or comments a post has received, and when we open the app or log on to the network’s website, our eyes dart towards the spot (the upper righthand corner, in Facebook’s case) where our notifications appear as a number, vermilion bright. We might also receive email alerts and pop-ups on our phones – good news can arrive in a variety of ways, always urging you to return to the network to respond.

The problem with alerts is that, like our updates, they never end. They become a way to be permanently chained to the network. We are always waiting to hear good news, even as we ostensibly are engaged in something else. Just as urban spaces threaten to do away with silence or with stars – the city’s sound and light, its primordial vibrancy, become pollutants – notifications crowd out contemplation. They condition us to always expect something else, some outside message that is more important than whatever we might be doing then.

The writer and former tech executive Linda Stone calls this phenomenon “continuous partial attention”. She differentiates it from multitasking, though there is some similarity. Continuous partial attention, she says, “is motivated by a desire to be a live node on the network. Another way of saying this is that we want to connect and be connected. We want to effectively scan for opportunity and optimise for the best opportunities, activities, and contacts, in any given moment. To be busy, to be connected, is to be alive, to be recognised, and to matter.”

Psychologists and brain researchers have begun studying these problems, with some dispiriting conclusions: multitasking is largely a myth; we can’t do multiple things at once, and when we try, we tend to do a poorer job at both. Frequent interruptions – such as your phone starting to vibrate while you are reading this paragraph – make it harder to return to the task at hand. In fact, office workers experience an interruption about every three minutes. It could be an email popping up or a friend coming by your desk. But it can take more than 20 minutes to shake off the interruption and get back to the job at hand. That means that many of us are being interrupted too often to regain focus, with the result that our work and mental clarity suffer. On the other hand, some of these same studies have found that when we expect interruptions, we can perform better, as we train ourselves to become more single-minded and to complete a task in a limited period.

You can turn off your Twitter’s email alerts or tell your smartphone to stop pushing your Facebook updates or the latest news from Tumblr. But alerts are the critical symbol of the call and response, the affirmation and approval, that tie a social network together. They let us know that we are being heard, and if we do not have them forced on us, we still have to reckon with them when we log in to the app or on to the network’s website. It is important not only to have them but also to have them in sufficient number – or at least some amount that, we tell ourselves, justifies the update. Four people clicked “like”; that’s enough, I guess. After posting an update, we might return to the network several times over the next hour, hoping for some validating reply.

The window for this kind of response is painfully brief. We know fairly quickly whether our beautifully filtered photo of a grilled cheese sandwich or our joke about a philandering politician was a dud. According to a 2010 study by Sysomos, only 29% of tweets receive a response – a reply, retweet, or favourite – while 6% are retweeted. A full 92% of retweets happen within the tweet’s first hour, meaning that if 60 minutes have passed and no one has picked up on your tweet, it has likely disappeared into the ether. Even when they do appear, likes and favourites have been mostly drained of meaning – a sign of approval and popularity, sure, but also now a rather conventional way of telling a friend that he was heard. The favourite has become a limp pat on the back.

This ephemerality contributes to social media’s tendency towards self-consciousness and the constant calibrating of one’s public persona. We know that we do not have much time – or many characters – and that we had better make it count. “If I don’t get more than 10 faves in [the] first three minutes after tweeting something, I’ll probably delete it,” one amateur comedian told the Wall Street Journal. Otherwise, the tweet hangs there, a minor emblem of its author’s unsatisfied ambition.

What that comedian really fears is the loss of followers and social capital. We take it for granted, perhaps, that social media comes with metrics. We are constantly told how many people are following us, how many approved of an update, how many people follow those people. Metrics help create the hierarchies that are embedded in all social networks, and that often replicate offline hierarchies. If you do not know immediately how popular someone is on social media, the answer is only a click away.

This hovering awareness of rank and privilege helps drive the insecurity and self-consciousness that result from an environment suffused with the language of PR, branding, and advertising. Describing his experience on Twitter, the satirist Henry Alford writes that “every time someone retweets one of my jokes, it sets off a spate of fretting about reciprocity … If the person is a total stranger whose feed I do not follow, then I will look at this feed and consider climbing aboard. I’ll look at the ratio of how many tweets to how many followers that person has: if it exceeds 10 to 1, then I may suddenly feel shy. Because this person is unknown to me, I will feel no compunction to retweet a post of hers, though I may be tempted to ‘favourite’ (the equivalent of Facebook’s ‘like’ button) one.”

Twitter superstar Rob Delaney. Photograph: Sarah Lee

Alford is demurring here. What he really means is that someone with a tweets-to-followers ratio of 10 to 1 is probably an unknown, one of the innumerable Twitter users whose many tweets go pretty much ignored, and not one of the journalists, comedians, or writers who probably belong to his intended audience. (He goes on to mention, happily, that one of his jokes was recognised by the comedians Merrill Markoe and Rob Delaney, the latter a Twitter superstar.) There is nothing wrong with that, of course, except that Alford’s own admissions speak to the difficulty of negotiating the odd social pressures and anxieties that come with every utterance being public. Is he on Twitter to promote himself, to meet people, or to endear himself to colleagues – or is there a conflicting mix of motivations? In Alford’s case, his concerns about visibility and reciprocity intensify when he is responded to on Twitter by someone he knows: “Suddenly the pressure mounts. I’ll proceed to follow her, of course, if I don’t already. Then I’ll start feeling very guilty if I don’t retweet one of her posts.” Each exchange requires a complex cost-benefit analysis, one that, for anyone who has experienced this, may seem wildly disproportionate to the conversation at hand. Just as metadata (that is, the number of retweets or likes) can matter more than the message itself, this process of meta-analysis, of deciphering the uncertain power dynamic between two people, can seem more important than the conversation on which it is based.


Maybe Alford would be more comfortable with photographs, which, in their vivid particularity, seem to demand less of a response. They can live on their own. We don’t need to justify them. Photographs “furnish evidence,” as Susan Sontag said. Or, in Paul Strand’s words: “Your photography is a record of your living.” You met a celebrity, cooked a great meal, or saw something extraordinary, and the photograph is what remains: the receipt of experience. Now that every smartphone comes complete with a digital camera as good as any point-and-shoot most of us had a few years ago, there is little reason not to photograph something. Into the camera roll it goes, so that later you can perform the ritual triage: filter or no filter? Tumblr, Instagram, or Snapchat?

The ubiquity of digital photography, along with image-heavy (or image-only) social networks such as Instagram, Pinterest, Tumblr, Imgur, Snapchat, and Facebook, has changed what it means to take and collect photos. No longer do we shoot, develop, and then curate them in frames or albums in the privacy of our homes. If we organise them in albums at all, it is on Facebook or Flickr – that is, on someone else’s platform – and we leave them there to be commented upon and circulated through the network. Photos become less about memorialising a moment than communicating the reality of that moment to others. They are also a way to handle the newfound anxiety over living in the present moment, knowing that our friends and colleagues may be doing something more interesting at just that very moment, and that we will see those experiences documented later on social media. Do we come to feel an anticipatory regret, sensing that future social-media postings will make our own activities appear inadequate by comparison? Perhaps we try to stave off that regret, that fear of missing out, by launching a preemptive attack of photographic documentation. Here we are, having fun! It looks good, right? Please validate it, and I’ll validate yours, and we’ll take turns saying how much we missed each other.

Photography has always been “acquisitive,” as Sontag called it, a way of appropriating the subject of the photograph. Online you can find a perfectly lit, professionally shot photo of nearly anything you want, but that does not work for most of us. We must do it ourselves.

Think about the pictures of a horde of tourists assembled in front of the Mona Lisa, their cameras clicking away. It is the most photographed work of art in human history. You can see it in full light, low light, close-up, far away, x-rayed; you can find parodies of parodies of parodies; and yet, seeing it in person and walking away does not suffice. The experience must be captured, the painting itself possessed, a poor facsimile of it acquired so that you can call it your own – a photograph which, in the end, says, I was here. I went to Paris and saw the Mona Lisa. The photo shows that you could afford the trip, that you are cultured, and offers an entrée to your story about the other tourists you had to elbow your way through, the security guard who tried to flirt with you, the incredible pastry you had afterwards, the realisation that the painting really is not much to look at and that you have always preferred Rembrandt. The grainy, slightly askew photo signifies all these things. Most important, it is yours. You took it. It got 12 likes.

This is also the unspoken thought process behind every reblog or retweet, every time you pin something that has already been pinned hundreds of times. You need it for yourself. Placing it on your blog or in your Twitter stream acts as a form of identification – a signal of your aesthetics, a reflection of your background, an avatar of your desires. It must be held, however provisionally and insubstantially, in your hand, and so by reposting it, you claim some kind of possession of it.

Beyoncé, Jay Z and the Mona Lisa. Photograph: via iam.beyonce.com

Something similar can be said about people you see at concerts, recording or photographing a band. Unless you have a fancy digital SLR camera and are positioned close to the stage, the photographs will probably be terrible. Recording the show is even more of a fool’s errand, because it will only show you how poor your iPhone’s microphone is and how this experience, so precious at the time, cannot be captured by the technology in your pocket. No, the video will be shaky, as you struggle to hold your phone above the heads of people in front of you, and the audio will sound like someone played the track at full volume inside a steel trash can. But of course, many of us do this, or we hold our phones aloft so friends on the other end of the line can hear – what exactly? Again, we find that it doesn’t quite matter. We will probably never watch that video later, nor will we make that photo our desktop wallpaper or print it out and frame it. The crummy photos, the crackly recording, the indecipherable blast of music a caller hears: these are not personal remembrances or artistic artefacts. They are souvenirs, lifestyle totems meant to communicate status – to be your status update. They do not describe the band being captured; they describe us.

People often ask, “Why don’t concertgoers just live in the moment and enjoy the show? Don’t they know that the photos won’t turn out well and that focusing on their iPhone, staring at its small screen rather than the thing itself, takes them out of the experience?” The answer is yes, but that’s also beside the point. Taking photos of the band may show a kind of disregard for the music, an inability to enjoy it simply as it is, but it also reflects how the very act of photographing has become part of just about any event or evening out.

In the same way that Sontag says that “travel becomes a strategy for accumulating photographs” (eg standing in a certain spot in front of the Leaning Tower of Pisa, where you can position your arms so it looks like you are holding up the building), life itself becomes a way of accumulating and sharing photographs. Taking photographs gives you something to do; it means that you no longer have to be idle. Living in the moment means trying to capture and possess it. We turn ourselves into tourists of our own lives and communities, our Instagram accounts our self-authored guidebooks, reflecting our good taste. At the same time, the use of filters and simple image-editing software means that any scene can be made into an appealing photograph or at least one that has the appearance of being artistically engaging. The process of digitally weathering and distressing photographs is supposed to add a false vintage veneer, a shortcut to nostalgia, and it has the added benefit of making it look like the photo went through some process. You worked for it, at least in a fashion.

This kind of cultural practice is no more clearly on display than during a night out with twentysomethings. The evening becomes partitioned into opportunities for photo taking: getting dressed, friends arriving, a taxi ride, arriving at the bar, running into more friends, encountering funny graffiti in the bathroom, drunk street food, the stranger vomiting on the street, the taxi home, maybe a shot of the clock before bed. A story is told here, sure, but more precisely, life is documented, its reality confirmed by being spliced into shareable data. Now everyone knows how much fun you had and offers their approval, and you can return to it to see what you forgot in that boozy haze.

All of this also offers evidence of your consumption, of the great life you are living and the products contained therein. Photography’s acquisitive aspect – the part of it that turns life into one long campaign of window shopping – finds its fullest expression on Pinterest, Instagram, and other image-heavy social networks. There we become like the hero of Saul Bellow’s Henderson the Rain King, a man who hears a voice within himself saying:“I want, I want, I want.” Like Henderson, we do not know what we want exactly but we have some sense that it is out there, in the endless feed of shimmering imagery, and that if we click through it long enough, maybe we will find satisfaction. In this ruminative browsing, where time becomes something distended, passing without notice, our idle fantasies take flight.


The Big Tribute Music Festival in Wales. Photograph: Alamy

The documentary lifestyle of social media raises concerns about how we commoditise ourselves and how we put ourselves up for public display and judgment. That does not mean that fun cannot be had or that this kind of documentation cannot coexist with an authentic life. It is just that the question of what is authentic shifts, sometimes uncomfortably, and not just through what Facebook calls “frictionless sharing” – disclosing everything, always, completely. Instead, it is that our documentation and social broadcasts become the most important thing, an ulterior act that threatens to become the main event.

The social media theorist Nathan Jurgenson describes users as developing “a ‘Facebook Eye’: our brains always looking for moments where the ephemeral blur of lived experience might best be translated into a Facebook post; one that will draw the most comments and ‘likes’.” We might feel this phenomenon in different ways, depending on which networks you use and which activities constitute your day. I feel it acutely when reading articles on my smartphone or my computer – this sense that I am not just reading for my own enjoyment or edification but also so that I can pull out some pithy sentence (allowing enough space for the 23 or so characters needed for a link) to share on Twitter. I began to feel an odd kind of guilt, knowing that my attention is not being brought to bear on what I am doing. It feels dishonest, like I am not reading for the right reasons, and self-loathing builds within me – the self-loathing of the amateur comedian who deletes his tweets if they fail to quickly attract enough likes. I can offer myself justification by saying that I enjoy sharing or that I am trying to pass along information to others, but that strikes me as insincere. The truth is more depressing, as this impulse to share reflects, I think, the essential narcissism of the Facebook Eye (or the Instagram Lens, or whichever filter you prefer). Our experiences become not about our own fulfilment, the fulfilment of those we are with, or even about sharing; they become about ego, demonstrating status, seeming cool or smart or well-informed. Perhaps if you are a young journalist, looking to increase your esteem in the eyes of peers and a couple thousand followers in the digital ether, the goal of reading is to be the first to share something newsworthy, sometimes before you have even finished reading the article. Like a village gossip, you want to build social capital by becoming a locus for news and information. But with this role, there is an inevitable hollowing out of interiority, of the quietness of your thoughts – reading itself becomes directed outward, from private contemplation to a strategic act meant to satisfy some nebulous public.

Our reading habits change, and so do the stories we tell, the way we share these things. We make our updates more machine-readable – adding tags; choosing brands and emotions supplied to us through Facebook’s interface; shortening jokes to fit into Twitter’s 140-character limit. We wait to post on Tumblr after 12pm, because we have read that usage rises during the lunch hour, or we schedule tweets to go up during periods when we will be disconnected.

On social media, it can seem as if time and data are always slipping through our fingers. Particularly when we follow or befriend many people on a network, our updates seem illusory, as if they are never inscribed into anything but instead shouted into the void. (Did anyone really hear me? Why is no one replying?) In this way, social media can resemble traditional, pre-literate societies, where communication is purely oral and everything – culture, news, gossip, history – is communicated through speech. When we retweet someone, we are just speaking their words again – ensuring that they are passed on and do not get lost in the flurry of communication.

Media theorists refer to these eruptions of oral culture within literate culture as examples of “secondary orality”. Social media’s culture of sharing and storytelling, its lack of a long-term memory, and the use of news and information to build social capital are examples of this phenomenon. While records of our activities exist to varying extents, secondary orality shows us how social media exists largely in a kind of eternal present, upon which the past rarely intrudes. Twitter is a meaningful example. It is evanescent: posts are preserved, but in practice, they are lost in one’s rapidly self-refreshing timeline – read it now or not at all. Twitter is also reminiscent of oral storytelling, in which one person is speaking to a larger assembled group and receiving feedback in return, which helps to shape the story. One of the digital twists here is that many storyteller-like figures are speaking simultaneously, jockeying for attention and for some form of recognition. The point is not that social media is atavistically traditional but that it returns elements of oral societies to us. Our fancy new digital media is in fact not entirely new, but a hybrid of elements we have seen in past forms of communication. The outbursts of tribalism we sometimes see online – a group of anonymous trolls launching misogynist attacks on a female journalist; the ecstatic social media groupies of Justin Bieber; the way one’s Twitter timeline can, for a short while, become centred around parsing one major event, as if gathered in a village square – are evidence of a very old-fashioned, even preliterate communitarianism, reified for the digital world.

Think about the anxious comedian or your oversharing Facebook friend. Their frequent updates come in part from the (perhaps unacknowledged) feeling that in social media nothing is permanent. Of course, we know that these networks, and the varying sites and companies that piggyback off of them, preserve everything. But in practice, everything is fleeting. You must speak to be heard. And when everyone else is speaking, you had better do it often and do it well, or risk being drowned in the din, consigned to that house at the edge of the village, where few people visit and, when they do, expect to hear little of any consequence. ★

Jacob Silverman is the author of Terms of Service: Social Media and the Price of Constant Connection, from which this essay was adapted.

Follow the Long Read on Twitter: @gdnlongread

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed