Skip to main content

Michelle Thompson/The Globe and Mail

One of the most heartbreaking overlooked moments in the case of Rehtaeh Parsons – the 17-year-old Nova Scotian who killed herself in April after allegedly being raped and bullied – came in a blog post written by her father three days after her death.

"I had to write something about this," one line read. "I don't want her life to be defined by a Google search about suicide or death or rape. I want it to be about the giving heart she had."

The sentiment is so moving because it is so fruitless. The Parsons tragedy is hypermodern – someone photographed her during the alleged rapes; the photo was disseminated around her school; her classmates sent her cruel, crude messages – and Rehtaeh will, of course, be defined by her Google search. As we all increasingly are.

At the heart of the Internet is a tension between ephemera and permanence. Every tweet, Facebook post and Instagram photo is a vehicle for instant gratification, but that information sticks around, squirrelled away forever – forgotten, until it isn't.

Typically, this is cast as an issue of privacy: Does a job applicant, for example, deserve to lose an opportunity because Googling her name pulls up some long-ago indiscretion? But it's more. Rehtaeh Parsons's father was worried his daughter would be memorialized by forces outside human control, by the inscrutable, impersonal logic of algorithms.

The difference between how humans remember and how the Internet remembers is deep and fundamental. Humans forget, or remember selectively; the Internet remembers everything.

"For almost all of human history, collecting information and storing information was time-consuming and costly, and therefore we stored as little as possible," says Viktor Mayer-Schönberger, a professor at Oxford University and the author of Delete: The Virtue of Forgetting in the New Digital Age. "Even the stuff we stored we rarely made use of, because retrieval was so expensive."

But digital technology massively decreased the cost of data storage, and made accessing that information far easier. Now, we're steeped not just in knowledge but in memory: of our checkered pasts, our personal failures, the ruined lives of our loved ones.

"Human forgetting actually performs a very important function for us individually as well as for society," Prof. Mayer-Schönberger says. "It lets us act and think in the present rather than be tethered to an ever-more-comprehensive past. The beauty of the human mind and human forgetting is that, as we forget, we're able to generalize, to abstract, to see the forest rather than the individual tree. And if we cannot forget, then all we will have are the individual trees to go by."

In Rehtaeh Parsons's case, all we have are those trees: the awful circumstances of her death, the official bungling of the investigation. A more human kind of memory would recall her as a whole person, someone with agency and interiority.

We live in an era of endless archiving. For $279, you can pre-order a "lifelogging camera" called Memoto, which attaches to your clothes and takes two geotagged photos every minute, around the clock: "This means that you can revisit any moment of your past." the copy reads.

Google Glass, the tech giant's much-hyped wearable computer, will also come with a camera. More nobly, the United Nations' Memory of the World project aims to preserve the world's "documentary heritage" – from the archives of the Dutch East India Company to the woodblocks of Vietnam's Nguyen Dynasty.

Even Facebook's Timeline redesign is a memorial project, creating as it does a single continuous stream of your entire existence on the social network.

The advantages of the Internet's vast archive are obvious: Never before has our knowledge been so far-reaching or esoteric. Political projects such as WikiLeaks hold governments to account; online memorials to deceased loved ones create easily accessible places of mourning.

Indeed, there's an emotional side to all this. A Tumblr called Sad YouTube collects poignant comments left on music videos. In one entry, someone with the username "napolean moran" recalls how the song Have You Seen Her? by the Chi-Lites reminds him of an old girlfriend. "I made a mistake and lost contact with her, a war came by and eventually had to leave my country [El Salvador] on self-imposed exile," he writes. "Ever since I think of her and wish I had the chance to at least say that I was so sorry. I will never forget her."

Mark Slutsky, the Montreal-based filmmaker behind the site, says YouTube plays an "unintentional role of archiving a haphazard oral history" of modern life. "People really are telling their stories – a moment that they remember that resonates with them, which might be lost or never shared if not for YouTube," he says. "It's serving a really interesting function of coaxing memories out of people."

But too much digital memory can also do us a disservice. In the European Union, policy-makers are debating the "right to be forgotten" – an idea that sounds woolly but could soon become enshrined in law. In true EU fashion, the proposed changes are knotty and complex. But the idea is to grant users greater control over any personal information held by a company or government agency – that is, to establish a clear legal right to obtain personal data, stop it from being processed, or delete it entirely. The legislation would also harmonize data-privacy rules across the EU's 27 member states.

If the "right to be forgotten" is an attractive name, it's also a slightly misleading one. "It's actually more a right to delete than a right to be forgotten," says Jim Killock, the executive director of the Open Rights Group, a consumer advocacy organization. "The idea is not really about the forgetfulness of companies. The idea is that a company, when asked to remove your data, should delete it in full."

So, for example, you should have the right not just to delete your Facebook account, but to ensure that all your personal information is permanently scoured from the site.

Indeed, one of the more disconcerting elements of online memory is our lack of control over our digital trails. Nowhere is this clearer than in the question of what to do with our e-mail and social media when we die. A mini-industry has cropped up to deal with this problem.

For example, Google recently announced a tool called Inactive Account Manager, granting you the option to posthumously send your Google data – Gmail messages, YouTube videos, and so on – to a friend or loved one, or to delete it entirely, so that our digital slates, in death, may be wiped clean.

The changing face of online memory is also apparent, in a slightly more frivolous form, in Snapchat, a smartphone app that embraces impermanence. Like countless other apps, Snapchat allows you to take a photo or video, and send it to friends. The difference is that, after 10 seconds or less, the photo is deleted forever. Not even the company holds on to the data. Its mascot, fittingly, is a ghost.

Although Snapchat's primary function might seem pornographic – imagine the consequence-free possibilities – it has proved more popular than that: 150 million auto-destructing photos now pass through the app every day. The company recently raised $13.5-million, and Facebook has released a copycat service called Poke. Snapchat has struck a chord, perhaps, because we all long to be forgotten.

But just as not everything should be remembered, surely not everything should be deleted. So how do we strike a balance? Prof. Mayer-Schönberger has one pragmatic suggestion: assigning optional expiry dates to data. For example, every Facebook post might exist for some predetermined amount of time before it vanishes.

"You could still share a lot of information," he notes. "You could at the same time, though, control how long you want to share something for, and that is up to you. You basically condition the digital tools to be forgetful."

To further illustrate the value of forgetting, Prof. Mayer-Schönberger points to Funes the Memorious, a short story by the great Argentine writer Jorge Luis Borges. The title character, a boy who suffered a horse-riding accident, is incapable of forgetting and becomes lost in specificity: the creation of a new numeric system, the classification of his every childhood memory.

"To think is to forget differences, generalize, make abstractions," Borges writes. "In the teeming world of Funes, there were only details, almost immediate in their presence."

And what about Rehtaeh Parsons? Although most of the media attention surrounding her death focused on "cyberbullying," the word itself hardly does justice to the misogynist torture she faced. Now, in death, she faces a different kind of malice: posthumous victim-blaming.

After her death, anonymous online trolls set up a fake "The Real Rehtaeh Parsons" Facebook account, and, last week, National Post columnist Christie Blatchford even suggested that the girl had lied about being raped.

Online, Rehtaeh faces a sort of permanent libel – even more ugly, in some ways, than the kind she faced in life, because it is public and available by typing a few words into a search bar. And it exists in perpetuity,.

Perhaps that's what her father meant when he wrote that he did not want his daughter "defined by a Google search." Not just that he didn't want the horrific circumstances of her death to serve as a tombstone, but that granting primacy to those circumstances gives power to her tormentors: the boys who raped her, the kids who bullied her, the police who dismissed her case, the trolls who still savage her memory.

Turning away from endless online memory does not mean that we should forget someone like Rehtaeh Parsons. But it could mean considering how our digital lives might be reshaped to better reflect what is best about human memory: its selectivity, its fallibility, its sensibility.

Otherwise, we could end up like poor Funes, afloat on a sea of endless detail, the broader view obscured by our own eternal knowledge.

Interact with The Globe