The Villa Gillet was commissioned by the Council for Artistic Creation to curate challenging encounters between French and Americans thinkers. This is how Walls and Bridges, a threefold series of events in New York city, came to be. Fueled by the collaboration with wonderful partners, it fosters creative exchanges held in venues as diverse as the New York Public Library, Brooklyn Flea Market, Aperture Foundation, the New School or the Heyman Center for the Humanities.
Four years ago, Stacy Snyder, then a 25-year-old teacher in training at Conestoga Valley High School in Lancaster, Pa., posted a photo on her MySpace page that showed her at a party wearing a pirate hat and drinking from a plastic cup, with the caption "Drunken Pirate." After discovering the page, her supervisor at the high school told her the photo was "unprofessional," and the dean of Millersville University School of Education, where Snyder was enrolled, said she was promoting drinking in virtual view of her under-age students. As a result, days before Snyder's scheduled graduation, the university denied her a teaching degree. Snyder sued, arguing that the university had violated her First Amendment rights by penalizing her for her (perfectly legal) after-hours behavior. But in 2008, a federal district judge rejected the claim, saying that because Snyder was a public employee whose photo didn't relate to matters of public concern, her "Drunken Pirate" post was not protected speech.
When historians of the future look back on the perils of the early digital age, Stacy Snyder may well be an icon. The problem she faced is only one example of a challenge that, in big and small ways, is confronting millions of people around the globe: how best to live our lives in a world where the Internet records everything and forgets nothing - where every online photo, status update, Twitter post and blog entry by and about us can be stored forever. With Web sites like LOL Facebook Moments, which collects and shares embarrassing personal revelations from Facebook users, ill-advised photos and online chatter are coming back to haunt people months or years after the fact. Examples are proliferating daily: there was the 16-year-old British girl who was fired from her office job for complaining on Facebook, "I'm so totally bored!!"; there was the 66-year-old Canadian psychotherapist who tried to enter the United States but was turned away at the border and barred permanently from visiting the country - after a border guard's Internet search found that the therapist had written an article in a philosophy journal describing his experiments 30 years ago with L.S.D.
According to a recent survey by Microsoft, 75 percent of U.S. recruiters and human-resource professionals report that their companies require them to do online research about candidates, and many use a range of sites when scrutinizing applicants - including search engines, social-networking sites, photo- and video-sharing sites, personal Web sites and blogs, Twitter and online-gaming sites. Seventy percent of U.S. recruiters report that they have rejected candidates because of information found online, like photos and discussion-board conversations and membership in controversial groups.
We've known for years that the Web allows for unprecedented voyeurism, exhibitionism and inadvertent indiscretion, but we are only beginning to understand the costs of an age in which so much of what we say, and of what others say about us, goes into our permanent - and public - digital files. The fact that the Internet never seems to forget is threatening, at an almost existential level, our ability to control our identities; to preserve the option of reinventing ourselves and starting anew; to overcome our checkered pasts.
In a recent book, Delete: The Virtue of Forgetting in the Digital Age, the cyberscholar Viktor Mayer-Schönberger cites Stacy Snyder's case as a reminder of the importance of "societal forgetting." By "erasing external memories," he says in the book, "our society accepts that human beings evolve over time, that we have the capacity to learn from past experiences and adjust our behavior." In traditional societies, where missteps are observed but not necessarily recorded, the limits of human memory ensure that people's sins are eventually forgotten. By contrast, Mayer-Schönberger notes, a society in which everything is recorded will "forever tether us to all our past actions, making it impossible, in practice, to escape them." He concludes that "without some form of forgetting, forgiving becomes a difficult undertaking."
It's often said that we live in a permissive era, one with infinite second chances. But the truth is that for a great many people, the permanent memory bank of the Web increasingly means there are no second chances - no opportunities to escape a scarlet letter in your digital past. Now the worst thing you've done is often the first thing everyone knows about you.
All around the world, political leaders, scholars and citizens are searching for responses to the challenge of preserving control of our identities in a digital world that never forgets. Are the most promising solutions going to be technological? Legislative? Judicial? Ethical? A result of shifting social norms and cultural expectations? Or some mix of the above? Alex Türk, the French data-protection commissioner, has called for a "constitutional right to oblivion" that would allow citizens to maintain a greater degree of anonymity online and in public places. In Argentina, the writers Alejandro Tortolini and Enrique Quagliano have started a campaign to "reinvent forgetting on the Internet," exploring a range of political and technological ways of making data disappear. In February, the European Union helped finance a campaign called "Think B4 U post!" that urges young people to consider the "potential consequences" of publishing photos of themselves or their friends without "thinking carefully" and asking permission. And in the United States, a group of technologists, legal scholars and cyberthinkers are exploring ways of recreating the possibility of digital forgetting. These approaches share the common goal of reconstructing a form of control over our identities: the ability to reinvent ourselves, to escape our pasts and to improve the selves that we present to the world.
The most ambitious legal proposals focus on the creation of a "right to oblivion," or a legally enforceable right of individuals to remove personal data they have posted. The German Federal Data Privacy Commissioner, Peter Schaar, has proposed that employers should not be able to base employment decisions on data marked as private on Facebook: they couldn't require job applicants, for example, to open their Facebook profiles during job interviews. But although it might be feasible to restrict the data that employers can consider, requiring the deletion of data becomes more complicated. How can regulatory bodies require the deletion of data that has been shared? Should there be a cause of action against Facebook users who copy and forward private information in violation of privacy settings? And most data is stored in many places. Creating a cause of action against each data would require an ambitious rethinking of our ideas of property rights on the web and who owns data.
The regulatory proposals become even more complicated - and threatening to free speech - when they involve liability against search engines. The European Commission has proposed requiring search engines to ignore tagged results, and the German interior minister would create a right of the victim to put his version of events at the top of any search results. And last year, an Argentinian judge held Google and Yahoo liable for causing "moral harm" and violating the privacy of Virginia Da Cunha, a pop star, actress, and lead singer of a band called the Virgin Pancakes. The judge ordered Google and Yahoo to pay 50,000 pesos each in damages simply because their search results had included pictures of Da Cunha that were linked to erotic content. The ruling was overturned on appeal in August, but there are at least 130 similar cases pending in Argentina to force search engines to remove or block offensive content, according to The New York Times. In the United States, search engines are protected by the Communications Decency Act, which immunizes Internet service providers from being held liable for content posted by third parties. But, as liability against search engines expands abroad, it will seriously curtail free speech: Yahoo says that the only way to comply with injunctions is to block all sites that refer to a particular plaintiff.
That's one reason that the most promising solutions to the problem of embarrassing but true information online may be not legal but technological ones. Instead of suing after the damage is done (or hiring a firm to clean up our messes), we need to explore ways of pre-emptively making the offending words or pictures disappear.
Jorge Luis Borges, in his short story "Funes, the Memorious," describes a young man who, as a result of a riding accident, has lost his ability to forget. Funes has a tremendous memory, but he is so lost in the details of everything he knows that he is unable to convert the information into knowledge and unable, as a result, to grow in wisdom. Viktor Mayer-Schönberger, in "Delete," uses the Borges story as an emblem for the personal and social costs of being so shackled by our digital past that we are unable to evolve and learn from our mistakes. After reviewing the various possible legal solutions to this problem, Mayer-Schönberger says he is more convinced by a technological fix: namely, mimicking human forgetting with built-in expiration dates for data. He imagines a world in which digital-storage devices could be programmed to delete photos or blog posts or other data that have reached their expiration dates, and he suggests that users could be prompted to select an expiration date before saving any data.
This is not an entirely fanciful vision. Google not long ago decided to render all search queries anonymous after nine months (by deleting part of each Internet protocol address), and the upstart search engine Cuil has announced that it won't keep any personally identifiable information at all, a privacy feature that distinguishes it from Google. And there are already small-scale privacy apps that offer disappearing data. An app called TigerText allows text-message senders to set a time limit from one minute to 30 days after which the text disappears from the company's servers on which it is stored and therefore from the senders' and recipients' phones. (The founder of TigerText, Jeffrey Evans, has said he chose the name before the scandal involving Tiger Woods's supposed texts to a mistress.)
Expiration dates could be implemented more broadly in various ways. Researchers at theUniversity of Washington, for example, are developing a technology called Vanish that makes electronic data "self-destruct" after a specified period of time. Instead of relying on Google, Facebook or Hotmail to delete the data that is stored "in the cloud" - in other words, on their distributed servers - Vanish encrypts the data and then "shatters" the encryption key. To read the data, your computer has to put the pieces of the key back together, but they "erode" or "rust" as time passes, and after a certain point the document can no longer be read. Tadayoshi Kohno, a designer of Vanish, told me that the system could provide expiration dates not only for e-mail but also for any data stored in the cloud, including photos or text or anything posted on Facebook, Google or blogs. The technology doesn't promise perfect control - you can't stop someone from copying your photos or Facebook chats during the period in which they are not encrypted. But as Vanish improves, it could bring us much closer to a world where our data didn't linger forever.
Kohno told me that Facebook, if it wanted to, could implement expiration dates on its own platform, making our data disappear after, say, three days or three months unless a user specified that he wanted it to linger forever. It might be a more welcome option for Facebook to encourage the development of Vanish-style apps that would allow individual users who are concerned about privacy to make their own data disappear without imposing the default on all Facebook users.
So far, however, Zuckerberg, Facebook's C.E.O., has been moving in the opposite direction - toward transparency rather than privacy. In defending Facebook's recent decision to make the default for profile information about friends and relationship status public rather than private, Zuckerberg said in January to the founder of the publication TechCrunch that Facebook had an obligation to reflect "current social norms" that favored exposure over privacy. "People have really gotten comfortable not only sharing more information and different kinds but more openly and with more people, and that social norm is just something that has evolved over time," he said.
Still, Zuckerberg is on to something when he recognizes that the future of our online identities and reputations will ultimately be shaped not just by laws and technologies but also by changing social norms. And norms are already developing to recreate off-the-record spaces in public, with no photos, Twitter posts or blogging allowed. Milk and Honey, an exclusive bar on Manhattan's Lower East Side, requires potential members to sign an agreement promising not to blog about the bar's goings on or to post photos on social-networking sites, and other bars and nightclubs are adopting similar policies. I've been at dinners recently where someone has requested, in all seriousness, "Please don't tweet this" - a custom that is likely to spread.
In addition to exposing less for the Web to forget, it might be helpful for us to explore new ways of living in a world that is slow to forgive. It's sobering, now that we live in a world misleadingly called a "global village," to think about privacy in actual, small villages long ago. In the villages described in the Babylonian Talmud, for example, any kind of gossip or tale-bearing about other people - oral or written, true or false, friendly or mean - was considered a terrible sin because small communities have long memories and every word spoken about other people was thought to ascend to the heavenly cloud. (The digital cloud has made this metaphor literal.) But the Talmudic villages were, in fact, far more humane and forgiving than our brutal global village, where much of the content on the Internet would meet the Talmudic definition of gossip: although the Talmudic sages believed that God reads our thoughts and records them in the book of life, they also believed that God erases the book for those who atone for their sins by asking forgiveness of those they have wronged. In the Talmud, people have an obligation not to remind others of their past misdeeds, on the assumption they may have atoned and grown spiritually from their mistakes. "If a man was a repentant [sinner]," the Talmud says, "one must not say to him, Remember your former deeds."
Unlike God, however, the digital cloud rarely wipes our slates clean, and the keepers of the cloud today are sometimes less forgiving than their all-powerful divine predecessor. In an interview withCharlie Rose on PBS, Eric Schmidt, the C.E.O. of Google, said that "the next generation is infinitely more social online" - and less private "as evidenced by their Facebook pictures," which "will be around when they're running for president years from now." Schmidt added: "As long as the answer is that I chose to make a mess of myself with this picture, then it's fine. The issue is when somebody else does it." If people chose to expose themselves for 15 minutes of fame, Schmidt says, "that's their choice, and they have to live with it."
Schmidt added that the "notion of control is fundamental to the evolution of these privacy-based solutions," pointing to Google Latitude, which allows people to broadcast their locations in real time.
This idea of privacy as a form of control is echoed by many privacy scholars, but it seems too harsh to say that if people like Stacy Snyder don't use their privacy settings responsibly, they have to live forever with the consequences. Privacy protects us from being unfairly judged out of context on the basis of snippets of private information that have been exposed against our will; but we can be just as unfairly judged out of context on the basis of snippets of public information that we have unwisely chosen to reveal to the wrong audience.
Moreover, the narrow focus on privacy as a form of control misses what really worries people on the Internet today. What people seem to want is not simply control over their privacy settings; they want control over their online reputations. But the idea that any of us can control our reputations is, of course, an unrealistic fantasy. The truth is we can't possibly control what others say or know or think about us in a world of Facebook and Google, nor can we realistically demand that others give us the deference and respect to which we think we're entitled. On the Internet, it turns out, we're not entitled to demand any particular respect at all, and if others don't have the empathy necessary to forgive our missteps, or the attention spans necessary to judge us in context, there's nothing we can do about it.
But if we can't control what others think or say or view about us, we can control our own reaction to photos, videos, blogs and Twitter posts that we feel unfairly represent us.
Perhaps society will become more forgiving of drunken Facebook pictures over time. And some may welcome the end of the segmented self, on the grounds that it will discourage bad behavior and hypocrisy: it's harder to have clandestine affairs when you're broadcasting your every move on Facebook, Twitter and Foursquare. But a humane society values privacy, because it allows people to cultivate different aspects of their personalities in different contexts; and at the moment, the enforced merging of identities that used to be separate is leaving many casualties in its wake. Stacy Snyder couldn't reconcile her "aspiring-teacher self" with her "having-a-few-drinks self": even the impression, correct or not, that she had a drink in a pirate hat at an off-campus party was enough to derail her teaching career.
That doesn't mean, however, that it had to derail her life. After taking down her MySpace profile, Snyder is understandably trying to maintain her privacy: her lawyer told me in a recent interview that she is now working in human resources; she did not respond to a request for comment. But her success as a human being who can change and evolve, learning from her mistakes and growing in wisdom, has nothing to do with the digital file she can never entirely escape. Our character, ultimately, can't be judged by strangers on the basis of our Facebook or Google profiles; it can be judged by only those who know us and have time to evaluate our strengths and weaknesses, face to face and in context, with insight and understanding. In the meantime, as all of us stumble over the challenges of living in a world without forgetting, we need to learn new forms of empathy, new ways of defining ourselves without reference to what others say about us and new ways of forgiving one another for the digital trails that will follow us forever.
Pour citer ces ressources :
Jeffrey Rosen. 09/2011. "The End of Forgetting".
La Clé des Langues (Lyon: ENS LYON/DGESCO). ISSN 2107-7029. Mis à jour le 23 septembre 2011.
Consulté le 3 mai 2016.
Url : http://cle.ens-lyon.fr/anglais/the-end-of-forgetting-131823.kjsp