Sign In
Not register? Register Now!
Pages:
2 pages/≈550 words
Sources:
2 Sources
Style:
APA
Subject:
Business & Marketing
Type:
Case Study
Language:
English (U.S.)
Document:
MS Word
Date:
Total cost:
$ 8.64
Topic:

Google. The Right to be Forgotten. Cases in Business and Society

Case Study Instructions:

State the problems facing the company/government 

Identify and link the symptoms and root causes of the problems 

Differentiate short term from long term problems 

Conclude with the decision facing the company/government 

(and it must include this below)
In 2009, Mario Costeja Gonzalez, a self-employed attorney living in a small town outside Madrid, Spain, casually “googled” himself and was startled by what came up on his com- puter screen. Prominently displayed in the search results was a brief legal notice that had appeared more than a decade earlier in a local newspaper, La Vanguardia, which listed property seized and being auctioned by a government agency for nonpayments of debts. Among the properties was a home jointly owned by Costeja and his wife.
Costeja immediately realized that this information could damage his reputation as an attor- ney. Equally troubling, the information was no longer factual. He had paid his debt nearly a decade earlier. Abanlex, Costeja’s small law firm, depended on the Internet to gain much of its new business, which was often generated by a Google search. Potential clients might choose not to hire him, based on the old auction notice, he reflected. His mind then turned to the possible effects of this kind of information on other people’s livelihoods. “There are peo- ple who cannot get a job because of content that is irrelevant,” he thought.1 “I support free- dom of expression and I do not defend censorship. [However, I decided] to fight for the right to request the deletion of data that violates the honor, dignity and reputation of individuals.”2
The next week, Costeja wrote to La Vanguardia and requested that it remove the article about his debt notice, because it had had been fully resolved a number of years earlier and reference to it now was therefore entirely irrelevant.3 In doing so, he was making use of his rights under Spain’s strong data protection policies, which recognized the protection and integrity of personal data as a constitutional right under Section 18 of the nation’s Data Protection Act.4 In response, the newspaper informed him that it had recently uploaded to the Internet all its past archives, dating back to 1881, to allow them to be searched by the public. It also noted that the auction notice had originally been publicly posted in order to secure as many bidders as possible. The newspaper refused Costeja’s request, stating that the information was obtained from public records and had thus been published lawfully.5
To be sure, the real problem for Costeja was not that the notice had appeared in La Vanguardia’s digital library, but that it had shown up in the results of the most widely
By Cynthia E. Clark, Bentley University. Copyright © 2015 by the author. Used by permission.
1 “Google Privacy Campaigner Praises Search Engine for Bowing to EU,” Financial Times, May 30, 2014.
2 “The Man Who Sued Google to be Forgotten,” Newsweek, May 30, 2014.
3 European Parliament. Judgment of the Court, May 13, 2014, at http://curia(dot)europa(dot)eu/juris/document/document. jsf?docid=152065&doclang=EN.
4 “The Unforgettable Story of the Seizure to the Defaulter Mario Costeja González that Happened in 1998,” Derechoaleer, May 30, 2014, at http://derechoaleer(dot)org/en/blog/2014/05/the-unforgettable-story-of-the-seizure-to-the-defaulter-mario- costeja-gonzalez-that-happened-in-1998.html.
5 “Will Europe Censor this Article?” The Atlantic, May 13, 2014, www(dot)theatlantic(dot)com/international/archive/2014/05/ europes-troubling-new-right-to-be-forgotten/370796/.
451
452 Cases in Business and Society
Google, Inc.
used search engine in the world, Google, where potential clients might use it to judge his character.6 Following this reasoning, Costeja then wrote to Google Spain, the firm’s Spanish affiliate, only to be told that the parent company, Google Inc., was the entity responsi- ble for the development of search results.7 Costeja was taken aback by this development. “The resources Google has at their disposal aren’t like those of any other citizens,” he reflected.8 Costeja felt he would be at a disadvantage in a lawsuit against an industry giant like Google.
In March 2010, after his unsuccessful attempts with the newspaper and Google Spain, Costeja turned to Spain’s Data Protection Agency (SDPA), the government agency responsible for enforcing the Data Protection Act. “Google in Spain asked me to address myself to its headquar- ters in the U.S., but I found it too far and difficult to launch a complaint in the U.S., so I went to the agency in Spain to ask for their assistance. They said I was right, and the case went to court,” he explained.9 In a legal filing, Costeja requested, first, that the agency issue an administrative order requiring La Vanguardia either to remove or alter the pages in question (so that his per- sonal data no longer appeared) or to use certain tools made available by search engines in order to shield the data from view. Second, he requested that the agency require that Google Spain or Google Inc. remove or conceal his personal data so that it no longer appeared in the search results and in the links to La Vanguardia. Costeja stated that his debt had been fully resolved.10
With these steps, a small-town Spanish lawyer had drawn one of the world’s richest and best-known companies, Google, into a debate over the right to be forgotten.
Google Inc. was a technology company that built products and provides services to orga- nize information. Founded in 1998 and headquartered in Mountain View, CA, Google’s mission was to organize the world’s information and make it universally accessible and useful. It employed more than 55,000 people and had revenues of $45 billion. The com- pany also had 70 offices in more than 40 countries.
The company’s main product, Google Search, provided information online in response to a user’s search. Google’s other well-known products provided additional services. For example, Google Now provided information to users when they needed it, and its Product Listing Ads offered product image, price, and merchant information. The company also provided AdWords, an auction-based advertising program and AdSense, which enabled websites that were part of the Google network to deliver ads. Google Display was a display advertising network; DoubleClick Ad Exchange was a marketplace for the trading display ad space; and YouTube offered video, interactive, and other ad formats.
Search Technology
In its core business, Google conducted searches in three stages: crawling and indexing, applying algorithms, and fighting spam.
6 “The Unforgettable Story of the Seizure to the Defaulter Mario Costeja González that Happened in 1998,” Derechoaleer, May 30, 2014, http://derechoaleer(dot)org/en/blog/2014/05/the-unforgettable-story-of-the-seizure-to-the-defaulter-mario-costeja- gonzalez-that-happened-in-1998.html.
7 B. Van Alsenoy, A. Kuczerawy, and J. Ausloos. “Search Engines after Google Spain: internet@liberty or privacy@peril?” ICRI Working Paper Series, September 6, 2013, at http://papers(dot)ssrn(dot)com/sol3/papers.cfm?abstract_id=2321494.
8 “Spain’s Everyday Internet Warrior Who Cut Free from Google’s Tentacles,” The Guardian, May 13, 2014, http://www .theguardian.com/technology/2014/may/13/spain-everyman-google-mario-costeja-gonzalez.
9 “The Man Who Sued Google to Be Forgotten,” op. cit.
10 Court of Justice, Judgment in Case C-131/12 Google Spain SL, Google Inc. v. Agencia Española de Protección de Datos, Mario Costeja González.

Crawlers, programs that browsed the web to create an index of data, looked at web pages and followed links on those pages. They then moved from link to link and brought data about those web pages back to Google’s servers. Google would then use this information to create an index to know exactly how to retrieve information for its users. Algorithms were the computer processes and formulas that took users’ questions and turned them into answers. At the most basic level, Google’s algorithms looked up the user’s search terms in the index to find the most appropriate pages. For a typical query, thousands, if not mil- lions, of web pages might have helpful information. Google’s algorithms relied on more than 200 unique signals or “clues” that made it possible to guess what an individual was really looking for. These signals included the terms on websites, the freshness of content, the region, and the page rank of the web page.11 Lastly, the company fought spam through a combination of computer algorithms and manual review. Spam sites attempted to game their way to the top of search results by repeating keywords, buying links that passed Google’s PageRank process, or putting invisible text on the screen. Google scouted out and removed spam because it could make legitimate websites harder to find. While much of this process was automated, Google did maintain teams whose job was to review sites manually.12
Policy on Information Removal
Google’s policy on the general removal of information was the following:
Upon request, we’ll remove personal information from search results if we believe it could make you susceptible to specific harm, such as identity theft or financial fraud. This includes sensitive government ID numbers like U.S. Social Security numbers, bank account numbers, credit card numbers and images of signatures. We generally don’t process removals of national ID numbers from official government websites because in those cases we consider the information to be public. We some- times refuse requests if we believe someone is attempting to abuse these policies to remove other information from our results.13
Apart from this general policy, Google Inc. also removed content or features from its search results for legal reasons. For example, in the United States, the company would remove content with valid notification from the copyright holder under the Digital Mil- lennium Copyright Act (DMCA), which was administered by the U.S. Copyright Office. The DCMA provided recourse for owners of copyrighted materials who believed that their rights under copyright law had been infringed upon on the Internet.14 Under the notice and takedown procedure of the law, a copyright owner could notify the service provider, such as Google, requesting that a website or portion of a website be removed or blocked. If, upon receiving proper notification, the service provider promptly did so, it would be exempt from monetary liability.
Google regularly received such requests from copyright holders and those that repre- sented them, such as the Walt Disney Company and the Recording Industry Association of America. Google produced and made public a list of the domain portions of URLs that had been the subject of a request for removal, and noted which ones had been removed.
11 Information on PageRank is available online at http://infolab(dot)stanford(dot)edu/ backrub/google.html.
12 Information about Google search is available online at www(dot)google(dot)com/insidesearch/howsearchworks/index.html. 13 http://www(dot)google(dot)com/insidesearch/howsearchworks/policies.html.
14 Information about Digital Millennium Copyright Act (“DMCA”) notice procedure is available at www(dot)fosterinstitute(dot)com/ legal-forms/dmca-notice.
Case 2 Google and the Right to Be Forgotten 453
454 Cases in Business and Society
As of July 2015, it had removed more than 600,000 URLs out of more than 2.4 million requests.15
Likewise, content on local versions of Google was also removed when required by national laws. For example, content that glorified the Nazi party was illegal in Germany, and content that insulted religion was illegal in India.16 The respective governments, via a court order or a routine request as described above, typically made these requests. Google reviewed these requests to determine if any content should be removed because it violated a specific country’s law.
When Google removed content from search results for legal reasons, it first displayed a notification that the content had been removed and then reported the removal to www.chill- ingeffects.org, a website established by the Electronic Frontier Foundation and several law schools. This website, which later changed its name to lumendatabase.org, collected and analyzed legal complaints and requests for removal of a broad set of online materials. It was designed to help Internet users know their rights and understand the law. Researchers could use the data to study the prevalence of legal threats and the source of content remov- als. This database also allowed the public to search for specific takedown notifications.17
Google removed content quickly. Its average processing time across all copyright infringement removal requests submitted via its website was approximately 6 hours. Dif- ferent factors influenced the processing time, including the method of delivery, language, and completeness of the information submitted.
The Right to Be Forgotten
The right to be forgotten can be understood as peoples’ right to request that information be removed from the Internet or other repositories because it violated their privacy or was no longer relevant. This right assumed greater prominence in the digital era, when people began finding it increasingly difficult to escape information that had accumulated over many years, resulting in expressions such as “the net never forgets,” “everything is in the cloud,” “reputation bankruptcy,” and “online reputation.”18 According to Jeffrey Rosen, professor of law at George Washington University, the intellectual roots of the right to be forgotten could be found in French law, “which recognizes le droit à l’oubli—or the ‘right of oblivion’—a right that allows a convicted criminal who has served his time and been rehabilitated to object to the publication of the facts of his conviction and incarceration.”19
Although the right to be forgotten was rooted in expunging criminal records, the rise of the Internet had given the concept a new, more complex meaning. Search engines enabled users to access information on just about any topic with considerable ease. The ease with which information could be shared, stored, and retrieved through online search raised issues of both privacy and freedom of expression. On the one hand, when opening a bank account, joining a social networking website, or booking a flight online, a consumer would voluntarily disclose vital personal information such as name, address, and credit card num- bers. Consumers were often unsure of what happened to their data and were concerned that it might fall into the wrong hands—that is, that their privacy would be violated.
15 Information about the removal process is available online at www(dot)google(dot)com/transparencyreport/removals/copyright/ 16 http://www(dot)google(dot)com/insidesearch/howsearchworks/policies.html.
17 The Berkman Center for Internet & Society at https://cyber(dot)law(dot)harvard(dot)edu/research/chillingeffects.
18 “The Unforgettable Story of the Seizure to the Defaulter Mario Costeja González that Happened in 1998,” op. cit.
19 “Will Europe Censor this Article?” op. cit.

On the other hand, by facilitating the retrieval of information, search engines enhanced individuals’ freedom to receive and impart information. Any interference with search engine activities could therefore pose a threat to the effective enjoyment of these rights.20 As Van Alsenoy, a researcher at the Interdisciplinary Center for Law and Information Communication Technology, argued, “In a world where search engines are used as the main tool to find relevant content online, any governmental interference in the provision- ing of these services presents a substantial risk that requires close scrutiny.”21
Europe
Since the 1990s, both the European Union and its member states (such as Spain) had enacted laws that addressed the right to privacy and, by extension, the right to be forgotten. A fundamental right of individuals to protect their data was introduced in the EU’s original data protection law, passed in 1995. Specifically, the European Data Protection Directive 95/46 defined the appropriate scope of national laws relating to personal data and the processing of those data. According to Article 3(1), Directive 95/46 applied “to the processing of personal data wholly or partly by automatic means, and to the processing otherwise than by automatic means of personal data which form part of a filing system or are intended to form part of a filing system.”22 Article 2(b) of the EU Data Protection
Directive 95/46 defined the processing of personal data as
any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.
Individual countries within the European Union also enacted their own laws, which were sometimes stronger than those of the EU. For example, in Spain, the protection of data was a constitutional right. The Spanish Constitution recognized the right to personal privacy, secrecy of communications, and the protection of personal data. These rights were protected through the Data Protection Act (the “Act”), passed in 1999, which incorporated the 1995 European Directive on data protection, and was enforced by the Spanish Data Protection Agency (SDPA). Created in 1993, this agency was relatively inactive until the passing of the Act, which gave it more powers and a mandate to enforce privacy rules in a wide range of situations.23
The Spanish agency exercised its powers broadly. For example, in 2013, it fined tele- com firm Telefonica SA €20,000 for twice listing an individual’s phone number in local phone books without the individual’s prior consent. In 2008, the agency fined a marketing company €600 for using “recommend this to a friend” icons on websites, saying that send- ers of recommendation e-mails had to first request the recipient’s permission. The agency had also successfully required anyone using security cameras to clearly mark their pres- ence with a recognizable icon. Supporters of this move have highlighted the importance of transparency in protecting one’s privacy.24
20 Alsenoy et al., 2013.
21 Alsenoy et al., 2013.
22 Alsenoy et al., 2013.
23 “Data Protection in Spain” June 24, 2010, at www(dot)i-policy(dot)org/2010/06/data-protection-in-spain.html.
24 “Spanish Agency Behind the Google Ruling Lauded by Some, Hated by Others,” The Wall Street Journal, June 26, 2014.
Case 2 Google and the Right to Be Forgotten 455
456 Cases in Business and Society
Over time, however, differences in the way that each EU country interpreted privacy rights led to an uneven level of protection for personal data, depending on where an indi- vidual lived or bought goods and services. This led the European high court to take a sec- ond look, in 2013, at the original law.25 A European Commission memo at that time noted that the right “is about empowering individuals, not about erasing past events or restricting freedom of the press.”26 The changes were intended to give citizens more control over their personal data, making it easier to access and improve the quality of information they received about what happened to their data once they decided to share it. An unanswered question, however, was the latitude given to national courts and regulators across Europe to set the parameters by which these requests could be made.27
The United States
U.S. courts had taken a very different approach to privacy and to the right to be forgotten. A few U.S. laws recognized the right to be forgotten; the Fair Credit Reporting Act of 1970, for example, gave individuals the right to delete certain negative information about their credit—such as late payments, tax liens, or judgments—seven years from the date of the delinquency. But, for the most part, fundamental differences in legal philosophy made this right less likely to become widely supported in the United States. In an article published in the Atlantic in May 2014, Matt Ford suggested that in the U.S. context, one person’s right to be forgotten logically imposed a responsibility to forget upon someone else, a notion that was alien to American law. The First Amendment to the Constitution barred the government from interfering with free speech. Law professor Rosen argued that the First Amendment would make a right to be forgotten virtually impossible, not only to create but to enforce. For example, the U.S. Supreme Court ruled in 1989 that penalizing a newspaper for publishing truthful, lawfully obtained information from the public record was unconstitutional.28
The Lawsuit and Court Decision
The main focus of Costeja’s complaint before Spanish Data Protection Agency (SDPA) was his request that La Vanguardia remove the debt notice from its archives. In doing so, he was claiming his constitutional right to protect the integrity of his personal data. Coste- ja’s request had two parts: that (1) La Vanguardia be required either to remove or alter the pages in question or to use certain tools made available by search engines in order to pro- tect the data and (2) that Google Spain or Google Inc. be required to remove or conceal the personal data relating to him so that the data no longer appeared in search results.
In July 2010, two months after Costeja’s original request, the SDPA ordered Goo- gle Spain and Google Inc. to take “all reasonable steps to remove the disputed personal data from its index and preclude further access,” upholding that part of the complaint.29
25 “What is the ‘Right to Be Forgotten?’” The Wall Street Journal, May 13, 2014.
26 European Commission, “LIBE Committee Vote Backs New EU Data Protection Rules”, October 22, 2013, at http://europa(dot)eu/
rapid/press-release_MEMO-13-923_en.htm.
27 “What is the ‘Right to Be Forgotten?’” op. cit. 28 “Will Europe Censor This Article?” op. cit.
29 Audiencia Nacional. Sala de lo Contencioso, Google Spain SL y Google Inc., S.L.c. Agencia de Protección de Datos, para- graph 1.2, at www(dot)poderjudicial(dot)es.

However, the SDPA rejected Costeja’s complaint as it related to La Vanguardia, because it considered that the publication by it of the information in question was legally justified.30
A year later, Google filed an appeal against the decision by the SDPA before the Audi- encia Nacional in Madrid, Spain’s highest national court. In March 2012, this court referred the case to the European Court of Justice, the EU’s high court, for a preliminary ruling.31
In their briefs, Google Spain and Google Inc.’s argument hinged on the meaning of “personal data” and “crawling.” Crawling, as noted above, was the use of software pro- grams to find multiple websites that responded to requests for information online.32 These programs were configured to look for information on the Internet, according to a set of criteria that told them where to go and when.33 Once the relevant web pages had been cop- ied and collected, their content was analyzed and indexed.34 Google compared its search engine index to an index at the back of a textbook, in that it included information about words and their locations.35
Specifically, Google argued before the European Court of Justice that because it crawled and indexed websites “indiscriminately” (that is, without a deliberate intent to process per- sonal data as such), no processing of personal data within the meaning of Article 2 (b) of the EU Data Protection Directive 95/46 actually took place. This absence of intent, the company argued, clearly distinguished Google’s activities as a search engine provider from the processing of personal data as interpreted by the Court.
Google’s other main argument was that the publisher of the information should be the sole controller of data, not the search engine. After all, its attorneys argued, Google’s inter- vention was purely accessory in nature; it was merely making information published by others more readily accessible. If a publisher, for whatever reason, decided to remove cer- tain information from its website, this information would (eventually) be removed from Google’s index and would no longer appear in its search results. As a result, Google’s counsel argued, the role of a search engine should be thought of as an “intermediary.”
In May 2014, the European Court of Justice ruled against Google. The court found the Internet search provider was responsible for the processing of personal data that appeared on web pages published by third parties. It further required Google to remove links returned in search results based on an individual’s name when those results were deemed to be “inadequate, irrelevant or no longer relevant, or excessive.” At the heart of the court’s logic was the process that Google used to produce its search results. The official ruling explained the court’s rationale:
The Court points out in this context that processing of personal data carried out by such an operator enables any Internet user, when he makes a search on the basis of an individual’s name, to obtain, through the list of results, a structured overview of the information relating to that individual on the internet. The Court observes, fur- thermore, that this information potentially concerns a vast number of aspects of his
30 “Spanish Agency behind the Google Ruling Lauded by Some, Hated by Others,” Wall Street Journal, June 23, 2014
at http://online(dot)wsj(dot)com/articles/spanish-agency-behind-the-google-ruling-lauded-by-some-hated-by-others-1403795717?c- b=logged0.03531818146039811.
31 Van Alsenoy et al., 2013.
32 See http://answers(dot)google(dot)com/answers/threadview/id/33696.html.
33 Matt Cutts (Google Quality Group Engineer), How Search Works, s30–s44, available at http://www(dot)youtube(dot)com/ watch?v=BNHR6IQJGZs.
34 Alsenoy et al., 2013.
35 More information about crawling is available online at https://www(dot)google(dot)com/search/about/.
Case 2 Google and the Right to Be Forgotten 457
458 Cases in Business and Society
private life and that, without the search engine, the information could not have been
interconnected or could have been only with great difficulty.36
In essence, the Court ruled that an activity, “whether or not by automatic means” could be considered to be the “processing of personal data” within the meaning of Article 2(b), even if no intention to process such data existed.37 The court’s ruling applied to any search engine operators that had a branch or a subsidiary in any of the 28 member states of the EU.38
Costeja’s lawyer, Joaquín Muñoz, was pleased with the ruling. “When you search for something in Google, they don’t scour the entire Internet for you and then give you a result. They’ve stored links, organized them, and they show them based on a criteria they’ve decided upon.”39 As for Costeja, he expressed satisfaction with the result of his four-year legal crusade. Speaking of the court’s decision, he said, “I think this is the correct move. You have to provide a path for communication between the user and the search engine. Now that communication can take place.”40
Google’s Application of the Ruling
For its part, Google—although disappointed with the ruling—set about complying with it. Soon after the court decision, it removed Costeja’s disputed information from its search results. But, the company also took more general action.
The Court’s decision recognized Google as a data controller, or the operator of the search engine and the party responsible for its data. As such, the court said, Google was required to police its links and put into place a mechanism to address individual concerns. Accordingly, shortly after the ruling was announced, Google set up an online form for users (from the European Union only) to request the right to be forgotten. The company website stated that each request would be evaluated individually and that Google would attempt to “balance the privacy rights of the individual with the public’s interest to know and the right to distribute information.”41 Once an individual had filled out the form, he or she received a confirmation. Each request was assessed on a case-by-case basis. Occa- sionally, Google would ask for more information from the individual. Once Google had made its decision, it notified the individual by e-mail, providing a brief explanation if the decision was against removal. If so, the individual could request that a local data protection authority review Google’s decision.
In evaluating a request, Google looked at whether the results included outdated or inac- curate information about the individual. It also weighed whether or not the information was of public interest. For example, Google generally retained the information if it related to financial scams, professional malpractice, criminal convictions, or a government offi- cial’s public conduct.42
36 Court of Justice. Judgment in Case C-131/12 Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González.
37 European Parliament. Judgment of the Court. May 13, 2014, at http://curia(dot)europa(dot)eu/juris/document/document. jsf?docid=152065&doclang=EN.
38 European Commission, “Fact sheet on the Right to be Forgotten,” at http://ec(dot)europa(dot)eu/justice/data-protection/files/ factsheets/factsheet_data_protection_en.pdf.
39 “Spain’s Everyday Internet Warrior Who Cut Free from Google’s Tentacles,” op. cit.
40 “Google Privacy Campaigner Praises Search Engine for Following to EU,” Financial Times, May 30, 2014.
41 “Search Removal Request under Data Protection Law in Europe,” at https://support(dot)google(dot)com/legal/contact/ lr_eudpa?product=websearch.
42 Frequently Asked Questions, at http://www(dot)google(dot)com/transparencyreport/removals/europeprivacy/ faq/?hl=en#how_does_googles_removals.

At the same time, Google invited eight independent experts to form an advisory coun- cil expressly to “advise it on performing the balancing act between an individual’s right to privacy and the public’s interest in access to information.”43 The committee included three professors (two of law and one of ethics), a newspaper editorial director, a former government official, and three privacy and freedom of speech experts (including one from the United Nations). Google’s CEO and chief legal officer served as conveners. The com- mittee’s job was to provide recommendations to Google on how to best implement the EU court’s ruling.
The majority recommendation of the advisory council, published on February 6, 2015, was that the right to be forgotten ruling should apply only within the 28 countries in the European Union.44 As a practical matter, this meant that Google was only required to apply removals to European domains, such as Google.fr or Google.co.uk, but not Google. com, even when accessed in Europe. Although over 95 percent of all queries originat- ing in Europe used European domains, users could still access information that had been removed via the Google.com site.
The report also explained that once the information was removed, it was still available at the source site (e.g., the newspaper article about Costeja in La Vanguardia). Removal meant merely that its accessibility to the general public was reduced because searches for that information would not return a link to the source site. A person could still find the information, since only the link to the information had been removed, not the information itself.
The advisory council also recommended a set of criteria Google should use in assess- ing requests by individuals to “delist” their information (that is, to remove certain links in search results based on queries for that individual’s name). How should the operator of the search engine best balance the privacy and data protection rights of the subject with the interest of the general public in having access to the information? The authors of the report felt that whether the data subject experienced harm from such accessibility to the information was relevant to this balancing test. Following this reasoning, they identified four primary criteria for evaluating delisting requests:
∙ First, what was the data subject’s role in public life? Did the individuals have a clear role in public life (CEOs, politicians, sports stars)? If so, this would weigh against delisting.
∙ Second, what type of information was involved? Information that would normally be considered private (such as financial information, details of a person’s sex life, or iden- tification numbers) would weigh toward delisting. Information that would normally be considered to be in the public interest (such as data relevant to political discourse, citi- zen engagement, or governance) would normally weigh against delisting.
∙ Third, what was the source of the information? Here, the report suggested that journal- istic writing or government publications would normally not be delisted.
∙ Finally, the report considered the effect of time, given that as circumstances change, the relevance of information might fade. Thus, the passage of time might favor delisting.
The advisory council also considered procedures and recommended that Google adopt an easily accessible and easy-to-understand form for data subjects to use in submitting their requests.
43 The Advisory Council to Google on the Right to be Forgotten, February 6, 2015, at https://drive(dot)google(dot)com/ file/d/0B1UgZshetMd4cEI3SjlvV0hNbDA/view?pli=1.
44 “Limit ‘Right to Be Forgotten’ to Europe, Panel Tells Google,” The New York Times, February 6, 2015.
Case 2 Google and the Right to Be Forgotten 459
460 Cases in Business and Society
Discussion Questions
The recommendations of the advisory council were not unanimous. Jimmy Wales, the cofounder of Wikipedia and one of the eight group members, appended a dissenting com- ment to the report. “I completely oppose the legal situation in which a commercial com- pany is forced to become the judge of our most fundamental rights of expression and privacy, without allowing any appropriate procedure for appeal by publishers whose work in being suppressed,” Mr. Wales wrote. “The recommendations to Google contained in this report are deeply flawed due to the law itself being deeply flawed.”45
1. In what ways has technology made it more difficult for individuals to protect their privacy?
2. Do you believe an individual should have the right to be forgotten, that is, to remove information about themselves from the Internet? If so, should this right be limited, and if so, how?
3. How does public policy with respect to individual privacy differ in the United States and Europe, and what explains these differences?
4. Do you think Google should be responsible for modifying its search results in response to individual requests? If so, what criteria should it use in doing so? Are there limits to the resources the company should be expected to expend to comply with such requests?
5. If you were a Google executive, how would you balance the privacy rights of the indi- vidual with the public’s interest to know and the right to distribute information?

Case Study Sample Content Preview:

The Right to be Forgotten
Student’s Name
Institution
The Right to be Forgotten
The debate over the right to be forgotten was ignited by a small-town Spanish lawyer, Mario Costeja Gonzalez when he filed a lawsuit against Google concerning personal information that prominently featured his Google search (Derechoaleer, 2014). The search revealed information about financial problems he had over a decade ago that were already resolved but would potentially chase his prospective customers away. In the lawsuit, Costeja requested Google to remove or conceal the personal data relating to him so that the data no longer appeared in search results. Google was facing two major problems: Costeja’s lawsuit itself and its implications to the company regarding the right to be forgotten (The Wall Street Journal, 2014).
La Vanguardia Spanish daily digitized its archives. Among the articles in the archives was an announcement it ran in 1998 publicizing the action of a house to pay taxes owed by Costeja. In 2009, Costeja “googled” himself and was concerned by the result of the search. The legal notice that had appeared in La Vanguardia was prominently featured in the search (Derechoaleer, 2014). There were two reasons for his concerns. First, the information potentially damaged his reputation as an attorney and prospective client may not choose to hire him based on the information. Second, the information was no longer factual; he had cleared his debt ten years earlier. Reflecting on his situation, Costeja realized the possible negative effects this kind of information would have on other people’s lives and set ”to fight for the right to request the deletion of data that violates the honor, dignity, and reputation of individuals” (The Wall Street Journal, 2014).
Google was facing both long term and short problems. The short term pr...
Updated on
Get the Whole Paper!
Not exactly what you need?
Do you need a custom essay? Order right now:

You Might Also Like Other Topics Related to censorship essays:

HIRE A WRITER FROM $11.95 / PAGE
ORDER WITH 15% DISCOUNT!