Over the past decade, the EU has made significant technological and legal strides toward the widespread adoption of electronic identification cards. An electronic ID card, or e-ID, serves as a form of secure identification for online transactions – in other words, it provides sufficient verification of an individual’s identity to allow that person to electronically sign and submit sensitive documents such as tax returns and voting ballots over the Internet. Many people see e-IDs as the future of secure identification since they offer the potential to greatly facilitate cardholders’ personal and business transactions, and the EU Commission has recognized this potential by drafting regulations meant to eliminate transactional barriers currently hindering the cards’ cross-border reach. However, the increasingly widespread use of e-ID systems also gives rise to significant data security concerns.
Countries including Spain, Italy, Germany, and Belgium already have adopted e-ID systems, and the precise mechanics of the systems differ from country to country. In the Estonian system, for example, each e-ID carries a chip with encrypted files that provide proof of identity when accessed by a card reader (which a cardholder may purchase and connect to his or her computer). Once the card is inserted into the card reader, the user inputs different PIN numbers to access the appropriate database and electronically sign e-documents.
In fact, as recently detailed in The Economist, the small Baltic country of Estonia has one of Europe’s most highly-developed e-ID systems and exemplifies the underlying potential of this technology. Around 1.1 million of the country’s 1.3 million residents have electronic ID cards, which they can use to take advantage of the country’s fairly advanced array of e-government offerings. Estonians can use their e-IDs to go online and securely file their taxes, vote in elections, log into their bank accounts, access governmental databases to check their medical records, and even set up businesses, among many other tasks. Estonia even has established an e-prescription system that permits doctors to order a refill by forwarding an online renewal notice to a national database, thereby allowing a patient pick up a prescription from any pharmacy in the country simply by presenting his or her e-ID. The Estonian government also has announced a plan to start issuing cards to non-Estonians, so that citizens of other countries can easily set up businesses in Estonia or otherwise take advantage of that country’s many e-services. Estonia’s e-ID system thus illustrates how these cards can enhance convenience and save time that may otherwise be spent waiting in line to file documents in government offices, and they represent a significant step in that country’s efforts to brand itself as “e-Estonia.”
Naturally, the use of these cards to access such large quantities of personal data implicates important data security issues. Estonia assures its cardholders that their transactions are secure because each card’s files are protected by 2048-bit public key encryption, and because users need to enter multiple PIN numbers to access and use certain online services. To date, Estonia’s e-ID system has not suffered a major data breach. Nevertheless, the security of the system has been called into question by researchers that claim that Estonia’s e-voting process is vulnerable to manipulation by skilled hackers.
So what other factors may hinder the deployment of this technology, beyond the large upfront costs of developing an e-ID system and distributing e-ID cards? As mentioned above, the e-ID system requires the adoption of extensive data security measures to ensure the confidentiality of personal data. Furthermore, systems like those established by Estonia are so efficient in part because they draw on personal data – including health information – held within government databases. Citizens of other countries, such as those that have largely privatized medical systems like the United States, may be much more wary of government efforts to consolidate this type of personal information, even for the sake of efficiency. Others countries share a similar concerns about governmental collection of personal information. When the U.K. government announced plans to issue ID cards linked to a national identity register, for example, opposition proved so fierce that the government abandoned its pursuit of the project. Denmark and Ireland also do not issue ID cards to their citizens.
Regardless of this opposition, the European Commission believes that e-IDs will facilitate business within the EU and is dedicated to removing many of the legal barriers hindering the implementation of this technology. As early as 1999, the Commission issued Directive 1999/93/EC, which provided a framework for the legal recognition of electronic signatures. And in 2012, the Commission issued its draft regulation on electronic identification and trust services for electronic transactions. The regulation set forth a mutual recognition scheme mandating that all member states recognize and accept electronic IDs issued in other member states for the purposes of accessing online services. The regulation would, for example, allow an Italian student attending a German university to pay her school fees online via the university’s German website by using her Italian e-ID.
In sum, e-IDs have the potential to simplify the lives of cardholders – but only if those issuing the cards are willing to take the appropriate security precautions and work to achieve mutual recognition of other countries’ IDs.
The CNIL’s report starts with what was the central issue in data protection throughout 2013, the U.S. Prism program and more generally any mass surveillance programs of European citizens by foreign entities. The CNIL created a working group on the related subject of long-arm foreign statutes which allow foreign administrations to obtain personal data from French and European citizens. Such statutes have various purposes (combating money laundering, corruption, the financing of terrorism, etc.) and lead to the creation of black lists. In addition, the CNIL addresses those subjects with the other Data Protection Agencies within the Article 29 Working Party.
Another important topic was the proposed creation in France of a centralized national register where all consumer credit lines opened by an individual would have been listed, in order to allow credit companies to verify an individual’s level of debt. Indeed, consumer credit lines are fairly easily granted in France, and some consumers accumulate credit lines beyond their payment capacities and ultimately default in payment. The CNIL rendered negative advice on this register arguing that it breached the proportionality principle of the French law on data protection. Indeed, since only a small minority of people defaults, it considered that the collection and processing of data from all credit users was disproportionate. The register was nevertheless approved by the Parliament, but was immediately overruled by the French constitutional court in 2014, which, like the CNIL, considered that the register breached the right to privacy.
With regards to of the CNIL’s auditing and sanctions in 2013, the CNIL’s priorities remained committed to training, promoting awareness on data protection and issuing guidance for companies. Imposing financial penalties remains an exception. Statistics of the CNIL’s auditing and sanctions activities in 2013 demonstrate this quite clearly:
5640 complaints: Complaints to the CNIL were stable in 2013. The CNIL attributes this stability to its new guidance available on its website. This guidance deals with common issues such as video surveillance and direct marketing, and helps companies to comply, thus stabilizing the number of complaints to the CNIL.
414 audits: 75% of the CNIL’s audits in 2013 were of private companies, and 25% were of public administration. Many audits occurred after a complaint was filed with the CNIL (33% of the audits), but audits were also conducted at the initiative of the CNIL (27%) or following a previous sanction to make sure that the companies were now compliant (16%). Finally, 24% of the audits were devoted to sectors chosen by the CNIL: in 2013, companies dealing with open data as well as surveys were audited, and the social services administration was also audited.
14 decisions with sanctions: This includes 7 warnings and only 7 financial penalties.
For 2014, the CNIL has identified four major topics: open data, health data, and “digital death”. On open data, the CNIL will audit the current legal framework and will propose improvements. The CNIL itself wishes to open its data (rendered anonymous) to the public. With regards to health data, the CNIL will investigate the impact on privacy from apps and other tools (“quantified self”) that allow individuals to monitor their health and physical activity. The CNIL will address “digital death”, in particular how to deal with data of a deceased person. Finally, the CNIL will conduct audits in the penitentiary administration in order to verify whether the rights of prisoners to privacy are respected.
In France, before implementing a whistleblowing process, a company must inform and consult with its employees’ representatives, inform its employees and notify the French Data Protection Agency (CNIL).
There are two possible ways to notify the CNIL of a whistleblowing system:
- request a formal authorization from the CNIL (this is quite burdensome and difficult to obtain), or
- opt for the standard whistleblowing authorization (AU-004).
The standard whistleblowing authorization (AU-004) was enacted by the French Data Protection Agency in 2005 in order to facilitate notifying the CNIL of whistleblowing systems. As long as the company undertakes to comply with the principles and scope of the standard authorization, it is automatically authorized to implement the whistleblowing system. As enacted in 2005, the types of wrongdoings that could be reported through a whistleblowing system under the standard authorization were quite broad. Companies were authorized to adopt whistleblowing systems for purposes of regulatory internal control requirements, to comply with French law requirements and the United States Sarbanes-Oxley Act, and to protect vital interests of the company or the physical or psychological integrity of its employees.
However, in 2010, the CNIL had to modify the scope of the wrongdoings which could be reported when using a standard whistleblowing authorization pursuant to a decision of the French Supreme Court dated December 8, 2009 (see our post of December 15th, 2010: http://privacylaw.proskauer.com/2010/12/articles/data-privacy-laws/french-data-protection-agency-restricts-the-scope-of-the-whistleblowing-procedures-multinational-companies-need-to-make-sure-they-are-compliant/). In order to comply with the French Supreme Court decision, the CNIL narrowed whistleblowing reporting under the standard authorization to the following types of wrongdoings:
- Companies concerned with the U.S. Sarbanes-Oxley Act, section 301(4); and
- Japanese SOX of June 6, 2006.
The scope of the standard authorization was therefore very limited, requiring companies needing a broader scope of whistleblowing reporting to obtain a formal authorization from the CNIL and therefore to face the risk of a refusal.
From 2011 to 2013, given the scope limits of the standard authorization, the CNIL has had to process a high volume of filings for formal authorizations to implement whistleblowing systems.
Given the increased volume of requests from companies, on January 30, 2014, the CNIL decided to modify again the scope of application of the standard whistleblowing authorization (AU-004) to widen it.
As a consequence, companies implementing whistleblowing systems in France within the following categories can benefit from the new standard authorization:
- Discrimination and bullying at work;
- Health and safety at work; and
- Environment protection.
In its updated standard whistleblowing authorization, the CNIL also stated its preference against anonymous whistleblowing. Anonymous whistleblowing is allowed only if:
- The facts are serious and the factual elements are sufficiently detailed; and
- The treatment of the alert is subject to particular precautions such as a prior checking before it is sent through the whistleblowing process.
Special thanks to Tiffany Quach, 2014 summer associate, for her assistance in preparing this post.
On June 25, 2014, the Supreme Court unanimously ruled that police must first obtain a warrant before searching the cell phones of arrested individuals, except in “exigent circumstances.” Chief Justice John Roberts authored the opinion, which held that an individual’s Fourth Amendment right to privacy outweighs the interest of law enforcement in conducting searches of cell phones without a warrant. The decision resolved a split among state and federal courts on the search incident to arrest doctrine (which permits police to search an arrested individual without a warrant) as it applies to cell phones.
The case of Riley v. California as heard before the Supreme Court combined two cases, one involving a smartphone and the other involving a flip phone. In the first case, Riley v. California, the police arrested David Leon Riley, searched his smartphone, and found photographs and videos potentially connecting him to gang activity and an earlier shooting. In the second case, United States v. Wurie, Brima Wurie was arrested for allegedly dealing drugs, and incoming calls on his flip phone helped lead the police to a house used to store drugs and guns.
Roberts wrote that neither of the two justifications for warrantless searches – protecting police officers and preventing the destruction of evidence – applies in the context of cell phones. According to the Court, the justification of protecting police officers falls flat since data on a cell phone cannot be used as a weapon. Roberts was also not persuaded by concerns that criminals could destroy evidence through remote wiping. He pointed out that police have alternatives to a warrantless search in order to prevent the destruction of evidence, including: turning the phone off, removing its battery, or placing the phone in a “Faraday bag,” an aluminum foil bag that blocks radio waves.
The Chief Justice focused on the differences between modern cell phones and other physical items found on arrested individuals to support his argument that modern cell phones “implicate privacy concerns far beyond those implicated by the search of a cigarette pack, a wallet, or a purse.” He cited modern cell phones’ huge storage capacity and how they function as “minicomputers that…could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers.” Roberts also noted that data viewed on a phone is frequently not stored on the device itself, but on remote servers, and that officers searching a phone generally do not know the location of data they are viewing.
However, Roberts maintained that exigent circumstances could still justify warrantless searches of cell phones on a case-by-case basis. Such circumstances include: preventing imminent destruction of evidence in individual cases, pursuing a fleeing suspect, and providing assistance to people who are seriously injured or are threatened with imminent injury.
Robert’s opinion is in line with the Court’s stance in the 2012 case United States v. Jones, which held that installing a GPS device on a vehicle and using the device to track the vehicle constitutes a search under the Fourth Amendment.
Justice Samuel Alito concurred in the judgment and agreed with Roberts that the old rule should not be applied mechanically to modern cell phones. However, he made two points that diverged from Roberts’ opinion. First, he disagreed with the idea that the old rule on searches incident to arrest was primarily based on the two justifications of protecting police and preventing destruction of evidence. Second, if Congress or state legislatures pass future legislation on searching cell phones found on arrested individuals, the Court should defer to their judgment.
The Riley opinion recognizes the unique role that cell phones play in modern life (“such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude that they were an important feature of the human anatomy”) and that they “hold for many Americans ‘the privacies of life.’”
On July 2, 2014 Singapore’s new Personal Data Protection Act (the “PDPA” or the “Act”)) will go into force, requiring companies that have a physical presence in Singapore to comply with many new data protection obligations under the PDPA. Fortunately, in advance of the Act’s effective date, the Singapore Personal Data Commission has recently promulgated Personal Data Protection Regulations (2014) (the “Regulations”) to clarify companies’ obligations under the Act.
Under the PDPA, an individual may request from an organization that is subject to the Act access to, and correction of, the personal data that the organization holds about that individual. The Regulations clarify that the request must be made in writing and must include sufficient identifying information in order for the organization to process the request. The Regulations also specify that the request for access or correction should be made to the company’s Data Protection Officer (which companies are now required to appoint under the Act). Under the Regulations, an organization must respond to the request for access to personal data “as soon as practicable” but if it is anticipated that it will take longer than 30 days to do so, the organization must so inform the individual within that 30 day period.
The Regulations confirm that individuals under the Act are entitled to expansive access rights: a company must provide them with access to all personal data requested, as well as “use and disclosure information in documentary form”. If such is not possible however, the organization can provide the applicant with a “reasonable opportunity to examine the personal data and use and disclosure information.”
Perhaps in an effort to reduce the burden and expense to organizations complying with an access request by an individual, the Regulations provide that an organization may charge an individual a “reasonable fee” to respond to an individual’s request for access to the personal data the company holds related to the individual, provided it has previously communicated an estimate of the fee to the applicant.
The Regulations also contain a number of details regarding the transfer of personal data outside Singapore. Specifically, the Regulations clarify that before transferring personal data to another jurisdiction, the transferring organization in Singapore must ensure that the recipient is “legally bound by enforceable obligations… to provide to the transferred personal data a standard of protection that is at least comparable to the protection under the Act.”
“Enforceable obligations” under the PDPA are similar to that under the European Union, and include the existence of a comparable data protection law, a written contract that provides for sufficient protections, as well as “binding corporate rules.”
The Regulations (together with recently issued Advisory Guidelines On Key Concepts In The Personal Data Protection Act (revised on 16 May 2014)) now provide much needed guidance in helping companies comply with their new data protection obligations under the Act.
After a decision denying class certification last week, claims by Hulu users that their personal information was improperly disclosed to Facebook are limited to the individual named plaintiffs (at least for now, as the decision was without prejudice).
The plaintiffs alleged Hulu violated the federal Video Privacy Protection Act by configuring its website to include a Facebook “like” button. This functionality used cookies that disclosed users’ information to Facebook. But, the U.S. District Court for the Northern District of California credited expert evidence presented by Hulu that three things could stop the cookies from transmitting information: 1) if the Facebook “keep me logged in” feature was not activated; 2) if the user manually cleared cookies after his or her Facebook and Hulu sessions, or 3) if the user used cookie blocking or ad blocking software.
In its decision, the court ruled that these methods of stopping disclosure of information rendered the proposed class insufficiently ascertainable. To maintain a class action, a class must be sufficiently ascertainable by reference to objective criteria. Plaintiffs argued that the class membership could be ascertained by submission of affidavits from each class member, but the court reasoned that individual subjective memories of computer usage may not be reliable, particularly given the availability of $2,500 statutory damages per person under the Video Privacy Protection Act. Thus, the court found plaintiffs had not defined an ascertainable class.
Additionally, because an individual inquiry into whether each member of the putative class used any one of the methods to block disclosure of information would be required, the court found individual issues predominated and would preclude certification of the class as defined. Hulu also argued the total potential damages were out of proportion to any actual damages suffered and thus violated due process, but the court noted that while the argument might have some merit, the issue was not ripe given the denial of class certification.
While the case, In re Hulu Privacy Litigation, can still continue on behalf of the named plaintiffs, the court left open the possibility of proceeding with subclasses or a narrower class definition. Consequently, additional class certification practice is likely in this closely watched case.
Last month, a federal district court in the Northern District of California issued an order that may affect the policies of any company that records telephone conversations with consumers.
The trouble began when plaintiff John Lofton began receiving calls from Collecto, Verizon’s third-party collections agency, on his cell phone. The calls were made in error – Lofton did not owe Verizon any money because he wasn’t even a Verizon customer – but Lofton decided to take action when he discovered that Collecto had been recording its conversations with him without prior notice. Lofton brought a class action against Verizon under California’s Invasion of Privacy Act, theorizing that Verizon was vicariously responsible for Collecto’s actions because Collecto was Verizon’s third-party vendor and because Verizon’s call-monitoring disclosure policy did not require the disclosure of recordings in certain situations. Verizon filed a motion to dismiss, arguing that the recordings did not invade Lofton’s privacy and therefore did not run afoul of the statute.
The court denied the motion to dismiss, holding that the statutory language of § 632.7 of the Invasion of Privacy Act banned the recording of all calls made to cell phones – not just confidential or private calls made to cell phones – without prior notice. The statute’s treatment of cell phones thus diverges from its treatment of landlines, as recordings of calls made to landlines only have to be disclosed via prior notice if the call is “confidential.”
Though the case is ongoing, this ruling indicates that Lofton v. Verizon Wireless (VAW) LLC ultimately may have a significant impact on how companies interact with consumers over the phone. First, the prevalence of cell phones means that companies should assume that § 632.7 applies to a large percentage of its calls with consumers – not only because it is highly likely that these consumers use cell phones instead of landlines, but because it may be difficult for the company to tell whether these consumers are in California and subject to § 632.7. Second, this recent order indicates that companies may be held responsible for their third-party vendors’ lack of disclosure, meaning that companies should change their policies to require their third-party vendors to refrain from recording phone conversations without prior notice, and also monitor the vendors for compliance with this requirement. In sum, when it comes to providing prior notice of recordings to consumers, companies shouldn’t phone it in – they should ensure they and any third party vendors err on the side of disclosure to avoid legal hangups down the line.
http://www.agpd.es/portalwebAGPD/revista_prensa/revista_prensa/2013/notas_prensa/common/diciembre/131219_PR_AEPD_PRI_POL_GOOGLE.pdf More decisions are to be expected.
This was the first issue that the CNIL had to decide upon. The territorial scope of French law derives from the rules set out by the EC Directive n°95/46. Hence, French law is applicable either because 1) the data controller carries out his activity within an establishment in France, or 2) the data controller is not established in France nor in the EU, but uses “means of processing” of personal data located in France to collect data.
Google claimed that the French law did not apply because Google Inc. in California is solely responsible for data collection and processing, and that Google France is not involved in any activity related to the data processing performed by Google Inc.
The CNIL rejects this argument, arguing that Google France is involved in the sale of targeted advertisement, which value is based on the data collection of Internet users. Hence, Google France is involved in the activity of personal data processing, even though it does not perform the technical processing of personal data. The CNIL’s argument is similar to the argument developed by the Advocate General in the case currently opposing Google and the Spanish DPA before the European Court of Justice (“ECJ”) (http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=198456/). The ruling of the ECJ on this issue is eagerly awaited.
In addition, the CNIL ruled that Google Inc. placed cookies on the computers of French users, and that such cookies were “means of processing” of personal data located in France because they are used to collect data from the users’ computers. Therefore, even if Google Inc. were to be considered as the sole data controller, French law would nevertheless apply because of the location of the cookies in France.
Are all data collected by Google “personal data” within the meaning of French and EU Law?
One of the main issues is the difference put forward by Google between “authenticated users”, who have registered their ID to use services such as Gmail and “unauthenticated users” who use services that do not require identification such as Youtube! or “passive users” who visit a third-party website where Google has placed Analytics cookies for targeted advertising.
According to Google, it holds “personal data” only on “authenticated users” and not on “unauthenticated users” and “passive users”. The CNIL rejects the argument because the definition of personal data under French law includes information that indirectly identifies a person. The CNIL considers that, even if the name of the user is not collected, the collection of an IP address combined with the collection of precise and detailed information on the browsing history of the computer amounts to indirectly identifying a person, because it gives precise information of a person’s interests, daily life, choices of life etc.
Therefore, all data collected by Google is considered by CNIL as personal data.
The CNIL, following the findings of the Article 29 Working Party, found four breaches of French law on data protection.
Secondly, Google should have informed users and obtained their consent before placing advertising cookies on their terminal. Obtaining consent for cookies does not require opt-in consent from the user, but the user must be properly informed before the cookies are placed on the terminal, of their purposes and on how to refuse them. The CNIL found that, with regards to unauthenticated users, Google placed cookies prior to any information, in breach of French Data Protection law. In addition, the information provided to users is not sufficient. Only two services of Google (Search and YouTube!) have a banner with information on cookies. Moreover, little information is given regarding the purposes of the cookies: stating that cookies are meant “to ensure proper performance of the services” is not deemed to be sufficient information in order to obtain an “informed consent” from the user. With regards to “passive users” who visit third-party websites where Google placed its “Analytics” cookies, the CNIL considers that, since Google uses the data collected for its own activity (by producing statistics and improving its service), it acts as a data controller and is responsible for obtaining consent.
Thirdly, Google has not defined the duration during which it retains the data collected and has not implemented any automatic processes for deleting data. For example, no information is available as to the duration during which the data is kept once an authenticated user has canceled its account.
Based on a December 3rd decision by the Second Circuit Court of Appeals, class actions under the Telephone Consumer Protection Act (TCPA) can now be brought in New York federal court. This decision marks a reversal of Second Circuit precedent, and will likely increase the number of TCPA class actions being filed in New York. Companies should review their telemarketing practices and procedures in light of the potential statutory penalties under the TCPA.
The Better Business Bureau (“BBB”) and the Direct Marketing Association (“DMA”) are in charge of enforcing the ad industry’s Self Regulatory Principles for Online Behavioral Advertising (“OBA Principles”), which regulate the online behavioral advertising activities of both advertisers and publishers (that is, web sites on which behaviorally-targeted ads are displayed or from which user data is collected and used to target ads elsewhere). Among other things, the OBA Principles provide consumers transparency about the collection and use of their Internet usage data for behavioral advertising purposes. Specifically, the “Transparency Principle” requires links to informational disclosures on both: (i) online behaviorally-targeted advertisements themselves, and (ii) webpages that display behaviorally-targeted ads or that collect data for use by non-affiliated third parties for behavioral advertising purposes. The “Consumer Control Principle” requires that consumers be given a means to opt-out of behavioral advertising.
Through its “Online Interest-Based Advertising Accountability Program”, the BBB recently enforced the OBA Principles in a series of actions—some with implications for publishers and some with implications for advertisers.
Last month, the BBB admonished publishers to heed these self-regulatory requirements when it issued its first Compliance Warning. The BBB encouraged the use of the DAA endorsed AdChoices Icon on any page from which a website operator allows a third party to collect user data for behavioral advertising or transfers such data to a third party for such purposes. This policy gives consumers “just-in-time” notice and enables them to decide whether to participate in behavioral advertising.
Recent inquiries by the BBB into the behavioral advertising activities of BMW of North America, LLC and Scottrade, Inc. illustrate the requirements of the OBA Principles on Web site publishers. In these inquiries, the BBB found that both companies had failed to provide enhanced notice and opt-out links on webpages where they allowed third-party data collection for behavioral advertising purposes. Both companies quickly achieved compliance in response.
The BBB warned that publishers that do not comply with the first party transparency requirements for third party data collection, but are otherwise in compliance with the OBA Principles, will face enforcement action by the BBB beginning on January 1, 2014.
The Transparency Principle requires third parties (e.g., advertisers and ad service providers) that engage in behavioral advertising on non-affiliated websites to provide enhanced notice to web users about behavioral advertising through a similar hyperlinked disclosure preferably containing the AdChoices Icon in or near the behaviorally-targeted advertisement itself. The disclosure should contain information about the types of data collected for behavioral advertising purposes, the use of such data, and (under the Consumer Control Principle) the choices available to users with respect to behavioral advertising. This requirement is designed to notify consumers that the ad they are viewing is based on interests inferred from a previous website visit.
Recently, the BBB found three companies in violation of the OBA Principles and on November 20th released decisions resolving their inquiries into the violations. The BBB visited a company website, and when it left the company site to browse other websites, found ads from the original company on the non-affiliated sites. However, the ads did not include enhanced notice, which, the BBB said, was in violation of the third party transparency requirements of the OBA Principles. According to the BBB, the party responsible for this omission was 3Q Digital, an ad agency. The agency used the self-service features of a demand side platform to serve targeted ads. The BBB found that when 3Q Digital used the self-service platform it stepped “into the shoes of companies explicitly covered by the OBA Principles” and assumed the compliance responsibilities of covered companies.
In response, 3Q Digital’s client included the AdChoices Icon in all its interest-based ads. 3Q Digital, in turn, took similar steps with all its other online campaigns.
The BBB’s recent enforcement activities emphasize the need for companies (whether they advertise online, display third party ads on their online properties, or simply contribute user data for use by others for behavioral advertising) to be vigilant of the specific requirements of the OBA Principles that are applicable to them based on how they participate in behavioral advertising. One motivation for the existence of the OBA Principles is to show regulators that legislation in this area is not necessary. To ward off legislation and avoid enforcement action by the BBB and the DMA, all parties involved should be mindful of the OBA Principles, and make them a part of their compliance programs.
An article published by Law360 last week quoted Jeremy Mittman, co-Chair of Proskauer’s International Privacy Group and a member of the firm’s International Labor Group, on the data protection reform legislation recently passed by European Parliament and the difficulties multinational companies face to comply with both EU and U.S. privacy laws.
Jeremy was again solicited to comment on the EU-U.S. Safe Harbor Program in an article published by Politico on November 7. The article mentions Jeremy’s experience drafting Safe Harbor certifications and EU model contracts.
The determination of the territorial scope of the current EU Directive n° 95/46 is still under dispute both before national Courts and the European Court of Justice (ECJ). This issue may soon become moot with the adoption of future data protection regulation, which may modify and expand the territorial scope of EU data privacy law, especially following the results of the recent vote of the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs. The following is meant to help determine the current state of affairs regarding the issue of the territorial (and extraterritorial) scope of the future EU law following this vote of the European Parliament.
As the internet has allowed companies to easily provide services from a distance, the issue as to what laws are applicable to personal data has become more complex. This was not fully anticipated when the current EU Directive on personal data protection was adopted in 1995. Modifications to the rules regarding territorial scope set by Article 4 of the current EU Directive have been a highly debated issue in the EU.
An ongoing case before the ECJ highlights this complexity, and the legal uncertainty, surrounding the territorial scope of the current EU Directive. In this case, a Spanish citizen lodged a complaint against Google Spain and Google Inc. before the Spanish Data Protection Agency (“AEPD”) because Google refused to take down data that appeared when his name was entered in the search engine. As a defense, Google argued that Spanish law was not applicable because the processing of personal data relating to its search engine does not take place in Spain, as Google Spain acts merely as a commercial representative: the technical data processing takes place in California. According to Article 4.1 (a) of the EU Directive, national law is applicable if “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State.” The ECJ will therefore have to determine whether Google Spain, “in the context of its activities,” may be considered as processing data, even though, as a commercial subsidiary, it does not technically process personal data.
The Advocate General has given a positive answer to that question in a non-binding Opinion delivered last summer. In the Opinion, he argues that since the business model of search engines relies on targeted advertising, the local establishment in charge of marketing such targeted advertising to the inhabitants of a particular country must be considered as processing personal data “in the context of its activities,” even though the technical operations are not performed there. The ECJ is expected to render its decision at the end of this year.
In the near future, the applicable law in such a situation may more easily be determined based on the draft Regulation proposed by the European Parliament.
- First, the European Parliament has proposed Article 3.1 of the EU Directive be amended to clarify that “this Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union whether the processing takes places in the Union or not.” (emphasis added). If the draft Regulation is adopted as such, EU law would therefore unequivocally apply to the activities of the subsidiaries established in the EU of foreign companies, regardless of the actual place of data processing.
- Second, the European Parliament proposes to amend Article 3.2 of the EU Directive, which concerns the extraterritorial application of EU law (i.e., the situation where the data controller does not have any presence in the EU). The draft Regulation provides that EU law would nonetheless apply if the processing of data is related to “offering of goods or services” to data subjects in the European Union. In accordance with the Article 29 Working Party, which stated in its Opinion 01/2012 that the offering of goods or services should include free services, the European Parliament has proposed amending Article 3.2 to provide that the EU law would apply to any processing activity related to the offering of goods or services to data subjects in the EU, “irrespective of whether a payment of the data subject is required.”
The draft amended regulation will now be negotiated with the European Council (the governments of the EU Member States). The European Parliament is pushing for a vote of the regulation in the spring 2014. However, such a timetable is far from assured, given the general “slow track” of the proposed legislation coupled with recent pronouncements by the leaders of several EU countries suggesting a timetable closer to 2015.
The Working Paper indicates that necessary information includes:
- identification of all of the types of cookies used;
- the purpose(s) of the cookies;
- if relevant, an indication of possible cookies from third parties;
- if relevant, third party access to data collected by the cookies;
- the data retention period (i.e. the cookie expiry date); and
- typical values and other technical information.
Users must also be informed about the ways that they can accept all, some or no cookies and how to change their cookie settings in the future.
Timing: Consent must be obtained before data processing begins, i.e. on the entry page. The Working Party recommends that websites implement a consent solution in which no cookies are set to a user’s device (other than those that fall under an exception and thus do not require the user’s consent) until that user has provided consent.
Active Behavior: The Working Party indicates that valid consent must be through a “positive action or other active behavior”, provided that the user has been fully informed that cookies will be set due to this action. Unfortunately, the passive use of a website containing a link to additional cookie information is not likely to be sufficient. Examples provided by the Working Party include (i) clicking on a button or link, (ii) ticking a box in or close to the space where information is presented or (iii) any other active behavior from which a website can unambiguously conclude that the user intends specific and informed consent. The Working Party also confirmed their previously issued view that browser settings may be able to deliver valid and effective consent in certain limited circumstances. Where the website operator is confident that the user has been fully informed and has actively configured their browser or other application to accept cookies, then such a configuration would signify an active behavior.
Real Choice: The Working Document provides that websites should present users with real and meaningful choice regarding cookies on the entry page. This choice should allow users to decline all or some cookies and to change cookie setting in the future. The Working Document also clarifies that websites should not make general access to the website conditional on the acceptance of all cookies, although it notes that access to “specific content” could in some circumstances be conditional.
Although the Working Document is a welcome source of guidance providing further clarification on this thorny issue, it is clear that compliance with the European Union’s rules governing cookie consent will continue to provide challenges to companies seeking to conform their websites accordingly.
On October 21, a key European parliamentary committee (the Committee on Civil Liberties, Justice and Home Affairs (“Committee”) approved an amended version of the draft EU Data Protection Regulation, paving the way for further negotiations with EU governmental bodies. The goal, according to a press release by the Committee, is to reach compromise on the draft agreement and a vote prior to the May 2014 EU Parliamentary elections. The proposed legislation (which passed in a 51-1 vote) contains a number of key concepts, including:
Right to Erasure:
Stronger than the previously worded “Right to be Forgotten”, the proposed legislation contains a “Right to Erasure”, whereby a data subject would have the right to ask any entity holding personal data on that data subject to erase the personal data upon request. Moreover, if the personal data has been “replicated” with other entities, the data controller to whom the request has been made must forward the request to the other entities it has transferred the data subject’s personal data to.
The Committee voted to increase the amount of penalties that could be levied for companies that violate the rules. Whereas previously the proposal was penalties up to 1 million euros or 2% of worldwide annual turnover revenue of the company, the Committee ratcheted up the proposed penalties to 100 million euros or up to 5% of annual worldwide revenue, whichever is greater—a significant increase that illustrates the potentially expensive consequences of violating the data protection legislation.
Data transfers to non-EU countries:
Specifically referencing the June 2013 Snowden disclosure of mass surveillance by the U.S. government’s PRISM program, the Committee proposed that if a company in the EU was requested to disclose personal data to a government located outside the EU, the entity would need to seek specific authorization from the data protection authority located in the EU country, before transferring any such personal data outside of the EU. The new provision reflects the acute concern of the EU over the Snowden revelations of this summer.
The package adopted by the Committee includes a provision limiting the practice of profiling, i.e. “a practice used to analyze or predict a person’s performance at work, economic situation, location, health or behavior.” Now, individual consent (such as that provided by a contract) would be needed in order to profile, and any individual should possess the right to object to such profiling.
Although the Committee hopes to reach agreement with the other EU legislative bodies (such as the national governments that compose the European Council) by May 2014, it is clear that there is still a long road ahead before the new legislation is finalized and enacted. The contours of the proposed Regulation may change after further rounds of negotiations. However, the recent proposals by the Committee help to illuminate the direction that the Regulation is heading.
On September 27, 2013, California Governor Jerry Brown signed into law an amendment to California’s breach notification law (Cal. Civ. Code § 1798.82). Effective January 1, 2014, under the amended law, the definition of “Personal Information” will be expanded to include “a user name or email address, in combination with a password or security question and answer that would permit access to an online account.” Additionally, new notification options have been added to address a breach of this type of information.
As amended, if there is a breach involving only this type of information and not the other types of information covered under the pre-amendment definition of “Personal Information,” the entity in question may provide notice to the affected person in electronic or other form. This notice must direct that person to change his or her password and security question or answer, as applicable, or to take other appropriate steps to protect the online account in question and all other accounts for which that person uses the same credentials.
Under the amended law, if the credentials breached are for a person’s email account furnished by the entity that suffered the breach, the entity in question may not provide notice to the compromised email account, but may use one of the other notification methods allowed by the law, or may comply by providing clear and conspicuous notice to that person when he is connected to the compromised online account from an IP address or online location from which the entity knows that person customarily accesses the online account in question.
It should be noted that the foregoing notification methods are options – an entity that breaches its requirements under California’s data security laws may still provide notice under the law’s original notification provision.
Law Targets Sites and Mobile Apps Directed to Minors, Offers “Online Eraser”
Likely to Have Nationwide Effect
On July 1st of this year, new amendments to the Children’s Online Privacy Protection Act Rule (COPPA Rule) came into effect, with perhaps the most pronounced changes being the expansion of COPPA to apply to geolocation information and persistent identifiers used in behavioral advertising. Critics called the amendments jumbled and labeled it a compliance headache, while privacy advocates were buoyed, but thought the changes did not go far enough to protect the online privacy of children. Still others contended that federal law contains a gap that fails to offer privacy protections for teenage users.
Once again, the California state government has stepped up to fill what it perceives to be a void in federal online privacy protection, this time to address certain restrictions on the use of information collected from minors and to give minors an online “eraser” of sorts. In late September, Gov. Brown signed S.B.568, which expanded the privacy rights for California minors in the digital world.
“Minors”, by the way, are defined under the law as residents of California under age 18 – this definition in itself is an expansion of the protections afforded to children under COPPA, which addresses the collection and use of information from children under 13. That is not the only expansion of COPPA presented by this new law. The federal COPPA Rule is primarily concerned with mandating notice and parental consent mechanisms before qualifying sites or mobile apps can engage in certain data collection and data tracking activities with respect to children under 13. The California statute’s marketing restrictions for minors contain no parental consent procedures – rather, restrictions for covered web services directed to minors that relate to certain specified categories of activities that are illegal for individuals under 18 years of age.
As a practical matter, compliance with this law will require certain changes in the way website publishers collect and process user information. For example, it is much easier for online operators to determine whether their websites are directed to children under 13 as opposed to “directed to minors” under 18. Going forward, sites and apps will have to reevaluate their intended audience, as well as establish procedures for when a minor user self-reports his or her age, triggering the site having actual knowledge of a minor using its service.
S.B.568 has two principal parts: minor marketing restrictions and the data “eraser.”
Marketing Restrictions: The new California law prohibits an operator of a website, online service or mobile app directed to minors or one with actual knowledge that a minor is using its online site or mobile app from marketing or advertising specified types of products or services to minors. The law also prohibits an operator from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile the personal information of a minor for the purpose of marketing or advertising specified types of products or services. Moreover, the law makes this prohibition applicable to an advertising service that is notified by a covered operator that the site, service, or application is directed to a minor. The statute lists 19 categories of prohibited content covered by the law’s marketing restrictions, including, firearms, alcohol, tobacco, drug paraphernalia, vandalism tools and fireworks. Notably, the law does not require an operator to collect or retain the ages of users, and provides operators with a safe harbor for “reasonable actions in good faith” designed to avoid violations of the marketing restrictions.
Online Eraser: The second part of S.B. 568 requires operators of websites and applications that are directed to minors, or that know that a minor is using its site or application, to allow minors that are registered users, to remove (or to request and obtain removal of) their own posted content. The operators must also provide notice and clear instructions to minors explaining their rights regarding removal of their own content. Notably, SB 568 does not require operators to completely delete the content from its servers; it only requires that the content be no longer visible to other users of the service and the public. There are certain exceptions to this “online eraser” right, such as circumstances where any other provision of federal or state law requires the operator or third party to maintain the content, the content is stored on or posted to the operator’s site or application by a third party, the operator anonymizes the content, the minor fails to follow the instructions regarding removal, or the minor has received “compensation or other consideration for providing the content.”
Both prongs of the law raise many questions:
- How does a site or application owner determine whether it is covered by S.B.568? Under the statute, a website, online service, or mobile app “directed to minors” means an “Internet Web site, online service, online application, or mobile application, or a portion thereof, that is created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.”
- What will qualify for “reasonable actions in good faith” under the safe harbor? What are the legal ramifications of an independent online ad network serving unlawful ads to minors without the knowledge of an otherwise compliant site operator?
- How does a site implement the “eraser” function? With user tools to eliminate UGC, or will the site control the removal process via an online request form? Will a removal request necessarily cause the removal of other users’ content (e.g. social media postings of other users that comment on a removed comment or submitted photo)?
- The online eraser right seemingly applies only to minors. How should a site or app handle requests from adults wishing to remove content they posted when they were minors? Should sites simply offer the tool to all users to avoid compliance issues?
- What qualifies as “compensation or other consideration for providing the content” under the exceptions to the online eraser right? Would this include free products, coupon codes, or the right to receive exclusive ‘limited time’ offers?
- What changes are required in the site’s privacy policies?
The law will come into effect on January 1, 2015. Any company with a website that can be accessed by California residents should assess the impact of these new requirements in the coming year. Considering that most, if not all, major websites and apps necessarily have or will have California-based users, this state law may become a de facto national standard, particularly since technical controls to screen or segregate California users may be unworkable.
[Incidentally, California also recently enacted a new law addressing online tracking, so it appears that the California legislature continues its focus on web privacy].
On September 27, California Governor Jerry Brown signed a new privacy law that has significant repercussions for nearly every business in the United States that operates a commercial website or online service and collects “personally identifiable information” (which means, under the law, “individually identifiable information about an individual consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual.”) The new law goes into effect on January 1, 2014.
The analysis noted the rapid rise in online tracking of users’ web-surfing behavior as well as the California Attorney General’s observation that although “all the major browser companies have offered Do Not Track browser headers” that, if selected, can “signal to websites an individual’s choice not to be tracked, [t]here is, however, no legal requirement for sites to honor the headers.” Thus, because Web sites have been free to disregard such Do Not Track selections by consumers, they would not know whether or not their selection is honored unless the Web site provides them with such notice. The new law will mandate providing users with the requisite notice.
In addition to the above “do not track” notice obligations, the law also requires website and online service operators “to disclose whether other parties” collect PII regarding a consumer’s “online activities over time and across different Web sites when a consumer uses the operator’s Web site or service.”
In light of the new obligations, it is imperative that any organization that collects PII concerning California residents (whether or not that organization is based in California) assess its current Web site privacy policies to ensure that they are compliant with California’s new laws requiring additional disclosures.
On October 16, 2013, the Federal Communications Commission’s (“FCC”) new rule implementing the Telephone Consumer Protection Act (“TCPA”) will go into effect.
These are rules with teeth, as the TCPA allows recovery of anywhere between $500 and $1,500 for each improper communication and does not require a showing of actual injury. This makes the TCPA a particularly attractive vehicle for class actions. Accordingly, we highlight some of the more salient changes in the new rule below.
Currently, except in an emergency, FCC regulations require businesses to obtain “prior express consent” before making any type of call or sending any text message using autodialers or prerecorded voices to cellphones, pagers, emergency lines, or hospital room lines. Further, the regulations also require for-profit businesses to obtain prior express consent before making commercial advertisement or telemarketing calls or messages using a prerecorded voice to any residential line absent an emergency and absent an established business relationship with the person called.
Under the new rule, however, the established business relationship exemption disappears. For-profit businesses will have to acquire “prior express written consent” before making advertisement or telemarketing calls or sending text messages using autodialers or prerecorded voices to a cellphone, pager, emergency line, or hospital room line. Absent an emergency, prior express written consent will also be required for commercial advertisement or telemarketing calls or messages using a prerecorded voice made to a residential line, whether or not there is a prior business relationship with the recipient.
The written consent must be “clear and conspicuous.” In other words, the written consent must specify the phone number for which consent is given, that the consent is not on condition of purchase, and that the consent encompasses future autodialed or prerecorded telemarketing messages. The written consent can be electronic – for example, through e-mail, website forms, text messages, or telephone keypad functions.
The new rule also requires prerecorded telemarketing calls to include an interactive, automated opt-out mechanism that is announced at the beginning of the message and is available at any time during the call. If a call could be answered by an answering machine or voicemail, the message must include a toll-free number the consumer can call to opt out. The existing three-percent limit on abandoned calls is also revised to apply to calls within a single calling campaign rather than all calls made across all calling campaigns. Finally, the new rule exempts HIPAA-covered entities from the requirements on prerecorded calls to residential lines.
Many businesses may already have the necessary procedures in place to comply with the new rule, as many of the new requirements, including the written consent requirement, are designed to harmonize FCC regulations with those of the Federal Trade Commission. Still, though the new rule does not go into effect until October 16, 2013, the clock is ticking.
Click here to read more in an article by Margaret Dale and David Munkittrick, members of Proskauer’s Privacy and Data Security Group.
In February of 2013, President Obama signed an executive order with the purpose of creating a cybersecurity framework (or set of voluntary standards and procedures) to encourage private companies that operate critical infrastructure to take steps to reduce their cyber risk (see our blog here). Critical Infastructure Systems such as the electric grid, drinking water, and trains are considered vulnerable to cyber attack, and the results of such attack could be debilitating. The Departments of Commerce, Homeland Security, and Treasury were tasked with preparing recommendations to incentivize private companies to comply with heightened cybersecurity standards. On August 6, 2013 the White House posted its preliminary list of incentives encouraging the adoption of cybersecurity best practices.
The draft framework of incentives is not due until October of this year, when it will be published for public comment. A final version is expected for February of 2014. The August 6th post serves as an interim step, which allows the private sector an opportunity to think about the recommendations and provide feedback.
In the post, Michael Daniel, Special Assistant to the President and the Cybersecurity Coordinator, lists eight ideas, summarized below.
- Cybersecurity Insurance – engage the insurance industry with the goal of creating a competitive cyber insurance market.
- Grants – make participation in the cybersecurity programs a condition or criteria for a federal critical infrastructure grant.
- Process Preference— make participation a consideration in the government’s determination of whether to expedite existing government service delivery.
- Liability Limitation — agencies are looking into whether reducing liability on participants in certain areas (such as tort liability, limited indemnity, higher burdens of proof) would encourage critical infastructure companies to implement the framework.
- Streamline Regulations — streamline existing cybersecurity regulations and develop ways to make compliance easier, such as by eliminating overlaps in existing laws and reducing audit burdens.
- Public Recognition — agencies are exploring whether giving companies the option of public recognition for participation in the programs would work as an incentive.
- Rate Recovery for Price Regulated Industries — speaking to federal, state, and local regulators regarding whether utility rates could be set to allow for recovery for investments related to adopting the cybersecurity framework.
- Cybersecurity Research — research and development to determine where commercial solutions are possible but do not yet exist. The government can then focus on research and development to meet the most pressing cybersecurity issues.
The August 6th report offers an “initial examination” of ways to incentivize the adoption of cybersecurity measures by private companies in the critical infrastructure sector. Discussions with the industry will help determine which direction the government ultimately takes with its cybersecurity framework.