An article published by Law360 last week quoted Jeremy Mittman, co-Chair of Proskauer’s International Privacy Group and a member of the firm’s International Labor Group, on the data protection reform legislation recently passed by European Parliament and the difficulties multinational companies face to comply with both EU and U.S. privacy laws.
Jeremy was again solicited to comment on the EU-U.S. Safe Harbor Program in an article published by Politico on November 7. The article mentions Jeremy’s experience drafting Safe Harbor certifications and EU model contracts.
The determination of the territorial scope of the current EU Directive n° 95/46 is still under dispute both before national Courts and the European Court of Justice (ECJ). This issue may soon become moot with the adoption of future data protection regulation, which may modify and expand the territorial scope of EU data privacy law, especially following the results of the recent vote of the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs. The following is meant to help determine the current state of affairs regarding the issue of the territorial (and extraterritorial) scope of the future EU law following this vote of the European Parliament.
As the internet has allowed companies to easily provide services from a distance, the issue as to what laws are applicable to personal data has become more complex. This was not fully anticipated when the current EU Directive on personal data protection was adopted in 1995. Modifications to the rules regarding territorial scope set by Article 4 of the current EU Directive have been a highly debated issue in the EU.
An ongoing case before the ECJ highlights this complexity, and the legal uncertainty, surrounding the territorial scope of the current EU Directive. In this case, a Spanish citizen lodged a complaint against Google Spain and Google Inc. before the Spanish Data Protection Agency (“AEPD”) because Google refused to take down data that appeared when his name was entered in the search engine. As a defense, Google argued that Spanish law was not applicable because the processing of personal data relating to its search engine does not take place in Spain, as Google Spain acts merely as a commercial representative: the technical data processing takes place in California. According to Article 4.1 (a) of the EU Directive, national law is applicable if “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State.” The ECJ will therefore have to determine whether Google Spain, “in the context of its activities,” may be considered as processing data, even though, as a commercial subsidiary, it does not technically process personal data.
The Advocate General has given a positive answer to that question in a non-binding Opinion delivered last summer. In the Opinion, he argues that since the business model of search engines relies on targeted advertising, the local establishment in charge of marketing such targeted advertising to the inhabitants of a particular country must be considered as processing personal data “in the context of its activities,” even though the technical operations are not performed there. The ECJ is expected to render its decision at the end of this year.
In the near future, the applicable law in such a situation may more easily be determined based on the draft Regulation proposed by the European Parliament.
- First, the European Parliament has proposed Article 3.1 of the EU Directive be amended to clarify that “this Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union whether the processing takes places in the Union or not.” (emphasis added). If the draft Regulation is adopted as such, EU law would therefore unequivocally apply to the activities of the subsidiaries established in the EU of foreign companies, regardless of the actual place of data processing.
- Second, the European Parliament proposes to amend Article 3.2 of the EU Directive, which concerns the extraterritorial application of EU law (i.e., the situation where the data controller does not have any presence in the EU). The draft Regulation provides that EU law would nonetheless apply if the processing of data is related to “offering of goods or services” to data subjects in the European Union. In accordance with the Article 29 Working Party, which stated in its Opinion 01/2012 that the offering of goods or services should include free services, the European Parliament has proposed amending Article 3.2 to provide that the EU law would apply to any processing activity related to the offering of goods or services to data subjects in the EU, “irrespective of whether a payment of the data subject is required.”
The draft amended regulation will now be negotiated with the European Council (the governments of the EU Member States). The European Parliament is pushing for a vote of the regulation in the spring 2014. However, such a timetable is far from assured, given the general “slow track” of the proposed legislation coupled with recent pronouncements by the leaders of several EU countries suggesting a timetable closer to 2015.
The Working Paper indicates that necessary information includes:
- identification of all of the types of cookies used;
- the purpose(s) of the cookies;
- if relevant, an indication of possible cookies from third parties;
- if relevant, third party access to data collected by the cookies;
- the data retention period (i.e. the cookie expiry date); and
- typical values and other technical information.
Users must also be informed about the ways that they can accept all, some or no cookies and how to change their cookie settings in the future.
Timing: Consent must be obtained before data processing begins, i.e. on the entry page. The Working Party recommends that websites implement a consent solution in which no cookies are set to a user’s device (other than those that fall under an exception and thus do not require the user’s consent) until that user has provided consent.
Active Behavior: The Working Party indicates that valid consent must be through a “positive action or other active behavior”, provided that the user has been fully informed that cookies will be set due to this action. Unfortunately, the passive use of a website containing a link to additional cookie information is not likely to be sufficient. Examples provided by the Working Party include (i) clicking on a button or link, (ii) ticking a box in or close to the space where information is presented or (iii) any other active behavior from which a website can unambiguously conclude that the user intends specific and informed consent. The Working Party also confirmed their previously issued view that browser settings may be able to deliver valid and effective consent in certain limited circumstances. Where the website operator is confident that the user has been fully informed and has actively configured their browser or other application to accept cookies, then such a configuration would signify an active behavior.
Real Choice: The Working Document provides that websites should present users with real and meaningful choice regarding cookies on the entry page. This choice should allow users to decline all or some cookies and to change cookie setting in the future. The Working Document also clarifies that websites should not make general access to the website conditional on the acceptance of all cookies, although it notes that access to “specific content” could in some circumstances be conditional.
Although the Working Document is a welcome source of guidance providing further clarification on this thorny issue, it is clear that compliance with the European Union’s rules governing cookie consent will continue to provide challenges to companies seeking to conform their websites accordingly.
On October 21, a key European parliamentary committee (the Committee on Civil Liberties, Justice and Home Affairs (“Committee”) approved an amended version of the draft EU Data Protection Regulation, paving the way for further negotiations with EU governmental bodies. The goal, according to a press release by the Committee, is to reach compromise on the draft agreement and a vote prior to the May 2014 EU Parliamentary elections. The proposed legislation (which passed in a 51-1 vote) contains a number of key concepts, including:
Right to Erasure:
Stronger than the previously worded “Right to be Forgotten”, the proposed legislation contains a “Right to Erasure”, whereby a data subject would have the right to ask any entity holding personal data on that data subject to erase the personal data upon request. Moreover, if the personal data has been “replicated” with other entities, the data controller to whom the request has been made must forward the request to the other entities it has transferred the data subject’s personal data to.
The Committee voted to increase the amount of penalties that could be levied for companies that violate the rules. Whereas previously the proposal was penalties up to 1 million euros or 2% of worldwide annual turnover revenue of the company, the Committee ratcheted up the proposed penalties to 100 million euros or up to 5% of annual worldwide revenue, whichever is greater—a significant increase that illustrates the potentially expensive consequences of violating the data protection legislation.
Data transfers to non-EU countries:
Specifically referencing the June 2013 Snowden disclosure of mass surveillance by the U.S. government’s PRISM program, the Committee proposed that if a company in the EU was requested to disclose personal data to a government located outside the EU, the entity would need to seek specific authorization from the data protection authority located in the EU country, before transferring any such personal data outside of the EU. The new provision reflects the acute concern of the EU over the Snowden revelations of this summer.
The package adopted by the Committee includes a provision limiting the practice of profiling, i.e. “a practice used to analyze or predict a person’s performance at work, economic situation, location, health or behavior.” Now, individual consent (such as that provided by a contract) would be needed in order to profile, and any individual should possess the right to object to such profiling.
Although the Committee hopes to reach agreement with the other EU legislative bodies (such as the national governments that compose the European Council) by May 2014, it is clear that there is still a long road ahead before the new legislation is finalized and enacted. The contours of the proposed Regulation may change after further rounds of negotiations. However, the recent proposals by the Committee help to illuminate the direction that the Regulation is heading.
On September 27, 2013, California Governor Jerry Brown signed into law an amendment to California’s breach notification law (Cal. Civ. Code § 1798.82). Effective January 1, 2014, under the amended law, the definition of “Personal Information” will be expanded to include “a user name or email address, in combination with a password or security question and answer that would permit access to an online account.” Additionally, new notification options have been added to address a breach of this type of information.
As amended, if there is a breach involving only this type of information and not the other types of information covered under the pre-amendment definition of “Personal Information,” the entity in question may provide notice to the affected person in electronic or other form. This notice must direct that person to change his or her password and security question or answer, as applicable, or to take other appropriate steps to protect the online account in question and all other accounts for which that person uses the same credentials.
Under the amended law, if the credentials breached are for a person’s email account furnished by the entity that suffered the breach, the entity in question may not provide notice to the compromised email account, but may use one of the other notification methods allowed by the law, or may comply by providing clear and conspicuous notice to that person when he is connected to the compromised online account from an IP address or online location from which the entity knows that person customarily accesses the online account in question.
It should be noted that the foregoing notification methods are options – an entity that breaches its requirements under California’s data security laws may still provide notice under the law’s original notification provision.
Law Targets Sites and Mobile Apps Directed to Minors, Offers “Online Eraser”
Likely to Have Nationwide Effect
On July 1st of this year, new amendments to the Children’s Online Privacy Protection Act Rule (COPPA Rule) came into effect, with perhaps the most pronounced changes being the expansion of COPPA to apply to geolocation information and persistent identifiers used in behavioral advertising. Critics called the amendments jumbled and labeled it a compliance headache, while privacy advocates were buoyed, but thought the changes did not go far enough to protect the online privacy of children. Still others contended that federal law contains a gap that fails to offer privacy protections for teenage users.
Once again, the California state government has stepped up to fill what it perceives to be a void in federal online privacy protection, this time to address certain restrictions on the use of information collected from minors and to give minors an online “eraser” of sorts. In late September, Gov. Brown signed S.B.568, which expanded the privacy rights for California minors in the digital world.
“Minors”, by the way, are defined under the law as residents of California under age 18 – this definition in itself is an expansion of the protections afforded to children under COPPA, which addresses the collection and use of information from children under 13. That is not the only expansion of COPPA presented by this new law. The federal COPPA Rule is primarily concerned with mandating notice and parental consent mechanisms before qualifying sites or mobile apps can engage in certain data collection and data tracking activities with respect to children under 13. The California statute’s marketing restrictions for minors contain no parental consent procedures – rather, restrictions for covered web services directed to minors that relate to certain specified categories of activities that are illegal for individuals under 18 years of age.
As a practical matter, compliance with this law will require certain changes in the way website publishers collect and process user information. For example, it is much easier for online operators to determine whether their websites are directed to children under 13 as opposed to “directed to minors” under 18. Going forward, sites and apps will have to reevaluate their intended audience, as well as establish procedures for when a minor user self-reports his or her age, triggering the site having actual knowledge of a minor using its service.
S.B.568 has two principal parts: minor marketing restrictions and the data “eraser.”
Marketing Restrictions: The new California law prohibits an operator of a website, online service or mobile app directed to minors or one with actual knowledge that a minor is using its online site or mobile app from marketing or advertising specified types of products or services to minors. The law also prohibits an operator from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile the personal information of a minor for the purpose of marketing or advertising specified types of products or services. Moreover, the law makes this prohibition applicable to an advertising service that is notified by a covered operator that the site, service, or application is directed to a minor. The statute lists 19 categories of prohibited content covered by the law’s marketing restrictions, including, firearms, alcohol, tobacco, drug paraphernalia, vandalism tools and fireworks. Notably, the law does not require an operator to collect or retain the ages of users, and provides operators with a safe harbor for “reasonable actions in good faith” designed to avoid violations of the marketing restrictions.
Online Eraser: The second part of S.B. 568 requires operators of websites and applications that are directed to minors, or that know that a minor is using its site or application, to allow minors that are registered users, to remove (or to request and obtain removal of) their own posted content. The operators must also provide notice and clear instructions to minors explaining their rights regarding removal of their own content. Notably, SB 568 does not require operators to completely delete the content from its servers; it only requires that the content be no longer visible to other users of the service and the public. There are certain exceptions to this “online eraser” right, such as circumstances where any other provision of federal or state law requires the operator or third party to maintain the content, the content is stored on or posted to the operator’s site or application by a third party, the operator anonymizes the content, the minor fails to follow the instructions regarding removal, or the minor has received “compensation or other consideration for providing the content.”
Both prongs of the law raise many questions:
- How does a site or application owner determine whether it is covered by S.B.568? Under the statute, a website, online service, or mobile app “directed to minors” means an “Internet Web site, online service, online application, or mobile application, or a portion thereof, that is created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.”
- What will qualify for “reasonable actions in good faith” under the safe harbor? What are the legal ramifications of an independent online ad network serving unlawful ads to minors without the knowledge of an otherwise compliant site operator?
- How does a site implement the “eraser” function? With user tools to eliminate UGC, or will the site control the removal process via an online request form? Will a removal request necessarily cause the removal of other users’ content (e.g. social media postings of other users that comment on a removed comment or submitted photo)?
- The online eraser right seemingly applies only to minors. How should a site or app handle requests from adults wishing to remove content they posted when they were minors? Should sites simply offer the tool to all users to avoid compliance issues?
- What qualifies as “compensation or other consideration for providing the content” under the exceptions to the online eraser right? Would this include free products, coupon codes, or the right to receive exclusive ‘limited time’ offers?
- What changes are required in the site’s privacy policies?
The law will come into effect on January 1, 2015. Any company with a website that can be accessed by California residents should assess the impact of these new requirements in the coming year. Considering that most, if not all, major websites and apps necessarily have or will have California-based users, this state law may become a de facto national standard, particularly since technical controls to screen or segregate California users may be unworkable.
[Incidentally, California also recently enacted a new law addressing online tracking, so it appears that the California legislature continues its focus on web privacy].
On September 27, California Governor Jerry Brown signed a new privacy law that has significant repercussions for nearly every business in the United States that operates a commercial website or online service and collects “personally identifiable information” (which means, under the law, “individually identifiable information about an individual consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual.”) The new law goes into effect on January 1, 2014.
The analysis noted the rapid rise in online tracking of users’ web-surfing behavior as well as the California Attorney General’s observation that although “all the major browser companies have offered Do Not Track browser headers” that, if selected, can “signal to websites an individual’s choice not to be tracked, [t]here is, however, no legal requirement for sites to honor the headers.” Thus, because Web sites have been free to disregard such Do Not Track selections by consumers, they would not know whether or not their selection is honored unless the Web site provides them with such notice. The new law will mandate providing users with the requisite notice.
In addition to the above “do not track” notice obligations, the law also requires website and online service operators “to disclose whether other parties” collect PII regarding a consumer’s “online activities over time and across different Web sites when a consumer uses the operator’s Web site or service.”
In light of the new obligations, it is imperative that any organization that collects PII concerning California residents (whether or not that organization is based in California) assess its current Web site privacy policies to ensure that they are compliant with California’s new laws requiring additional disclosures.
On October 16, 2013, the Federal Communications Commission’s (“FCC”) new rule implementing the Telephone Consumer Protection Act (“TCPA”) will go into effect.
These are rules with teeth, as the TCPA allows recovery of anywhere between $500 and $1,500 for each improper communication and does not require a showing of actual injury. This makes the TCPA a particularly attractive vehicle for class actions. Accordingly, we highlight some of the more salient changes in the new rule below.
Currently, except in an emergency, FCC regulations require businesses to obtain “prior express consent” before making any type of call or sending any text message using autodialers or prerecorded voices to cellphones, pagers, emergency lines, or hospital room lines. Further, the regulations also require for-profit businesses to obtain prior express consent before making commercial advertisement or telemarketing calls or messages using a prerecorded voice to any residential line absent an emergency and absent an established business relationship with the person called.
Under the new rule, however, the established business relationship exemption disappears. For-profit businesses will have to acquire “prior express written consent” before making advertisement or telemarketing calls or sending text messages using autodialers or prerecorded voices to a cellphone, pager, emergency line, or hospital room line. Absent an emergency, prior express written consent will also be required for commercial advertisement or telemarketing calls or messages using a prerecorded voice made to a residential line, whether or not there is a prior business relationship with the recipient.
The written consent must be “clear and conspicuous.” In other words, the written consent must specify the phone number for which consent is given, that the consent is not on condition of purchase, and that the consent encompasses future autodialed or prerecorded telemarketing messages. The written consent can be electronic – for example, through e-mail, website forms, text messages, or telephone keypad functions.
The new rule also requires prerecorded telemarketing calls to include an interactive, automated opt-out mechanism that is announced at the beginning of the message and is available at any time during the call. If a call could be answered by an answering machine or voicemail, the message must include a toll-free number the consumer can call to opt out. The existing three-percent limit on abandoned calls is also revised to apply to calls within a single calling campaign rather than all calls made across all calling campaigns. Finally, the new rule exempts HIPAA-covered entities from the requirements on prerecorded calls to residential lines.
Many businesses may already have the necessary procedures in place to comply with the new rule, as many of the new requirements, including the written consent requirement, are designed to harmonize FCC regulations with those of the Federal Trade Commission. Still, though the new rule does not go into effect until October 16, 2013, the clock is ticking.
Click here to read more in an article by Margaret Dale and David Munkittrick, members of Proskauer’s Privacy and Data Security Group.
In February of 2013, President Obama signed an executive order with the purpose of creating a cybersecurity framework (or set of voluntary standards and procedures) to encourage private companies that operate critical infrastructure to take steps to reduce their cyber risk (see our blog here). Critical Infastructure Systems such as the electric grid, drinking water, and trains are considered vulnerable to cyber attack, and the results of such attack could be debilitating. The Departments of Commerce, Homeland Security, and Treasury were tasked with preparing recommendations to incentivize private companies to comply with heightened cybersecurity standards. On August 6, 2013 the White House posted its preliminary list of incentives encouraging the adoption of cybersecurity best practices.
The draft framework of incentives is not due until October of this year, when it will be published for public comment. A final version is expected for February of 2014. The August 6th post serves as an interim step, which allows the private sector an opportunity to think about the recommendations and provide feedback.
In the post, Michael Daniel, Special Assistant to the President and the Cybersecurity Coordinator, lists eight ideas, summarized below.
- Cybersecurity Insurance – engage the insurance industry with the goal of creating a competitive cyber insurance market.
- Grants – make participation in the cybersecurity programs a condition or criteria for a federal critical infrastructure grant.
- Process Preference— make participation a consideration in the government’s determination of whether to expedite existing government service delivery.
- Liability Limitation — agencies are looking into whether reducing liability on participants in certain areas (such as tort liability, limited indemnity, higher burdens of proof) would encourage critical infastructure companies to implement the framework.
- Streamline Regulations — streamline existing cybersecurity regulations and develop ways to make compliance easier, such as by eliminating overlaps in existing laws and reducing audit burdens.
- Public Recognition — agencies are exploring whether giving companies the option of public recognition for participation in the programs would work as an incentive.
- Rate Recovery for Price Regulated Industries — speaking to federal, state, and local regulators regarding whether utility rates could be set to allow for recovery for investments related to adopting the cybersecurity framework.
- Cybersecurity Research — research and development to determine where commercial solutions are possible but do not yet exist. The government can then focus on research and development to meet the most pressing cybersecurity issues.
The August 6th report offers an “initial examination” of ways to incentivize the adoption of cybersecurity measures by private companies in the critical infrastructure sector. Discussions with the industry will help determine which direction the government ultimately takes with its cybersecurity framework.
We have heard the well-publicized stories of stolen laptops and resulting violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), and we generally recognize the inherent security risks and potential for breach of unsecured electronic protected health information posed by computer hard drives. We remember to “wipe” the personal data off of our phones or computers before they are disposed, donated, or recycled.
A recent HIPAA settlement offers a costly reminder that other types of office equipment we use regularly have similar hard drives capable of storing confidential personal information.
On August 14, 2013, HHS announced a $1,215,780 settlement with the not-for-profit managed care plan Affinity Health Plan, Inc., stemming from an investigation of potential violations of the HIPAA Privacy and Security Rules relating to an April 15, 2010 breach report filed by Affinity with the HHS Office for Civil Rights (OCR). Affinity’s breach report and OCR’s subsequent investigation revealed that Affinity had impermissibly disclosed the protected health information of up to 344,579 individuals when it returned multiple photocopiers to leasing agents without erasing the photocopier hard drives. Affinity learned of the breach when a representative from CBS Evening News informed the New York health plan that, as part of an investigatory report, CBS had purchased a photocopier previously leased by Affinity and had found confidential medical information on the photocopier’s hard drive. OCR’s investigation indicated that Affinity had failed to assess the potential security risks and implement policies for the disposal of protected health information stored on the photocopier hard drives.
In addition to the financial settlement, the Resolution Agreement includes a corrective action plan (CAP) requiring Affinity to use its “best efforts to retrieve all photocopier hard drives that were contained in photocopiers previously leased by [Affinity] that remain in the possession of [the leasing agent].” The CAP also requires Affinity to conduct a comprehensive risk analysis and implement safeguards to protect electronic protected health information on all of its electronic equipment and systems.
For more than ten years, digital copiers have been capable of storing images of documents. This settlement should serve as a warning to entities and individuals who handle electronic personal health information: any and all equipment capable of storing trace amounts of digital information should be accounted for in risk assessments conducted under the HIPAA Security Rule. All HIPAA Privacy and Security Policies and Procedures Manuals should be updated to include guidelines for safeguarding protected health information retained on digital copiers, scanners, fax machines and other devices whose primary function may not be data storage.
By Ryan Blaney and Kelly Carroll
We’re all familiar with the ads that pop up on the side of our browsers, personalized to highlight things we might be interested in based on our web browsing activity. Marketers and advertisers regularly track consumers’ online activities, interests and preferences and use the information they collect to create targeted ads, meant to appeal to individual consumers based on their behavioral profiles. Some consumers have no objections to this type of targeted advertising, but others do not want their online activities monitored. In response to privacy concerns raised by pervasive online tracking, the U.S. Federal Trade Commission endorsed the implementation of a Do Not Track (“DNT”) mechanism and the World Wide Web Consortium (“W3C”) has been working to develop a DNT technology standard that would allow users to control the tracking of their online activities.
Although little consensus has been reached on a DNT standard, various browsers, including Internet Explorer, Firefox, and Safari, now offer a DNT option that, in theory, permits users to elect not to have information about their web browsing activities monitored and collected. For many browsers, the DNT option exists in the form of a DNT header, an HTTP header which, whenever a user’s browser receives or sends data over the Internet, sends a signal indicating that the particular user does not want to be tracked. The effectiveness of DNT headers is in flux, however, because it relies on the cooperation of the companies receiving the DNT signals to honor the requests. DNT headers merely express a user’s preference; they are not backed by regulatory or legislative authority and nothing requires recipients to honor users’ requests. Indeed, success requires browsers, website publishers, developers and other companies to work together and, although most browsers now have DNT options in place, other companies have been slow to honor DNT headers.
Last year, Twitter made headlines by announcing its decision to honor its users’ DNT headers. When a Twitter user has his browser’s DNT header turned on, Twitter stops collecting the information that would otherwise allow it to tailor suggestions and ads to that user based on his online activities. More recently, Pinterest announced that it will follow suit, becoming the second social media site to refrain from collecting data about its users’ activities across the web if they have DNT headers in place.
As for companies that choose not to honor DNT headers, in a recent settlement of note, PulsePoint, a digital advertising company, paid $1,000,000 to settle charges brought against it by the attorney general of New Jersey and the New Jersey Division of Consumer Affairs for circumventing users’ privacy settings. According to the consent order, PulsePoint bypassed the privacy settings of Safari users whose browsers were set to block third-party ad cookies and covertly placed such cookies on users’ browsers, resulting in as many as 215 million targeted ads. According to PulsePoint, the practice was initiated by a predecessor company and PulsePoint ended the practice immediately upon learning about it.
Regardless of whether a company chooses to honor DNT headers, it is important to ensure that the company’s actions are consistent with whatever privacy policies it has in place.
In a recent decision (deliberation CNIL May 30, 2013 n°2013-139), the French Data Protection Agency (CNIL) sanctioned a company for implementing a CCTV system without informing employees and because the CCTV enabled the constant monitoring of one employee making the recording disproportionate to the goal pursued. The CNIL also sanctioned the company because it failed to implement an adequate level of security of the data housed on its systems.
The agents of the CNIL noticed during an on-site inspection that passwords used within the company to log into its systems, and therefore to access personal data stored within those systems, were simple to crack. Indeed, most of them were only 5 characters and some of them were only the surname or name of the employees and had not been changed since 2011.
The CNIL therefore required that the company implement a data security policy.
After another on-site inspection, the agents of the CNIL noticed that, despite its commitments, the company had not implemented such a policy.
The CNIL concluded that the company did not provide for an adequate level of protection of data given that the passwords were short, simple and not modified.
According to Article 34 of the French Data Protection Act of January 6th, 1978, the data controller shall take all useful precautions, with regard to the nature of the data and the risks of the processing, to preserve the security of the data and, in particular, prevent their alteration and damage, or access by non-authorized third parties.
In a previous post, we highlighted the recommendations enacted by the CNIL to help companies to strengthen the security of their data processing.
In light of the vulnerabilities noticed during the on-site inspections and the failure of the company to properly address them, the company was required by the CNIL to pay a €10,000 fine.
Companies located in France must therefore pay particular attention to their data security policies to make sure that they comply with French data protection law requirements.
Texas recently amended its data breach notification law, Tex BC. Code Ann. § 521.053, to clarify that if a data subject is a resident of a state other than Texas that has its own breach notification law, a company that does business in Texas can notify that data subject either pursuant to Texas law or pursuant to the law of the state of residence. In other words, according to Texas, Texas companies do not have to become familiar with the breach notification laws of other states. (Query whether those other states would agree.)
As we blogged about here, Texas previously amended its breach notification law in September 2012 to specifically require notification of data breaches to residents of states that had not enacted their own law requiring such notification. The amended law was confusing as to which law should apply when the state of residence did have a breach notification law.
The reporting obligations will still apply to persons that “conduct business” in Texas, although the law does not provide guidance on what it means to “conduct business” in Texas. As a result, in the event of a data breach, a company that does business in Texas could be required to timely notify individuals nation-wide or face a fine of up to $250,000 for failing to do so.
In France, the guiding principle is that emails received or sent by an employee through the employer’s company email account are considered “professional”, which means that the employer can access and read them. However, French employers must be cautious before accessing their employees’ professional emails because they are not permitted to access emails that have been identified by the employee as being “ personal” or “ private”. Recently, the French Supreme Court, in a decision of June 19th, 2013 (n°12-12138: http://www.legifrance.gouv.fr/affichJuriJudi.do?oldAction=rechJuriJudi&idTexte=JURITEXT000027596663&fastReqId=1099388011&fastPos=1) addressed this issue in detail.
As background, the distinction between “professional” and “personal” or “private” emails received or sent in the course of business using the company’s information technology or systems is due to the fact that French case law recognizes that employees have a right to privacy, even at the workspace during working hours. As a consequence, employees cannot be banned from using the company’s IT for personal use. The French Data Protection Agency (CNIL) specifies that employees have the possibility to use for personal purposes the IT put at their disposal by their employers, provided that such personal use of the IT is “reasonable”.
If in most cases, given that generally employees mention, in the title of the email, that it is personal, it is quite easy for employers to determine whether an email is professional or personal and therefore whether it can be read or not. Sometimes, however, it is much more difficult.
In the aforesaid decision, an employee brought an action against his employer alleging that the employer wrongfully terminated his employment based on unlawfully accessed personal email files that were stored on his office computer, which was a breach of the employee’s privacy. Specifically, the employee stored his personal emails received and sent from his personal email account on the hard disk of an office computer put at his disposal by his employer. The employee did not identify these saved files as his personal emails.
The employer, suspecting that the employee was working for a competitor, terminated him. To prove that the termination was grounded, the employer put forward that the investigation led by an IT expert of the office computer’s hard drive contained evidence showing the fault of the employee, particularly in the emails that were saved on the computer.
The Court of Appeals ruled that the termination was unfair because the emails had been collected unlawfully by breaching the employee’s privacy, but the French Supreme Court overruled the Court of Appeals, finding that a file created by an employee using the employer’s IT put at his disposal by the employer to carry on his duties is presumed to be professional, unless the employee identifies it as being personal.
The French Supreme Court concluded in its decision that emails and files stored on the professional hard disk of the employee are not automatically considered personal just because they initially come from the personal mailbox of the employee. Those files and emails can therefore be used for a disciplinary termination.
In France, employees should be cautious about what they do with their employer’s IT (including when they put their personal files on employer devices), but employers should also be cautious when determining whether or not an employee’s files or emails are professional or personal.
On June 27, 2013, the NY Court of Appeals held that the state can use GPS tracking to monitor its employees during working hours without a warrant. Click here to read Proskauer’s Employment Law Counseling & Training Group’s discussion of the recent case.
On June 20, 2013, the California Court of Appeal affirmed the dismissal of a putative class action which alleged that Chevron violated California’s Song-Beverly Credit Card Act (“Song-Beverly”) by requiring California customers to enter ZIP codes in pay-at-the-pump gas station transactions in locations with a high risk of fraud. Flores v. Chevron U.S.A. Inc., No. B240477, 2013 WL 3084913 (Cal. Ct. App. June 20, 2013), available here. With certain exceptions, Song-Beverly prohibits requests for personal identification information from customers paying by credit card in California and provides for up to $1,000 in civil penalties per violation.
The plaintiffs filed the suit shortly after the California Supreme Court held that ZIP codes constitute “personal identification information” within the meaning of Song-Beverly. Pineda v. Williams-Sonoma Stores, Inc., 51 Cal.4th 524, 527 (2011), available here. In response to the Pineda decision, the California Legislature passed several amendments to Song-Beverly, including section 1747.08(c)(3)(B) which permits requesting or requiring personal identification information in credit card fuel purchases at automated machines “solely for prevention of fraud, theft, or identity theft.” Section 1747.08(b) defines “personal identification information,” as “information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.”
The parties disputed whether the amendments applied retroactively to the plaintiffs’ complaint. The Court of Appeal did not address this issue, instead resolving the appeal based on a statutory exception that preceded Pineda. Under section 1747.08(c)(4), requesting or requiring personal identification information does not violate Song-Beverly “if personal identification information is required for a special purpose incidental but related to the individual credit card transaction, including, but not limited to, information relating to shipping, delivery, servicing, or installation of the purchased merchandise, or for special orders.” Chevron argued that its use of personal identification information was required for the special purpose of preventing fraud, citing the 80 percent reduction in fraudulent transactions since requiring ZIP codes. Moreover, Chevron asserted, and the plaintiffs did not dispute, that it used the ZIP codes exclusively to prevent fraud and that the data is purged after 90 days once Chevron reconciles the transactions.
In siding with Chevron, the Court of Appeal rejected the plaintiffs’ argument that the special purpose exception applies only if personal identification information “is required for the completion of the transaction.” The Court of Appeal also rejected the plaintiffs’ assertion that preventing fraud is insufficiently similar to the statute’s listed “special purposes,” pointing out that the statute states that the relevant special purposes are not limited to those listed.
This post was prepared by Brent Canter, a Law Clerk in Proskauer’s Los Angeles office.
We pack tons of personal and sensitive information in our DNA. While the human genome has been mapped for a decade, legal issues of genetic privacy are just beginning to rise. Earlier this month, the U.S. Supreme Court decided what Justice Alito described as “perhaps the most important criminal procedure case that this court has heard in decades.” The case addressed whether police could constitutionally take a DNA sample from a person arrested for a serious crime, and in a 5-4 decision, the Court ruled that DNA collection serves the legitimate government interest in identifying arrestees. In the majority opinion, however, Justice Kennedy noted that, “If in the future police analyze samples to determine, for instance, an arrestee’s predisposition for a particular disease or other hereditary factors not relevant to identity, that case would present additional privacy concerns not present here.”
This use of genetic information – to determine a person’s medical predispositions – was at the heart of two recent cases brought by the Equal Employment Opportunity Commission (EEOC). The cases alleged violations of the Genetic Information Nondiscrimination Act (GINA), which applies to any employer with at least 15 employees, and makes it illegal to discriminate against employees or job applicants based on genetic information. Under GINA, “Genetic Information” includes family medical history, as a person’s family medical history can illuminate many genetic proclivities.
The first of the two cases brought by the EEOC was the first time the EEOC had alleged genetic discrimination. The suit accused Fabricut, Inc., an Oklahoma-based company, of refusing to hire a job applicant because it thought she had carpal tunnel syndrome (CTS). The EEOC alleged that Fabricut’s contract medical examiner inquired into the medical history of the applicant’s family. Fabricut then told the applicant that she needed to be evaluated for CTS by her personal physician. She complied, her physician concluded she did not have CTS, but Fabricut nevertheless rescinded its job offer after its contract medical examiner concluded Ms. Jones did have CTS. This, the EEOC alleged, violated GINA and the Americans with Disabilities Act (ADA), which prohibits discrimination against qualified individuals with disabilities or individuals who are incorrectly regarded as having disabilities. Fabricut settled the suit.
Soon after the EEOC settled with Fabricut, the EEOC filed a class action suit against a different employer on similar grounds. The class action alleges that Founders Pavilion, Inc., a New York-based company violated GINA, the ADA, and Title VII of the Civil Rights Act by asking applicants for genetic information, including family medical history, during the hiring process.
While the Supreme Court has opened the doors to collecting DNA samples from certain arrestees, it is clear that the EEOC is focusing on genetic privacy. The EEOC’s Strategic Enforcement Plan includes “addressing emerging and developing issues,” and genetic privacy is certainly one of those issues.
In January 2011, David Cheng (Plaintiff) filed a lawsuit against his former co-worker and fellow radiologist, Laura Romo (Defendant), alleging a violation of the Stored Communications Act (SCA) and Massachusetts privacy law. After the U.S District Court of Massachusetts denied Defendant’s motion for summary judgment on both counts, the case went to trial and the verdict came down at the end of April. The jury found that Defendant violated both the SCA and Massachusetts privacy law, and awarded Plaintiff damages totaling $325,000. This case is significant in that courts have struggled to interpret the language of the SCA yet the jury very clearly decided in favor of Plaintiff.
By way of background, Plaintiff, Defendant and Defendant’s husband worked at Advanced Radiology (“Advanced”), a medical practice that provides medical imaging services. Advanced did not provide its employees with email accounts so, in 2000, when Plaintiff joined Advanced, he set up a Yahoo! email account that he used for both personal and professional purposes. In July 2000, Plaintiff gave Defendant the password to his Yahoo! email account so that she could review images in connection with their duties at Advanced. Plaintiff did not in any way limit Defendant’s access to his email account. At her deposition, Defendant testified that she had accessed Plaintiff’s email about ten times per year to review consultant’s reports, but that she did not remember accessing Plaintiff’s email account from 2002-07.
By 2008, Defendant and her husband’s relationship with Advanced had deteriorated and they each filed a lawsuit against Advanced. Prior to filing, Defendant accessed Plaintiff’s Yahoo! email account and provided at least ten emails to her husband, which were then produced in his lawsuit against Advanced. Some of these emails contained content personal to Plaintiff, including information related to a relationship Plaintiff had with a non-manager level employee at Advanced.
The District Court denied Defendant’s motion for summary judgment both with respect to her alleged violation of the SCA and her alleged violation of Massachusetts privacy laws.
The SCA prohibits intentionally accessing without authorization a facility through which an electronic communication service is provided or intentionally exceeding an authorization to access that facility. 18 U.S.C. § 2701(a). With respect to Plaintiff’s claim that Defendant violated the SCA, the District Court noted that “courts have struggled with what it means for a person to ‘access without authorization’ and ‘exceed an authorization to access’ a facility.” But the District Court denied to resolve the issue on summary judgment, stating that “there is a disputed issue of material fact for the factfinder to resolve as to whether [Defendant] was authorized to access [Plaintiff’s] account.”
M.G. L. chap. 214, § 1B states that a “person shall have a right against unreasonable, substantial or serious interference with his privacy.” With respect to Plaintiff’s claim that Defendant violated Massachusetts privacy law, the District Court denied Defendant’s motion for summary judgment, noting that “there [were] disputed issues of material fact as to whether [Defendant] had a reasonable expectation of privacy in his email messages and whether [Plaintiff’s] actions in reading these messages and in providing them to her husband were an unreasonable, substantial or serious interference with [Plaintiff’s] privacy.”
The jury resolved both allegations in Plaintiff’s favor, awarding $325,000 in damages – including $50,000 to compensate Plaintiff for his actual damages; $150,000 for the dissemination of emails from his account; and $125,000 in punitive damages. But the case is not over yet, as the District Court must decide two pending motions, including Plaintiff’s motion for attorney’s fees and costs and Defendant’s motion for judgment as a matter of law. Both decisions will be instructive in terms of the impact this case will have on privacy litigation in Massachusetts.
Colorado on May 12, 2013 and Washington on May 21, 2013 joined the likes of California, Maryland, Utah and New Mexico by prohibiting employers from requesting that prospective and current employees disclose their username and password to their personal social media accounts. Our Labor & Employment group discussed the Colorado law here and the Washington law here.