• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Wednesday, 08 Oct 2014 02:33

Stacy Barrow

On September 30, 2014, California took further steps to protect the personal information of its residents by amending several sections of its breach notification and information security laws (Cal. Civ. Code §§ 1798.81.5, 1798.82 and 1798.85).  The amended law, which is effective January 1, 2015, updates existing law in three significant ways:

  1. Under current law, businesses that own or license personal information about a California resident must implement reasonable security procedures and practices appropriate to the nature of the information.  This requirement is expanded to also include entities that merely “maintain” such personal information. 
  2. Under current law, businesses that own or license personal information may be required to issue a security breach notification to affected individuals in the event of a breach where an individual’s social security number or driver’s license number may have been exposed.  The amended law provides that if the entity providing the notification was the source of the breach, an offer to provide identity theft prevention or mitigation services, if any, must be made at no cost to the affected person for at least 12 months, along with all information necessary to take advantage of the offer.  The breach notification requirement does not apply to entities that merely “maintain” personal information.  Given the words “if any,” and the ambiguity as to whether those words refer to the availability of credit monitoring services in the marketplace or to whether the business has chosen to offer it, it is not clear from the law whether this constitutes an absolute requirement to offer credit monitoring services to affected individuals.  That said, we note that the bill’s co-author, Assemblyman Roger Dickinson, stated his view in a recent interview with Law360 that the offer to provide credit monitoring services is mandatory when a driver’s license number or social security number was breached.
  3. Under current law, a business may not publicly disclose an individual’s social security number or engage in other acts that might compromise its security.  The amended law clarifies that except as permitted by law, a person or entity may not sell, advertise for sale, or offer to sell an individual’s social security number.

For purposes of #1 above, the amended law defines the term “maintain” to include personal information that a business maintains but does not own or license.  This appears to include entities that host or otherwise retain data for others, such as “cloud” storage companies and businesses that collect information but do not own or license it.  These entities will need to implement and maintain reasonable security procedures and practices to the extent that the data it collects contains personal information.  That said, the law provides that such security procedures and practices are scalable; they should be “appropriate to the nature of the information, to protect the personal information from unauthorized access, destruction, use, modification, or disclosure.”

Author: "Stacy Barrow" Tags: "California, Data Breaches, Identity Thef..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 03 Oct 2014 00:01

Carly Ziegler

Traditionally, a person’s most valuable assets to be distributed upon death consisted of tangible items such as real property, cash, jewelry and personal effects of sentimental value like photographs and letters.  However, the advent of the digital age has brought a shift from file cabinets, mailmen and photo albums to cloud storage, e-mail accounts and online photo streams.  Today, virtually everyone has at least some assets that are not physical, but are stored as data and accessed via the Internet.  “Digital assets” may include, for example, text messages, instant messaging accounts, e-mails, documents, audio or video images and sounds, social media content, health insurance records, source code, software, databases, online bank accounts, blogs, and the user names and passwords necessary to access online accounts, among other things.  More specifically, consider a person’s PayPal or Venmo accounts, which might contain large sums of money, or Google, Yahoo, Facebook or Instagram accounts, which might contain letters, pictures, videos and other items of intrinsic value.  The steady growth of most individuals’ online presence has given rise to a novel legal issue – authority over administering the digital assets and accounts of an account holder upon death or disability. 

In July 2014, the Uniform Law Commission – a group of lawyers appointed by each state to help create standardized laws – released the Uniform Fiduciary Access to Digital Assets Act (the “UFADAA”), a model statute aimed at ensuring that account holders can retain control of their digital property and plan for its ultimate disposition after their death.  Under the UFADAA, a fiduciary managing an individual’s tangible assets may also manage that person’s digital assets with the same right of access as the account holder himself, so long as the account holder did not otherwise prohibit such access by will, trust, or other legal document.  The UFADAA provides that a fiduciary can gain access to, but not control of, a person’s digital accounts for the purpose of carrying out his fiduciary duties and remains subject to other laws, such as federal copyright and privacy laws.  For example, an executor may access a decedent’s e-mail account to take inventory of estate assets, but may not publish the decedent’s confidential communications or make copies of or distribute copyrighted content from the account. 

Under the UFADAA, if the company that controls or stores a person’s digital assets or accounts (the “operator”) has a terms of service agreement, privacy policy, end user license agreement or other analogous agreement with the account holder that limits a fiduciary’s access to the account holder’s digital assets or accounts without requiring affirmative action by the account holder, the relevant provisions will be void as against public policy.  Although the UFADAA allows an account holder to knowingly and intentionally opt out of fiduciary access in such an agreement, boilerplate provisions cannot be used to override the model statute.  To gain access to an account holder’s digital assets under the UFADAA, the fiduciary must send a request to the operator along with a certified copy of the document granting fiduciary authority (e.g., letter of appointment, court order, certification of trust).  Operators receiving ostensibly valid requests for access are immune from liability for good faith compliance. 

In August, Delaware became the first state to enact a law modeled after the UFADAA, with the passing of its Fiduciary Access to Digital Assets and Digital Accounts Act (the “Delaware Act”) set to become effective January 1, 2015.  The Delaware Act provides that a fiduciary with authority over an account holder’s digital assets or accounts will have the same access as the account holder and defines “digital assets” and “digital accounts” broadly. 

While some see the UFADAA and the Delaware Act as a welcome solution to the estate administration issue raised by the rise of digital property and electronic communications, industry groups have been quick to criticize this new legislation as encroaching on the privacy rights of the deceased or incapacitated, noting that online companies already have privacy tools in place to address the issue.  In a recent blog post, Yahoo’s Senior Legal Director for Public Policy criticized the UFADAA for the “faulty presumption that the decedent would have wanted the trustee to have access to his or her communications” and for “set[ting] the privacy default at zero.”  According to Yahoo, its terms of service was crafted with its users’ privacy in mind and the privacy of those who are party to sensitive e-mail communications.  Yahoo’s terms of service provides, in relevant part:

No Right of Survivorship and Non-Transferability. You agree that your Yahoo account is non-transferable and any rights to your Yahoo ID or contents within your account terminate upon your death. Upon receipt of a copy of a death certificate, your account may be terminated and all contents therein permanently deleted.

Yahoo is not the only company that has publicly opposed the UFADAA and the Delaware Act – Facebook has stated that it agrees with the concerns raised by Yahoo and Google previously co-signed an industry letter to Delaware’s governor, before the Delaware Act was enacted, urging that he veto the proposed law. 

There is no doubt that the Internet age has radically transformed the nature of property and communication and, regardless of where applicable state law come out on the UFADAA, it is important to consider one’s desires with respect to access to digital accounts and assets in planning for death or incapacity. 

Author: "Carly Ziegler" Tags: "Cloud Computing, Electronic Communicatio..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 24 Sep 2014 15:20

Cecile Martin

The European Court of Justice, in a decision rendered on May 13, 2014, held that search engines are considered data controllers under the Directive of October 24, 1995 on data protection, and as such they must provide data subjects with a “right to be forgotten.”

In that ruling, the European Court outlined that an individual is entitled to request search engines to have links and URLs removed from the lists of results displayed following a search on the basis of a person’s name.  This means that to enforce the right to be forgotten, an individual does not have to request the deletion of content by a website editor, but can make the request of the search engines instead, thereby making the content more difficult to locate on the Internet.

However, the European Court also made it clear that the right to be forgotten is not an absolute right, and must be evaluated on a case-by-case basis taking into consideration the extent to which it interferes with the economic interests of search engines.  In the ongoing discussions regarding the European regulation which will replace the European Directive of 1995, the right to be forgotten is reinforced, raising concerns regarding countervailing rights such as the right of free speech.   As a consequence, the European Court stressed that the right to be forgotten or de-listed depends mainly on the nature of the information, how sensitive it is, and the public interest in accessing the information.

Further to that decision and in order to be compliant, search engines have published specific forms to be filled out by claimants who want to request the deletion of their information from search results.  However, several complainants have received negative responses from search engines and have submitted complaints to the European Data Protection agencies throughout Europe claiming that search engines did not comply with the European Court of Justice’s decision.

Following these complaints, the European Data Protection agencies, after meeting with search engines representatives, decided to implement a common tool box to handle complaints.  That common toolbox will allow for a coordinated approach to the handling of claims alleging that search engines have not adequately responded to “right to be forgotten” requests.

This coordination will be implemented by a network of dedicated contact persons in charge of developing common case handling criteria to manage complaints.

The network will provide the data protection authorities with:

-          a common record of decisions taken on complaints,

-          a dashboard to help identify similar cases and difficult cases.

This cooperation between the European Data Protection agencies is intended to bring about the best possible coordination of the European Member States in their response to these complaints, and to ultimately effectuate and enforce the right to be forgotten.  Furthermore, this coordination is intended to enable complainants to receive a uniform response to their complaints from data protection authorities throughout Europe.

Author: "Cecile Martin" Tags: "Data Privacy Laws, European Union, data ..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 17 Sep 2014 18:14

Maritza Jean-Louis

A substantial rise in schools’ use of online educational technology products has caused educators to become increasingly reliant on these products to develop their curricula, deliver materials to students in real time, and monitor students’ progress and learning habits through the collection of data by third-party cloud computing service providers.  Unfortunately, with these advances come the data security concerns that go hand-in-hand with cloud computing—such as data breaches, hacking, spyware, and the potential misappropriation or misuse of sensitive personal information.  With the Family Educational Rights and Privacy Act (FERPA)—federal legislation enacted to safeguard the privacy of student data—in place for four decades, the education sector is ripe for new standards and guidance on how to protect students’ personal information in the era of cloud computing. California has tackled this issue head on, with the passage of two education data privacy bills by its legislature on August 30, 2014.  Senate Bill 1177 and Assembly Bill 1442 (together, the Student Online Personal Information Protection Act (SOPIPA)) create privacy standards for K-12 school districts that rely on third-parties to collect and analyze students’ data, and require that student data managed by outside companies remain the property of those school districts and remain within school district control.California has long been an innovator in the realm of privacy law, having enacted the nation’s first data breach notification law in 2003, and more recently, a 2013 law granting children and young adults the right to delete posted content from online services, mobile apps or other digital services for which they are registered users.  The passage of SOPIPA is a significant milestone in education data privacy law reform. Acting as a measure to fill in FERPA’s gaps, the bills place restrictions on companies that operate online sites, online applications, or provide web-based services to K-12 students. 

Schools have always kept records of students to track their individual progress, as well as to create databases aggregating information such as test scores, attendance records and demographic data in order to meet benchmarks and develop curricula.  Whereas teachers and school administrators previously aggregated student records themselves, it is now the norm for educators to outsource this task, as management of such databases can be more efficiently and systematically performed by privately-owned and operated education service providers, websites, and app makers.  According to estimates provided by the Software and Information Industry Association, a U.S.-based software and information trade association, the market for education software for pre-K through 12th grade students was approximately $8 billion in the 2011-12 school year, up $500 million from only two years prior. One of the reasons for this increase is the fact that school districts often lack the technical expertise to create and manage these databases.  The development of cloud-based computing and technology products that operate online has resulted in an increased number of third-party operators that collect and possess sensitive student data, including grades, disciplinary history, grades and demographic information. 

The challenge with this practice is that third-party operators are not subject to the provisions of FERPA.  Included among FERPA’s requirements is a mandate for schools that receive federal funding to: (i) allow parents access to their children’s files in order to request corrections; and (ii) obtain parents’ consent before sharing such information.  Enacted in 1974, FERPA is ill-equipped to adequately safeguard against 21st century education data security concerns.  Under the law, “an educational agency or institution may disclose personally identifiable information from an education record only on the condition that the party to whom the information is disclosed will not disclose the information to any other party without the prior consent of the parent or eligible student.” 34 CFR § 99.33(a)(1). Plainly stated, FERPA applies only to the schools themselves, not to third-party cloud computing or online service providers.  If a school provides student data to such a service provider, the regulation allows the school to disclose that data to the provider without parental or student consent because a “contractor, consultant, volunteer or other party” to whom a school or school district has outsourced institutional functions “may be considered a school official” and thus, is shielded from liability, even if the Department of Education (DOE) alleges a FERPA violation against the school or school district. 34 CFR 99.31 § 99.31(a)(1)(i)(B)(1)-(3).  Even more troubling is that only the DOE can sue a school for FERPA violations; parents and students have no cause of action under the law.

Why is this an issue? Currently, information provided to a school about a child’s medical history, behavioral issues or academic performance—potentially damaging information—could be leaked, exposed by hackers, or more likely, sold to advertisers by the private companies hired by the schools themselves. In a December 2013 report entitled Privacy and Cloud Computing in Public Schools, Fordham Law School’s Center on Law and Information Privacy surveyed twenty school districts across the country and uncovered the following:

  • 95% of the school districts surveyed rely on cloud computing for multiple functions, including monitoring student performance, providing support for classroom activities, data hosting and student guidance.
  • Only 25% of the school districts inform parents of their use of cloud services.
  • 20% of school districts do not have policies governing the use of their online services.
  • Only 25% of the contracts between the school districts and cloud service providers give schools the right to audit and inspect the service provider’s practices with respect to the student data collected.
  • Fewer than 7% of the contracts between the school districts and cloud service providers restrict the sale or marketing of student information by vendors.
  • Only one contract required the cloud service provider to notify the school district in the event of a data security breach.

California’s law, if signed by Governor Jerry Brown, would step in. Under SOPIPA, any operator of a company to whom student data is provided will be prohibited from using, sharing, disclosing, or compiling personal information about a K-12 student for any purpose other than the K-12 school purpose.  When an operator is no longer using the information for a legitimate educational purpose, the student requests deletion, or the student ceases to be a student at the school or school district using the operator’s services, the student’s personal information must be deleted.  Finally, SOPIPA creates a private right of action for parents or students alleging that an online service provider has violated the statute.  Under the law, a SOPIPA violation would be an unlawful business practice, allowing individuals as well as government entities to seek judicial remedies. 

While it is expected that Governor Brown will sign SOPIPA into law soon, the fate of education data privacy across the rest of the U.S. remains to be determined.  Although data privacy is a perennial hot-button issue, the dialogue surrounding protection of personal data against its appropriation or misuse tends to focus not on education data privacy, but on the interactions between businesses and consumers, and the ways in which technological advances have made both groups vulnerable to data breaches, hackers, malware and the like.  The tide may be turning, however.  In February 2014, the DOE released guidance on FERPA compliance aimed at providing educators and parents alike with information on how to protect student privacy while using online educational services.  While well intentioned, the DOE’s guidance only went so far as to acknowledge that advances in technology have changed the way student data is used and “raises new questions” about privacy protection.  In addition, as of April 2014, 83 bills concerning education data security were being considered in 32 states, according to the Data Quality Campaign, a nonpartisan educational advocacy organization. Perhaps most promising is the Protecting Student Privacy Act (PSPA), bipartisan legislation introduced in late July by Senators Edward J. Markey (D-MA) and Orrin Hatch (R-UT).  The PSPA would amend FERPA to mandate the requisition of federal funding if a school district fails to put in place data security safeguards to protect and provide parents with greater access to sensitive student data held by third parties to whom educational functions are outsourced.  In short, PSPA would apply rules to the data that schools share with outside parties similar to the constraints imposed by FERPA on schools themselves, echoing the California legislation.  More than anything, these developments signal that educators and legislators must work together to strike a balance among student privacy, technological innovation and student data needs.

Author: "Maritza Jean-Louis" Tags: "California, Children's Online Privacy Pr..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 04 Sep 2014 10:46

Laura Shovlowsky

In a recent article published by Law 360, Proskauer litigation associate Courtney Bowman outlines how companies can make inroads in the e-commerce market in the Middle East and North Africa (MENA).  Although often overlooked, the region’s relative wealth and level of internet penetration make its more stable areas attractive markets for those companies willing to undertake the steps necessary to understand the region’s cultural nuances and customer preferences.  Two of the most significant barriers to e-commerce growth in MENA is the widespread reluctance of customers to shop online due to fears about the security of online transactions, as well as the low rate of credit card use in the region.  As the article notes, however, e-commerce companies willing to offer solutions tailored to address these concerns, such as cash cards and m-payment systems, may be poised to establish a potentially lucrative presence in this part of the world.

Author: "Laura Shovlowsky"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 03 Sep 2014 07:26

Laura Shovlowsky

Corporate Counsel published an article authored by Nolan Goldberg, Senior Counsel, Intellectual Property and Technology, concerning the recent decision compelling Microsoft to produce e-mails located on foreign servers. The article, entitled “Is the Flap Over Microsoft Emails in Ireland Overblown?”, provides a counter-point to critics who believe that Judge Preska’s Order will have broad implications for the U.S. technology industry.

Author: "Laura Shovlowsky"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 19 Aug 2014 17:49

Margaret A. Dale

Capital One Financial Corp. (“Capital One”) and three collection agencies have agreed to pay one of the largest settlement amounts in history — $75.5 million — to end a consolidated class action lawsuit alleging that the companies used an automated dialer to call customers’ cellphones without consent in violation of the twenty-two-year-old Telephone Consumer Protection Act (“TCPA”). Judge Holderman of the Northern District of Illinois preliminarily approved the settlement in late July. 

TCPA Allegations and the Proposed Settlement

In 2012, separate cases against each defendant were consolidated by the U.S. Panel on Multidistrict Litigation because the collection firms were collecting debts on behalf of Capital One. The allegations were largely the same:  that the companies used autodialers and/or pre-recorded messages in calls to cell phones without the consumers’ express consent.

Without admitting any wrongdoing, according to the settlement agreement, Capital One will pay $73 million into the settlement fund with the other three companies contributing approximately $2.5 million. According to estimates provided by class counsel in its July 14th court submission, the proposed agreement would provide between $20 and $40 to each member of the class, which is estimated to include about 21 million people, and is defined to include all people in the United States who received a call from Capital One’s dialers to a cellphone from an automatic telephone dialing system with an attempt to collect on a credit card debt from January 2008 to June 2014 and those who received calls from participating vendors from February 2009 to June 2014.

The TCPA provides redress for those who receive unsolicited telephone calls, texts or faxes, and includes statutory penalties of $500 per violation and $1,500 for willful violations. As such, commentators have noted what class counsel acknowledged to the court; i.e., the settlement fund “does not constitute the full measure of statutory damages potentially available to the class,” but counsel argued that this fact “should not weigh against preliminary approval.”

If the settlement is approved, up to 30 percent of the settlement amount (about $22.5 million) will be awarded to the consumers’ attorneys, and each of the five lead plaintiffs each will receive no more than $5,000.

In addition to any monetary award, the settlement identifies the “core relief” as changes to Capital One’s business practices, and notes that Capital One already has “developed and implemented significant enhancements to its calling systems designed to prevent the calling of a cellular telephone with an autodialer unless the recipient of the call has provided express consent.”

In seeking preliminary approval of the proposed settlement, plaintiffs’ attorneys noted that the parties remained sharply divided on many issues including “three critical ones.” The disagreements include the defenses that Capital One had raised, including: (i) whether the Capital One customer agreement provided the prior express consent to make automated calls to class members’ cell phones; (ii) whether the statute allows “prior express consent to be obtained after the transaction that resulted in the debt owed,” and (iii) whether the action could fairly be maintained as a class action given the predominance of individual issues.

It bears mentioning that the Capital One litigation was filed prior to the adoption of the October 2013 amendments to the regulations implementing the TCPA. Under the new rules now in effect, for profit-businesses have to acquire “prior express written consent” before making any type of call or sending any text message using autodialers or prerecorded voices to cellphones.

The Capital One settlement is subject to formal approval, including notification to the class. The final hearing on the settlement is set for December 9, 2014.

Author: "Margaret A. Dale"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 15 Aug 2014 18:20

Rohit Dave

In April, Microsoft tried to quash a search warrant from law enforcement agents in the United States (U.S.) that asked the technology company to produce the contents of one of its customer’s emails stored on a server located in Dublin, Ireland. The magistrate court denied Microsoft’s challenge, and Microsoft appealed. On July 31st, the software giant presented its case in the Southern District of New York where it was dealt another loss.

U.S. District Judge Loretta Preska, after two hours of oral argument, affirmed the magistrate court’s decision and ordered Microsoft to hand over the user data stored in Ireland in accordance with the original warrant. Microsoft argued that the warrant exceeded U.S. jurisdictional reach. However, the court explained that the decision turned on section 442(1)(a) of Restatement (Third) of Foreign Relations. The provision says that a court can permit a U.S. agency “to order a person subject to its jurisdiction to produce documents, objects or other information relevant to an action or investigation, even if the information or the person in possession of the information is outside the United States.” Because Microsoft is located in the U.S. , the information it controlled abroad could be subject to domestic jurisdiction.

Microsoft had the support of large U.S. technology companies, including Apple, AT&T and Verizon. The larger issue for these companies lies in the U.S. government’s power to seize data and content held in the cloud and stored in locations around the world. When a conflict arises between the data sharing laws of the country where the servers are located and U.S. law, it can put these companies in the difficult position to choose to follow one country’s laws over the other.

Microsoft further argued that the ramifications for international policy are substantial. The company argued that compelling production of foreign stored information was an intrusion upon Irish sovereignty. It said that the decision could be interpreted by foreign countries as a green light to make similar invasions into data stored in the U.S. However, Judge Preska dismissed these concerns as diplomatic issues that were incidental and not of the court’s immediate concern.

The order has been stayed pending appeal.

Author: "Rohit Dave"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 14 Aug 2014 16:51

Jessica Goldenberg

On August 7, 2014 the PCI Security Standards Council issued new guidance to supplement PCI DSS Requirement 3.0 and help organizations reduce the risks associated with entrusting third-party service providers (“TPSPs”) with consumer payment information.  More and more merchants use TPSPs to store, process and transmit cardholder data or manage components of the entity’s cardholder data environment.  A number of studies have shown that breach is tied increasingly to security vulnerabilities introduced by third parties.  To combat such risk, a PCI special interest group made up of merchants, banks and TPSPs, together representing more than 160 organizations, created practical guidelines for how merchants and their business partners can work together to comply with the existing PCI standard and protect against breach.

Below are some high-level recommendations found in the “Information Supplement: Third-Party Security Assurance”:

TPSP Due Diligence: Conduct due diligence and risk assessment when engaging TPSPs to determine whether the skills and experience of the TPSP are appropriately suited to the task. Ask:

  • What technology and system components are used by the TPSP for the services?
  • Does the TPSP use other third parties?
  • What other core processes or services are housed in TPSP facilities?
  • How many facilities does the TPSP have where cardholder data will be located?
  • Consult with your “acquiring bank”, “merchant bank”, or “acquiring financial institution” (each an “Acquirer”) to ensure the TPSP services are approved.
  • Review the participating payment card brand service-provider listings and websites as well as the PCI DSS validation documents.
  • Perform a risk assessment on the TPSP based on industry-accepted methodology.

Engaging the TPSP: Implement a process for engaging TPSPs.

  • Set forth the expectations of all parties involved and review expectations at least annually so as to keep a consistent and mutually agreed upon mode of operation.
  • Assess scope of TPSP’s responsibility and consider including contractual provisions in documents with TPSPs that require evidence sharing.
  • Establish a communication schedule so that changes are communicated to the appropriate people in a timely manner.
  • Track how the TPSP’s services and products match up with the PCI DSS requirements.

Written Agreements, Policies and Procedures: Once a TPSP is chosen, the entity and the TPSP should memorialize the agreement in writing.

  • If a TPSP claims its services are PCI DSS Compliant, consider documenting such compliance, the date of compliance assessment and any components that were excluded from the assessment.
  • An entity should keep in mind all regional requirements that apply, such as state-specific requirements and all legislative considerations such as definitions of protected information and breach-notification thresholds.
  • Review agreements with Acquirers to ensure TPSPs are meeting additional requirements.
  • Review compliance programs for each payment card brand to make sure the TPSP is in compliance.
  • Keep industry specific regulations in mind.
  • Make TPSP aware of the company incident response plan, its requirements and the allocation of responsibility in the case of a suspected data breach.
  • Consider what requirements and responsibilities will continue to impact TPSP even after the engagement has formally ended (e.g. if a TPSP continues to store an entity’s cardholder data as part of a backup system).

Monitor Third-Party Service Provider Compliance Status:  Develop a robust compliance monitoring program and document it.

  • Make sure all resources involved in monitoring understand the scope of the cardholder data environment and establish a deliverable for the TPSP.
  • Set forth a procedure for maintaining the TPSP list which includes information such as name and primary points of contact at the TPSP, specific services provided, last date of review, etc.
  • Consider including the following in your TPSP monitoring procedure: a list of evidence and supporting documentation that will be collected from the TPSP, a detailed description of the PCI DSS compliance status, a report template, details describing how status review results are to be shared and approved, and policies for retention of monitoring program data.

By properly implementing a third-party assurance program a company can help ensure that data is kept in a safe and compliant manner.

Author: "Jessica Goldenberg"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 13 Aug 2014 23:02

Erin Staab

On July 23, 2014, the Massachusetts Attorney General announced a consent judgment with an out-of-state Rhode Island hospital, Women & Infants Hospital of Rhode Island (“WIH” or the “Hospital”), resolving a lawsuit against WIH for violations of federal and state information security and privacy laws involving the loss of over 12,000 Massachusetts residents’ sensitive patient health records.  The regulations and laws at issue were Mass. G.L. c. 93A, Mass. G.L. c. 93H and its implementing regulations codified at 201 C.M.R. 17.00 et. seq., as well as federal regulations under the Health Insurance Portability and Accountability Act (“HIPAA”).

Massachusetts’ data security regulations 201 C.M.R. 17.00 et. seq. are among the most comprehensive in the country.  When the regulations first went into effect in March of 2010, many wondered whether the Massachusetts Attorney General would pursue actions against out-of-state enterprises given the regulations’ unique reach to all “persons” or entities inside or outside of Massachusetts that own or license the personal information of Massachusetts residents.  Since 2010, however, the Massachusetts Attorney General has predominately focused efforts on data breaches of Massachusetts-based businesses—launching enforcement proceedings against Massachusetts hospitals, a major Boston restaurant group, and a medical billing practice and associated medical providers.  

In 2011, WIH misplaced nineteen backup tapes from two prenatal centers—one in Providence, Rhode Island and one in New Bedford, Massachusetts.  The tapes contained personal information and protected health care information, including patients’ names, dates of birth, Social Security numbers, dates of medical examinations, physicians’ names and ultrasound images, for 12,127 Massachusetts residents and approximately 1,200 Rhode Island residents.  The Massachusetts Attorney General’s Office cited to “deficient employee training and internal policies” which prevented the breach from being discovered and reported in a timely manner.  The Hospital did not discover that the tapes were missing until the spring of 2012 and failed to report the breach to consumers and the Massachusetts Attorney General’s Office until the fall of 2012.

The consent agreement requires the Hospital to pay $150,000 to the Commonwealth of Massachusetts and to take steps to ensure compliance with state and federal security laws, including hiring an outside firm to perform audits and maintaining an up-to-date inventory of all locations, custodians, and descriptions of unencrypted electronic media and patient charts containing personal information.  Unlike Massachusetts, however, the Rhode Island Attorney General did not bring a civil suit against WIH, stating that under the Rhode Island identity theft protection law, the Attorney General was satisfied by the actions taken by the hospital to notify Rhode Island residents potentially impacted by the data breach and to offer them one year of credit monitoring.  This may be a sign of Massachusetts’ more aggressive approach to privacy and data security enforcement.

The case is significant because it represents one of the first Massachusetts enforcement actions against an out-of-state entity under both Massachusetts regulation 201 C.M.R. 17.00 and the new provisions of the Health Information Technology for Economic and Clinical Health (“HITECH”) Act.  The HITECH Act provides state attorneys general with the authority to enforce out-of-state violations of HIPAA, including disclosure of Protected Health Information (“PHI”), on behalf of state residents.  Thus, this case also represents the continued efforts of state attorneys general to use their relatively new enforcement power to enforce HIPPA under HITECH.

If this consent judgment is representative of future privacy enforcement proceedings launched by the Massachusetts Attorney General, then businesses outside the Commonwealth that hold relevant privacy information may be well-advised to broadly re-examine their data security procedures, including preventative measures, to avoid running afoul of Massachusetts’ strict data security regulations.  Furthermore, any business entity that handles PHI under the protection of HIPPA and the HITECH Act may want to undergo a similar internal data security review given the increasing frequency of enforcement proceedings by attorneys general nationwide.

Author: "Erin Staab" Tags: "Data Breaches, Data Privacy Laws, HIPAA,..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 08 Aug 2014 15:36

Shawn Ledingham

As we’ve previously reported, cyber risks are an increasingly common risk facing businesses of all kinds.  In a recent speech given at the New York Stock Exchange, SEC Commissioner Luis A. Aguilar emphasized that cybersecurity has grown to be a “top concern” of businesses and regulators alike and admonished companies, and more specifically their directors, to “take seriously their obligation to make sure that companies are appropriately addressing those risks.”

Commissioner Aguilar, in the speech delivered as part of the Cyber Risks and the Boardroom Conference hosted by the New York Stock Exchange’s Governance Services department on June 10, 2014, emphasized the responsibility of corporate directors to consider and address the risk of cyber-attacks.  The commissioner focused heavily on the obligation of companies to implement cybersecurity measures to prevent attacks.  He lauded companies for establishing board committees dedicated to risk management, noting that since 2008, the number of corporations with board-level risk committees responsible for security and privacy risks had increased from 8% to 48%.  Commissioner Aguilar nevertheless lamented what he referred to as the “gap” between the magnitude of cyber-risk exposure faced by companies today and the steps companies are currently taking to address those risks.  The commissioner referred companies to a federal framework for improving cybersecurity published earlier this year by the National Institute of Standards and Technology, which he noted may become a “baseline of best practices” to be used for legal, regulatory, or insurance purposes in assessing a company’s approach to cybersecurity.

Cyber-attack prevention is only half the battle, however.  Commissioner Aguilar cautioned that, despite their efforts to prevent a cyber-attack, companies must prepare “for the inevitable cyber-attack and the resulting fallout.”  An important part of any company’s cyber-risk management strategy is ensuring the company has adequate insurance coverage to respond to the costs of such an attack, including litigation and business disruption costs.

The insurance industry has responded to the increasing threat of cyber-attacks, such as data breaches, by issuing specific cyber insurance policies, while attempting to exclude coverage of these risks from their standard CGL policies.  Commissioner Aguilar observed that the U.S. Department of Commerce has suggested that companies include cyber insurance as part of their cyber-risk management plan, but that many companies still choose to forego this coverage.  While businesses without cyber insurance may have coverage under existing policies, insurers have relentlessly fought to cabin their responsibility for claims arising out of cyber-attacks.  Additionally, Commissioner Aguilar’s speech emphasizes that cyber-risk management is a board-level obligation, which may subject directors and officers of companies to the threat of litigation after a cyber-attack, underscoring the importance of adequate D&O coverage.

The Commissioner’s speech offers yet another reminder that companies should seek professional advice in determining whether they are adequately covered for losses and D&O liability arising out of a cyber-attack, both in prospectively evaluating insurance needs and in reacting to a cyber-attack when the risk materializes.

Read Commissioner Aguilar’s full speech here.

Author: "Shawn Ledingham" Tags: "Cyber Security, cyber insurance, cyber r..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 24 Jul 2014 22:50

Courtney Bowman

Over the past decade, the EU has made significant technological and legal strides toward the widespread adoption of electronic identification cards.  An electronic ID card, or e-ID, serves as a form of secure identification for online transactions – in other words, it provides sufficient verification of an individual’s identity to allow that person to electronically sign and submit sensitive documents such as tax returns and voting ballots over the Internet.  Many people see e-IDs as the future of secure identification since they offer the potential to greatly facilitate cardholders’ personal and business transactions, and the EU Commission has recognized this potential by drafting regulations meant to eliminate transactional barriers currently hindering the cards’ cross-border reach.  However, the increasingly widespread use of e-ID systems also gives rise to significant data security concerns.

Countries including Spain, Italy, Germany, and Belgium already have adopted e-ID systems, and the precise mechanics of the systems differ from country to country.  In the Estonian system, for example, each e-ID carries a chip with encrypted files that provide proof of identity when accessed by a card reader (which a cardholder may purchase and connect to his or her computer).  Once the card is inserted into the card reader, the user inputs different PIN numbers to access the appropriate database and electronically sign e-documents.

In fact, as recently detailed in The Economist, the small Baltic country of Estonia has one of Europe’s most highly-developed e-ID systems and exemplifies the underlying potential of this technology.  Around 1.1 million of the country’s 1.3 million residents have electronic ID cards, which they can use to take advantage of the country’s fairly advanced array of e-government offerings.  Estonians can use their e-IDs to go online and securely file their taxes, vote in elections, log into their bank accounts, access governmental databases to check their medical records, and even set up businesses, among many other tasks.  Estonia even has established an e-prescription system that permits doctors to order a refill by forwarding an online renewal notice to a national database, thereby allowing a patient pick up a prescription from any pharmacy in the country simply by presenting his or her e-ID.  The Estonian government also has announced a plan to start issuing cards to non-Estonians, so that citizens of other countries can easily set up businesses in Estonia or otherwise take advantage of that country’s many e-services.  Estonia’s e-ID system thus illustrates how these cards can enhance convenience and save time that may otherwise be spent waiting in line to file documents in government offices, and they represent a significant step in that country’s efforts to brand itself as “e-Estonia.”

Naturally, the use of these cards to access such large quantities of personal data implicates important data security issues.  Estonia assures its cardholders that their transactions are secure because each card’s files are protected by 2048-bit public key encryption, and because users need to enter multiple PIN numbers to access and use certain online services.  To date, Estonia’s e-ID system has not suffered a major data breach.  Nevertheless, the security of the system has been called into question by researchers that claim that Estonia’s e-voting process is vulnerable to manipulation by skilled hackers.

So what other factors may hinder the deployment of this technology, beyond the large upfront costs of developing an e-ID system and distributing e-ID cards?  As mentioned above, the e-ID system requires the adoption of extensive data security measures to ensure the confidentiality of personal data.  Furthermore, systems like those established by Estonia are so efficient in part because they draw on personal data – including health information – held within government databases.  Citizens of other countries, such as those that have largely privatized medical systems like the United States, may be much more wary of government efforts to consolidate this type of personal information, even for the sake of efficiency.  Others countries share a similar concerns about governmental collection of personal information.  When the U.K. government announced plans to issue ID cards linked to a national identity register, for example, opposition proved so fierce that the government abandoned its pursuit of the project.  Denmark and Ireland also do not issue ID cards to their citizens.

Regardless of this opposition, the European Commission believes that e-IDs will facilitate business within the EU and is dedicated to removing many of the legal barriers hindering the implementation of this technology.  As early as 1999, the Commission issued Directive 1999/93/EC, which provided a framework for the legal recognition of electronic signatures.  And in 2012, the Commission issued its draft regulation on electronic identification and trust services for electronic transactions.  The regulation set forth a mutual recognition scheme mandating that all member states recognize and accept electronic IDs issued in other member states for the purposes of accessing online services.  The regulation would, for example, allow an Italian student attending a German university to pay her school fees online via  the university’s German website by using her Italian e-ID.

In sum, e-IDs have the potential to simplify the lives of cardholders – but only if those issuing the cards are willing to take the appropriate security precautions and work to achieve mutual recognition of other countries’ IDs.

Author: "Courtney Bowman" Tags: "European Union, International, Legislati..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 10 Jul 2014 03:51

Marianne Le Moullec

According to the French Data Protection Authority’s (“CNIL”) recently issued activity report for 2013, the CNIL was especially busy in 2013. The main topics addressed by the CNIL in 2013 were the creation of a national consumer credit database, the right to be forgotten, the right to refuse cookies, the proposed EU Regulation, and, of course, the revelations concerning the U.S. Prism program and the surveillance of European citizens’ personal data by foreign entities. The report also presents the main issues that the CNIL will tackle in 2014. Such issues include privacy in relation to open data, as well as in relation to new health monitoring apps or quantified self apps. The CNIL will also deal with “digital death” and more specifically, on how to deal with the social network profiles of deceased persons.

The CNIL’s report starts with what was the central issue in data protection throughout 2013, the U.S. Prism program and more generally any mass surveillance programs of European citizens by foreign entities. The CNIL created a working group on the related subject of long-arm foreign statutes which allow foreign administrations to obtain personal data from French and European citizens. Such statutes have various purposes (combating money laundering, corruption, the financing of terrorism, etc.) and lead to the creation of black lists. In addition, the CNIL addresses those subjects with the other Data Protection Agencies within the Article 29 Working Party.

Another important topic was the proposed creation in France of a centralized national register where all consumer credit lines opened by an individual would have been listed, in order to allow credit companies to verify an individual’s level of debt.  Indeed, consumer credit lines are fairly easily granted in France, and some consumers accumulate credit lines beyond their payment capacities and ultimately default in payment. The CNIL rendered negative advice on this register arguing that it breached the proportionality principle of the French law on data protection. Indeed, since only a small minority of people defaults, it considered that the collection and processing of data from all credit users was disproportionate. The register was nevertheless approved by the Parliament, but was immediately overruled by the French constitutional court in 2014, which, like the CNIL, considered that the register breached the right to privacy.

The CNIL also issued a recommendation in 2013 on how to obtain valid consent for cookies and any type of online tracking devices. The CNIL had initially interpreted consent for cookies (resulting from the e-privacy directive) as meaning explicit “opt-in” consent. But the CNIL finally backtracked and issued its 2013 recommendation allowing for opt-out consent, provided that website users are duly informed. In practice, the CNIL recommends the use of a banner on the website, stating that the site uses cookies and listing the purposes of the cookie. The user may click on the banner to refuse some or all cookies. But the banner provides that if the user continues to surf the website, he/she is deemed to have accepted the cookies (which is a form of opt-out consent). Some cookies, including those necessary for the functioning of the website or for security, do not require consent.

With regards to of the CNIL’s auditing and sanctions in 2013, the CNIL’s priorities remained committed to training, promoting awareness on data protection and issuing guidance for companies. Imposing financial penalties remains an exception. Statistics of the CNIL’s auditing and sanctions activities in 2013 demonstrate this quite clearly:

5640 complaints: Complaints to the CNIL were stable in 2013. The CNIL attributes this stability to its new guidance available on its website. This guidance deals with common issues such as video surveillance and direct marketing, and helps companies to comply, thus stabilizing the number of complaints to the CNIL.

414 audits: 75% of the CNIL’s audits in 2013 were of private companies, and 25% were of public administration. Many audits occurred after a complaint was filed with the CNIL (33% of the audits), but audits were also conducted at the initiative of the CNIL (27%) or following a previous sanction to make sure that the companies were now compliant (16%). Finally, 24% of the audits were devoted to sectors chosen by the CNIL: in 2013, companies dealing with open data as well as surveys were audited, and the social services administration was also audited.

14 decisions with sanctions: This includes 7 warnings and only 7 financial penalties.

For 2014, the CNIL has identified four major topics: open data, health data, and “digital death”. On open data, the CNIL will audit the current legal framework and will propose improvements. The CNIL itself wishes to open its data (rendered anonymous) to the public. With regards to health data, the CNIL will investigate the impact on privacy from apps and other tools (“quantified self”) that allow individuals to monitor their health and physical activity. The CNIL will address “digital death”, in particular how to deal with data of a deceased person. Finally, the CNIL will conduct audits in the penitentiary administration in order to verify whether the rights of prisoners to privacy are respected.

Author: "Marianne Le Moullec" Tags: "International, CNIL, French Data Protect..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 01 Jul 2014 13:47

Cecile Martin

In France, before implementing a whistleblowing process, a company must inform and consult with its employees’ representatives, inform its employees and notify the French Data Protection Agency (CNIL).

There are two possible ways to notify the CNIL of a whistleblowing system:

  1. request a formal authorization from the CNIL (this is quite burdensome and difficult to obtain), or
  2. opt for the standard whistleblowing authorization (AU-004).

The standard whistleblowing authorization (AU-004) was enacted by the French Data Protection Agency in 2005 in order to facilitate notifying the CNIL of whistleblowing systems. As long as the company undertakes to comply with the principles and scope of the standard authorization, it is automatically authorized to implement the whistleblowing system. As enacted in 2005, the types of wrongdoings that could be reported through a whistleblowing system under the standard authorization were quite broad. Companies were authorized to adopt whistleblowing systems for purposes of regulatory internal control requirements, to comply with French law requirements and the United States Sarbanes-Oxley Act, and to protect vital interests of the company or the physical or psychological integrity of its employees.

However, in 2010, the CNIL had to modify the scope of the wrongdoings which could be reported when using a standard whistleblowing authorization pursuant to a decision of the French Supreme Court dated December 8, 2009 (see our post of December 15th, 2010: http://privacylaw.proskauer.com/2010/12/articles/data-privacy-laws/french-data-protection-agency-restricts-the-scope-of-the-whistleblowing-procedures-multinational-companies-need-to-make-sure-they-are-compliant/). In order to comply with the French Supreme Court decision, the CNIL narrowed whistleblowing reporting under the standard authorization to the following types of wrongdoings:

  • Accounting;
  • Finance;
  • Banking;
  • Anti-corruption;
  • Competition;
  • Companies concerned with the U.S. Sarbanes-Oxley Act, section 301(4); and
  • Japanese SOX of June 6, 2006.

The scope of the standard authorization was therefore very limited, requiring companies needing a broader scope of whistleblowing reporting to obtain a formal authorization from the CNIL and therefore to face the risk of a refusal.

From 2011 to 2013, given the scope limits of the standard authorization, the CNIL has had to process a high volume of filings for formal authorizations to implement whistleblowing systems.

Given the increased volume of requests from companies, on January 30, 2014, the CNIL decided to modify again the scope of application of the standard whistleblowing authorization (AU-004) to widen it.

As a consequence, companies implementing whistleblowing systems in France within the following categories can benefit from the new standard authorization:

  • Finance;
  • Accounting;
  • Banking;
  • Anti-corruption;
  • Competition;
  • Discrimination and bullying at work;
  • Health and safety at work; and
  • Environment protection.

In its updated standard whistleblowing authorization, the CNIL also stated its preference against anonymous whistleblowing. Anonymous whistleblowing is allowed only if:

  • The facts are serious and the factual elements are sufficiently detailed; and
  • The treatment of the alert is subject to particular precautions such as a prior checking before it is sent through the whistleblowing process.
Author: "Cecile Martin"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 30 Jun 2014 17:40

Kristen J. Mathews

On June 25, 2014, the Supreme Court unanimously ruled that police must first obtain a warrant before searching the cell phones of arrested individuals, except in “exigent circumstances.” Chief Justice John Roberts authored the opinion, which held that an individual’s Fourth Amendment right to privacy outweighs the interest of law enforcement in conducting searches of cell phones without a warrant. The decision resolved a split among state and federal courts on the search incident to arrest doctrine (which permits police to search an arrested individual without a warrant) as it applies to cell phones.

The case of Riley v. California as heard before the Supreme Court combined two cases, one involving a smartphone and the other involving a flip phone. In the first case, Riley v. California, the police arrested David Leon Riley, searched his smartphone, and found photographs and videos potentially connecting him to gang activity and an earlier shooting. In the second case, United States v. Wurie, Brima Wurie was arrested for allegedly dealing drugs, and incoming calls on his flip phone helped lead the police to a house used to store drugs and guns.

Roberts wrote that neither of the two justifications for warrantless searches – protecting police officers and preventing the destruction of evidence – applies in the context of cell phones. According to the Court, the justification of protecting police officers falls flat since data on a cell phone cannot be used as a weapon. Roberts was also not persuaded by concerns that criminals could destroy evidence through remote wiping. He pointed out that police have alternatives to a warrantless search in order to prevent the destruction of evidence, including: turning the phone off, removing its battery, or placing the phone in a “Faraday bag,” an aluminum foil bag that blocks radio waves.

The Chief Justice focused on the differences between modern cell phones and other physical items found on arrested individuals to support his argument that modern cell phones “implicate privacy concerns far beyond those implicated by the search of a cigarette pack, a wallet, or a purse.” He cited modern cell phones’ huge storage capacity and how they function as “minicomputers that…could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers.” Roberts also noted that data viewed on a phone is frequently not stored on the device itself, but on remote servers, and that officers searching a phone generally do not know the location of data they are viewing.

However, Roberts maintained that exigent circumstances could still justify warrantless searches of cell phones on a case-by-case basis. Such circumstances include: preventing imminent destruction of evidence in individual cases, pursuing a fleeing suspect, and providing assistance to people who are seriously injured or are threatened with imminent injury.

Robert’s opinion is in line with the Court’s stance in the 2012 case United States v. Jones, which held that installing a GPS device on a vehicle and using the device to track the vehicle constitutes a search under the Fourth Amendment.

Justice Samuel Alito concurred in the judgment and agreed with Roberts that the old rule should not be applied mechanically to modern cell phones. However, he made two points that diverged from Roberts’ opinion. First, he disagreed with the idea that the old rule on searches incident to arrest was primarily based on the two justifications of protecting police and preventing destruction of evidence. Second, if Congress or state legislatures pass future legislation on searching cell phones found on arrested individuals, the Court should defer to their judgment.

The Riley opinion recognizes the unique role that cell phones play in modern life (“such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude that they were an important feature of the human anatomy”) and that they “hold for many Americans ‘the privacies of life.’”

Special thanks to Tiffany Quach, 2014 summer associate, for her assistance in preparing this post.

Author: "Kristen J. Mathews"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 25 Jun 2014 05:23

Jeremy M. Mittman

On July 2, 2014 Singapore’s new Personal Data Protection Act (the “PDPA” or the “Act”)) will go into force, requiring companies that have a physical presence in Singapore to comply with many new data protection obligations under the PDPA.   Fortunately, in advance of the Act’s effective date, the Singapore Personal Data Commission has recently promulgated Personal Data Protection Regulations (2014) (the “Regulations”) to clarify companies’ obligations under the Act.

Under the PDPA, an individual may request from an organization that is subject to the Act access to, and correction of, the personal data that the organization holds about that individual.  The Regulations clarify that the request must be made in writing and must include sufficient identifying information in order for the organization to process the request.  The Regulations also specify that the request for access or correction should be made to the company’s Data Protection Officer (which companies are now required to appoint under the Act).  Under the Regulations, an organization must respond to the request for access to personal data “as soon as practicable” but if it is anticipated that it will take longer than 30 days to do so, the organization must so inform the individual within that 30 day period.  

The Regulations confirm that individuals under the Act are entitled to expansive access rights: a company must provide them with access to all personal data requested, as well as “use and disclosure information in documentary form”.   If such is not possible however, the organization can provide the applicant with a “reasonable opportunity to examine the personal data and use and disclosure information.”

Perhaps in an effort to reduce the burden and expense to organizations complying with an access request by an individual, the Regulations provide that an organization may charge an individual a “reasonable fee” to respond to an individual’s request for access to the personal data the company holds related to the individual, provided it has previously communicated an estimate of the fee to the applicant.

The Regulations also contain a number of details regarding the transfer of personal data outside Singapore.  Specifically, the Regulations clarify that before transferring personal data to another jurisdiction, the transferring organization in Singapore must ensure that the recipient is “legally bound by enforceable obligations… to provide to the transferred personal data a standard of protection that is at least comparable to the protection under the Act.”

“Enforceable obligations” under the PDPA are similar to that under the European Union, and include the existence of a comparable data protection law, a written contract that provides for sufficient protections, as well as “binding corporate rules.”

The Regulations (together with recently issued Advisory Guidelines On Key Concepts In The Personal Data Protection Act (revised on 16 May 2014)) now provide much needed guidance in helping companies comply with their new data protection obligations under the Act.

Author: "Jeremy M. Mittman"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 24 Jun 2014 13:17

David Munkittrick

After a decision denying class certification last week, claims by Hulu users that their personal information was improperly disclosed to Facebook are limited to the individual named plaintiffs (at least for now, as the decision was without prejudice).

The plaintiffs alleged Hulu violated the federal Video Privacy Protection Act by configuring its website to include a Facebook “like” button.  This functionality used cookies that disclosed users’ information to Facebook.  But, the U.S. District Court for the Northern District of California credited expert evidence presented by Hulu that three things could stop the cookies from transmitting information: 1) if the Facebook “keep me logged in” feature was not activated; 2) if the user manually cleared cookies after his or her Facebook and Hulu sessions, or 3) if the user used cookie blocking or ad blocking software. 

In its decision, the court ruled that these methods of stopping disclosure of information rendered the proposed class insufficiently ascertainable.  To maintain a class action, a class must be sufficiently ascertainable by reference to objective criteria.  Plaintiffs argued that the class membership could be ascertained by submission of affidavits from each class member, but the court reasoned that individual subjective memories of computer usage may not be reliable, particularly given the availability of $2,500 statutory damages per person under the Video Privacy Protection Act.  Thus, the court found plaintiffs had not defined an ascertainable class.

Additionally, because an individual inquiry into whether each member of the putative class used any one of the methods to block disclosure of information would be required, the court found individual issues predominated and would preclude certification of the class as defined.  Hulu also argued the total potential damages were out of proportion to any actual damages suffered and thus violated due process, but the court noted that while the argument might have some merit, the issue was not ripe given the denial of class certification.

While the case, In re Hulu Privacy Litigation, can still continue on behalf of the named plaintiffs, the court left open the possibility of proceeding with subclasses or a narrower class definition.  Consequently, additional class certification practice is likely in this closely watched case.

Author: "David Munkittrick" Tags: "Data Privacy Laws, Online Privacy, Priva..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 28 Apr 2014 18:12

Courtney Bowman

Last month, a federal district court in the Northern District of California issued an order that may affect the policies of any company that records telephone conversations with consumers.

The trouble began when plaintiff John Lofton began receiving calls from Collecto, Verizon’s third-party collections agency, on his cell phone.  The calls were made in error – Lofton did not owe Verizon any money because he wasn’t even a Verizon customer – but Lofton decided to take action when he discovered that Collecto had been recording its conversations with him without prior notice.  Lofton brought a class action against Verizon under California’s Invasion of Privacy Act, theorizing that Verizon was vicariously responsible for Collecto’s actions because Collecto was Verizon’s third-party vendor and because Verizon’s call-monitoring disclosure policy did not require the disclosure of recordings in certain situations. Verizon filed a motion to dismiss, arguing that the recordings did not invade Lofton’s privacy and therefore did not run afoul of the statute. 

The court denied the motion to dismiss, holding that the statutory language of § 632.7 of the Invasion of Privacy Act banned the recording of all calls made to cell phones – not just confidential or private calls made to cell phones – without prior notice.  The statute’s treatment of cell phones thus diverges from its treatment of landlines, as recordings of calls made to landlines only have to be disclosed via prior notice if the call is “confidential.” 

Though the case is ongoing, this ruling indicates that Lofton v. Verizon Wireless (VAW) LLC ultimately may have a significant impact on how companies interact with consumers over the phone. First, the prevalence of cell phones means that companies should assume that § 632.7 applies to a large percentage of its calls with consumers – not only because it is highly likely that these consumers use cell phones instead of landlines, but because it may be difficult for the company to tell whether these consumers are in California and subject to § 632.7. Second, this recent order indicates that companies may be held responsible for their third-party vendors’ lack of disclosure, meaning that companies should change their policies to require their third-party vendors to refrain from recording phone conversations without prior notice, and also monitor the vendors for compliance with this requirement.  In sum, when it comes to providing prior notice of recordings to consumers, companies shouldn’t phone it in – they should ensure they and any third party vendors err on the side of disclosure to avoid legal hangups down the line.

Author: "Courtney Bowman" Tags: "California, Invasion of Privacy, Mobile ..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 31 Jan 2014 15:43

Marianne Le Moullec

After two years of investigation and proceedings regarding Google’s privacy policy, European Data Protection Authorities (DPAs) are now reaching their final decisions against Google. The French DPA (“CNIL”) issued ,on January 3rd 2014, a decision ruling that Google’s privacy policy did not comply with the French Data Protection laws and imposed a fine of € 150,000 http://www.cnil.fr/english/news-and-events/news/article/the-cnils-sanctions-committee-issues-a-150-000-EUR-monetary-penalty-to-google-inc/. Google has brought an appeal against the CNIL’s decision.

This is the second decision by a European DPA fining Google for the lack of compliance of its privacy policy: on December 19 2013 the Spanish DPA (“AEPD”) ruled that Google had committed three serious violations of the Spanish Data Protection law and ordered Google to pay a fine of € 300,000 for each one of the three violations.

http://www.agpd.es/portalwebAGPD/revista_prensa/revista_prensa/2013/notas_prensa/common/diciembre/131219_PR_AEPD_PRI_POL_GOOGLE.pdf More decisions are to be expected.

A European-wide investigation against Google’s privacy policy

The decisions rendered by the French and Spanish DPAs are the result of a joint investigation by European DPAs, launched in early 2012 when Google announced that it was about to replace the individual privacy policies of each of its products and services by a single privacy policy. The Article 29 Working Party, an independent advisory body whose members are the DPAs of the28 European Member States, immediately expressed privacy concerns and decided to launch an investigation on behalf of all European DPAs. Following this investigation, the Article 29 Working Party rendered its findings in October 2012: it stated that Google’s privacy policy did not comply with the European Directive on Data Protection for several reasons: 1) it did not inform the users of the type of data collected and of its purposes; 2) it combined, without authorization, data collected by various services of Google and finally, 3) it did not specify the data retention periods. The Article 29 Working Party issued recommendations that Google refused to implement. This led to the decision, in April 2013, of six of the European DPA (Germany, France, Italy, Spain, the Netherlands, and the UK) to simultaneously launch legal actions against Google http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/20130227_pr_google_privacy_policy_en.pdf

So far, Spain and France are the only two DPAs that have issued fines against Google. The Dutch DPA has issued a decision finding Google’s privacy policy in breach of the Dutch privacy law, but has not yet issued sanctions. http://www.dutchdpa.nl/Pages/pb_20131128-google-privacypolicy.aspx. The investigations are still on-going in the other Member States (Germany, the UK and Italy).

Does French Law apply to Google’s privacy policy?

This was the first issue that the CNIL had to decide upon. The territorial scope of French law derives from the rules set out by the EC Directive n°95/46. Hence, French law is applicable either because 1)  the data controller carries out his activity within an establishment in France, or 2) the data controller is not established in France nor in the EU, but uses “means of processing” of personal data located in France to collect data.

Google claimed that the French law did not apply because Google Inc. in California is solely responsible for data collection and processing, and that Google France is not involved in any activity related to the data processing performed by Google Inc.

The CNIL rejects this argument, arguing that Google France is involved in the sale of targeted advertisement, which value is based on the data collection of Internet users. Hence, Google France is involved in the activity of personal data processing, even though it does not perform the technical processing of personal data. The CNIL’s argument is similar to the argument developed by the Advocate General in the case currently opposing Google and the Spanish DPA before the European Court of Justice (“ECJ”) (http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=198456/). The ruling of the ECJ on this issue is eagerly awaited.

In addition, the CNIL ruled that Google Inc. placed cookies on the computers of French users, and that such cookies were “means of processing” of personal data located in France because they are used to collect data from the users’ computers. Therefore, even if Google Inc. were to be considered as the sole data controller, French law would nevertheless apply because of the location of the cookies in France.

Are all data collected by Google “personal data” within the meaning of French and EU Law?

One of the main issues is the difference put forward by Google between “authenticated users”, who have registered their ID to use services such as Gmail and “unauthenticated users” who use services that do not require identification such as Youtube! or “passive users” who visit a third-party website where Google has placed Analytics cookies for targeted advertising.

According to Google, it holds “personal data” only on “authenticated users” and not on “unauthenticated users” and “passive users”. The CNIL rejects the argument because the definition of personal data under French law includes information that indirectly identifies a person. The CNIL considers that, even if the name of the user is not collected, the collection of an IP address combined with the collection of precise and detailed information on the browsing history of the computer amounts to indirectly identifying a person, because it gives precise information of a person’s interests, daily life, choices of life etc.

Therefore, all data collected by Google is considered by CNIL as personal data.

Why Google’s privacy policy breaches French Data Protection Law?

The CNIL, following the findings of the Article 29 Working Party, found four breaches of French law on data protection.

First, Google’s privacy policy fails to properly inform the users of the collection of their personal data and its purposes. Google unified into a single document the privacy policies applicable to more than 60 services and products. With regards to the “purposes” of the data collection, Google’s policy provides for general purposes such as the proper performance of the services, without further explanation or details. This is considered as too vague to inform users, especially given the variety of services offered by Google and the various types of data collected.

Secondly, Google should have informed users and obtained their consent before placing advertising cookies on their terminal. Obtaining consent for cookies does not require opt-in consent from the user, but the user must be properly informed before the cookies are placed on the terminal, of their purposes and on how to refuse them. The CNIL found that, with regards to unauthenticated users, Google placed cookies prior to any information, in breach of French Data Protection law. In addition, the information provided to users is not sufficient. Only two services of Google (Search and YouTube!) have a banner with information on cookies. Moreover, little information is given regarding the purposes of the cookies: stating that cookies are meant “to ensure proper performance of the services” is not deemed to be sufficient information in order to obtain an “informed consent” from the user. With regards to “passive users” who visit  third-party websites where Google placed its “Analytics” cookies, the CNIL considers that, since Google uses the data collected for its own activity (by producing statistics and improving its service), it acts as a data controller and is responsible for obtaining consent.

Thirdly, Google has not defined the duration during which it retains the data collected and has not implemented any automatic processes for deleting data. For example, no information is available as to the duration during which the data is kept once an authenticated user has canceled its account.

Finally, the combination of data collected from one Google service with data collected from other Google services requires informed, explicit and specific consent from the user. The CNIL ruled that Google breached this obligation because it did not provide detailed information on the type of data combination it performs and did not seek explicit and specific consent from the user. Consent to the general privacy policy or the terms and conditions of use is not considered as sufficient.

Author: "Marianne Le Moullec" Tags: "Data Privacy Laws, European Union, Onlin..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 19 Dec 2013 02:46

Kristen J. Mathews

Based on a December 3rd decision by the Second Circuit Court of Appeals, class actions under the Telephone Consumer Protection Act (TCPA) can now be brought in New York federal court. This decision marks a reversal of Second Circuit precedent, and will likely increase the number of TCPA class actions being filed in New York. Companies should review their telemarketing practices and procedures in light of the potential statutory penalties under the TCPA.

Continue reading

Author: "Kristen J. Mathews" Tags: "TCPA, telemarketing, telephone consumer ..."
Comments Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader