• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)

Date: Thursday, 10 Jul 2014 03:51

Marianne Le Moullec

According to the French Data Protection Authority’s (“CNIL”) recently issued activity report for 2013 (http://www.cnil.fr/fileadmin/documents/La_CNIL/publications/CNIL_34e_Rapport_annuel_2013.pdf ), the CNIL was especially busy in 2013. The main topics addressed by the CNIL in 2013 were the creation of a national consumer credit database, the right to be forgotten, the right to refuse cookies, the proposed EU Regulation, and, of course, the revelations concerning the U.S. Prism program and the surveillance of European citizens’ personal data by foreign entities. The report also presents the main issues that the CNIL will tackle in 2014. Such issues include privacy in relation to open data, as well as in relation to new health monitoring apps or quantified self apps. The CNIL will also deal with “digital death” and more specifically, on how to deal with the social network profiles of deceased persons.

The CNIL’s report starts with what was the central issue in data protection throughout 2013, the U.S. Prism program and more generally any mass surveillance programs of European citizens by foreign entities. The CNIL created a working group on the related subject of long-arm foreign statutes which allow foreign administrations to obtain personal data from French and European citizens. Such statutes have various purposes (combating money laundering, corruption, the financing of terrorism, etc.) and lead to the creation of black lists. In addition, the CNIL addresses those subjects with the other Data Protection Agencies within the Article 29 Working Party.

Another important topic was the proposed creation in France of a centralized national register where all consumer credit lines opened by an individual would have been listed, in order to allow credit companies to verify an individual’s level of debt.  Indeed, consumer credit lines are fairly easily granted in France, and some consumers accumulate credit lines beyond their payment capacities and ultimately default in payment. The CNIL rendered negative advice on this register arguing that it breached the proportionality principle of the French law on data protection. Indeed, since only a small minority of people defaults, it considered that the collection and processing of data from all credit users was disproportionate. The register was nevertheless approved by the Parliament, but was immediately overruled by the French constitutional court in 2014, which, like the CNIL, considered that the register breached the right to privacy.

The CNIL also issued a recommendation in 2013 on how to obtain valid consent for cookies and any type of online tracking devices. The CNIL had initially interpreted consent for cookies (resulting from the e-privacy directive) as meaning explicit “opt-in” consent. But the CNIL finally backtracked and issued its 2013 recommendation allowing for opt-out consent, provided that website users are duly informed. In practice, the CNIL recommends the use of a banner on the website, stating that the site uses cookies and listing the purposes of the cookie. The user may click on the banner to refuse some or all cookies. But the banner provides that if the user continues to surf the website, he/she is deemed to have accepted the cookies (which is a form of opt-out consent). Some cookies, including those necessary for the functioning of the website or for security, do not require consent.

With regards to of the CNIL’s auditing and sanctions in 2013, the CNIL’s priorities remained committed to training, promoting awareness on data protection and issuing guidance for companies. Imposing financial penalties remains an exception. Statistics of the CNIL’s auditing and sanctions activities in 2013 demonstrate this quite clearly:

5640 complaints: Complaints to the CNIL were stable in 2013. The CNIL attributes this stability to its new guidance available on its website. This guidance deals with common issues such as video surveillance and direct marketing, and helps companies to comply, thus stabilizing the number of complaints to the CNIL.

414 audits: 75% of the CNIL’s audits in 2013 were of private companies, and 25% were of public administration. Many audits occurred after a complaint was filed with the CNIL (33% of the audits), but audits were also conducted at the initiative of the CNIL (27%) or following a previous sanction to make sure that the companies were now compliant (16%). Finally, 24% of the audits were devoted to sectors chosen by the CNIL: in 2013, companies dealing with open data as well as surveys were audited, and the social services administration was also audited.

14 decisions with sanctions: This includes 7 warnings and only 7 financial penalties.

For 2014, the CNIL has identified four major topics: open data, health data, and “digital death”. On open data, the CNIL will audit the current legal framework and will propose improvements. The CNIL itself wishes to open its data (rendered anonymous) to the public. With regards to health data, the CNIL will investigate the impact on privacy from apps and other tools (“quantified self”) that allow individuals to monitor their health and physical activity. The CNIL will address “digital death”, in particular how to deal with data of a deceased person. Finally, the CNIL will conduct audits in the penitentiary administration in order to verify whether the rights of prisoners to privacy are respected.

Author: "Marianne Le Moullec" Tags: "International, CNIL, French Data Protect..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 01 Jul 2014 13:47

Cecile Martin

In France, before implementing a whistleblowing process, a company must inform and consult with its employees’ representatives, inform its employees and notify the French Data Protection Agency (CNIL).

There are two possible ways to notify the CNIL of a whistleblowing system:

  1. request a formal authorization from the CNIL (this is quite burdensome and difficult to obtain), or
  2. opt for the standard whistleblowing authorization (AU-004).

The standard whistleblowing authorization (AU-004) was enacted by the French Data Protection Agency in 2005 in order to facilitate notifying the CNIL of whistleblowing systems. As long as the company undertakes to comply with the principles and scope of the standard authorization, it is automatically authorized to implement the whistleblowing system. As enacted in 2005, the types of wrongdoings that could be reported through a whistleblowing system under the standard authorization were quite broad. Companies were authorized to adopt whistleblowing systems for purposes of regulatory internal control requirements, to comply with French law requirements and the United States Sarbanes-Oxley Act, and to protect vital interests of the company or the physical or psychological integrity of its employees.

However, in 2010, the CNIL had to modify the scope of the wrongdoings which could be reported when using a standard whistleblowing authorization pursuant to a decision of the French Supreme Court dated December 8, 2009 (see our post of December 15th, 2010: http://privacylaw.proskauer.com/2010/12/articles/data-privacy-laws/french-data-protection-agency-restricts-the-scope-of-the-whistleblowing-procedures-multinational-companies-need-to-make-sure-they-are-compliant/). In order to comply with the French Supreme Court decision, the CNIL narrowed whistleblowing reporting under the standard authorization to the following types of wrongdoings:

  • Accounting;
  • Finance;
  • Banking;
  • Anti-corruption;
  • Competition;
  • Companies concerned with the U.S. Sarbanes-Oxley Act, section 301(4); and
  • Japanese SOX of June 6, 2006.

The scope of the standard authorization was therefore very limited, requiring companies needing a broader scope of whistleblowing reporting to obtain a formal authorization from the CNIL and therefore to face the risk of a refusal.

From 2011 to 2013, given the scope limits of the standard authorization, the CNIL has had to process a high volume of filings for formal authorizations to implement whistleblowing systems.

Given the increased volume of requests from companies, on January 30, 2014, the CNIL decided to modify again the scope of application of the standard whistleblowing authorization (AU-004) to widen it.

As a consequence, companies implementing whistleblowing systems in France within the following categories can benefit from the new standard authorization:

  • Finance;
  • Accounting;
  • Banking;
  • Anti-corruption;
  • Competition;
  • Discrimination and bullying at work;
  • Health and safety at work; and
  • Environment protection.

In its updated standard whistleblowing authorization, the CNIL also stated its preference against anonymous whistleblowing. Anonymous whistleblowing is allowed only if:

  • The facts are serious and the factual elements are sufficiently detailed; and
  • The treatment of the alert is subject to particular precautions such as a prior checking before it is sent through the whistleblowing process.
Author: "Cecile Martin" Tags: "Data Privacy Laws, CNIL, France, whistle..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 30 Jun 2014 17:40

Kristen J. Mathews

Special thanks to Tiffany Quach, 2014 summer associate, for her assistance in preparing this post.

On June 25, 2014, the Supreme Court unanimously ruled that police must first obtain a warrant before searching the cell phones of arrested individuals, except in “exigent circumstances.” Chief Justice John Roberts authored the opinion, which held that an individual’s Fourth Amendment right to privacy outweighs the interest of law enforcement in conducting searches of cell phones without a warrant. The decision resolved a split among state and federal courts on the search incident to arrest doctrine (which permits police to search an arrested individual without a warrant) as it applies to cell phones.

The case of Riley v. California as heard before the Supreme Court combined two cases, one involving a smartphone and the other involving a flip phone. In the first case, Riley v. California, the police arrested David Leon Riley, searched his smartphone, and found photographs and videos potentially connecting him to gang activity and an earlier shooting. In the second case, United States v. Wurie, Brima Wurie was arrested for allegedly dealing drugs, and incoming calls on his flip phone helped lead the police to a house used to store drugs and guns.

Roberts wrote that neither of the two justifications for warrantless searches – protecting police officers and preventing the destruction of evidence – applies in the context of cell phones. According to the Court, the justification of protecting police officers falls flat since data on a cell phone cannot be used as a weapon. Roberts was also not persuaded by concerns that criminals could destroy evidence through remote wiping. He pointed out that police have alternatives to a warrantless search in order to prevent the destruction of evidence, including: turning the phone off, removing its battery, or placing the phone in a “Faraday bag,” an aluminum foil bag that blocks radio waves.

The Chief Justice focused on the differences between modern cell phones and other physical items found on arrested individuals to support his argument that modern cell phones “implicate privacy concerns far beyond those implicated by the search of a cigarette pack, a wallet, or a purse.” He cited modern cell phones’ huge storage capacity and how they function as “minicomputers that…could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers.” Roberts also noted that data viewed on a phone is frequently not stored on the device itself, but on remote servers, and that officers searching a phone generally do not know the location of data they are viewing.

However, Roberts maintained that exigent circumstances could still justify warrantless searches of cell phones on a case-by-case basis. Such circumstances include: preventing imminent destruction of evidence in individual cases, pursuing a fleeing suspect, and providing assistance to people who are seriously injured or are threatened with imminent injury.

Robert’s opinion is in line with the Court’s stance in the 2012 case United States v. Jones, which held that installing a GPS device on a vehicle and using the device to track the vehicle constitutes a search under the Fourth Amendment.

Justice Samuel Alito concurred in the judgment and agreed with Roberts that the old rule should not be applied mechanically to modern cell phones. However, he made two points that diverged from Roberts’ opinion. First, he disagreed with the idea that the old rule on searches incident to arrest was primarily based on the two justifications of protecting police and preventing destruction of evidence. Second, if Congress or state legislatures pass future legislation on searching cell phones found on arrested individuals, the Court should defer to their judgment.

The Riley opinion recognizes the unique role that cell phones play in modern life (“such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude that they were an important feature of the human anatomy”) and that they “hold for many Americans ‘the privacies of life.’”

Author: "Kristen J. Mathews" Tags: "Fourth Amendment, Riley v. California, U..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 25 Jun 2014 05:23

Jeremy M. Mittman

On July 2, 2014 Singapore’s new Personal Data Protection Act (the “PDPA” or the “Act”)) will go into force, requiring companies that have a physical presence in Singapore to comply with many new data protection obligations under the PDPA.   Fortunately, in advance of the Act’s effective date, the Singapore Personal Data Commission has recently promulgated Personal Data Protection Regulations (2014) (the “Regulations”) to clarify companies’ obligations under the Act. 

Under the PDPA, an individual may request from an organization that is subject to the Act access to, and correction of, the personal data that the organization holds about that individual.  The Regulations clarify that the request must be made in writing and must include sufficient identifying information in order for the organization to process the request.  The Regulations also specify that the request for access or correction should be made to the company’s Data Protection Officer (which companies are now required to appoint under the Act).  Under the Regulations, an organization must respond to the request for access to personal data “as soon as practicable” but if it is anticipated that it will take longer than 30 days to do so, the organization must so inform the individual within that 30 day period.  

The Regulations confirm that individuals under the Act are entitled to expansive access rights: a company must provide them with access to all personal data requested, as well as “use and disclosure information in documentary form”.   If such is not possible however, the organization can provide the applicant with a “reasonable opportunity to examine the personal data and use and disclosure information.” 

Perhaps in an effort to reduce the burden and expense to organizations complying with an access request by an individual, the Regulations provide that an organization may charge an individual a “reasonable fee” to respond to an individual’s request for access to the personal data the company holds related to the individual, provided it has previously communicated an estimate of the fee to the applicant. 

The Regulations also contain a number of details regarding the transfer of personal data outside Singapore.  Specifically, the Regulations clarify that before transferring personal data to another jurisdiction, the transferring organization in Singapore must ensure that the recipient is “legally bound by enforceable obligations… to provide to the transferred personal data a standard of protection that is at least comparable to the protection under the Act.”

“Enforceable obligations” under the PDPA are similar to that under the European Union, and include the existence of a comparable data protection law, a written contract that provides for sufficient protections, as well as “binding corporate rules.”

The Regulations (together with recently issued Advisory Guidelines On Key Concepts In The Personal Data Protection Act (revised on 16 May 2014)) now provide much needed guidance in helping companies comply with their new data protection obligations under the Act.

Author: "Jeremy M. Mittman" Tags: "International"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 24 Jun 2014 13:17

David Munkittrick

After a decision denying class certification last week, claims by Hulu users that their personal information was improperly disclosed to Facebook are limited to the individual named plaintiffs (at least for now, as the decision was without prejudice).

The plaintiffs alleged Hulu violated the federal Video Privacy Protection Act by configuring its website to include a Facebook “like” button.  This functionality used cookies that disclosed users’ information to Facebook.  But, the U.S. District Court for the Northern District of California credited expert evidence presented by Hulu that three things could stop the cookies from transmitting information: 1) if the Facebook “keep me logged in” feature was not activated; 2) if the user manually cleared cookies after his or her Facebook and Hulu sessions, or 3) if the user used cookie blocking or ad blocking software. 

In its decision, the court ruled that these methods of stopping disclosure of information rendered the proposed class insufficiently ascertainable.  To maintain a class action, a class must be sufficiently ascertainable by reference to objective criteria.  Plaintiffs argued that the class membership could be ascertained by submission of affidavits from each class member, but the court reasoned that individual subjective memories of computer usage may not be reliable, particularly given the availability of $2,500 statutory damages per person under the Video Privacy Protection Act.  Thus, the court found plaintiffs had not defined an ascertainable class.

Additionally, because an individual inquiry into whether each member of the putative class used any one of the methods to block disclosure of information would be required, the court found individual issues predominated and would preclude certification of the class as defined.  Hulu also argued the total potential damages were out of proportion to any actual damages suffered and thus violated due process, but the court noted that while the argument might have some merit, the issue was not ripe given the denial of class certification.

While the case, In re Hulu Privacy Litigation, can still continue on behalf of the named plaintiffs, the court left open the possibility of proceeding with subclasses or a narrower class definition.  Consequently, additional class certification practice is likely in this closely watched case.

Author: "David Munkittrick" Tags: "Data Privacy Laws, Online Privacy, Priva..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 28 Apr 2014 18:12

Courtney Bowman

Last month, a federal district court in the Northern District of California issued an order that may affect the policies of any company that records telephone conversations with consumers.

The trouble began when plaintiff John Lofton began receiving calls from Collecto, Verizon’s third-party collections agency, on his cell phone.  The calls were made in error – Lofton did not owe Verizon any money because he wasn’t even a Verizon customer – but Lofton decided to take action when he discovered that Collecto had been recording its conversations with him without prior notice.  Lofton brought a class action against Verizon under California’s Invasion of Privacy Act, theorizing that Verizon was vicariously responsible for Collecto’s actions because Collecto was Verizon’s third-party vendor and because Verizon’s call-monitoring disclosure policy did not require the disclosure of recordings in certain situations. Verizon filed a motion to dismiss, arguing that the recordings did not invade Lofton’s privacy and therefore did not run afoul of the statute. 

The court denied the motion to dismiss, holding that the statutory language of § 632.7 of the Invasion of Privacy Act banned the recording of all calls made to cell phones – not just confidential or private calls made to cell phones – without prior notice.  The statute’s treatment of cell phones thus diverges from its treatment of landlines, as recordings of calls made to landlines only have to be disclosed via prior notice if the call is “confidential.” 

Though the case is ongoing, this ruling indicates that Lofton v. Verizon Wireless (VAW) LLC ultimately may have a significant impact on how companies interact with consumers over the phone. First, the prevalence of cell phones means that companies should assume that § 632.7 applies to a large percentage of its calls with consumers – not only because it is highly likely that these consumers use cell phones instead of landlines, but because it may be difficult for the company to tell whether these consumers are in California and subject to § 632.7. Second, this recent order indicates that companies may be held responsible for their third-party vendors’ lack of disclosure, meaning that companies should change their policies to require their third-party vendors to refrain from recording phone conversations without prior notice, and also monitor the vendors for compliance with this requirement.  In sum, when it comes to providing prior notice of recordings to consumers, companies shouldn’t phone it in – they should ensure they and any third party vendors err on the side of disclosure to avoid legal hangups down the line.

Author: "Courtney Bowman" Tags: "California, Invasion of Privacy, Mobile ..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 31 Jan 2014 15:43

Marianne Le Moullec

After two years of investigation and proceedings regarding Google’s privacy policy, European Data Protection Authorities (DPAs) are now reaching their final decisions against Google. The French DPA (“CNIL”) issued ,on January 3rd 2014, a decision ruling that Google’s privacy policy did not comply with the French Data Protection laws and imposed a fine of € 150,000 http://www.cnil.fr/english/news-and-events/news/article/the-cnils-sanctions-committee-issues-a-150-000-EUR-monetary-penalty-to-google-inc/. Google has brought an appeal against the CNIL’s decision.

This is the second decision by a European DPA fining Google for the lack of compliance of its privacy policy: on December 19 2013 the Spanish DPA (“AEPD”) ruled that Google had committed three serious violations of the Spanish Data Protection law and ordered Google to pay a fine of € 300,000 for each one of the three violations.

http://www.agpd.es/portalwebAGPD/revista_prensa/revista_prensa/2013/notas_prensa/common/diciembre/131219_PR_AEPD_PRI_POL_GOOGLE.pdf More decisions are to be expected.

A European-wide investigation against Google’s privacy policy

The decisions rendered by the French and Spanish DPAs are the result of a joint investigation by European DPAs, launched in early 2012 when Google announced that it was about to replace the individual privacy policies of each of its products and services by a single privacy policy. The Article 29 Working Party, an independent advisory body whose members are the DPAs of the28 European Member States, immediately expressed privacy concerns and decided to launch an investigation on behalf of all European DPAs. Following this investigation, the Article 29 Working Party rendered its findings in October 2012: it stated that Google’s privacy policy did not comply with the European Directive on Data Protection for several reasons: 1) it did not inform the users of the type of data collected and of its purposes; 2) it combined, without authorization, data collected by various services of Google and finally, 3) it did not specify the data retention periods. The Article 29 Working Party issued recommendations that Google refused to implement. This led to the decision, in April 2013, of six of the European DPA (Germany, France, Italy, Spain, the Netherlands, and the UK) to simultaneously launch legal actions against Google http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/20130227_pr_google_privacy_policy_en.pdf

So far, Spain and France are the only two DPAs that have issued fines against Google. The Dutch DPA has issued a decision finding Google’s privacy policy in breach of the Dutch privacy law, but has not yet issued sanctions. http://www.dutchdpa.nl/Pages/pb_20131128-google-privacypolicy.aspx. The investigations are still on-going in the other Member States (Germany, the UK and Italy).

Does French Law apply to Google’s privacy policy?

This was the first issue that the CNIL had to decide upon. The territorial scope of French law derives from the rules set out by the EC Directive n°95/46. Hence, French law is applicable either because 1)  the data controller carries out his activity within an establishment in France, or 2) the data controller is not established in France nor in the EU, but uses “means of processing” of personal data located in France to collect data.

Google claimed that the French law did not apply because Google Inc. in California is solely responsible for data collection and processing, and that Google France is not involved in any activity related to the data processing performed by Google Inc.

The CNIL rejects this argument, arguing that Google France is involved in the sale of targeted advertisement, which value is based on the data collection of Internet users. Hence, Google France is involved in the activity of personal data processing, even though it does not perform the technical processing of personal data. The CNIL’s argument is similar to the argument developed by the Advocate General in the case currently opposing Google and the Spanish DPA before the European Court of Justice (“ECJ”) (http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=198456/). The ruling of the ECJ on this issue is eagerly awaited.

In addition, the CNIL ruled that Google Inc. placed cookies on the computers of French users, and that such cookies were “means of processing” of personal data located in France because they are used to collect data from the users’ computers. Therefore, even if Google Inc. were to be considered as the sole data controller, French law would nevertheless apply because of the location of the cookies in France.

Are all data collected by Google “personal data” within the meaning of French and EU Law?

One of the main issues is the difference put forward by Google between “authenticated users”, who have registered their ID to use services such as Gmail and “unauthenticated users” who use services that do not require identification such as Youtube! or “passive users” who visit a third-party website where Google has placed Analytics cookies for targeted advertising.

According to Google, it holds “personal data” only on “authenticated users” and not on “unauthenticated users” and “passive users”. The CNIL rejects the argument because the definition of personal data under French law includes information that indirectly identifies a person. The CNIL considers that, even if the name of the user is not collected, the collection of an IP address combined with the collection of precise and detailed information on the browsing history of the computer amounts to indirectly identifying a person, because it gives precise information of a person’s interests, daily life, choices of life etc.

Therefore, all data collected by Google is considered by CNIL as personal data.

Why Google’s privacy policy breaches French Data Protection Law?

The CNIL, following the findings of the Article 29 Working Party, found four breaches of French law on data protection.

First, Google’s privacy policy fails to properly inform the users of the collection of their personal data and its purposes. Google unified into a single document the privacy policies applicable to more than 60 services and products. With regards to the “purposes” of the data collection, Google’s policy provides for general purposes such as the proper performance of the services, without further explanation or details. This is considered as too vague to inform users, especially given the variety of services offered by Google and the various types of data collected.

Secondly, Google should have informed users and obtained their consent before placing advertising cookies on their terminal. Obtaining consent for cookies does not require opt-in consent from the user, but the user must be properly informed before the cookies are placed on the terminal, of their purposes and on how to refuse them. The CNIL found that, with regards to unauthenticated users, Google placed cookies prior to any information, in breach of French Data Protection law. In addition, the information provided to users is not sufficient. Only two services of Google (Search and YouTube!) have a banner with information on cookies. Moreover, little information is given regarding the purposes of the cookies: stating that cookies are meant “to ensure proper performance of the services” is not deemed to be sufficient information in order to obtain an “informed consent” from the user. With regards to “passive users” who visit  third-party websites where Google placed its “Analytics” cookies, the CNIL considers that, since Google uses the data collected for its own activity (by producing statistics and improving its service), it acts as a data controller and is responsible for obtaining consent.

Thirdly, Google has not defined the duration during which it retains the data collected and has not implemented any automatic processes for deleting data. For example, no information is available as to the duration during which the data is kept once an authenticated user has canceled its account.

Finally, the combination of data collected from one Google service with data collected from other Google services requires informed, explicit and specific consent from the user. The CNIL ruled that Google breached this obligation because it did not provide detailed information on the type of data combination it performs and did not seek explicit and specific consent from the user. Consent to the general privacy policy or the terms and conditions of use is not considered as sufficient.

Author: "Marianne Le Moullec" Tags: "Data Privacy Laws, European Union, Onlin..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 19 Dec 2013 02:46

Kristen J. Mathews

Based on a December 3rd decision by the Second Circuit Court of Appeals, class actions under the Telephone Consumer Protection Act (TCPA) can now be brought in New York federal court. This decision marks a reversal of Second Circuit precedent, and will likely increase the number of TCPA class actions being filed in New York. Companies should review their telemarketing practices and procedures in light of the potential statutory penalties under the TCPA.

Continue reading

Author: "Kristen J. Mathews" Tags: "TCPA, telemarketing, telephone consumer ..."
Comments Send by mail Print  Save  Delicious 
Date: Friday, 13 Dec 2013 17:24

Rohit Dave

The Better Business Bureau (“BBB”) and the Direct Marketing Association (“DMA”) are in charge of enforcing the ad industry’s Self Regulatory Principles for Online Behavioral Advertising (“OBA Principles”), which regulate the online behavioral advertising activities of both advertisers and publishers (that is, web sites on which behaviorally-targeted ads are displayed or from which user data is collected and used to target ads elsewhere). Among other things, the OBA Principles provide consumers transparency about the collection and use of their Internet usage data for behavioral advertising purposes. Specifically, the “Transparency Principle” requires links to informational disclosures on both: (i) online behaviorally-targeted advertisements themselves, and (ii) webpages that display behaviorally-targeted ads or that collect data for use by non-affiliated third parties for behavioral advertising purposes. The “Consumer Control Principle” requires that consumers be given a means to opt-out of behavioral advertising.

Through its “Online Interest-Based Advertising Accountability Program”, the BBB recently enforced the OBA Principles in a series of actions—some with implications for publishers and some with implications for advertisers.

For Publishers

Web publishers, as the owner or operator of their websites, are “first parties” under the OBA Principles. Meanwhile, “third parties” include ad networks, advertisers, and demand side platforms. The Transparency Principle requires publishers to provide enhanced notice to consumers of the collection of usage data by third parties on their websites for behavioral advertising purposes. To satisfy the enhanced notice requirement, publishers must provide “a clear, meaningful, and prominent link” to an informational disclosure on any page where usage data is collected by or transferred to third parties for behavioral advertising purposes. The disclosure can either point to an industry-developed Web page, such as the Digital Advertising Alliance’s (“DAA”) Consumer Choice Page, or it can individually list and link to all the third parties engaged in online behavioral advertising in connection with the website. This enhanced notice link is to be separate from, but just as prominent as, the website’s privacy policy link.

Last month, the BBB admonished publishers to heed these self-regulatory requirements when it issued its first Compliance Warning. The BBB encouraged the use of the DAA endorsed AdChoices Icon on any page from which a website operator allows a third party to collect user data for behavioral advertising or transfers such data to a third party for such purposes. This policy gives consumers “just-in-time” notice and enables them to decide whether to participate in behavioral advertising.

Recent inquiries by the BBB into the behavioral advertising activities of BMW of North America, LLC and Scottrade, Inc. illustrate the requirements of the OBA Principles on Web site publishers. In these inquiries, the BBB found that both companies had failed to provide enhanced notice and opt-out links on webpages where they allowed third-party data collection for behavioral advertising purposes. Both companies quickly achieved compliance in response.

The BBB warned that publishers that do not comply with the first party transparency requirements for third party data collection, but are otherwise in compliance with the OBA Principles, will face enforcement action by the BBB beginning on January 1, 2014.

For Advertisers

The Transparency Principle requires third parties (e.g., advertisers and ad service providers) that engage in behavioral advertising on non-affiliated websites to provide enhanced notice to web users about behavioral advertising through a similar hyperlinked disclosure preferably containing the AdChoices Icon in or near the behaviorally-targeted advertisement itself. The disclosure should contain information about the types of data collected for behavioral advertising purposes, the use of such data, and (under the Consumer Control Principle) the choices available to users with respect to behavioral advertising. This requirement is designed to notify consumers that the ad they are viewing is based on interests inferred from a previous website visit.

Recently, the BBB found three companies in violation of the OBA Principles and on November 20th released decisions resolving their inquiries into the violations. The BBB visited a company website, and when it left the company site to browse other websites, found ads from the original company on the non-affiliated sites. However, the ads did not include enhanced notice, which, the BBB said, was in violation of the third party transparency requirements of the OBA Principles. According to the BBB, the party responsible for this omission was 3Q Digital, an ad agency. The agency used the self-service features of a demand side platform to serve targeted ads. The BBB found that when 3Q Digital used the self-service platform it stepped “into the shoes of companies explicitly covered by the OBA Principles” and assumed the compliance responsibilities of covered companies.         

In response, 3Q Digital’s client included the AdChoices Icon in all its interest-based ads. 3Q Digital, in turn, took similar steps with all its other online campaigns.


The BBB’s recent enforcement activities emphasize the need for companies (whether they advertise online, display third party ads on their online properties, or simply contribute user data for use by others for behavioral advertising) to be vigilant of the specific requirements of the OBA Principles that are applicable to them based on how they participate in behavioral advertising. One motivation for the existence of the OBA Principles is to show regulators that legislation in this area is not necessary. To ward off legislation and avoid enforcement action by the BBB and the DMA, all parties involved should be mindful of the OBA Principles, and make them a part of their compliance programs.

Author: "Rohit Dave" Tags: "Behavioral Marketing, behavioral adverti..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 21 Nov 2013 00:16

Laura Shovlowsky

An article published by Law360 last week quoted Jeremy Mittman, co-Chair of Proskauer’s International Privacy Group and a member of the firm’s International Labor Group, on the data protection reform legislation recently passed by European Parliament and the difficulties multinational companies face to comply with both EU and U.S. privacy laws.

Jeremy was again solicited to comment on the EU-U.S. Safe Harbor Program in an article published by Politico on November 7.  The article mentions Jeremy’s experience drafting Safe Harbor certifications and EU model contracts.

Author: "Laura Shovlowsky" Tags: "Articles, data protection reform, Europe..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 13 Nov 2013 16:16

Marianne Le Moullec

The determination of the territorial scope of the current EU Directive n° 95/46 is still under dispute both before national Courts and the European Court of Justice (ECJ). This issue may soon become moot with the adoption of future data protection regulation, which may modify and expand the territorial scope of EU data privacy law, especially following the results of the recent vote of the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs. The following is meant to help determine the current state of affairs regarding the issue of the territorial (and extraterritorial) scope of the future EU law following this vote of the European Parliament. 

As the internet has allowed companies to easily provide services from a distance, the issue as to what laws are applicable to personal data has become more complex. This was not fully anticipated when the current EU Directive on personal data protection was adopted in 1995. Modifications to the rules regarding territorial scope set by Article 4 of the current EU Directive have been a highly debated issue in the EU.

An ongoing case before the ECJ highlights this complexity, and the legal uncertainty, surrounding the territorial scope of the current EU Directive. In this case, a Spanish citizen lodged a complaint against Google Spain and Google Inc. before the Spanish Data Protection Agency (“AEPD”) because Google refused to take down data that appeared when his name was entered in the search engine. As a defense, Google argued that Spanish law was not applicable because the processing of personal data relating to its search engine does not take place in Spain, as Google Spain acts merely as a commercial representative: the technical data processing takes place in California. According to Article 4.1 (a) of the EU Directive, national law is applicable if “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State.” The ECJ will therefore have to determine whether Google Spain, “in the context of its activities,” may be considered as processing data, even though, as a commercial subsidiary, it does not technically process personal data.

The Advocate General has given a positive answer to that question in a non-binding Opinion delivered last summer. In the Opinion, he argues that since the business model of search engines relies on targeted advertising, the local establishment in charge of marketing such targeted advertising to the inhabitants of a particular country must be considered as processing personal data “in the context of its activities,” even though the technical operations are not performed there. The ECJ is expected to render its decision at the end of this year.

In the near future, the applicable law in such a situation may more easily be determined based on the draft Regulation proposed by the European Parliament.

  • First, the European Parliament has proposed Article 3.1 of the EU Directive be amended to clarify that “this Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union whether the processing takes places in the Union or not.” (emphasis added). If the draft Regulation is adopted as such, EU law would therefore unequivocally apply to the activities of the subsidiaries established in the EU of foreign companies, regardless of the actual place of data processing.
  • Second, the European Parliament proposes to amend Article 3.2 of the EU Directive, which concerns the extraterritorial application of EU law (i.e., the situation where the data controller does not have any presence in the EU). The draft Regulation provides that EU law would nonetheless apply if the processing of data is related to “offering of goods or services” to data subjects in the European Union. In accordance with the Article 29 Working Party, which stated in its Opinion 01/2012 that the offering of goods or services should include free services, the European Parliament has proposed amending Article 3.2 to provide that the EU law would apply to any processing activity related to the offering of goods or services to data subjects in the EU, “irrespective  of whether a payment of the data subject is required.”

The draft amended regulation will now be negotiated with the European Council (the governments of the EU Member States). The European Parliament is pushing for a vote of the regulation in the spring 2014.  However, such a timetable is far from assured, given the general “slow track” of the proposed legislation coupled with recent pronouncements by the leaders of several EU countries suggesting a timetable closer to 2015.

Author: "Marianne Le Moullec" Tags: "Data Privacy Laws, European Union, Legis..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 06 Nov 2013 23:06

Charley Lozada

This past month, the European Union’s Article 29 Data Protection Working Party (the “Working Party”) issued the Working Document 02/2013 providing new guidance on obtaining consent for cookies (“Working Document”). The Working Document sets forth various mechanisms which can be utilized by websites to obtain consent for the use of cookies in compliance with all EU Member State legal requirements.

The amended e-Privacy Directive 2002/58/EC, adopted in 2009 and implemented in all EU Member States, requires website operators to obtain users’ consent for the use of cookies or similar tracking technologies.  The Working Document elaborates on the Working Party’s prior opinion, as explained in the Working Party’s Opinion of July 13, 2011 on the concept of consent in particular.  Specifically if a website operator wants to ensure that a consent mechanism satisfies each EU Member State requirement, such consent mechanism should include each of the main elements: (1) specific information, (2) prior consent, (3) indication of wishes expressed by user’s active behavior and (4) an ability to choose freely.

Specific Information: The Working Party states that websites should implement a mechanism that provides for “for a clear, comprehensive and visible notice on the use of cookies, at the time and place where consent is sought.” Users must be able to access all necessary information about the different types or purposes of cookies being used by the website.

The Working Paper indicates that necessary information includes:

  • identification of all of the types of cookies used;
  • the purpose(s) of the cookies;
  • if relevant, an indication of possible cookies from third parties;
  • if relevant, third party access to data collected by the cookies;
  • the data retention period (i.e. the cookie expiry date); and
  •  typical values and other technical information.

Users must also be informed about the ways that they can accept all, some or no cookies and how to change their cookie settings in the future.

Timing:  Consent must be obtained before data processing begins, i.e. on the entry page.  The Working Party recommends that websites implement a consent solution in which no cookies are set to a user’s device (other than those that fall under an exception and thus do not require the user’s consent) until that user has provided consent.

Active Behavior: The Working Party indicates that valid consent must be through a “positive action or other active behavior”, provided that the user has been fully informed that cookies will be set due to this action. Unfortunately, the passive use of a website containing a link to additional cookie information is not likely to be sufficient.  Examples provided by the Working Party include (i) clicking on a button or link, (ii) ticking a box in or close to the space where information is presented or (iii) any other active behavior from which a website can unambiguously conclude that the user intends specific and informed consent.  The Working Party also confirmed their previously issued view that browser settings may be able to deliver valid and effective consent in certain limited circumstances. Where the website operator is confident that the user has been fully informed and has actively configured their browser or other application to accept cookies, then such a configuration would signify an active behavior.

Real Choice:  The Working Document provides that websites should present users with real and meaningful choice regarding cookies on the entry page. This choice should allow users to decline all or some cookies and to change cookie setting in the future. The Working Document also clarifies that websites should not make general access to the website conditional on the acceptance of all cookies, although it notes that access to “specific content” could in some circumstances be conditional.

Although the Working Document is a welcome source of guidance providing further clarification on this thorny issue, it is clear that compliance with the European Union’s rules governing cookie consent will continue to provide challenges to companies seeking to conform their websites accordingly.

Author: "Charley Lozada" Tags: "European Union, International, Uncategor..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 24 Oct 2013 18:47

Jeremy M. Mittman

On October 21, a key European parliamentary committee (the Committee on Civil Liberties, Justice and Home Affairs (“Committee”) approved an amended version of the draft EU Data Protection Regulation, paving the way for further negotiations with EU governmental bodies.  The goal, according to a press release by the Committee, is to reach compromise on the draft agreement and a vote prior to the May 2014 EU Parliamentary elections.  The proposed legislation (which passed in a 51-1 vote) contains a number of key concepts, including:

Right to Erasure:

Stronger than the previously worded “Right to be Forgotten”, the proposed legislation contains a “Right to Erasure”, whereby a data subject would have the right to ask any entity holding personal data on that data subject to erase the personal data upon request.  Moreover, if the personal data has been “replicated” with other entities, the data controller to whom the request has been made must forward the request to the other entities it has transferred the data subject’s personal data to.

Increased Sanctions:  

The Committee voted to increase the amount of penalties that could be levied for companies that violate the rules.  Whereas previously the proposal was penalties up to 1 million euros or 2% of worldwide annual turnover revenue of the company, the Committee ratcheted up the proposed penalties to 100 million euros or up to 5% of annual worldwide revenue, whichever is greater—a significant increase that illustrates the potentially expensive consequences of violating the data protection legislation.

Data transfers to non-EU countries:   

Specifically referencing the June 2013 Snowden disclosure of mass surveillance by the U.S. government’s PRISM program, the Committee proposed that if a company in the EU was requested to disclose personal data to a government located outside the EU, the entity would need to seek specific authorization from the data protection authority located in the EU country, before transferring any such personal data outside of the EU.  The new provision reflects the acute concern of the EU over the Snowden revelations of this summer.


The package adopted by the Committee includes a provision limiting the practice of profiling, i.e. “a practice used to analyze or predict a person’s performance at work, economic situation, location, health or behavior.”  Now, individual consent (such as that provided by a contract) would be needed in order to profile, and any individual should possess the right to object to such profiling.

Although the Committee hopes to reach agreement with the other EU legislative bodies (such as the national governments that compose the European Council) by May 2014, it is clear that there is still a long road ahead before the new legislation is finalized and enacted.  The contours of the proposed Regulation may change after further rounds of negotiations.  However, the recent proposals by the Committee help to illuminate the direction that the Regulation is heading.

Author: "Jeremy M. Mittman" Tags: "European Union, Committee on Civil Liber..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 02 Oct 2013 22:50

Kevin Khurana

On September 27, 2013, California Governor Jerry Brown signed into law an amendment to California’s breach notification law (Cal. Civ. Code § 1798.82).  Effective January 1, 2014, under the amended law, the definition of “Personal Information” will be expanded to include “a user name or email address, in combination with a password or security question and answer that would permit access to an online account.”  Additionally, new notification options have been added to address a breach of this type of information.

As amended, if there is a breach involving only this type of information and not the other types of information covered under the pre-amendment definition of “Personal Information,” the entity in question may provide notice to the affected person in electronic or other form.  This notice must direct that person to change his or her password and security question or answer, as applicable, or to take other appropriate steps to protect the online account in question and all other accounts for which that person uses the same credentials. 

Under the amended law, if the credentials breached are for a person’s email account furnished by the entity that suffered the breach, the entity in question may not provide notice to the compromised email account, but may use one of the other notification methods allowed by the law, or may comply by providing clear and conspicuous notice to that person when he is connected to the compromised online account from an IP address or online location from which the entity knows that person customarily accesses the online account in question.   

It should be noted that the foregoing notification methods are options – an entity that breaches its requirements under California’s data security laws may still provide notice under the law’s original notification provision.

Author: "Kevin Khurana" Tags: "California, Data Privacy Laws, breach no..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 02 Oct 2013 19:08

Charley Lozada

Law Targets Sites and Mobile Apps Directed to Minors, Offers “Online Eraser”     

Likely to Have Nationwide Effect

On July 1st of this year, new amendments to the Children’s Online Privacy Protection Act Rule (COPPA Rule) came into effect, with perhaps the most pronounced changes being the expansion of COPPA to apply to geolocation information and persistent identifiers used in behavioral advertising.  Critics called the amendments jumbled and labeled it a compliance headache, while privacy advocates were buoyed, but thought the changes did not go far enough to protect the online privacy of children.  Still others contended that federal law contains a gap that fails to offer privacy protections for teenage users.

Once again, the California state government has stepped up to fill what it perceives to be a void in federal online privacy protection, this time to address certain restrictions on the use of information collected from minors and to give minors an online “eraser” of sorts. In late September, Gov. Brown signed S.B.568, which expanded the privacy rights for California minors in the digital world.

“Minors”, by the way, are defined under the law as residents of California under age 18 – this definition in itself is an expansion of the protections afforded to children under COPPA, which addresses the collection and use of information from children under 13.  That is not the only expansion of COPPA presented by this new law.  The federal COPPA Rule is primarily concerned with mandating notice and parental consent mechanisms before qualifying sites or mobile apps can engage in certain data collection and data tracking activities with respect to children under 13.  The California statute’s marketing restrictions for minors contain no parental consent procedures  – rather, restrictions for covered web services directed to minors that relate to certain specified categories of activities that are illegal for individuals under 18 years of age.

As a practical matter, compliance with this law will require certain changes in the way website publishers collect and process user information.  For example, it is much easier for online operators to determine whether their websites are directed to children under 13 as opposed to “directed to minors” under 18.  Going forward, sites and apps will have to reevaluate their intended audience, as well as establish procedures for when a minor user self-reports his or her age, triggering the site having actual knowledge of a minor using its service.

S.B.568 has two principal parts:  minor marketing restrictions and the data “eraser.”

Marketing Restrictions: The new California law prohibits an operator of a website, online service or mobile app directed to minors or one with actual knowledge that a minor is using its online site or mobile app from marketing or advertising specified types of products or services to minors. The law also prohibits an operator from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile the personal information of a minor for the purpose of marketing or advertising specified types of products or services. Moreover, the law makes this prohibition applicable to an advertising service that is notified by a covered operator that the site, service, or application is directed to a minor.  The statute lists 19 categories of prohibited content covered by the law’s marketing restrictions, including, firearms, alcohol, tobacco, drug paraphernalia, vandalism tools and fireworks.  Notably, the law does not require an operator to collect or retain the ages of users, and provides operators with a safe harbor for “reasonable actions in good faith” designed to avoid violations of the marketing restrictions.

Online Eraser: The second part of S.B. 568 requires operators of websites and applications that are directed to minors, or that know that a minor is using its site or application, to allow minors that are registered users, to remove (or to request and obtain removal of) their own posted content. The operators must also provide notice and clear instructions to minors explaining their rights regarding removal of their own content. Notably, SB 568 does not require operators to completely delete the content from its servers; it only requires that the content be no longer visible to other users of the service and the public.  There are certain exceptions to this “online eraser” right, such as circumstances where any other provision of federal or state law requires the operator or third party to maintain the content, the content is stored on or posted to the operator’s site or application by a third party, the operator anonymizes the content, the minor fails to follow the instructions regarding removal, or the minor has received “compensation or other consideration for providing the content.”

Both prongs of the law raise many questions:

  • How does a site or application owner determine whether it is covered by S.B.568? Under the statute, a website, online service, or mobile app “directed to minors” means an “Internet Web site, online service, online application, or mobile application, or a portion thereof, that is created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.”
  • What will qualify for “reasonable actions in good faith” under the safe harbor?  What are the legal ramifications of an independent online ad network serving unlawful ads to minors without the knowledge of an otherwise compliant site operator?
  • How does a site implement the “eraser” function? With user tools to eliminate UGC, or will the site control the removal process via an online request form?  Will a removal request necessarily cause the removal of other users’ content (e.g. social media postings of other users that comment on a removed comment or submitted photo)?
  • The online eraser right seemingly applies only to minors.  How should a site or app handle requests from adults wishing to remove content they posted when they were minors?  Should sites simply offer the tool to all users to avoid compliance issues?
  • What qualifies as “compensation or other consideration for providing the content” under the exceptions to the online eraser right? Would this include free products, coupon codes, or the right to receive exclusive ‘limited time’ offers?
  • What changes are required in the site’s privacy policies?

The law will come into effect on January 1, 2015. Any company with a website that can be accessed by California residents should assess the impact of these new requirements in the coming year. Considering that most, if not all, major websites and apps necessarily have or will have California-based users, this state law may become a de facto national standard, particularly since technical controls to screen or segregate California users may be unworkable.

[Incidentally, California also recently enacted a new law addressing online tracking, so it appears that the California legislature continues its focus on web privacy].

Author: "Charley Lozada" Tags: "California, Legislation, Online Privacy"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 01 Oct 2013 01:39

Jeremy M. Mittman

On September 27, California Governor Jerry Brown signed a new privacy law that has significant repercussions for nearly every business in the United States that operates a commercial website or online service and collects “personally identifiable information” (which means, under the law, “individually identifiable information about an individual consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual.”)  The new law goes into effect on January 1, 2014.

Under California’s existing Online Privacy Protection Act, a Web site or online service that collects PII about California residents already has the obligation to post a privacy policy, identify its effective date and describe how users are notified about changes to the policy, as well as identify the categories of PII that are collected and with whom such PII is shared.

Now, the new law—which passed both houses of the California Legislature unanimously —requires that all such Web sites must disclose how they “respond to Web browser “do not track” signals or other mechanisms that provide consumers the ability to exercise choice regarding the collection of PII about an individual consumer’s online activities over time and across third-party Web sites or online services”, if such information is collected. The new law prescribes that operators can comply with this disclosure requirement by “providing a clear and conspicuous hyperlink” contained in the privacy policy that links to a description “of any protocol the operator follows that offers the consumer” the choice to opt-out of internet tracking.

The legislative analysis of the law reveals that its purpose is to “increase consumer awareness of the practice of online tracking by websites and online services, such as mobile apps [and] will allow consumers to learn from a website’s privacy policy whether or not that website honors a Do Not Track signal [which] will allow the consumer to make an informed decision about their use of the website or service.”

The analysis noted the rapid rise in online tracking of users’ web-surfing behavior as well as the California Attorney General’s observation that although “all the major browser companies have offered Do Not Track browser headers” that, if selected, can “signal to websites an individual’s choice not to be tracked, [t]here is, however, no legal requirement for sites to honor the headers.” Thus, because Web sites have been free to disregard such Do Not Track selections by consumers, they would not know whether or not their selection is honored unless the Web site provides them with such notice. The new law will mandate providing users with the requisite notice.

In addition to the above “do not track” notice obligations, the law also requires website and online service operators “to disclose whether other parties” collect PII regarding a consumer’s “online activities over time and across different Web sites when a consumer uses the operator’s Web site or service.”

In light of the new obligations, it is imperative that any organization that collects PII concerning California residents (whether or not that organization is based in California) assess its current Web site privacy policies to ensure that they are compliant with California’s new laws requiring additional disclosures.


Author: "Jeremy M. Mittman" Tags: "Legislation, california privacy, do not ..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 18 Sep 2013 13:49

David Munkittrick

On October 16, 2013, the Federal Communications Commission’s (“FCC”) new rule implementing the Telephone Consumer Protection Act (“TCPA”) will go into effect. 

These are rules with teeth, as the TCPA allows recovery of anywhere between $500 and $1,500 for each improper communication and does not require a showing of actual injury.  This makes the TCPA a particularly attractive vehicle for class actions.  Accordingly, we highlight some of the more salient changes in the new rule below.

Currently, except in an emergency, FCC regulations require businesses to obtain “prior express consent” before making any type of call or sending any text message using autodialers or prerecorded voices to cellphones, pagers, emergency lines, or hospital room lines.  Further, the regulations also require for-profit businesses to obtain prior express consent before making commercial advertisement or telemarketing calls or messages using a prerecorded voice to any residential line absent an emergency and absent an established business relationship with the person called. 

Under the new rule, however, the established business relationship exemption disappears.  For-profit businesses will have to acquire “prior express written consent” before making advertisement or telemarketing calls or sending text messages using autodialers or prerecorded voices to a cellphone, pager, emergency line, or hospital room line.  Absent an emergency, prior express written consent will also be required for commercial advertisement or telemarketing calls or messages using a prerecorded voice made to a residential line, whether or not there is a prior business relationship with the recipient.

The written consent must be “clear and conspicuous.”  In other words, the written consent must specify the phone number for which consent is given, that the consent is not on condition of purchase, and that the consent encompasses future autodialed or prerecorded telemarketing messages.  The written consent can be electronic – for example, through e-mail, website forms, text messages, or telephone keypad functions. 

The new rule also requires prerecorded telemarketing calls to include an interactive, automated opt-out mechanism that is announced at the beginning of the message and is available at any time during the call.  If a call could be answered by an answering machine or voicemail, the message must include a toll-free number the consumer can call to opt out.  The existing three-percent limit on abandoned calls is also revised to apply to calls within a single calling campaign rather than all calls made across all calling campaigns.  Finally, the new rule exempts HIPAA-covered entities from the requirements on prerecorded calls to residential lines.

Many businesses may already have the necessary procedures in place to comply with the new rule, as many of the new requirements, including the written consent requirement, are designed to harmonize FCC regulations with those of the Federal Trade Commission.  Still, though the new rule does not go into effect until October 16, 2013, the clock is ticking.

Author: "David Munkittrick" Tags: "Direct Marketing, TCPA"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 29 Aug 2013 23:13

David Munkittrick

In a world full of electronic information (not to mention hackers and identity thieves), data breaches—the loss, theft, or unauthorized access to data—are a reality for companies that collect and store personal information. Breaches can occur in myriad ways: a hacker gains access to a database or an unencrypted laptop is stolen, to name but two prevalent examples. Almost as regular as night follows day, class action lawsuits follow data breaches—and as the volume of data breaches increases, so too does the volume of litigation.  Yet federal standing and pleading requirements have thus far posed significant hurdles for plaintiffs.

Click here to read more in an article by Margaret Dale and David Munkittrick, members of Proskauer’s Privacy and Data Security Group.

Author: "David Munkittrick" Tags: "Articles, Data Breaches"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 22 Aug 2013 03:37

Jessica Goldenberg

In February of 2013, President Obama signed an executive order with the purpose of creating a cybersecurity framework (or set of voluntary standards and procedures) to encourage private companies that operate critical infrastructure to take steps to reduce their cyber risk (see our blog here). Critical Infastructure Systems such as the electric grid, drinking water, and trains are considered vulnerable to cyber attack, and the results of such attack could be debilitating. The Departments of Commerce, Homeland Security, and Treasury were tasked with preparing recommendations to incentivize private companies to comply with heightened cybersecurity standards. On August 6, 2013 the White House posted its preliminary list of incentives encouraging the adoption of cybersecurity best practices.

The draft framework of incentives is not due until October of this year, when it will be published for public comment. A final version is expected for February of 2014. The August 6th post serves as an interim step, which allows the private sector an opportunity to think about the recommendations and provide feedback.

In the post, Michael Daniel, Special Assistant to the President and the Cybersecurity Coordinator, lists eight ideas, summarized below.

  • Cybersecurity Insurance – engage the insurance industry with the goal of creating a competitive cyber insurance market.
  • Grants – make participation in the cybersecurity programs a condition or criteria for a federal critical infrastructure grant.
  • Process Preference— make participation a consideration in the government’s determination of whether to expedite existing government service delivery.
  • Liability Limitation — agencies are looking into whether reducing liability on participants in certain areas (such as tort liability, limited indemnity, higher burdens of proof) would encourage critical infastructure companies to implement the framework.
  • Streamline Regulations — streamline existing cybersecurity regulations and develop ways to make compliance easier, such as by eliminating overlaps in existing laws and reducing audit burdens.
  • Public Recognition — agencies are exploring whether giving companies the option of public recognition for participation in the programs would work as an incentive.
  • Rate Recovery for Price Regulated Industries — speaking to federal, state, and local regulators regarding whether utility rates could be set to allow for recovery for investments related to adopting the cybersecurity framework.
  • Cybersecurity Research — research and development to determine where commercial solutions are possible but do not yet exist. The government can then focus on research and development to meet the most pressing cybersecurity issues.

The August 6th report offers an “initial examination” of ways to incentivize the adoption of cybersecurity measures by private companies in the critical infrastructure sector. Discussions with the industry will help determine which direction the government ultimately takes with its cybersecurity framework.

Author: "Jessica Goldenberg" Tags: "Data Breaches, Data Privacy Laws, Nation..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 20 Aug 2013 18:16

Ryan Blaney

We have heard the well-publicized stories of stolen laptops and resulting violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), and we generally recognize the inherent security risks and potential for breach of unsecured electronic protected health information posed by computer hard drives. We remember to “wipe” the personal data off of our phones or computers before they are disposed, donated, or recycled.

A recent HIPAA settlement offers a costly reminder that other types of office equipment we use regularly have similar hard drives capable of storing confidential personal information.

On August 14, 2013, HHS announced a $1,215,780 settlement with the not-for-profit managed care plan Affinity Health Plan, Inc., stemming from an investigation of potential violations of the HIPAA Privacy and Security Rules relating to an April 15, 2010 breach report filed by Affinity with the HHS Office for Civil Rights (OCR). Affinity’s breach report and OCR’s subsequent investigation revealed that Affinity had impermissibly disclosed the protected health information of up to 344,579 individuals when it returned multiple photocopiers to leasing agents without erasing the photocopier hard drives. Affinity learned of the breach when a representative from CBS Evening News informed the New York health plan that, as part of an investigatory report, CBS had purchased a photocopier previously leased by Affinity and had found confidential medical information on the photocopier’s hard drive. OCR’s investigation indicated that Affinity had failed to assess the potential security risks and implement policies for the disposal of protected health information stored on the photocopier hard drives.

In addition to the financial settlement, the Resolution Agreement includes a corrective action plan (CAP) requiring Affinity to use its “best efforts to retrieve all photocopier hard drives that were contained in photocopiers previously leased by [Affinity] that remain in the possession of [the leasing agent].” The CAP also requires Affinity to conduct a comprehensive risk analysis and implement safeguards to protect electronic protected health information on all of its electronic equipment and systems.

For more than ten years, digital copiers have been capable of storing images of documents. This settlement should serve as a warning to entities and individuals who handle electronic personal health information: any and all equipment capable of storing trace amounts of digital information should be accounted for in risk assessments conducted under the HIPAA Security Rule.  All HIPAA Privacy and Security Policies and Procedures Manuals should be updated to include guidelines for safeguarding protected health information retained on digital copiers, scanners, fax machines and other devices whose primary function may not be data storage.

By Ryan Blaney and Kelly Carroll

Author: "Ryan Blaney" Tags: "Data Breaches, HIPAA, Identity Theft, Me..."
Comments Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader