& INFORMATION LAW UPDATE
Trade Commission and Department of Commerce to Issue Final Reports
Privacy Law: Proposed Changes to EU Data Privacy Framework and Recent
Decision on Industry Self-Regulation Foreshadow Compliance Challenges
PROPOSED REVISIONS TO THE EU DATA PRIVACY FRAMEWORK.
The new framework will immediately replace the Article 29 Working Party (the influential advisory body created by the Directive) with a European Data Protection Board with an expanded role. The new regulation would better harmonize the data protection laws of all member states because it will be immediately enforceable by those states once it is adopted by the European Commission (EC), unlike a directive, which must be adopted in some form by each member state before it can be enforced. This change will result in less fragmentation of EU privacy law, and therefore greater legal certainty for US businesses, although the proposed framework includes a number of changes that could significantly impact how US companies collect, retain and use EU citizens data. We summarize below some key changes:
SCOPE. If approved by the European Parliament and Council of Ministers, the proposed framework will apply to any company anywhere in the world if their activities are directed to citizens of the EU, whether or not that company has any offices located in the EU.
US-EU SAFE HARBOR. The proposed framework would put the current US-EU safe harbor in jeopardy, creating a much more burdensome process for businesses that want to legally transfer data from EU member countries to the US Under Articles 37-43 data would be able to be transferred between EU and non-EU countries in two situations: when the European Commission has issued an adequacy decision regarding the level of data protection in the non-EU country, or when an adequacy decision has not been made, companies will be able to use Binding Corporate Rules or standard or approved contractual data protection clauses. There is no provision for the safe harbor beyond the EU making an adequacy decision with regard to US privacy protections. Adequacy decisions may be general or sectoral (applying only to a particular type of data or industry), and will require an assessment of the adequacy of the countrys laws, including enforceable data protection and privacy rights and judicial redress, the presence of an independent data protection supervisory authority in the country, and international treaties and commitments by the country.
NEW RIGHTS AND OBLIGATIONS.
The proposed rules also create new rights for individual data subjects and impose new obligations on data controllers, including:
Right to be Forgotten. Article 15 of the proposed framework gives individuals the right to have their data erased by the data controller the company or public entity that holds their information in certain, broadly defined circumstances. Thus, data subjects can demand their data be erased when it is no longer necessary for the purpose for which it was collected or by withdrawing consent.
Right to Data Portability. Under article 16 of the proposed framework individuals will be given the right to obtain in non-proprietary formats the information collected about them. They also have the right to transfer their data from one system to another, in a commonly used format, without hindrance from the company originally holding the data. This provision could force companies like Facebook and Google to allow their customers to transfer data between the two companys products, such as transferring friend lists from Facebook to Google+ and transferring emails from Gmail to Facebook messages.
Responsibility for Data Protection. The responsibility for protecting data will rest with both the controller (who collects the data) and the processors (their vendors who process the data on behalf of the controller). These requirements will include documenting all processing operations, implementing security measures, and providing data breach notification to the EU supervisory authority and affected data subjects within 24 hours of the breach.
Data Protection Officers. Companies with over 250 employees and all public entities are required to have a data protection officer.
U.S. PATRIOT ACT. The proposed framework would also complicate data transfer outside of the EU for non-EU governments who want to collect data on EU citizens. In particular, the regulation would impact the ability of the US government to collect information from U.S. companies about EU citizens under the PATRIOT Act. Under Article 42 (with the caveat that international treaties between non-EU countries and member countries take precedence) any company which receives a judicial, administrative, or executive decision or request from a non-EU country to turn over data it has collected about an EU citizen must first obtain the authorization of the relevant EU countrys data protection authority. The EU supervisory authority will make its decision based on whether the request is necessary and legally required and otherwise within the regulations. The company would also be required to must inform the data subject of the request and the authoritys decision.
PENALTIES. The new framework would impose significant penalties for noncompliance. Under Article 79 companies can be penalized anywhere from 1% to 5% of their annual global revenues, depending on which rules are violated. For example, failure to provide required notice of a data breach would expose a company to the heaviest fine of 5% of annual global revenues.
ONLINE BEHAVIORAL ADVERTISING
The Article 29 Working Party also recently rejected self-regulatory proposals by the behavioral advertising industry.2 The opinion concluded that the self-regulatory code proposed by the European Advertising Standards Alliance and the Interactive Advertising Bureau Europe (EASA/IAB), who count Google, Microsoft, and Yahoo among their members, does not comply with the current e-Privacy Directive. The Working Party likewise rejected the industrys website www.youronlinechoices.eu as giving the false impression that it is possible to choose not to be tracked while surfing the Web.
Two aspects of the self-regulatory program were of particular concern. The first involved an icon approach, by which the EASA/IAB would include an icon on all ads that links to www.youronlinechoices.eu where users could opt out of ads from particular companies and ad networks. The Working Party concluded that this approach did not, on its own, sufficiently inform users about cookies. A number of reasons were cited; including that the icon is not widely recognized, the icon and website did not clearly differentiate between advertising and behavioral advertising, did not allow for prior consent, and did not provide easily understandable information about advertising networks and their reasons for processing information.
The opinion also clarified a number of issues regarding cookies and consent. For example, any unique identifiers, even if anonymized, are considered personal data and therefore subject the collecting and processing entities to the data protection rules. The opinion also offered alternatives to the industrys endorsement of pop-ups to obtain consent. These include a static banner at the top of the page asking consent to set cookies, a splash on entering the website, and default settings that prevent the collection or disclosure of personal data.
The proposed privacy framework will mark the beginning of increased compliance challenges for U.S. businesses that handle EU citizen data. The Working Partys rejection of a self-regulatory proposal for behavioral advertising can be seen as further evidence that European policymakers believe that only stringent rules will protect personal privacy. Many of the proposed privacy frameworks key components reflect the belief that current privacy law does not go far enough to protect privacy, particularly given the growth of technology, services and applications since the Directive was implemented. Accordingly, U.S. businesses should monitor developments as the proposed framework is considered by the European Parliament and be prepared to adapt their business and legal strategies accordingly.
Passes Legislation Amending Video Privacy Protection Act
The VPPA was passed by Congress in response to a newspaper report revealing the movie rental history of Robert Bork, who was at the time a nominee to the Supreme Court. The law, which applies to video tape service providers, defines those providers as anyone engaged in the business of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials, and protects personally identifiable information (PII). PII is defined as information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider. Currently, video service providers are prohibited from disclosing consumer PII to anyone without the informed, written consent of the consumer given at the time the disclosure is sought.
The amended law replaces the informed, written consent requirement. The new version explicitly states that consent to disclose PII to anyone may be given electronically and through the Internet, but requires that such consent must be in a separate and distinct form from other policies and registration forms. The bill also alters the time at which consent may be given. Although consent at the time disclosure is sought is still permissible, a video tape service provider may also obtain blanket consent in advance for a set period of time or until consent is withdrawn.
If enacted, these amendments will mean that a video rental company, including a streaming video provider such as Netflix, will be able to share its customers video rental histories on an ongoing basis with the public. Specifically, Netflix and other streaming video providers such as Hulu, Amazon, and Youtube would be permitted to obtain its customers permission to share on Facebook, Twitter, and other social media what movies a customer has rented or watched.
of Education Issues Updated FERPA Rule
Under FERPA, directory information is personally identifiable information (PII) about students that schools may disclose without written parental or, in the case of a student who is 18 or older student consent. Each school individually defines what directory information consists of, though it is typically name, address, and phone number for use in a student directory or a yearbook. Parents and adult students must be given the opportunity to opt-out of such disclosures.
Instead of the all-or-nothing approach under the old rules, where all directory information was publicly available, the new rules authorize schools to designate particular purposes or parties that may receive directory information. Therefore, a school that lists students names and addresses as directory information has the option of, for example, specifying that directory information may only be disclosed to employees of the school for educational purposes.
The new rules do not require such granular controls that schools must allow parents to opt-out of specific categories of directory information but not others, though they must at least provide the option of opting out of directory information entirely. For example, schools are required to allow parents and adult children to opt-out of all directory information disclosures; however they may, but are not required to give the option of opting-out of particular types of directory information disclosures, such as withholding consent for their childs address but allowing their childs name to appear in a student directory.
The new rules also clarify how directory information can be used for student ID cards or badges. Under the new rules, students do not have the right to opt out of the disclosure of directory information that is displayed or otherwise contained on a student ID. Schools may require students to wear, display, or disclose a student ID that exhibits PII, even if that information is otherwise designated as directory information and the student has opted out. However, schools may only require disclosure of information on a student ID if that information cannot be used to gain access to education records without additional authorization safeguards.
AUDIT & EVALUATION AND STUDIES EXCEPTIONS
Perhaps the most significant amendments to the FERPA rules are those concerning the audit and evaluation and the studies exceptions. In recent years, states have been attempting to implement statewide longitudinal data systems to study the efficacy of their schools. These exceptions are designed to allow schools to disclose student PII to third parties for the purposes of studying, auditing, and evaluating their education programs and their compliance with FERPA and various other state and federal laws. Specifically, the rules allow schools to provide authorized representatives with student information for the purpose of evaluating education programs. The amendments define these two previously undefined terms to expand access to education records but also provide additional safeguards in light of that disclosure.
The new definition of education program increases access by clarifying the variety of programs which may be evaluated under the rule. Education programs are defined as any program principally engaged in the provision of education, including, but not limited to, early childhood education, elementary and secondary education, postsecondary education, special education, job training, career and technical education, and adult education and any program administered by an educational agency or institution.
The new rules also define who may conduct these evaluations, expanding the pool of who may receive education records from schools. Authorized representatives include individuals or entities designated by FERPA-permitted entities to carry out an audit or evaluation of Federal- or State-supported education programs, or for the enforcement of or compliance with Federal legal requirements related to these programs. For example, a state may require its high schools to contract with some research organization that will evaluate the effects of standardized testing on college readiness. The research organization contracted by a school is an authorized representative of that school and may receive student PII from the school.
The new rules do impose certain safeguards on the data that is shared with these authorized representatives. The rules state that entities covered by FERPA must have a written agreement with their authorized representatives and the entity from which the PII originated is responsible for using reasonable methods to ensure its authorized representative complies with FERPA (emphasis mine). The written agreement must describe (1) how the audit, evaluation, or study falls under the relevant exception to the disclosure rules, (2) what PII from education records will be disclosed and how it will be used, and (3) how and when the authorized representative will be required to destroy the data it received. The written agreement must also establish policies and procedures for protecting PII from further disclosure and unauthorized use.
The new rules also clarify when an audit, that may involve invoking this exception, is authorized. DOE states that FERPA itself does not confer the authority to conduct an audit, evaluation, or enforcement or compliance activity. So, while the rule will permit disclosure of PII for such activities, it is only permitted if state, local, or some other federal law authorizes the audit, evaluation, or study.
The new rules authorize the DOE Secretary of Education to enforce FERPA against any entity that receives funds under any program administered by the Secretary, including funds provided by grant, cooperative agreement, contract, subgrant, or subcontract. In comments accompanying publication of the new rule, DOE noted that it was previously unclear whether t the agencys s Family Policy Compliance Office (FPCO) could enforce FERPA against entities that do not have students in attendance, such as State Education Agencies (SEAs) and student loan lenders. The new rules clarify that the all such entities are responsible for complying with FERPA and protecting the PII in any education records they hold.2
Entities receiving department funding are responsible for FERPA violations by their authorized representatives. As noted above, authorized representatives are third parties, who might not receive funding from DOE but contract with a school or other entity subject to FERPA to provide audit, evaluation, enforcement, or compliance services and may be given student PII to do so. Authorized representatives themselves are not liable under FERPA because they do not receive DOE funding. Of course, authorized representatives may be held accountable by FPCO liable if they do receive DOE funding (for example, certain research institutions); then FPCO may directly enforce FERPA against that authorized representative. And authorized representatives could still face potential liability for data breach, breach of contract, or other privacy violations under other tort, contract, or other state and federal privacy laws.)
Accordingly, if the FPCO finds an improper re-disclosure of PII by an authorized representative in the context of an audit & valuation and studies exception, the entity from which the PII originated is prohibited from giving the party responsible for the re-disclosure (the authorized representative) access to PII from educational records for at least five years. This applies if either of the parties is found to have improperly disclosed PII in the context of an audit, evaluation, or study.
Cloud Computing. DOE is still evaluating how cloud computing fits within FERPAs framework for protecting student privacy in light of the fact that data retention is generally moving from local server to cloud-based storage. The agency recommended that States adopt security plans that protect student data wherever it is stored.
Social Security Numbers. Commenters on the proposed rule raised concerns of about the use of student Social Security Numbers (SSNs) as unique identifiers to link student records. DOE has construed FERPA as prohibiting schools from designating student SSNs as directory information. The agency acknowledges, however, that a unique identifier is needed for linking records, particularly when States wish to conduct longitudinal studies. DOEs current position is that while FERPA does not expressly prohibit States from using student SSNs, best practices dictate that States should limit their use of SSNs to instances in which there is no other feasible alternative.3
The new rules reflect an attempt to balance student privacy against the need to use student data to support education reform. The old rules were ambiguous about the extent to which schools could share student data for this purpose. The new rules attempt to eliminate much of this uncertainty while fine-tuning enforcement mechanisms to address unauthorized disclosure and misuse of student data. Local school districts and their third-party consultants, as well as nonprofits that receive federal DOE funding, should undertake a comprehensive review of their FERPA compliance policies and practices and update them as warranted in light of the new rules.
Family Educational Rights and Privacy, 76 Fed. Reg. 75,604-58 (Dec.
2, 2011) (to be codified at 34 C.F.R. pt. 99)
Cant Fire Me, I Know Too Much: Adoption of Facial Recognition
Technology Triggers Investigations in the U.S. and Abroad
Although facial recognition technology has many current and future benefits, including authenticating identity and assisting law enforcement investigations, the potential for unauthorized access and misuse has prompted policymakers to closely examine the technology.
The FTC proceeding comes at a time when facial recognition privacy concerns are being raised globally. For example, last year the European Union (EU) Article 29 Working party launched an investigation into Facebooks Tag suggestions feature to ascertain whether it violates EU data protection laws. Tag suggestions recognizes friends faces based on photographs posted on Facebook and prompts users to confirm the identities of friends in new photos uploaded to Facebook profiles. The head of the Hamburg (Germany) Data Protection Authority asked Facebook to disable the tagging feature on grounds that it may violate EU privacy laws, and the UK Information Commissioners Office (ICO) released a statement that expressed concern about the privacy issues raised by facial recognition technology.
The FTC proceeding was initiated after the Agencys earlier workshop on the topic. Panelists included privacy advocates, legal scholars, and representatives from companies that included SceneTap, Facebook and Google. The panelists discussed possible uses of the technology and how best to protect consumers, whether through privacy-by-design, regulation, or robust security. They also raised questions about what expectations people have about the collection of facial recognition information and how much control individuals should have over such data.
In light of the issues raised at the workshop, the FTC seeks comment on the following questions:
Facial recognition technology has a myriad of current and future uses beyond photo tagging. SceneTap uses the technology to evaluate demographics (gender ratio, average age, and crowd size) at clubs and bars, combined with social reviews, to help its users to decide where to go at night. iPhones and Android phones now have apps that allow owners to unlock their phones by snapping a photo of their face with the phones camera. The technology is already being used by public schools for security, and by retailers to gauge customer reactions to brands and product placement for targeted ads. Future uses could include authentication for ATM use or vehicle entry.
Over the next few years, lawmakers can be expected to craft measures that will regulate the use of facial recognition technology, particularly with regard to children. Although it is too early to predict what approach will be taken in the United States, it is possible that the framework will be incorporated into existing privacy laws or self-regulatory schemes, particularly those dealing with biometrics, or through the passage of specific facial recognition legislation. Thus far, state legislatures have taken the lead on regulating the use of biometric technologies.
Organizations that employ facial recognition technology or are considering do so should monitor proceedings in the U.S. and abroad and be prepared for robust compliance obligations.
Court Rules that Californias Song-Beverly Act Does Not Apply to
These decisions clarify an area of great uncertainty for online retailers following last years decision in Pineda v. Williams Sonoma. In that case the California Supreme Court ruled that zip code information is PII under Song-Beverly, and retailers that request and record zip code information during a commercial transaction do so in violation of the Act. As a result, hundreds of lawsuits were filed against brick and mortar and online retailers. (Governor Brown subsequently signed legislation that created an exemption for retail gas stations seeking customer zip code information during self-service gas transactions). Pineda created potentially significant Song-Beverly exposure for online retailers. Absent appellate review, that risk appears to have been eliminated by the decisions in Salmonson and Mehrens, at least in California.
Online retailers can nonetheless expect that courts in other jurisdictions will be asked to apply similar laws to online sites and services. For example, just this month a federal court in Massachusetts concluded that zip code information is personal information under a Massachusetts law that was similar to Californias. The Court ultimately found that the Plaintiff failed to show cognizable harm. The case is significant, however, because it demonstrates that the Plaintiffs bar remains intent on testing the reach of existing laws while targeting online businesses. Accordingly, the outcomes of any similar actions should be closely examined for their potential impact on the ability of online retailers to collect zip code and other customer data.
Kingdom ICO Commissioner Gives Industry One Last Chance Before Initiating
Enforcement of New Privacy Rules
Industry should take advantage of the grace period to familiarize themselves with their tracking technology, including the nature, purpose and extent of cookie use and the opportunities provided to users to give or withhold consent.
Cookie Lawsuit Against Amazon Dismissed
The Court rejected all of the claims on grounds that the Plaintiffs failed to allege sufficient damages. The CFAA requires aggregated damages of at least $5,000 to be cognizable. The Court found that the Plaintiffs did not plead any facts that would allow the Court to reasonably find that this threshold amount was met. For example, the Plaintiffs alleged two categories of loss: the devaluing of their personal information once it was collected by Amazon, and the diminished performance of their computers as a result of the cookies. The Court found, however, that the Plaintiffs pleaded no facts that the claimed losses were anything but speculative. The CPA claim, which requires a specific showing of injury, failed for the same reason.
Del Vecchio v. Amazon, Inc., No. 11-cv-00366 (W.D. Wash. filed
March 2, 2011), available at http://docs.justia.com/cases/federal/district-courts/washington/wawdce/2:2011cv00366/174037/58/.
Facebook and Zynga Prevail in Privacy Litigation
In reaching this conclusion the Court noted that the plaintiffs asserted two inconsistent theories: (1) that Facebook is a remote computing service (RCS) whose purpose is storing and processing the information provided by the plaintiffs and (2) that the intended recipient of the plaintiffs information was advertisers. If the intended recipient was the advertisers, the Court reasoned that Facebook would not be an RCS; instead, Facebook would be an Electronic Communications Service, which cannot be held liable under the SCA for transmitting information along to its intended recipient. If the information was sent to Facebook for processing and storage, then Facebook would be the intended recipient and the information cannot have been intended for advertisers as alleged. In finding that the Facebook Plaintiffs failed to state a CCDAFA claim, the Court noted that the CCDAFA prohibits the introduction of a computer contaminant into any computer to usurp the normal operation of a computer. The Plaintiffs, however, alleged that the Referrer Headers at issue were a standard web browser function. The breach of contract and fraud claims also failed to survive because the Plaintiffs failed to show any actual and appreciable damages. The Court found likewise in Zynga with regard to the breach of contract claim, admonishing that there is no controlling authority and little clearly persuasive authority that the sharing of personal information constitutes a loss in value.3
In re Facebook Privacy Litigation, No. 10-cv-02389 (N.D. Cal filed
May 28, 2010) available at
|Copyright © 2010 St. Ledger-Roty & Olson, LLP.|