St. Ledger-Roty & Olson LLP

PRIVACY & INFORMATION LAW UPDATE
January 2012
A bimonthly update of trends and developments in privacy law & policy

Karen L. Neuman, Editor

  • You are receiving this publication because of your interest in privacy and data security. It is for informational including advertising purposes only and not a substitute for legal advice.
  • Not interested? unsubscribe. Please forward to someone who might be. View previous issues.
  • If someone sent you this publication subscribe to receive future issues.

In this Issue:
FEATURE ARTICLE:
Ringing in the New Year: Fine-tuning Your Privacy Focus in 2012
Federal Trade Commission and Department of Commerce to Issue Final Reports
European Privacy Law: Proposed Changes to EU Data Privacy Framework and Recent Decision on Industry Self-Regulation Foreshadow Compliance Challenges
House Passes Legislation Amending Video Privacy Protection Act
Department of Education Issues Updated FERPA Rule
“They Can’t Fire Me, I Know Too Much”: Adoption of Facial Recognition Technology Triggers Investigations in the U.S. and Abroad

UPDATES:
Federal Court Rules that California’s Song-Beverly Act Does Not Apply to Online Transactions
United Kingdom ICO Commissioner Gives Industry One Last Chance Before Initiating Enforcement of New Privacy Rules
Flash Cookie Lawsuit Against Amazon Dismissed
Facebook and Zynga Prevail in Privacy Litigation

 

Feature Article:
Ringing in the New Year: Fine-tuning Your Privacy Focus in 2012
By Karen Neuman

Privacy and data security promises to be a top priority for both new enterprises and established entities in 2012, particularly with the expansion and evolution of cloud and GPS -based technologies. Now is the time to anticipate and plan for the following:

  • Contractual Allocation of Privacy Responsibility, Risk and Liability in the Cloud. Many cloud storage service contracts favor the cloud service provider but lack important protections for the businesses that use these services. In light of last year’s well-publicized data breaches involving the financial services, health, gaming, higher learning sectors and even a large cloud service provider, organizations should negotiate for something other than a “take it or leave it” arrangement with their providers. At a minimum, organizations should contract for the following critical protections:
    Read more...

 

Federal Trade Commission and Department of Commerce to Issue Final Reports
By Karen Neuman

The FTC and DOC are expected to release final versions of previously issued draft privacy reports as early as the end of this month or beginning of February. The FTC’s proposed framework for protecting consumer privacy stopped short of calling for comprehensive privacy legislation. Instead, it endorsed the adoption of do not track mechanisms and privacy by design.
Read more...

 

European Privacy Law: Proposed Changes to EU Data Privacy Framework and Recent Decision on Industry Self-Regulation Foreshadow Compliance Challenges
By Karen Neuman and Ari Z. Moskowitz

Over the last few months, the European Union (EU) has signaled that it intends to implement a strengthened data privacy protection framework for member states. A draft version of the “General Data Protection Regulation” was leaked1 in December and could be adopted as early as the end of this month. It will replace the current European Data Protection Directive (Directive), which has been on the books since 1995. Meanwhile, in December an industry self-regulatory proposal involving online behavioral advertising was resoundingly rejected. These developments are discussed below.
Read more...

 

House Passes Legislation Amending Video Privacy Protection Act
By Ari Z. Moskowitz

On December 6, 2011 the U.S. House of Representatives passed a bill to amend the Video Privacy Protection Act of 1988 (VPPA). The bill has been sent to the Senate where is was referred to the Judiciary Committee. The amendments will allow a video rental provider to disclose the video rental histories of its customers to anyone on an ongoing basis, with the blanket consent of the customer. Netflix, in particular, had been publicly lobbying for this bill since it announced the integration of its video rental and streaming service with Facebook on September 22, 2011. Since that date, Netflix users in every country except the United States have been able to automatically share with their friends over Facebook what videos they have watched on Netflix. Netflix stated that the VPPA prevented them from opening up this feature in the US
Read more...

 

Department of Education Issues Updated FERPA Rule
By Karen Neuman and Ari Z. Moskowitz

On January 3, 2012, new rules amending the Family Education Rights and Privacy Act (FERPA) rules issued by the Department of Education (DOE) in December went into effect.1 They apply to local school districts, institutions of higher learning, and nonprofits and other entities receiving DOE funds. The new rules were the result of a DOE initiative to address the increasingly data driven nature of public education at both the K-12 and higher education levels. The new rules provide for greater access to student data while creating more granular privacy controls for educational agencies and institutions over certain information, and expand DOE enforcement authority for non-compliance. Specifically, the rules make changes to three fundamental components of FERPA’s privacy protection framework: (1) directory information, (2) enforcement, and (3) the audit & evaluation exception and studies exception.
Read more...

 

“They Can’t Fire Me, I Know Too Much”: Adoption of Facial Recognition Technology Triggers Investigations in the U.S. and Abroad
By Karen Neuman and Ari Z. Moskowitz

At the end of December, 2011 the Federal Trade Commission (FTC) announced that it is seeking public comment on facial recognition technology and its implications for privacy and security. Comments are due January 31, 2012. Although facial recognition technology has many current and future benefits, including authenticating identity and assisting law enforcement investigations, the potential for unauthorized access and misuse has prompted policymakers to closely examine the technology.
Read more...

UPDATES
 

Federal Court Rules that California’s Song-Beverly Act Does Not Apply to Online Transactions

On January 9, 2012 the US Federal District Court for the Central District of California dismissed two actions brought against online retailers under California’s Song-Beverly Act, which limits the collection of personally identifiable information (PII) by retailers. In Salmonson v. Microsoft Corp., No. 2:11-cv-05449-JHN-JC (C.D. Cal. Jan. 6, 2012), the Court ruled that Song Beverly does not apply to requests for zip code information in online credit card transactions. In Mehrens v. Redbox Automated Retail, No. 2:11-cv-02936-JHN-Ex (C.D. Cal. Jan 6, 2012), the Court ruled that Song Beverly does not apply to requests for zip code information involving self-service DVD rentals. The Court reasoned in part that the risk of fraud in online and self-service transactions precludes construing the statute to encompass these transactions.
Read more...

 

United Kingdom ICO Commissioner Gives Industry One Last Chance Before Initiating Enforcement of New Privacy Rules

On December 13, 2011 Commissioner Christopher Graham of the United Kingdom’s Information Commissioner’s Office (ICO) warned on the ICO blog that industry was not doing enough to prepare for enforcement of the Privacy and Electronic Communications Regulations, a European Directive introduced in the UK in May 2011. The Commission gave companies a 12 month grace period, ending May 26, 2012, to comply with the rules before beginning enforcement actions. The rules require that companies in the UK obtain consent before depositing cookies on users’ computers.
Read more...

 

Flash Cookie Lawsuit Against Amazon Dismissed

On December 1, 2011 a federal District Court dismissed Del Vecchio v. Amazon1 with leave to amend for failure to “establish any plausible harm.”2 This action, like lawsuits previously discussed, concerned Amazon’s use of flash cookies. The Plaintiffs alleged that Amazon violated the Computer Fraud and Abuse Act (CFAA) and Washington’s Consumer Protection Act (WCPA) when Amazon (1) took advantages of a weakness in Microsoft’s Internet Explorer that downloaded cookies contrary to the browser’s cookie filtering settings, (2) used Adobe Flash Local Stored Objects (Flash Cookies), which are not deleted with standard cookies, to track users, and (3) shared the information gathered by these cookies with third parties in violation of Amazon’s privacy policy.
Read more...

 

Facebook and Zynga Prevail in Privacy Litigation

On November 22, 2011 a federal district court dismissed with prejudice the related actions In re Facebook Privacy Litigation1 and In re Zynga Privacy Litigation2. As previously discussed in the broader context of legal and policy developments involving mobile apps (SLRNO Privacy & Information Law Update February 2011), the Complaints alleged that the companies shared information with app developers and third parties through referrer headers in breach of contract and through fraud, as well as in violation of the Stored Communications Act (SCA) and California’s Comprehensive Computer Data Access and Fraud Act (CCDAFA). The SCA issues raised in each action were exceedingly similar and were presided over by the same judge; the order dismissing In re Zynga cites the Facebook dismissal, and noted that the Zynga plaintiffs’ SCA claims failed for the same reasons as the Facebook plaintiffs’-- namely that the plaintiffs failed to state a claim under the SCA because a defendant cannot be liable for divulging communications to the intended recipient of that communication.
Read more...

 


Feature Article:
Ringing in the New Year: Fine-tuning Your Privacy Focus in 2012
By Karen Neuman

Privacy and data security promises to be a top priority for both new enterprises and established entities in 2012, particularly with the expansion and evolution of cloud and GPS -based technologies. Now is the time to anticipate and plan for the following:

  • Contractual Allocation of Privacy Responsibility, Risk and Liability in the Cloud. Many cloud storage service contracts favor the cloud service provider but lack important protections for the businesses that use these services. In light of last year’s well-publicized data breaches involving the financial services, health, gaming, higher learning sectors and even a large cloud service provider, organizations should negotiate for something other than a “take it or leave it” arrangement with their providers. At a minimum, organizations should contract for the following critical protections: 1) quality of service commitment provisions with meaningful recourse against the service provider for catastrophic data loss, outages, and security lapses; 2) system operation audit rights to ascertain such factors as storage capacity, the physical location of data (different jurisdictions offer varying levels of protection for certain types of data; likewise there may be different requirements imposed not only on the cloud service provider, but on the businesses that use cloud services to store customer data).
  • Increased Focus on Data Security Training. The accelerated adoption of tablets and smart phones, and the corresponding explosion of the mobile app sector for a wide variety of uses, will provide enhanced opportunities for unauthorized access to personal information. Some data protection regulatory schemes already require that websites and online services implement data security protocol that includes employee training. Organizations in the information “supply” and “distribution” chain should expect greater scrutiny by regulators and the plaintiffs’ bar about how their employees and independent contractors interact with personal information, particularly sensitive data. Organizations that increase their focus on data security training can minimize legal and business risk by taking some practical steps, including: 1) ascertaining which jobs require access to data, 2) categorizing data and corresponding levels of responsibility over it, and 3) implementing clear data security policies and procedures that place employees and independent contractors on notice of the consequences of noncompliance.
  • Children’s Privacy. Computing, including mobile, devices are being adopted by younger and younger children. Parents are also enabling their children to access sites and services, including social media and location services that were originally intended for adults. As a result, the FTC initiated proceedings to update the Children’s Online Privacy Protection Act (COPPA) rule and published a proposed rule in September 2011. As the FTC endeavors to adapt the COPPA rule to changes in the adoption and use of technology by children, the final version may be modified to capture intervening developments. Businesses should be prepared for more stringent requirements in order to remain COPPA compliant and organizations that have been beyond COPPA’s reach may be surprised to find that they are subject to the new rule. In addition, the Plaintiffs’ bar has been emboldened by recent class actions involving children’s privacy and consumer protection and can be expected to continue to target practices that are inconsistent with stated privacy practices or otherwise circumvent parental notice and consent. Finally, proposed changes to European privacy law will include more stringent requirements for the collection of personal information from children, a change that will have a significant impact on a wide swath of U.S.-based businesses operating in EU member countries. Unlike the U.S. COPPA rule, which defines children as individuals who are under 13, the proposed EU regulations define children as those under 18. The proposed changes would require that any communication directed at a child be in “clear and plain language” that a child can understand and that any consent by a child to data collection or disclosure is only valid when authorized by a parent. The proposed rules also note that any of the many privacy rights granted to EU citizens, such as the right to be forgotten, apply particularly to data collected from children.
  • Location Services. Last year’s revelations that iOS and android devices captured and stored users’ location data ignited a firestorm, and drew the attention of Congress and class action lawyers to privacy risks presented by GPS-enabled location services and technologies. These services and technologies include social “check-in” applications, “geo-fencing” search and advertising applications (technology that enables marketers or merchants to know when a mobile device crosses a digital boundary, such as a neighborhood or section of a mall), and precision services for agriculture, auto manufacturing, transportation, shipping and banking. A Federal Communications Commission (FCC) working group has been considering location privacy issues in tandem with its push for nationwide mobile broadband deployment and could recommend that the FCC adopt rules to protect the privacy of mobile device users. The FTC has considered the issue in the broader context of its proposed framework for protecting consumer privacy (see above). There are several location privacy measures pending in Congress, including a measure introduced by Al Franken (D-MN) that would make it more difficult for device manufacturers, developers and carriers to share users’ data with third parties, including, for example, the sale of a users’ location with marketers. Businesses that offer location services should, at a minimum, monitor these developments and adhere to industry best practices.
  • Banking Privacy. The Dodd-Frank Wall Street Reform and Consumer Protection Act created the Consumer Financial Services Protection Bureau (CFSPB) and authorized it to promulgate consumer privacy rules. The CFSPB faces an uncertain future in light of partisan opposition in Congress to its creation and powers, and the President’s recess appointment of a director threatens to further distract the agency from its mission. Nevertheless, Dodd-Frank as enacted remains on the books. The law grants consumers rights to access their financial records, and banks and other financial institutions will be required to maintain integrated databases to fulfill their corresponding obligations. As a result, they must undertake comprehensive data security reviews and implement measures to ensure that customer data is adequately protected, updated and accurate.

    The financial services industry is also expected to accelerate its adoption of behavioral advertising. Accordingly, banks and other financial services providers may find themselves subject to privacy and consumer protection obligations that previously were not factored into legal and compliance strategies. At a minimum, financial services providers should undertake a comprehensive review of privacy policy disclosures and align them with behavioral advertising practices. Likewise, a significant expansion of mobile banking, including mobile transactions and payment processing, is expected this year. Accordingly, the financial services industry will have to implement certain safeguards, including conducting comprehensive risk assessments, crafting data collection and security policies tailored to risks inherent in mobile transactions, and conducting due diligence on third party service providers. In addition, financial services institutions will have to ensure that third-party service provider agreements include appropriate confidentiality and data security provisions, clarify data ownership, and contain strong prohibitions against using customer data for purposes other than for which it was collected.
  • Health Privacy. The expansion of Electronic Health Records (EHR) and Personal Health Records (PHR) products and services is expected to lead to greater integration of these sectors of the health care industry. This development could add a significant layer of complexity to the compliance strategies of both HIPPA-and non-HIPPA covered entities. Non-HIPPA covered entities can also expect more aggressive enforcement of the FTC’s Health Data Breach Notification rule in light of several high profile heath data breaches last year. In addition, State legislatures are acting to protect health privacy. For example, last summer Texas Governor Rick Perry signed into law a health privacy measure that expands the HIPPA definition of “covered entity”. Accordingly, the law could have a significant impact on a wide swath of health and wellness businesses that are currently not subject to HIPPA. On the business side, health care sector businesses and service providers will want to conduct adequate privacy due diligence on third party vendors and ensure that vendor agreements contain appropriate confidentiality and data security provisions, particularly for sensitive data, clarify data ownership, and prohibit secondary uses of customer data.

Back to Top


Federal Trade Commission and Department of Commerce to Issue Final Reports
By Karen Neuman

The FTC and DOC are expected to release final versions of previously issued draft privacy reports as early as the end of this month or beginning of February. The FTC’s proposed framework for protecting consumer privacy stopped short of calling for comprehensive privacy legislation. Instead, it endorsed the adoption of do not track mechanisms and privacy by design. Although critical of industry self-regulation, the agency also seemed to give businesses “one last chance” to formulate effective self-regulatory programs. The final report is expected to preserve core elements of the proposed framework, including industry self-regulation, privacy by design and an opt-out of targeted advertising based on tracking and privacy by design. The report should also provide guidance on how to minimize the potential for unwanted FTC attention as the agency continues to use its broad powers through enforcement actions to make privacy policy in the absence of expanded rulemaking authority. The final DOC report is expected to lay the foundation for the agency’s planned initiative to work with business to create industry “self-regulatory codes.” In either event, expect a continuing focus by both agencies (not to mention the plaintiff’s bar), on consumer notice and choice.

Back to Top


European Privacy Law: Proposed Changes to EU Data Privacy Framework and Recent Decision on Industry Self-Regulation Foreshadow Compliance Challenges
By Karen Neuman and Ari Z. Moskowitz

Over the last few months, the European Union (EU) has signaled that it intends to implement a strengthened data privacy protection framework for member states. A draft version of the “General Data Protection Regulation” was leaked1 in December and could be adopted as early as the end of this month. It will replace the current European Data Protection Directive (Directive), which has been on the books since 1995. Meanwhile, in December an industry self-regulatory proposal involving online behavioral advertising was resoundingly rejected. These developments are discussed below.

PROPOSED REVISIONS TO THE EU DATA PRIVACY FRAMEWORK.

The new framework will immediately replace the Article 29 Working Party (the influential advisory body created by the Directive) with a European Data Protection Board with an expanded role. The new regulation would better harmonize the data protection laws of all member states because it will be immediately enforceable by those states once it is adopted by the European Commission (EC), unlike a directive, which must be adopted in some form by each member state before it can be enforced. This change will result in less fragmentation of EU privacy law, and therefore greater legal certainty for US businesses, although the proposed framework includes a number of changes that could significantly impact how US companies collect, retain and use EU citizens’ data. We summarize below some key changes:

SCOPE. If approved by the European Parliament and Council of Ministers, the proposed framework will apply to any company anywhere in the world if their activities are directed to citizens of the EU, whether or not that company has any offices located in the EU.

US-EU SAFE HARBOR. The proposed framework would put the current US-EU safe harbor in jeopardy, creating a much more burdensome process for businesses that want to legally transfer data from EU member countries to the US Under Articles 37-43 data would be able to be transferred between EU and non-EU countries in two situations: when the European Commission has issued an adequacy decision regarding the level of data protection in the non-EU country, or when an adequacy decision has not been made, companies will be able to use Binding Corporate Rules or standard or approved contractual data protection clauses. There is no provision for the safe harbor beyond the EU making an adequacy decision with regard to US privacy protections. Adequacy decisions may be general or sectoral (applying only to a particular type of data or industry), and will require an assessment of the adequacy of the country’s laws, including enforceable data protection and privacy rights and judicial redress, the presence of an independent data protection supervisory authority in the country, and international treaties and commitments by the country.

NEW RIGHTS AND OBLIGATIONS.

The proposed rules also create new rights for individual “data subjects” and impose new obligations on “data controllers”, including:

Right to be Forgotten. Article 15 of the proposed framework gives individuals the right to have their data erased by the data controller – the company or public entity – that holds their information in certain, broadly defined circumstances. Thus, data subjects can demand their data be erased when it is no longer necessary for the purpose for which it was collected or by withdrawing consent.

Right to Data Portability. Under article 16 of the proposed framework individuals will be given the right to obtain in non-proprietary formats the information collected about them. They also have the right to transfer their data from one system to another, in a “commonly used” format, without “hindrance” from the company originally holding the data. This provision could force companies like Facebook and Google to allow their customers to transfer data between the two company’s products, such as transferring friend lists from Facebook to Google+ and transferring emails from Gmail to Facebook messages.

Responsibility for Data Protection. The responsibility for protecting data will rest with both the controller (who collects the data) and the processors (their vendors who process the data on behalf of the controller). These requirements will include documenting all processing operations, implementing security measures, and providing data breach notification to the EU supervisory authority and affected data subjects within 24 hours of the breach.

Data Protection Officers. Companies with over 250 employees and all public entities are required to have a data protection officer.

U.S. PATRIOT ACT. The proposed framework would also complicate data transfer outside of the EU for non-EU governments who want to collect data on EU citizens. In particular, the regulation would impact the ability of the US government to collect information from U.S. companies about EU citizens under the PATRIOT Act. Under Article 42 (with the caveat that international treaties between non-EU countries and member countries take precedence) any company which receives a judicial, administrative, or executive decision or request from a non-EU country to turn over data it has collected about an EU citizen must first obtain the authorization of the relevant EU country’s data protection authority. The EU supervisory authority will make its decision based on whether the request is necessary and legally required and otherwise within the regulations. The company would also be required to must inform the data subject of the request and the authority’s decision.

PENALTIES. The new framework would impose significant penalties for noncompliance. Under Article 79 companies can be penalized anywhere from 1% to 5% of their annual global revenues, depending on which rules are violated. For example, failure to provide required notice of a data breach would expose a company to the heaviest fine of 5% of annual global revenues.

ONLINE BEHAVIORAL ADVERTISING

The Article 29 Working Party also recently rejected self-regulatory proposals by the behavioral advertising industry.2 The opinion concluded that the self-regulatory code proposed by the European Advertising Standards Alliance and the Interactive Advertising Bureau Europe (EASA/IAB), who count Google, Microsoft, and Yahoo among their members, does not comply with the current e-Privacy Directive. The Working Party likewise rejected the industry’s website www.youronlinechoices.eu as giving the false impression “that it is possible to choose not to be tracked while surfing the Web.”

Two aspects of the self-regulatory program were of particular concern. The first involved an “icon approach,” by which the EASA/IAB would include an icon on all ads that links to www.youronlinechoices.eu where users could opt out of ads from particular companies and ad networks. The Working Party concluded that this approach did not, on its own, sufficiently inform users about cookies. A number of reasons were cited; including that the icon is not widely recognized, the icon and website did not clearly differentiate between advertising and behavioral advertising, did not allow for prior consent, and did not provide easily understandable information about advertising networks and their reasons for processing information.

Second, the EASA/IAB code generally provides for a method of opting-out of behavioral advertising, but not opting-in. Specifically, the EASA/IAB code provided for the use of cookies to identify which users have opted-out of advertisements. The Working Party determined that this approach was not consistent with the e-Privacy Directive because personal information is processed before the opt- out option is presented. The Working Party recommended that the industry instead require the use of opt-in cookies, where only those who have downloaded the cookie to opt-in to behavioral advertising are shown personalized ads.

The opinion also clarified a number of issues regarding cookies and consent. For example, any unique identifiers, even if anonymized, are considered personal data and therefore subject the collecting and processing entities to the data protection rules. The opinion also offered alternatives to the industry’s endorsement of pop-ups to obtain consent. These include a static banner at the top of the page asking consent to set cookies, a splash on entering the website, and default settings that prevent the collection or disclosure of personal data.

CONCLUSION.

The proposed privacy framework will mark the beginning of increased compliance challenges for U.S. businesses that handle EU citizen data. The Working Party’s rejection of a self-regulatory proposal for behavioral advertising can be seen as further evidence that European policymakers believe that only stringent rules will protect personal privacy. Many of the proposed privacy framework’s key components reflect the belief that current privacy law does not go far enough to protect privacy, particularly given the growth of technology, services and applications since the Directive was implemented. Accordingly, U.S. businesses should monitor developments as the proposed framework is considered by the European Parliament and be prepared to adapt their business and legal strategies accordingly.


1 http://www.statewatch.org/news/2011/dec/eu-com-draft-dp-reg-inter-service-consultation.pdf
2 http://www.statewatch.org/news/2011/dec/eu-art-29-wp-dp-behavioural-ads-no-188.pdf

Back to Top


House Passes Legislation Amending Video Privacy Protection Act
By Ari Z. Moskowitz

On December 6, 2011 the U.S. House of Representatives passed a bill to amend the Video Privacy Protection Act of 1988 (VPPA). The bill has been sent to the Senate where is was referred to the Judiciary Committee. The amendments will allow a video rental provider to disclose the video rental histories of its customers to anyone on an ongoing basis, with the blanket consent of the customer. Netflix, in particular, had been publicly lobbying for this bill since it announced the integration of its video rental and streaming service with Facebook on September 22, 2011. Since that date, Netflix users in every country except the United States have been able to automatically share with their friends over Facebook what videos they have watched on Netflix. Netflix stated that the VPPA prevented them from opening up this feature in the U.S.

The VPPA was passed by Congress in response to a newspaper report revealing the movie rental history of Robert Bork, who was at the time a nominee to the Supreme Court. The law, which applies to “video tape service providers,” defines those providers as anyone engaged in the business of “rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials,” and protects personally identifiable information (PII). PII is defined as “information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.” Currently, video service providers are prohibited from disclosing consumer PII to anyone without the “informed, written consent of the consumer given at the time the disclosure is sought.”

The amended law replaces the informed, written consent requirement. The new version explicitly states that consent to disclose PII to anyone may be given electronically and through the Internet, but requires that such consent must be in a separate and distinct form from other policies and registration forms. The bill also alters the time at which consent may be given. Although consent at the time disclosure is sought is still permissible, a video tape service provider may also obtain blanket consent in advance for a set period of time or until consent is withdrawn.

If enacted, these amendments will mean that a video rental company, including a streaming video provider such as Netflix, will be able to share its customers’ video rental histories on an ongoing basis with the public. Specifically, Netflix and other streaming video providers such as Hulu, Amazon, and Youtube would be permitted to obtain its customers’ permission to share on Facebook, Twitter, and other social media what movies a customer has rented or watched.

Back to Top


Department of Education Issues Updated FERPA Rule
By Karen Neuman and Ari Z. Moskowitz

On January 3, 2012, new rules amending the Family Education Rights and Privacy Act (FERPA) rules issued by the Department of Education (DOE) in December went into effect.1 They apply to local school districts, institutions of higher learning, and nonprofits and other entities receiving DOE funds. The new rules were the result of a DOE initiative to address the increasingly data driven nature of public education at both the K-12 and higher education levels. The new rules provide for greater access to student data while creating more granular privacy controls for educational agencies and institutions over certain information, and expand DOE enforcement authority for non-compliance. Specifically, the rules make changes to three fundamental components of FERPA’s privacy protection framework: (1) directory information, (2) enforcement, and (3) the audit & evaluation exception and studies exception.

DIRECTORY INFORMATION

Under FERPA, “directory information” is personally identifiable information (PII) about students that schools may disclose without written parental or, in the case of a student who is 18 or older student consent. Each school individually defines what directory information consists of, though it is typically name, address, and phone number for use in a student directory or a yearbook. Parents and adult students must be given the opportunity to opt-out of such disclosures.

Instead of the all-or-nothing approach under the old rules, where all directory information was publicly available, the new rules authorize schools to designate particular purposes or parties that may receive directory information. Therefore, a school that lists students’ names and addresses as directory information has the option of, for example, specifying that directory information may only be disclosed to employees of the school for educational purposes.

The new rules do not require such granular controls that schools must allow parents to opt-out of specific categories of directory information but not others, though they must at least provide the option of opting out of directory information entirely. For example, schools are required to allow parents and adult children to opt-out of all directory information disclosures; however they may, but are not required to give the option of opting-out of particular types of directory information disclosures, such as withholding consent for their child’s address but allowing their child’s name to appear in a student directory.

The new rules also clarify how directory information can be used for student ID cards or badges. Under the new rules, students do not have the right to opt out of the disclosure of directory information that is displayed or otherwise contained on a student ID. Schools may require students to wear, display, or disclose a student ID that exhibits PII, even if that information is otherwise designated as directory information and the student has opted out. However, schools may only require disclosure of information on a student ID if that information cannot be used to gain access to education records without additional authorization safeguards.

AUDIT & EVALUATION AND STUDIES EXCEPTIONS

Perhaps the most significant amendments to the FERPA rules are those concerning the “audit and evaluation” and the “studies” exceptions. In recent years, states have been attempting to implement statewide longitudinal data systems to study the efficacy of their schools. These exceptions are designed to allow schools to disclose student PII to third parties for the purposes of studying, auditing, and evaluating their education programs and their compliance with FERPA and various other state and federal laws. Specifically, the rules allow schools to provide “authorized representatives” with student information for the purpose of evaluating ‘‘education programs.” The amendments define these two previously undefined terms to expand access to education records but also provide additional safeguards in light of that disclosure.

The new definition of “education program” increases access by clarifying the variety of programs which may be evaluated under the rule. Education programs are defined as “any program principally engaged in the provision of education, including, but not limited to, early childhood education, elementary and secondary education, postsecondary education, special education, job training, career and technical education, and adult education” and “any program administered by an educational agency or institution.”

The new rules also define who may conduct these evaluations, expanding the pool of who may receive education records from schools. Authorized representatives “include individuals or entities designated by FERPA-permitted entities to carry out an audit or evaluation of Federal- or State-supported education programs, or for the enforcement of or compliance with Federal legal requirements related to these programs.” For example, a state may require its high schools to contract with some research organization that will evaluate the effects of standardized testing on college readiness. The research organization contracted by a school is an “authorized representative” of that school and may receive student PII from the school.

The new rules do impose certain safeguards on the data that is shared with these authorized representatives. The rules state that entities covered by FERPA must have a written agreement with their authorized representatives and “the entity from which the PII originated is responsible for using reasonable methods to ensure… its authorized representative complies with FERPA” (emphasis mine). The written agreement must describe (1) how the audit, evaluation, or study falls under the relevant exception to the disclosure rules, (2) what PII from education records will be disclosed and how it will be used, and (3) how and when the authorized representative will be required to destroy the data it received. The written agreement must also establish policies and procedures for protecting PII from further disclosure and unauthorized use.

The new rules also clarify when an audit, that may involve invoking this exception, is authorized. DOE states that “FERPA itself does not confer the authority to conduct an audit, evaluation, or enforcement or compliance activity.” So, while the rule will permit disclosure of PII for such activities, it is only permitted if state, local, or some other federal law authorizes the audit, evaluation, or study.

ENFORCEMENT

The new rules authorize the DOE Secretary of Education to enforce FERPA against “any entity that receives funds under any program administered by the Secretary, including funds provided by grant, cooperative agreement, contract, subgrant, or subcontract.” In comments accompanying publication of the new rule, DOE noted that it was previously unclear whether t the agency’s ’s Family Policy Compliance Office (FPCO) could enforce FERPA against entities that do not have students in attendance, such as State Education Agencies (SEAs) and student loan lenders. The new rules clarify that the all such entities are responsible for complying with FERPA and protecting the PII in any education records they hold.2

Entities receiving department funding are responsible for FERPA violations by their authorized representatives. As noted above, authorized representatives are third parties, who might not receive funding from DOE but contract with a school or other entity subject to FERPA to provide audit, evaluation, enforcement, or compliance services and may be given student PII to do so. Authorized representatives themselves are not liable under FERPA because they do not receive DOE funding. Of course, authorized representatives may be held accountable by FPCO liable if they do receive DOE funding (for example, certain research institutions); then FPCO may directly enforce FERPA against that authorized representative. And authorized representatives could still face potential liability for data breach, breach of contract, or other privacy violations under other tort, contract, or other state and federal privacy laws.)

Accordingly, if the FPCO finds an improper re-disclosure of PII by an authorized representative in the context of an audit & valuation and studies exception, the entity from which the PII originated is prohibited from giving the party responsible for the re-disclosure (the authorized representative) access to PII from educational records for at least five years. This applies if either of the parties is found to have improperly disclosed PII in the context of an audit, evaluation, or study.

OTHER MATTERS

Cloud Computing. DOE is still evaluating how cloud computing fits within FERPA’s framework for protecting student privacy in light of the fact that data retention is generally moving from local server to “cloud”-based storage. The agency recommended that States adopt security plans that protect student data wherever it is stored.

Social Security Numbers. Commenters on the proposed rule raised concerns of about the use of student Social Security Numbers (SSNs) as unique identifiers to link student records. DOE has construed FERPA as prohibiting schools from designating student SSNs as directory information. The agency acknowledges, however, that a unique identifier is needed for linking records, particularly when States wish to conduct longitudinal studies. DOE’s current position is that “while FERPA does not expressly prohibit States from using student SSNs, best practices dictate that States should limit their use of SSNs to instances in which there is no other feasible alternative.”3

CONCLUSION

The new rules reflect an attempt to balance student privacy against the need to use student data to support education reform. The old rules were ambiguous about the extent to which schools could share student data for this purpose. The new rules attempt to eliminate much of this uncertainty while fine-tuning enforcement mechanisms to address unauthorized disclosure and misuse of student data. Local school districts and their third-party consultants, as well as nonprofits that receive federal DOE funding, should undertake a comprehensive review of their FERPA compliance policies and practices and update them as warranted in light of the new rules.


1 Family Educational Rights and Privacy, 76 Fed. Reg. 75,604-58 (Dec. 2, 2011) (to be codified at 34 C.F.R. pt. 99)
2 The Department declined, however, to define all of these entities as ‘‘educational agencies and institutions” because of the potential for confusion with other parts of the regulations. Family Educational Rights and Privacy, 76 Fed. Reg. 75,631 (Dec. 2, 2011) (to be codified at 34 C.F.R. pt. 99)
3 Family Educational Rights and Privacy, 76 Fed. Reg. 75,611 (Dec. 2, 2011) (to be codified at 34 C.F.R. pt. 99)

Back to Top


“They Can’t Fire Me, I Know Too Much”: Adoption of Facial Recognition Technology Triggers Investigations in the U.S. and Abroad
By Karen Neuman and Ari Z. Moskowitz

At the end of December, 2011 the Federal Trade Commission (FTC) announced that it is seeking public comment on facial recognition technology and its implications for privacy and security. Comments are due January 31, 2012.

Although facial recognition technology has many current and future benefits, including authenticating identity and assisting law enforcement investigations, the potential for unauthorized access and misuse has prompted policymakers to closely examine the technology.

The FTC proceeding comes at a time when facial recognition privacy concerns are being raised globally. For example, last year the European Union (EU) Article 29 Working party launched an investigation into Facebook’s “Tag suggestions” feature to ascertain whether it violates EU data protection laws. Tag suggestions recognizes friends’ faces based on photographs posted on Facebook and prompts users to confirm the identities of friends in new photos uploaded to Facebook profiles. The head of the Hamburg (Germany) Data Protection Authority asked Facebook to disable the tagging feature on grounds that it may violate EU privacy laws, and the UK Information Commissioner’s Office (ICO) released a statement that expressed concern about the privacy issues raised by facial recognition technology.

The FTC proceeding was initiated after the Agency’s earlier workshop on the topic. Panelists included privacy advocates, legal scholars, and representatives from companies that included SceneTap, Facebook and Google. The panelists discussed possible uses of the technology and how best to protect consumers, whether through privacy-by-design, regulation, or robust security. They also raised questions about what expectations people have about the collection of facial recognition information and how much control individuals should have over such data.

In light of the issues raised at the workshop, the FTC seeks comment on the following questions:

  • What are the current and future commercial uses of these technologies?
  • How can consumers benefit from the use of these technologies?
  • What are the privacy and security concerns surrounding the adoption of these technologies, and how do they vary depending on how the technologies are implemented?
  • Are there special considerations that should be given for the use of these technologies on or by populations that may be particularly vulnerable, such as children?
  • What are best practices for providing consumers with notice and choice regarding the use of these technologies?
  • Are there situations where notice and choice are not necessary? By contrast, are there contexts or places where these technologies should not be deployed, even with notice and choice?
  • Is notice and choice the best framework for dealing with the privacy concerns surrounding these technologies, or would other solutions be a better fit? If so, what are they?
  • What are best practices for developing and deploying these technologies in a way that protects consumer privacy?

Facial recognition technology has a myriad of current and future uses beyond photo tagging. SceneTap uses the technology to evaluate demographics (gender ratio, average age, and crowd size) at clubs and bars, combined with social reviews, to help its users to decide where to go at night. iPhones and Android phones now have apps that allow owners to unlock their phones by snapping a photo of their face with the phone’s camera. The technology is already being used by public schools for security, and by retailers to gauge customer reactions to brands and product placement for targeted ads. Future uses could include authentication for ATM use or vehicle entry.

Over the next few years, lawmakers can be expected to craft measures that will regulate the use of facial recognition technology, particularly with regard to children. Although it is too early to predict what approach will be taken in the United States, it is possible that the framework will be incorporated into existing privacy laws or self-regulatory schemes, particularly those dealing with biometrics, or through the passage of specific facial recognition legislation. Thus far, state legislatures have taken the lead on regulating the use of biometric technologies.

Organizations that employ facial recognition technology or are considering do so should monitor proceedings in the U.S. and abroad and be prepared for robust compliance obligations.

Back to Top


 
UPDATES

Federal Court Rules that California’s Song-Beverly Act Does Not Apply to Online Transactions

On January 9, 2012 the U.S. Federal District Court for the Central District of California dismissed two actions brought against online retailers under California’s Song-Beverly Act, which limits the collection of personally identifiable information (PII) by retailers. In Salmonson v. Microsoft Corp., No. 2:11-cv-05449-JHN-JC (C.D. Cal. Jan. 6, 2012), the Court ruled that Song Beverly does not apply to requests for zip code information in online credit card transactions. In Mehrens v. Redbox Automated Retail, No. 2:11-cv-02936-JHN-Ex (C.D. Cal. Jan 6, 2012), the Court ruled that Song Beverly does not apply to requests for zip code information involving self-service DVD rentals. The Court reasoned in part that the risk of fraud in online and self-service transactions precludes construing the statute to encompass these transactions.

These decisions clarify an area of great uncertainty for online retailers following last year’s decision in Pineda v. Williams Sonoma. In that case the California Supreme Court ruled that zip code information is PII under Song-Beverly, and retailers that request and record zip code information during a commercial transaction do so in violation of the Act. As a result, hundreds of lawsuits were filed against brick and mortar and online retailers. (Governor Brown subsequently signed legislation that created an exemption for retail gas stations seeking customer zip code information during self-service gas transactions). Pineda created potentially significant Song-Beverly exposure for online retailers. Absent appellate review, that risk appears to have been eliminated by the decisions in Salmonson and Mehrens, at least in California.

Online retailers can nonetheless expect that courts in other jurisdictions will be asked to apply similar laws to online sites and services. For example, just this month a federal court in Massachusetts concluded that zip code information is personal information under a Massachusetts law that was similar to California’s. The Court ultimately found that the Plaintiff failed to show cognizable harm. The case is significant, however, because it demonstrates that the Plaintiffs’ bar remains intent on testing the reach of existing laws while targeting online businesses. Accordingly, the outcomes of any similar actions should be closely examined for their potential impact on the ability of online retailers to collect zip code and other customer data.

Back to Top


United Kingdom ICO Commissioner Gives Industry One Last Chance Before Initiating Enforcement of New Privacy Rules

On December 13, 2011 Commissioner Christopher Graham of the United Kingdom’s Information Commissioner’s Office (ICO) warned on the ICO blog that industry was not doing enough to prepare for enforcement of the Privacy and Electronic Communications Regulations, a European Directive introduced in the UK in May 2011. The Commission gave companies a 12 month grace period, ending May 26, 2012, to comply with the rules before beginning enforcement actions. The rules require that companies in the UK obtain consent before depositing cookies on users’ computers.

Prompted by the halfway point from enactment of the rules to the start of enforcement, Commissioner Graham pressed industry to “try harder” to comply. He cited complaints from companies, including that “consent is impossible online,” “people never read cookie information anyway,” and “consent needs pop-ups and everyone hates pop-ups.” Commissioner Graham rejected these complaints and released guidance for companies to comply. The updated guidance specifies that enforcement will focus on those cookies with a clear privacy impact. Companies that use cookies for analytics or advertising will need to comply with the rules, while those that use cookies to keep data safe and for online shopping carts are likely exempt. The guidance also provides more details on the definition of consent, stating that “consent must involve some form of communication where an individual knowingly indicates their acceptance.”

Industry should take advantage of the grace period to familiarize themselves with their tracking technology, including the nature, purpose and extent of cookie use and the opportunities provided to users to give or withhold consent.

Back to Top


Flash Cookie Lawsuit Against Amazon Dismissed

On December 1, 2011 a federal District Court dismissed Del Vecchio v. Amazon1 with leave to amend for failure to “establish any plausible harm.”2 This action, like lawsuits previously discussed, concerned Amazon’s use of flash cookies. The Plaintiffs alleged that Amazon violated the Computer Fraud and Abuse Act (CFAA) and Washington’s Consumer Protection Act (WCPA) when Amazon (1) took advantages of a weakness in Microsoft’s Internet Explorer that downloaded cookies contrary to the browser’s cookie filtering settings, (2) used Adobe Flash Local Stored Objects (Flash Cookies), which are not deleted with standard cookies, to track users, and (3) shared the information gathered by these cookies with third parties in violation of Amazon’s privacy policy.

The Court rejected all of the claims on grounds that the Plaintiffs failed to allege sufficient damages. The CFAA requires aggregated damages of at least $5,000 to be cognizable. The Court found that the Plaintiffs did not plead any facts that would allow the Court to reasonably find that this threshold amount was met. For example, the Plaintiffs alleged two categories of loss: the devaluing of their personal information once it was collected by Amazon, and the diminished performance of their computers as a result of the cookies. The Court found, however, that the Plaintiffs pleaded no facts that the claimed losses were anything but speculative. The CPA claim, which requires a “specific showing of injury”, failed for the same reason.

Even though the case was not decided on the merits, the Court nevertheless addressed some of Plaintiffs’ substantive claims, teeing up an amended complaint, if filed. Regarding the CFAA claim that Amazon’s use of cookies and flash cookies was unauthorized or exceeded legally authorized use, the Court observed that Amazon’s privacy policy put users on notice that Amazon places cookies on users’ computer and that users who do not accept the cookies will be effectively prevented from using the site. The Court seemed to suggest that the “use of [Amazon’s] site to make purchases would appear to serve both as an acknowledgment that cookies were being [deposited] and an implied acceptance of that fact.”3


1 Del Vecchio v. Amazon, Inc., No. 11-cv-00366 (W.D. Wash. filed March 2, 2011), available at http://docs.justia.com/cases/federal/district-courts/washington/wawdce/2:2011cv00366/174037/58/.
2 Id.
3 Del Vecchio v. Amazon, Inc., No. 11-cv-00366, at *9 (W.D. Wash. filed March 2, 2011).

Back to Top


Facebook and Zynga Prevail in Privacy Litigation

On November 22, 2011 a federal district court dismissed with prejudice the related actions In re Facebook Privacy Litigation1 and In re Zynga Privacy Litigation2. As previously discussed in the broader context of legal and policy developments involving mobile apps (SLRNO Privacy & Information Law Update February 2011), the Complaints alleged that the companies shared information with app developers and third parties through referrer headers in breach of contract and through fraud, as well as in violation of the Stored Communications Act (SCA) and California’s Comprehensive Computer Data Access and Fraud Act (CCDAFA). The SCA issues raised in each action were exceedingly similar and were presided over by the same judge; the order dismissing In re Zynga cites the Facebook dismissal, and noted that the Zynga plaintiffs’ SCA claims failed for the same reasons as the Facebook plaintiffs’-- namely that the plaintiffs failed to state a claim under the SCA because a defendant cannot be liable for divulging communications to the intended recipient of that communication.

In reaching this conclusion the Court noted that the plaintiffs asserted two inconsistent theories: (1) that Facebook is a remote computing service (RCS) whose purpose is storing and processing the information provided by the plaintiffs and (2) that the intended recipient of the plaintiffs’ information was advertisers. If the intended recipient was the advertisers, the Court reasoned that Facebook would not be an RCS; instead, Facebook would be an Electronic Communications Service, which cannot be held liable under the SCA for transmitting information along to its intended recipient. If the information was sent to Facebook for processing and storage, then Facebook would be the intended recipient and the information cannot have been intended for advertisers as alleged. In finding that the Facebook Plaintiffs failed to state a CCDAFA claim, the Court noted that the CCDAFA prohibits the introduction of a “computer contaminant into any computer “ to “usurp the normal operation” of a computer. The Plaintiffs, however, alleged that the ‘Referrer Headers’ at issue were a “standard web browser function.” The breach of contract and fraud claims also failed to survive because the Plaintiffs failed to show any actual and appreciable damages. The Court found likewise in Zynga with regard to the breach of contract claim, admonishing that there is no controlling authority and little clearly persuasive authority that the sharing of personal information constitutes a loss in value.3


1 In re Facebook Privacy Litigation, No. 10-cv-02389 (N.D. Cal filed May 28, 2010) available at http://docs.justia.com/cases/federal/district-courts/california/candce/5:2010cv02389/228117/106/.
2 In re Zynga Privacy Litigation, No. 10-cv-4680 (N.D. Cal filed October 18, 2010) available at http://www.scribd.com/doc/73806587/In-Re-Zynga-Privacy-Litigation-C-10-04680-JW-N-D-Cal-Nov-22-2011.
3 In re Facebook Privacy Litigation, No. 10-cv-02389, at *9 (N.D. Cal filed May 28, 2010).

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036