St. Ledger-Roty & Olson LLP

PRIVACY & INFORMATION LAW UPDATE
June 2012
A bimonthly update of trends and developments in privacy law & policy

Karen Neuman, Editor

  • You are receiving this publication because of your interest in privacy and data security. It is for informational including advertising purposes only and not a substitute for legal advice.
  • Not interested? Unsubscribe or forward to someone who might be.
  • Did someone send you this Update? Subscribe to receive your own and view past issues.

In this Issue:
FEATURE ARTICLE:
The Education Market - Plenty of Opportunity & Plenty of Privacy Risk
FTC Holds Workshop on Mobile Disclosures
Legislative and Judicial Developments Limit Employer Access to Employee and Job Applicant Social Media & Online Accounts
Online Behavioral Advertising Accountability Program Faults Ad Companies' Data & Tracking Practices
FCC Issues Public Notice Seeking Comment on Mobile Service Provider Privacy Practices
What does it mean to have an Open Web? Its Inventor Speaks Out.
A Rolling Stone Gathers No Moss & Neither Should Your Privacy Policy: The FTC-Myspace Settlement
ICAAN Data Breach Exposes Customer Data as a Result of System Breakdown
Federal Automobile “Black Box” Legislation Expands Vehicle Manufacturers’ Focus Beyond Safety to Privacy

UPDATES:
Netflix Settles Privacy Class Action Lawsuit over Data Retention Practices
UK Cookie Compliance Grace Period Expires as ICO Permits Implied Consent
The PCI Data Security Standards Council (PCI SSC) Issues Mobile Payment Acceptance Fact Sheet
Federal Court Grants Class Certification in Song-Beverly Zip Code Action

FEATURE ARTICLE:
The Education Market - Plenty of Opportunity & Plenty of Privacy Risk
By Karen Neuman

An emerging market for education software and digital content has attracted entrepreneurs and investors looking for new sectors in which to innovate and profit. Examples include online learning management systems, question and answer services for students and teachers, assistive applications for special needs students, web-based plagiarism detection tools, behavior management software, and digital “badges” that can be used to represent academic accomplishment. These products and services are being integrated into traditional “brick and mortar” learning as well as online alternatives. Solutions that are enjoyed in the consumer context, including single sign-on and personalization, are being implemented in education to promote adoption of these technologies and facilitate their use.
Read more...

 

FTC Holds Workshop on Mobile Disclosures
By Karen Neuman

On May 30, 2012 the Federal Trade Commission (FTC) held a public workshop to discuss advertising and privacy disclosures over mobile platforms and devices. The workshop is part of the FTC’s initiative to update its Dot.com Disclosures Guidelines to reflect changes in technology, advertising and behavior since the Guidelines were ...
Read more...

 

Legislative and Judicial Developments Limit Employer Access to Employee and Job Applicant Social Media & Online Accounts
By Karen Neuman
     Ari Moskowitz

The wealth of information people are posting on social media sites and other web services is a treasure trove for employers who want to conduct background checks on job applicants or monitor their employees. Details of individuals' personal lives published on these platforms and services can be retained for periods much longer than those individuals may have anticipated, and therefore contravene their expectations of privacy in that information. While employers have access to more data than ever before about job applicants and employees, using this information carries potentially significant legal risk.
Read more...

 

Online Behavioral Advertising Accountability Program Faults Ad Companies' Data & Tracking Practices
By Karen Neuman

On May 30, 2012, the Council of Better Business Bureaus Online Interest-Based Advertising Accountability Program (OBA Accountability Program) announced that inquiries into the data collection and use practices of seven online advertising companies resulted in decisions that the companies violated the Program’s Self-Regulatory Principles for Online Behavioral Advertising (OBA) Principles. The decisions clarify that:
Read more...

 

FCC Issues Public Notice Seeking Comment on Mobile Service Provider Privacy Practices
By Ari Moskowitz

On May 25, 2012 the Federal Communications Commission (FCC) released a Public Notice (PN) soliciting comments on “the privacy and data-security practices of mobile wireless service providers with respect to customer information stored on their users’ mobile communications devices, and the application of existing privacy and security requirements to that information.”1 This inquiry comes on the heels of the “Carrier IQ” flap, in which a number of wireless communications service providers used software embedded in mobile devices to capture certain user data.
Read more...

 

What does it mean to have an Open Web? Its Inventor Speaks Out.
By Ari Moskowitz

Over the last several weeks, technologists and policy makers have been speaking out about the future direction of the Internet and control over information. The discussion has included calls for both private industry and regulators to consider the effects of their products and regulations, respectively, on the open, universal nature of the Internet and the web. In past discussions the focus has been on open, nondiscriminatory access to the web by gatekeepers; more recently though leaders have begun to consider the importance of privacy, and calls for privacy regulation on the future of the Open Web.
Read more...

 

A Rolling Stone Gathers No Moss & Neither Should Your Privacy Policy: The FTC-Myspace Settlement
By Karen Neuman

On May 8, 2012, Myspace and the FTC entered into a proposed consent decree to settle charges that it misrepresented in its privacy policy how it collects and uses customer data. The settlement is yet another in a growing list of actions taken by the FTC to hold operators of websites, online services and digital service providers accountable for privacy policy promises. The proposed settlement order bars ...
Read more...

 

ICAAN Data Breach Exposes Customer Data as a Result of System Breakdown
By Ari Moskowitz

In mid-April, the Internet Corporation for Assigned Names and Numbers (ICANN), the nonprofit organization responsible for assigning and maintaining Internet addresses, announced a data breach that exposed the user and file names of certain users1 applying for new Top Level Domains. Top level domains are the ends of web addresses that have thus far been restricted to “.com”, “.org” and 19 other terms (excluding country-specific domains). The glitch occurred in the Top Level Domain Application System, which ICANN was using to process applications for new top level “dot.brandanything” domains.
Read more...

 

Federal Automobile “Black Box” Legislation Expands Vehicle Manufacturers’ Focus Beyond Safety to Privacy
By Ari Moskowitz

Congress is currently considering legislation that would mandate the inclusion of Event Data Recorders (EDRs) in cars, adding potential privacy, legal, and business risks to the safety issues that the industry has been focused on. EDRs, like “black boxes” in airplanes, record and store data about a vehicle, including its speed, safety belt use, airbag deployment, and a variety of other safety metrics.1 The data are used to determine the causes of a crash and uncover and fix flaws in vehicles to prevent future crashes.
Read more...

 

UPDATES

Netflix Settles Privacy Class Action Lawsuit over Data Retention Practices

On June 1, 2012, Netflix filed a Settlement Agreement in a class action alleging that the company unlawfully retained personally identifiable information about its former customers in violation of the Video Privacy Protection Act (VPPA)1. The VPPA prohibits video rental services from sharing customer rental history and viewing data without prior written consent, and limits how long that information can be retained.
Read more...

 

UK Cookie Compliance Grace Period Expires as ICO Permits Implied Consent

On May 25, 2012, the United Kingdom Information Commissioner’s Office (ICO) announced that the one-year grace period from enforcement for compliance with its amended Privacy and Electronic Communications Regulations (implementing amendments to the EU’s 2002 Privacy and Electronic Communications Directive) would expire May 27, 2012. As we previously reported, the grace period was granted to give UK website operators (as well as U.S.-based operators serving UK consumers) time to comply with the rules, which require informed consent before operators can deposit cookies on a user’s computer or mobile device. Hours before the expiration of the grace period, the ICO announced that ....
Read more...

 

The PCI Data Security Standards Council (PCI SSC) Issues Mobile Payment Acceptance Fact Sheet

On May 16, 2012 the PCI SSC Mobile Working Group issued guidance for secure acceptance of payments by merchants using mobile devices such as smart phones and tablets at the point of sale. The guidance is intended to be a first step to facilitate fully secure mobile payments. The At a Glance Mobile Payment Acceptance Security guidance (Guidance) is intended to help merchants securely accept payment consistent with ...
Read more...

 

Federal Court Grants Class Certification in Song-Beverly Zip Code Action

On May 4, 2012, the U.S. District Court for the Southern District of California granted class certification in Yeoman v. IKEA U.S.A. West, Inc. The Plaintiffs alleged that IKEA requested and electronically stored customer zip card information during retail credit card transactions in violation of California’s Song-Beverly Credit Card Act of 1971 (Act)1. The Act prohibits retailers from collecting personal information during a ...
Read more...


FEATURE ARTICLE:
The Education Market - Plenty of Opportunity & Plenty of Privacy Risk

By Karen Neuman

Introduction.

An emerging market for education software and digital content has attracted entrepreneurs and investors looking for new sectors in which to innovate and profit. Examples include online learning management systems, question and answer services for students and teachers, assistive applications for special needs students, web-based plagiarism detection tools, behavior management software, and digital “badges” that can be used to represent academic accomplishment. These products and services are being integrated into traditional “brick and mortar” learning as well as online alternatives. Solutions that are enjoyed in the consumer context, including single sign-on and personalization, are being implemented in education to promote adoption of these technologies and facilitate their use.

Entry into this bourgeoning market can pose legal and reputational risk for companies unfamiliar with privacy issues and corresponding legal obligations that are unique to education. These risks can be minimized by adopting an approach early in development that reflects an understanding of how a product or service captures, uses, stores and discloses student data; interacts with other organizations in the ecosystem; and triggers legal obligations.

The Education Technology Ecosystem & Privacy Law.

Students interact with education technology in an ecosystem where their data is disclosed, accessed, stored in and shared over computing (including mobile) devices, cloud, email and internet services, social media platforms and, increasingly, mobile apps. Growing interoperability that is intended to facilitate the sharing of user data in other contexts may run counter to individual privacy expectations and specific laws in an education setting.

For example, a suite of hosted e-mail and collaboration tools for schools can trigger questions about whether the service provider is collecting, using and sharing personal information from children in violation of the Children’s Online Privacy Protection Act1 (COPPA). COPPA prohibits commercial (and in limited instances nonprofit) operators of websites, online services or mobile apps from collecting personal information from children who are under 13 without first providing notice to parents and obtaining their verifiable consent. Under the implementing (COPPA) rule2, adopted and enforced by the Federal Trade Commission (FTC), personal information includes email or physical address, full name, and in some instances date of birth and gender. The sharing of this information with third parties and the purpose for doing so (for example, to serve ads) must be disclosed in a required privacy policy.

A hosted virtual environment in which a game is used for student assessment could, in addition to raising COPPA questions, trigger Family Educational Rights Privacy Act3 (FERPA) obligations for the school, and possibly the game’s host or operator. FERPA is the federal law that protects the privacy of student “education records” and governs how information in those records can be disclosed. An education record is any record that contains personally identifiable information that is directly related to a particular student and maintained by the school or a third party acting on the school’s behalf. It can include computer data and be stored in a database or on a server.

FERPA prohibits schools that receive Department of Education (DOE) funds (and in limited circumstances specified third parties) from disclosing personal information in a student’s “education record “without the student’s written consent or, if the student is under 18, parental consent. The implementing rules4 are enforced by DOE. Although there is no private right of action for noncompliance, an enforcement action can be triggered when a complaint is filed by a student or parent. The DOE may impose sanctions for noncompliance, including the suspension or revocation of federal funds – a remedy that could make schools wary of adopting products or services that are seen as creating FERPA risk.

Classroom use of assistive technology can also raise FERPA compliance obligations, as well as similar privacy obligations in the Individual With Educational Disabilities Act5 (IDEA). Examples of assistive devices, and the entities with potential access to data captured in them, include word prediction or voice-to-text tools whose data can be accessed by the device manufacturer, the hosted database provider, or a third party studying language-based learning disabilities. Unlike FERPA, the IDEA provides a private right of action6 to enforce the confidentiality of personally identifiable information, and this right could be a deterrent to adoption of certain products by schools.

FTC Privacy Framework.

Companies that make digital learning products or services that collect, use, share and store user information must ensure that their privacy and data security practices adhere to the FTC framework for protecting consumer privacy. The framework recommends, but does not require companies to implement “Do-Not-Track” (DNT) tools. It does, however ask Congress to enact DNT legislation that would mandate implementation of DNT tools.

Other key recommendations are widely seen as providing a roadmap for avoiding an FTC enforcement action for misleading or deceptive privacy practices. These recommendations urge companies to implement Fair Information Principles (FIP) for collecting and protecting user data. FIP principles include providing users with meaningful notice about data collection, retention, and use; offering easily understood and implemented opt-out choice; implementing data minimization (collecting only that data necessary for a particular purpose); and maintaining adequate data integrity and security. This framework builds on recent FTC enforcement actions against a variety of online and digital companies for misrepresenting such data collection and use practices as using consumer data for secondary purposes that are materially different from the purpose for which it was originally collected, or failing to adhere to user opt-out choices about those practices.

Conclusion.

The market for education products and services presents new opportunities for entrepreneurs and their investors. The increasingly interoperable ecosystem in which education software and digital content is being adopted requires an understanding by providers, students and schools about how student data is collected, used and shared -- often by many parties. This data flow can raise potentially complex questions about ownership, control and liability. Once these questions are fully understood, relationships can be structured to allocate risk and liability.

By adequately addressing these questions at the outset, companies will be better situated to avoid turning a promising new venture into a disappointing one, or convert an investment opportunity that initially seemed appealing a costly one.


1 15 U.S.C. § 6501.
2 16 C.F.R. Part 312.
3 20 U.S.C. § 1232g.
4 34 CFR Part 99.
5 20 U.S.C. § 1400.
6 20 U.S.C. § 1439.


Back to Top


FTC Holds Workshop on Mobile Disclosures
By Karen Neuman

On May 30, 2012 the Federal Trade Commission (FTC) held a public workshop to discuss advertising and privacy disclosures over mobile platforms and devices. The workshop is part of the FTC’s initiative to update its Dot.com Disclosures Guidelines to reflect changes in technology, advertising and behavior since the Guidelines were issued in 2000. The updated Guidelines are expected to be released this fall. In addition to considering the workshop discussions, the FTC will take into account public comments submitted by July 11, 2012.

The panels were divided into advertizing and privacy, although there was some overlap as panelists agreed that small screen size and consumer behaviors, including multitasking, pose significant challenges to designing and conveying required disclosures to consumers. The privacy panel focused on how industry can provide consumers with legally compliant notice and choice on small mobile device screens. The efficacy of privacy “icons”, including the importance of design and visibility, was debated as well as layered or otherwise streamlined privacy policies and opt-out mechanisms. Many of these approaches are still in development by industry associations.

Examples of contextualization include presenting disclosures before the completion of a transaction or before a site or app transmits a user’s personal information. There was general consensus that more invasive privacy practices require more “robust” disclosure. There was also consensus, however, that even when information about a site or app’s privacy practices is presented to consumers, they don’t understand the what information is being collected from them and what is being done with it because privacy disclosures tend to be overly legalistic, or conveyed in language that the typical consumer is unfamiliar with.

Although the dot.com proceeding is focusing on how to convey consumer protection disclosures over mobile devices, communicating this information in language that can be easily understood and acted on may not be such a new challenge. Advertisers have long been criticized for failing to adequately convey important consumer protection information through traditional media (for example television “infomercials”).

Businesse are now engaging consumers through mobile devices, often as part of an integrated campaign that includes traditional media. The dot.com proceeding should be understood as occurring as policymakers at all levels of government are struggling to balance consumer demand for data-driven products and services with ensuring that they have ready access to actionable information about those products and services.

Back to Top

Legislative and Judicial Developments Limit Employer Access to Employee and Job Applicant Social Media & Online Accounts
By Karen Neuman
     Ari Moskowitz


The wealth of information people are posting on social media sites and other web services is a treasure trove for employers who want to conduct background checks on job applicants or monitor their employees. Details of individuals' personal lives published on these platforms and services can be retained for periods much longer than those individuals may have anticipated, and therefore contravene their expectations of privacy in that information. While employers have access to more data than ever before about job applicants and employees, using this information carries potentially significant legal risk.

In January 2012 the American Civil Liberties Union (ACLU) sent a letter to the Maryland Department of Corrections (DOC) on behalf of corrections officer Robert Collins asking it to rescind its policy requiring that employees and job applicants provide passwords to their social media accounts. The Maryland legislature and Governor took note, and on May 2012, legislation1 was enacted prohibiting employers from requesting or requiring that job applicants or employees provide user names or passwords to any “personal account or service [accessed] through an electronic communications device”. Although the law was enacted in response to the MD DOC’s request for a job applicant’s social media account and its policy requiring employees to turn over the same information, it applies broadly to requests for passwords to any personal web-based accounts or mobile apps.

The law also prohibits employers from disciplining employees who refused to turn over personal web-based account information. Employers are permitted, however, to seek such account information to conduct an investigation if they have reason to believe that an employee is using their personal account for business purposes, or is downloading proprietary information without authorization to a personal account.

Similar laws have been introduced or proposed in the U.S. Congress2 and a variety of states, including Ohio,3 Minnesota,4 Illinois,5 New Jersey,6 and California.7 The California bill recently passed through committee and is now before the legislature. It differs from the Maryland law in that it specifically applies to “social media accounts”, broadly defining “social media.” The Minnesota and Illinois proposals are even narrower, applying only to “social networking sites,” defined the same in both bills as a site that allows the creation of a public profile and a list of other users with whom the user is connected, and allows users to navigate those lists. These laws exempt e-mail accounts from the definition. More states are expected to introduce similar laws in the coming months.

The issue has not escaped the attention of the U.S. Congress. The Social Networking Online Protection Act, introduced in the U.S. House of Representatives in April, is limited to personal email accounts and “social networking websites,” which are defined as a service that is primarily intended for the management of “user-generated personal content” by users with distinct user names or passwords. The law would prohibit employers from requiring or requesting user names or passwords to employees’ or applicants’ personal email or social networking accounts and from retaliating against those who refuse to provide this information. The law specifies a civil fine for violations and authorizes the Secretary of Labor to enforce the law. This law is more expansive than any of the state laws, though, as it applies beyond the employment context. The law would also prohibit both higher education institutions and local schools from requiring or requesting access to students’ or applicants’ personal email or social networking accounts.

In addition to statutory protections, common law might offer relief in some jurisdictions for aggrieved job applicants or employees if they can demonstrate an expectation of privacy in their Facebook postings and that those postings were improperly accessed by the employer. For example, on May 30, 2012, the U.S. District Court for the District of New Jersey denied an employer’s motion to dismiss a claim for invasion of privacy where the employer, a hospital, viewed and copied the Facebook postings of a nurse by coercing a co-worker to disclose the postings to a supervisor. The Court noted that privacy in social networking is an emerging but underdeveloped area of the law. Nevertheless, the Court concluded that the Plaintiff stated a claim for invasion of privacy because she had taken steps to protect her Facebook account from public view by inviting coworkers, but not management, to be Facebook “friends.”

By its very nature Facebook is an inherently public communications medium. Individuals who post information on it and similar social media sites should recognize that their postings can be seen by anyone, including potential employers (regardless of their privacy settings). However, employers who require personal social media passwords as a condition of employment, or who engage in coercive behavior to access employee Facebook accounts when those employees have attempted to restrict access, face potentially significant legal risk.


1 House Bill 964, http://mlis.state.md.us/2012rs/bills/hb/hb0964t.pdf
2 Social Networking Online Protection Act, H.R.5050, http://thomas.loc.gov/cgi-bin/query/z?c112:H.R.5050:
3 Ohio, Senate Bill 351, http://legiscan.com/gaits/view/428731
4 H.F. No. 2963 (2011-2012), https://www.revisor.mn.gov/revisor/pages/search_status/status_detail.php?b=House&f=HF2963&ssn=0&y=2011
5 HB3782 (2011-2012), http://www.ilga.gov/legislation/BillStatus.asp?DocTypeID=HB&DocNum=3782&GAID=11&SessionID=84&LegID=61758
6 Gina Bittner, Gloucester County Times, Assemblyman John Burzichelli bill would deny boss access to employee Facebook (March 25, 2012), available at http://www.nj.com/gloucester-county/index.ssf/2012/03/assemblyman_john_burzichelli_b.html
7 AB 1844 (Campos), http://www.leginfo.ca.gov/cgi-bin/postquery?bill_number=ab_1844&sess=CUR&house=B&author=campos

Back to Top


Online Behavioral Advertising Accountability Program Faults Ad Companies' Data & Tracking Practices
By Karen Neuman

On May 30, 2012, the Council of Better Business Bureaus Online Interest-Based Advertising Accountability Program (OBA Accountability Program) announced that inquiries into the data collection and use practices of seven online advertising companies resulted in decisions that the companies violated the Program’s Self-Regulatory Principles for Online Behavioral Advertising (OBA) Principles. The decisions clarify that:

  • The Transparency and Consumer Control Principles cover all technologies (not just browser “cookies”) in the interest-based advertising space, including device fingerprinting;
  • Consumer disclosures should explain the company’s OBA practices and clearly notify consumers that the company adheres to the OBA Principles; and
  • Consumer opt-out mechanisms must have five-year duration, and once a consumer has opted out, this choice must be honored at all times, including during exchanges of information such as cookie synching.

The specific practices at issue involved: 1) ad company privacy policies that were seen as failing to “fully comply with all transparency and consumer control requirements” to notify users that their visits to websites were being tracked for advertising purposes; and 2) impaired mechanisms for opting out of tracking.

For example, one of the companies failed to notify consumers that they were being tracked across devices online. When users used tools on the Digital Advertising Website to opt out of receiving ads, the ads were not blocked. Another company offered consumers an opt-out cookie through its website that failed to work through users’ browsers because it lacked a domain attribute. A fourth company that was a member of the OBA Accountability Program was unaware of the OBA Principles. As a result its opt-out cookie did not adhere to the 5 year standard and instead expired after one year, and it failed to provide adequate notice and choice when it served an interest-based ad.

The OBA Accountability Program’s decisions come at a time of intense scrutiny by lawmakers, regulators and privacy class action lawyers of the privacy practices of a diverse array of businesses in the advertising supply and distribution chain. The resulting recommendations appear to be consistent with recent privacy policymaking by the FTC, including its recommendations for protecting consumer privacy, and remedial measures imposed by the FTC in several privacy enforcement actions that we reported here and here.

Members of the OBA Accountability Program should familiarize themselves with the program’s OBA Principles and undertake a review of their data collection, use and disclosure practices for compliance. They should also review their privacy policies to ensure compliance with the OBA Principles and that representations in those policies reflect actual practices.

Back to Top


FCC Issues Public Notice Seeking Comment on Mobile Service Provider Privacy Practices
By Ari Moskowitz

On May 25, 2012 the Federal Communications Commission (FCC) released a Public Notice (PN) soliciting comments on “the privacy and data-security practices of mobile wireless service providers with respect to customer information stored on their users’ mobile communications devices, and the application of existing privacy and security requirements to that information.”1 This inquiry comes on the heels of the “Carrier IQ” flap, in which a number of wireless communications service providers used software embedded in mobile devices to capture certain user data.

In support of this request for comments, the FCC cites its authority in section 222 of the Communications Act that telecommunications carriers have a duty to protect and limit the sharing and use of “customer proprietary network information (CPNI)” and other proprietary information relating to their customers.2 The FCC now joins a chorus of lawmakers, class action lawyers and regulators looking to examine mobile privacy. In 2007 the FCC first examined the privacy and security of customer information stored on mobile devices, and how that information is managed when those devices are refurbished and resold. In the current PN the FCC quotes some comments filed in that earlier proceeding to highlight the need for new data. For example, in its filing AT&T explained that customers have the final say over what personal data is stored on their devices and “[c]arriers do not typically have access to such information.” The FCC contrasts this with a letter AT&T sent to Senator Al Franken in response to questions about AT&T’s use of Carrier IQ to collect information about its customers. In that letter, AT&T stated that it gathers customer data to “enhance its network reporting capabilities.”3

The FCC now seeks comments on privacy and data-security questions, including:

  • Are consumers given meaningful notice and choice with respect to service providers’ collection of usage-related information on their devices?
  • Do current practices raise concerns with respect to consumer privacy and data security?
  • Have these practices created actual data-security vulnerabilities?
  • Should privacy and data security be greater considerations in the design of software for mobile devices, and, if so, should the Commission take any steps to encourage such privacy by design?
  • What role can disclosure of service providers’ practices to wireless consumers play?
  • To what extent should consumers bear responsibility for the privacy and security of data in their custody or control?
  • What should be the obligations when service providers use a third party to collect, store, host, or analyze such data?

The FCC also asks about what factors are relevant in determining wireless providers’ obligations under the law, and proposes several factors:

  • Whether the device is sold by the service provider;
  • The degree of control that the service provider exercises over the design, integration, installation, or use of the software that collects and stores information;
  • The service provider’s role in selecting, integrating, and updating the device’s operating system, preinstalled software, and security capabilities;
  • The manner in which the collected information is used;
  • The role of third parties in collecting and storing data.

All businesses operating in the mobile space, not only carriers, could be affected by a potential rulemaking, policy statement or other FCC action on this issue. An FCC Public Notice typically, though not always, initiates a formal rulemaking proceeding that can culminate in the agency issuing new regulations. If the FCC were to restrict the ability of carriers to collect, use, or disclose customer data, it could affect the ability of handset manufacturers, app developers, analytics providers, and others to access and analyze this data. Moreover, the PN indicates that the FCC may be joining the FTC, Department of Commerce, and members of Congress in scrutinizing mobile privacy. The outcome of any of these efforts could lead to potentially conflicting or duplicative requirements. Comments are due 30 days after publication in the federal register and reply comments are due 45 days after publication.


1 Federal Communications Commission, Public Notice, COMMENTS SOUGHT ON PRIVACY AND SECURITY OF INFORMATION STORED ON MOBILE COMMUNICATIONS DEVICES, CC Docket No. 96-115 (May 25, 2012).
2 47 U.S.C. § 222(a),(c)
3 Supra n.1 at 3.

Back to Top


What does it mean to have an Open Web? Its Inventor Speaks Out.
By Ari Moskowitz

Over the last several weeks, technologists and policy makers have been speaking out about the future direction of the Internet and control over information. The discussion has included calls for both private industry and regulators to consider the effects of their products and regulations, respectively, on the open, universal nature of the Internet and the web. In past discussions the focus has been on open, nondiscriminatory access to the web by gatekeepers; more recently though leaders have begun to consider the importance of privacy, and calls for privacy regulation on the future of the open Web.

Tim Berners-Lee, inventor of the World Wide Web, initiated the latest discussion with an essay in Scientific American. In it, he argues that seven principles underlying the web are being undermined – on the one side by governments violating people’s “network rights” and on the other side by social networking sites and apps closing off content from the broader web. Berners-Lee writes that two of the Web’s fundamental principles, Universality and Open Standards, are being undermined by social networks that put data on the web, but do not assign the data an URL, which effectively locks that data into that network by making it impossible for anyone else to link to the data. For example, though one can share a hyperlink to their LinkedIn profile, one cannot link to a particular employment experience. Similarly, one cannot export data from Facebook to fill in a LinkedIn profile. He levies a similar criticism against apps, which run over the Internet, but not the web, and so are likewise “walled gardens.” He compares these to America Online of the 1990s that provided a subset of the web to its users.

Berners-Lee also argues that the web must be kept separate from the Internet, characterizing the web as an application that runs over the Internet. He asks us to think of the web as a refrigerator that runs on the electric grid (the Internet), and contends that we can ask the government to regulate the Internet, without regulating the web. The web is designed to be an open platform, but the Internet, he argues, is subject to gatekeepers that can restrict access to both it and the web. He argues that governments should put net neutrality into law to protect the principle of Electronic Human Rights. But he also argues that governments must avoid interfering with individual rights by refraining from Internet surveillance and censorship, and must provide due process before terminating an Internet connection or taking down a web site.

Berners-Lee concludes the article with an example of how the strengths of all these principles could be leveraged through “linked data,” a development that is arguably already well under way as the web and its applications become more social and unified. Berners-Lee promotes the idea that as more data is added to the web and machines are better able to understand different types of data, the web will become ever more useful. He suggests that linking big data could lead to curing diseases, new and better businesses, and more effective government. He acknowledges that the trend to “linked data” could lead to privacy issues, and urges that “legal, cultural and technical options that will preserve privacy without stifling beneficial data-sharing capabilities” must be examined. He proposes that governments, developers, and citizens must work together to preserve these principles and thereby promote a platform for innovation and progress.

Neelie Kroes, the vice-president of the European commission, echoed Berners-Lee in a speech at the World Wide Web (WWW2012) conference on April 19. She declared her support for openness in the Internet ecosystem, and the elimination of “digital handcuffs.” Kroes argued that openness refers to the underlying architecture of the Internet, freedom of speech, the posting of public documents to the web, and open standards. She highlighted the importance of choice and not stifling alternative models for business or self-expression. In discussing the importance of alternative business models, she advocated that Europe update its copyright rules to account for “online realities.”

In her calls for regulation, Kroes agreed with Berners-Lee on the importance of legislating net neutrality but qualified her support by arguing that net neutrality does not mean banning all special Internet offers. Rather, it means being transparent about the transaction, its costs, and its benefits, while guaranteeing the availability of a choice to have full access to the Internet.

Kroes disagreed with Berners-Lee on two points. On the detriments of walled gardens, Kroes is more confident that, should consumers be given a choice, they will choose full Internet access over a closed environment. She also insists that privacy laws apply as forcefully on the Internet as they do elsewhere, asserting that “when you go online, you aren't stripped of your fundamental right to privacy.”

This discussion, which is only the most recent in a decades-long debate over control of the Internet and information, highlights the urgency with which both policymakers and technologists view these issues. Businesses are moving ever more quickly to collect and use data in new and innovative ways to the benefit of consumers. Public policy, always playing catch-up, appears to be falling ever farther behind.

Back to Top


A Rolling Stone Gathers No Moss & Neither Should Your Privacy Policy: The FTC-Myspace Settlement
By Karen Neuman

On May 8, 2012, Myspace and the FTC entered into a proposed consent decree to settle charges that it misrepresented in its privacy policy how it collects and uses customer data. The settlement is yet another in a growing list of actions taken by the FTC to hold operators of websites, online services and digital service providers accountable for privacy policy promises. The proposed settlement order bars Myspace from misrepresenting the extent to which it protects the privacy of users' personal information. As in the case of last year’s Google Buzz, Twitter and Chitika settlements, the decree requires that Myspace establish a comprehensive privacy program designed to protect consumers' information, and to obtain biennial assessments of its privacy program by independent, third-party auditors for 20 years.

The FTC alleged that the social network company’s privacy policy promised it would not share users’ personally identifiable information, or use such information in a way that was inconsistent with the purpose for which it was submitted, without first giving notice to users and receiving their permission to do so. The privacy policy also promised that the information used to customize ads would not individually identify users to third parties and would not share non-anonymized browsing activity. According to the Complaint, Myspace nevertheless provided advertisers with the Friend ID of users who were viewing particular pages on the site. Advertisers could use the Friend ID to locate a user's Myspace profile to obtain personal information publicly available on the profile and, in most instances, the user's full name. Advertisers also could combine the user's real name and other personal information with additional information to link broader web-browsing activity to a specific individual.

Myspace also certified that it complied with the U.S.-EU Safe Harbor Framework, which provides a method for U.S. companies to transfer personal data lawfully from the European Union to the United States. As part of its self-certification, Myspace claimed that it complied with the Safe Harbor Principles, including the requirements that consumers be given notice of how their information will be used and the choice to opt out. The FTC alleged that these statements were false. The consent decree bars Myspace from misrepresenting the extent to which it complies with any compliance program, including the Safe Harbor.

Lessons Learned.

The FTC continues to focus on perceived disconnects between privacy policy promises and data collection, use, sharing and retention practices. In order to minimize the risk of unwanted attention and possible enforcement action the following practices should be obvious by now:

  • Conduct regular reviews of your data practices and privacy policies. Many companies devote considerable time and resources to creating a privacy policy and believe they’ve met their privacy obligations. However, privacy policies should be seen as a snapshot in time. Adding new technology that captures or uses consumer data in ways not previously anticipated, or other intervening operational or legal developments, requires that you regularly review your privacy policy to make sure it is current and won’t be seen as misleading.
  • Designate one person to be in charge of privacy for your company. If that person is not an experienced privacy lawyer he or she should consult with one to make sure your company complies with all applicable laws, especially if your company handles sensitive information (e.g., health, financial, employment or children’s data).
  • Perform due diligence on agreements with all third-party vendors that have access to and handle your users’ personal information to ensure, at a minimum, that those practices are consistent with your privacy policy representations. Due diligence should be performed for new agreements, including if the agreement is the result of an acquisition of the entity with whom you originally contracted.
  • Review your default settings to make sure that your users aren’t disclosing personal information to third parties without having provided knowing consent.

Bottom line: Don’t say it if you don’t do it.


1 Consumer Data Privacy In A Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy, available at http://www.whitehouse.gov/sites/default/files/privacy-final.pdf.
2 Id. at p. 10

Back to Top


ICAAN Data Breach Exposes Customer Data as a Result of System Breakdown
By Ari Moskowitz

In mid-April, the Internet Corporation for Assigned Names and Numbers (ICANN), the nonprofit organization responsible for assigning and maintaining Internet addresses, announced a data breach that exposed the user and file names of certain users1 applying for new Top Level Domains. Top level domains are the ends of web addresses that have thus far been restricted to “.com”, “.org” and 19 other terms (excluding country-specific domains). The glitch occurred in the Top Level Domain Application System, which ICANN was using to process applications for new top level “dot.brandanything” domains. In January 2012 ICAAN announced that organizations would be able to apply for these top level domains until April 12, 2012. (Google recently announced that it applied for the “.google”, “.docs” and “.youtube” domains.) The deadline was extended to May 30, 2012 as a result of the glitch.

ICANN blamed the incident on a technical glitch associated with how the system was processing attachments, even though the system was tested prior to launch. ICAAN explained that the glitch was not the result of a hack or cyber-attack. Nevertheless, it shut down the affected system from April 12 until May 21. As of early May ICAAN had yet to notify the affected users and specify how many were actually affected, though it stated that it would do so once its investigation was completed.

Some businesses that may not have initially wanted to apply for a new domain name may, however have considered applying anyway as a defensive strategy to prevent other entities from registering their name and potentially damaging, or otherwise harming their brand. Other businesses may have applied for a name in anticipation of a new product launch. For example, in the same blog post in which Google announced its application for its brand domains, it indicated that it applied for others that the company thinks have “interesting and creative potential.” Because it remains unclear what data may have been compromised, organizations should try to obtain information about whether their trade or proprietary information was compromised.

This incident demonstrates that even sophisticated engineering and technical solutions to protecting data are not substitutes for robust privacy practices and data security plans and procedures. These practices and procedures must be constantly evaluated and modified to address any intervening changes in technology, system capacity or other known or suspected vulnerabilities. All companies that collect, handle and retain data must be prepared to minimize damage, resolve the issue, and, in most states, notify affected customers. Not only are these good privacy practices, increasingly these practices are also required by law.


1 ICANN, TAS Interruption - Frequently Asked Questions, http://newgtlds.icann.org/en/applicants/tas/interruption-faqs

Back to Top


Federal Automobile “Black Box” Legislation Expands Vehicle Manufacturers’ Focus Beyond Safety to Privacy
By Ari Moskowitz

Congress is currently considering legislation that would mandate the inclusion of Event Data Recorders (EDRs) in cars, adding potential privacy, legal, and business risks to the safety issues that the industry has been focused on. EDRs, like “black boxes” in airplanes, record and store data about a vehicle, including its speed, safety belt use, airbag deployment, and a variety of other safety metrics.1 The data are used to determine the causes of a crash and uncover and fix flaws in vehicles to prevent future crashes.

EDRs have been installed in most U.S. vehicles since the 1990s. Though not currently required by law, 85% of cars already have EDRs installed by the vehicle manufacturers. Many of these devices are seen as being of limited value in defending against lawsuits because they typically record only a few seconds of data preceding a crash. Since 2004, EDRs have been the subject of limited federal and state regulation. That year California passed a law requiring that vehicle manufacturers disclose the presence of an event data recorder in the car’s owner’s manual. The National Highway Traffic Safety Administration (NHTSA) adopted a rule in August 2006 that made this requirement apply nationally, but it also set some minimum standards about what information must be collected by EDRs.

The two most viable Congressional bills for mandatory EDR installation are the Moving Ahead for Progress in the 21st Century Act (MAP-21), S.1813, approved by the Senate in April, and H.R 14, one of several highway funding measures pending in the House. H.R. 14 includes a section on EDRs that is identical to the EDR provision in S.1813 and has been pending since March. One of the house funding measures is expected to be voted on and reconciled with the Senate version before the summer recess.

Privacy advocates have raised concerns about EDRs, including data ownership, security and the potential for misuse of EDR data. They fear, for example, that if information captured by an EDR is shared with insurance companies, it could lead to higher premiums or insurance denial. If the information is shared with traffic enforcement authorities, it could theoretically lead to ticketing drivers for speeding even when drivers were not detected by law enforcement. Privacy advocates also fear that EDR technology will inevitably lead to increasingly invasive tracking and monitoring of driver behavior and activities unrelated to safety. Both the Senate and House measures contain provisions that attempt to address these concerns, including assigning ownership over EDR data to the vehicle owner or lessee, and prohibiting access to the data except in specified limited circumstances. 2 However, the bill does not address such important details as how the data must be protected once it is retrieved under one of the exceptions.

The EDR section in MAP-213 mandates that all “new passenger motor vehicles” sold in the United States install EDRs beginning with model year 2015. The records will be required “to capture and store data related to motor vehicle safety covering a reasonable time period before, during, and after a motor vehicle crash or airbag deployment, including a rollover.”

The bill’s privacy and data security provisions mandate certain “Limitations on Information Retrieval.” In particular, all of the data stored in an EDR will be treated as the property of the person that owns or is leasing that car. Also, data stored on an EDR may not be retrieved by anyone other than the vehicle owner or lessee except in four circumstances: (1) a court order, (2) when the owner consents, (3) when retrieved in the course of an inspection or investigation authorized by federal law, on the condition that the owner’s personally identifiable information and the vehicle identification number are not connected to the retrieved data, and (4) when the information is retrieved to facilitate or determine the necessity of an “emergency medical response in response to a motor vehicle crash.”

Many of the specific requirements regarding data security standards and implementation would be left to rulemakings and reports by the Department of Transportation (DOT). This includes requiring the DOT initiate a rulemaking to specify a format in which EDR data be made accessible with commercially available equipment and establish specific data security requirements to prevent unauthorized access to the data.

MAP-21 appears to be a mixed bag for car manufacturers. Most already voluntarily install EDRs in their vehicles. But the bill appears to significantly limit the ability of manufacturers to access EDR data without the consent of either the vehicle owner or the courts, potentially impeding the identification and resolution of safety flaws. Also, law enforcement access to the data could theoretically make consumers reluctant to purchase cars loaded with this technology (though should it be enacted, there will not be a choice after 2015).

If MAP-21 is enacted, DOT will be required to study and report on EDR safety and privacy issues, as well as the costs and benefits of the devices. Vehicle manufacturers should closely monitor the progress of this legislation and if it becomes law, the DOT proceedings that will follow. Vehicle manufacturers should also be prepared to develop carefully tailored privacy and data security strategies as they join the increasing number of industries competing for consumer business on the basis of good privacy practices. Doing so should also help minimize the potential for unwanted attention by the privacy class action bar.


1 National Conference of State Legislatures, available at http://www.ncsl.org/issues-research/telecom/event-data- recorder-quotblack-box-quot-le135.aspx;
2 S.1813; H.R.14
3 Sec. 31406

Back to Top


UPDATES

Netflix Settles Privacy Class Action Lawsuit over Data Retention Practices

On June 1, 2012, Netflix filed a Settlement Agreement in a class action alleging that the company unlawfully retained personally identifiable information about its former customers in violation of the Video Privacy Protection Act (VPPA)1. The VPPA prohibits video rental services from sharing customer rental history and viewing data without prior written consent, and limits how long that information can be retained.

The Complaint, filed in 2011, alleged that Netflix failed to destroy the Plaintiffs’ rental and payment history after cancelling the Plaintiffs cancelled their Netflix subscriptions. The VPPA requires deletion of such personally identifiable information “as soon as “practicable”, and in any event one year from the date that it no longer serves the purpose for which it was collected. The Plaintiffs continued receiving marketing messages from Netflix that seemed to be generated on the basis of their rental history. Former customers who rejoined Netflix had their previous video “queues” reactivated, suggesting that Netflix had retained their viewing histories and preferences. The Settlement Agreement indicates that Netflix will remove information about former customers’ rental history one year after those accounts are canceled, and will do the same for current customers. In addition, Netflix will pay a total of $9 million, most of which of will be distributed to various nonprofit groups. A smaller portion will be awarded as attorneys fees.

The video rental industry is subject to explicit statutory requirements for handling customer data. The law’s provisions can be enforced by a private right of action, including one seeking monetary damages. (We previously reported, however, that the U.S. Court of Appeals for the 7th Circuit ruled that injunctive relief, and not monetary damages, is the only method for enforcing the VPPA’s data destruction provisions).

Back to Top


UK Cookie Compliance Grace Period Expires as ICO Permits Implied Consent

On May 25, 2012, the United Kingdom Information Commissioner’s Office (ICO) announced that the one-year grace period from enforcement for compliance with its amended Privacy and Electronic Communications Regulations (implementing amendments to the EU’s 2002 Privacy and Electronic Communications Directive) would expire May 27, 2012. As we previously reported, the grace period was granted to give UK website operators (as well as U.S.-based operators serving UK consumers) time to comply with the rules, which require informed consent before operators can deposit cookies on a user’s computer or mobile device. Hours before the expiration of the grace period, the ICO announced that implied consent will be legally permissible as long as it is “informed.” Initially, full compliance will not be required; however companies must implement measures intended to lead to full compliance, including performing a “cookie” audit, redesigning and placing privacy notices to make them more visible and creating an actionable compliance plan.

The ICO posted information on its website, including a video, explaining how companies who wish to rely on implied consent may do so under the rules, including the following:

  • Implied consent is a valid form of consent and can be used in the context of compliance with the revised rules on cookies.
  • If you are relying on implied consent you need to be satisfied that your users understand that their actions will result in cookies being set. Without this understanding you do not have their informed consent.
  • You should not rely on the fact that users might have read a privacy policy that is perhaps hard to find or difficult to understand.
  • In some circumstances, for example where you are collecting sensitive personal data such as health information, you might feel that explicit consent is more appropriate.

Members of the public will be able to report concerns about cookies on the ICO website, which the ICO will use to assist with enforcement.

Back to Top


The PCI Data Security Standards Council (PCI SSC) Issues Mobile Payment Acceptance Fact Sheet

On May 16, 2012 the PCI SSC Mobile Working Group issued guidance for secure acceptance of payments by merchants using mobile devices such as smart phones and tablets at the point of sale. The guidance is intended to be a first step to facilitate fully secure mobile payments. The At a Glance Mobile Payment Acceptance Security guidance (Guidance) is intended to help merchants securely accept payment consistent with the PCI data security standard (PCI DSS). (This standard requires merchants to protect cardholder data during collection, transmittal and storage.) The Guidance recommends that merchants secure account data at the point of sale by using a validated point-to-point solution that encrypts customer payment data, and adequately secure it before it enters the mobile payment ”acceptance device”. The guidance indicates that implementing the recommendation practices will reduce the scope of merchants’ PCI compliance obligations.

Interestingly, a freeze on approvals of card acceptance software for mobile card acceptance imposed last year by the PCI SSC remains in effect.

As we previously reported, last year the PCI Data SSC issued payment card industry guidelines for virtualized environments.

Back to Top


Federal Court Grants Class Certification in Song-Beverly Zip Code Action

On May 4, 2012, the U.S. District Court for the Southern District of California granted class certification in Yeoman v. IKEA U.S.A. West, Inc. The Plaintiffs alleged that IKEA requested and electronically stored customer zip card information during retail credit card transactions in violation of California’s Song-Beverly Credit Card Act of 1971 (Act)1. The Act prohibits retailers from collecting personal information during a credit card transaction:

[N]o person or … corporation that accepts credit cards for the transaction of business shall:2

  • . . . (2) Request, or require as a condition to accepting the credit card as payment … for goods or services [] the cardholder to provide personal identification information, which the person … or corporation accepting the credit card writes, causes to be written, or otherwise records...”
  • “Personal Identification Information” is defined as:
  • information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.3

As we previously reported, in 2011 the California Supreme Court ruled that zip codes are personal information under the Act. Following this decision, IKEA stopped collecting zip code information; it subsequently offered customers a loyalty program that they could enroll in by providing information that included a physical mailing address. The company argued that the class definition was overbroad because it included these individuals, who voluntarily provided personal information; accordingly, IKEA sought to exclude them from the class. The District Court rejected IKEA’s arguments reasoning that allowing retailers to collect zip code information during a credit card transaction from persons who voluntarily provide that or other personal information for store promotions would undermine the Act’s primary goal of preventing store clerks from collecting consumers’ personal information. This decision demonstrates that collecting credit card information in California, even if voluntarily provided prior to the particular transaction, poses significant risk for retailers.


1 California Civil Code §1747.08.
2 Id., §1747.08(a)(2)
2 Id., §1747.08(b).

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036