|
|||||||||||||||
PRIVACY
& INFORMATION LAW UPDATE |
|||||||||||||||
|
|||||||||||||||
|
|||||||||||||||
FEATURE ARTICLE: Introduction. An emerging market for education software and digital content has attracted entrepreneurs and investors looking for new sectors in which to innovate and profit. Examples include online learning management systems, question and answer services for students and teachers, assistive applications for special needs students, web-based plagiarism detection tools, behavior management software, and digital badges that can be used to represent academic accomplishment. These products and services are being integrated into traditional brick and mortar learning as well as online alternatives. Solutions that are enjoyed in the consumer context, including single sign-on and personalization, are being implemented in education to promote adoption of these technologies and facilitate their use. Entry into this bourgeoning market can pose legal and reputational risk for companies unfamiliar with privacy issues and corresponding legal obligations that are unique to education. These risks can be minimized by adopting an approach early in development that reflects an understanding of how a product or service captures, uses, stores and discloses student data; interacts with other organizations in the ecosystem; and triggers legal obligations. The Education Technology Ecosystem & Privacy Law. Students interact with education technology in an ecosystem where their data is disclosed, accessed, stored in and shared over computing (including mobile) devices, cloud, email and internet services, social media platforms and, increasingly, mobile apps. Growing interoperability that is intended to facilitate the sharing of user data in other contexts may run counter to individual privacy expectations and specific laws in an education setting. For example, a suite of hosted e-mail and collaboration tools for schools can trigger questions about whether the service provider is collecting, using and sharing personal information from children in violation of the Childrens Online Privacy Protection Act1 (COPPA). COPPA prohibits commercial (and in limited instances nonprofit) operators of websites, online services or mobile apps from collecting personal information from children who are under 13 without first providing notice to parents and obtaining their verifiable consent. Under the implementing (COPPA) rule2, adopted and enforced by the Federal Trade Commission (FTC), personal information includes email or physical address, full name, and in some instances date of birth and gender. The sharing of this information with third parties and the purpose for doing so (for example, to serve ads) must be disclosed in a required privacy policy. A hosted virtual environment in which a game is used for student assessment could, in addition to raising COPPA questions, trigger Family Educational Rights Privacy Act3 (FERPA) obligations for the school, and possibly the games host or operator. FERPA is the federal law that protects the privacy of student education records and governs how information in those records can be disclosed. An education record is any record that contains personally identifiable information that is directly related to a particular student and maintained by the school or a third party acting on the schools behalf. It can include computer data and be stored in a database or on a server. FERPA prohibits schools that receive Department of Education (DOE) funds (and in limited circumstances specified third parties) from disclosing personal information in a students education record without the students written consent or, if the student is under 18, parental consent. The implementing rules4 are enforced by DOE. Although there is no private right of action for noncompliance, an enforcement action can be triggered when a complaint is filed by a student or parent. The DOE may impose sanctions for noncompliance, including the suspension or revocation of federal funds a remedy that could make schools wary of adopting products or services that are seen as creating FERPA risk. Classroom use of assistive technology can also raise FERPA compliance obligations, as well as similar privacy obligations in the Individual With Educational Disabilities Act5 (IDEA). Examples of assistive devices, and the entities with potential access to data captured in them, include word prediction or voice-to-text tools whose data can be accessed by the device manufacturer, the hosted database provider, or a third party studying language-based learning disabilities. Unlike FERPA, the IDEA provides a private right of action6 to enforce the confidentiality of personally identifiable information, and this right could be a deterrent to adoption of certain products by schools. FTC Privacy Framework. Companies that make digital learning products or services that collect, use, share and store user information must ensure that their privacy and data security practices adhere to the FTC framework for protecting consumer privacy. The framework recommends, but does not require companies to implement Do-Not-Track (DNT) tools. It does, however ask Congress to enact DNT legislation that would mandate implementation of DNT tools. Other key recommendations are widely seen as providing a roadmap for avoiding an FTC enforcement action for misleading or deceptive privacy practices. These recommendations urge companies to implement Fair Information Principles (FIP) for collecting and protecting user data. FIP principles include providing users with meaningful notice about data collection, retention, and use; offering easily understood and implemented opt-out choice; implementing data minimization (collecting only that data necessary for a particular purpose); and maintaining adequate data integrity and security. This framework builds on recent FTC enforcement actions against a variety of online and digital companies for misrepresenting such data collection and use practices as using consumer data for secondary purposes that are materially different from the purpose for which it was originally collected, or failing to adhere to user opt-out choices about those practices. Conclusion. The market for education products and services presents new opportunities for entrepreneurs and their investors. The increasingly interoperable ecosystem in which education software and digital content is being adopted requires an understanding by providers, students and schools about how student data is collected, used and shared -- often by many parties. This data flow can raise potentially complex questions about ownership, control and liability. Once these questions are fully understood, relationships can be structured to allocate risk and liability. By adequately addressing these questions at the outset, companies will be better situated to avoid turning a promising new venture into a disappointing one, or convert an investment opportunity that initially seemed appealing a costly one.
1 15 U.S.C. § 6501. |
|||||||||||||||
FTC Holds
Workshop on Mobile Disclosures The panels were divided into advertizing and privacy, although there was some overlap as panelists agreed that small screen size and consumer behaviors, including multitasking, pose significant challenges to designing and conveying required disclosures to consumers. The privacy panel focused on how industry can provide consumers with legally compliant notice and choice on small mobile device screens. The efficacy of privacy icons, including the importance of design and visibility, was debated as well as layered or otherwise streamlined privacy policies and opt-out mechanisms. Many of these approaches are still in development by industry associations. Examples of contextualization include presenting disclosures before the completion of a transaction or before a site or app transmits a users personal information. There was general consensus that more invasive privacy practices require more robust disclosure. There was also consensus, however, that even when information about a site or apps privacy practices is presented to consumers, they dont understand the what information is being collected from them and what is being done with it because privacy disclosures tend to be overly legalistic, or conveyed in language that the typical consumer is unfamiliar with. Although the dot.com proceeding is focusing on how to convey consumer protection disclosures over mobile devices, communicating this information in language that can be easily understood and acted on may not be such a new challenge. Advertisers have long been criticized for failing to adequately convey important consumer protection information through traditional media (for example television infomercials). Businesse are now engaging consumers through mobile devices, often as part of an integrated campaign that includes traditional media. The dot.com proceeding should be understood as occurring as policymakers at all levels of government are struggling to balance consumer demand for data-driven products and services with ensuring that they have ready access to actionable information about those products and services. Back to Top |
|||||||||||||||
Legislative
and Judicial Developments Limit Employer Access to Employee and Job Applicant
Social Media & Online Accounts In January 2012 the American Civil Liberties Union (ACLU) sent a letter to the Maryland Department of Corrections (DOC) on behalf of corrections officer Robert Collins asking it to rescind its policy requiring that employees and job applicants provide passwords to their social media accounts. The Maryland legislature and Governor took note, and on May 2012, legislation1 was enacted prohibiting employers from requesting or requiring that job applicants or employees provide user names or passwords to any personal account or service [accessed] through an electronic communications device. Although the law was enacted in response to the MD DOCs request for a job applicants social media account and its policy requiring employees to turn over the same information, it applies broadly to requests for passwords to any personal web-based accounts or mobile apps. The law also prohibits employers from disciplining employees who refused to turn over personal web-based account information. Employers are permitted, however, to seek such account information to conduct an investigation if they have reason to believe that an employee is using their personal account for business purposes, or is downloading proprietary information without authorization to a personal account. Similar laws have been introduced or proposed in the U.S. Congress2 and a variety of states, including Ohio,3 Minnesota,4 Illinois,5 New Jersey,6 and California.7 The California bill recently passed through committee and is now before the legislature. It differs from the Maryland law in that it specifically applies to social media accounts, broadly defining social media. The Minnesota and Illinois proposals are even narrower, applying only to social networking sites, defined the same in both bills as a site that allows the creation of a public profile and a list of other users with whom the user is connected, and allows users to navigate those lists. These laws exempt e-mail accounts from the definition. More states are expected to introduce similar laws in the coming months. The issue has not escaped the attention of the U.S. Congress. The Social Networking Online Protection Act, introduced in the U.S. House of Representatives in April, is limited to personal email accounts and social networking websites, which are defined as a service that is primarily intended for the management of user-generated personal content by users with distinct user names or passwords. The law would prohibit employers from requiring or requesting user names or passwords to employees or applicants personal email or social networking accounts and from retaliating against those who refuse to provide this information. The law specifies a civil fine for violations and authorizes the Secretary of Labor to enforce the law. This law is more expansive than any of the state laws, though, as it applies beyond the employment context. The law would also prohibit both higher education institutions and local schools from requiring or requesting access to students or applicants personal email or social networking accounts. In addition to statutory protections, common law might offer relief in some jurisdictions for aggrieved job applicants or employees if they can demonstrate an expectation of privacy in their Facebook postings and that those postings were improperly accessed by the employer. For example, on May 30, 2012, the U.S. District Court for the District of New Jersey denied an employers motion to dismiss a claim for invasion of privacy where the employer, a hospital, viewed and copied the Facebook postings of a nurse by coercing a co-worker to disclose the postings to a supervisor. The Court noted that privacy in social networking is an emerging but underdeveloped area of the law. Nevertheless, the Court concluded that the Plaintiff stated a claim for invasion of privacy because she had taken steps to protect her Facebook account from public view by inviting coworkers, but not management, to be Facebook friends. By its very nature Facebook is an inherently public communications medium. Individuals who post information on it and similar social media sites should recognize that their postings can be seen by anyone, including potential employers (regardless of their privacy settings). However, employers who require personal social media passwords as a condition of employment, or who engage in coercive behavior to access employee Facebook accounts when those employees have attempted to restrict access, face potentially significant legal risk.
1 House Bill 964, http://mlis.state.md.us/2012rs/bills/hb/hb0964t.pdf |
|||||||||||||||
Online
Behavioral Advertising Accountability Program Faults Ad Companies' Data
& Tracking Practices On May 30, 2012, the Council of Better Business Bureaus Online Interest-Based Advertising Accountability Program (OBA Accountability Program) announced that inquiries into the data collection and use practices of seven online advertising companies resulted in decisions that the companies violated the Programs Self-Regulatory Principles for Online Behavioral Advertising (OBA) Principles. The decisions clarify that:
The specific practices at issue involved: 1) ad company privacy policies that were seen as failing to fully comply with all transparency and consumer control requirements to notify users that their visits to websites were being tracked for advertising purposes; and 2) impaired mechanisms for opting out of tracking. For example, one of the companies failed to notify consumers that they were being tracked across devices online. When users used tools on the Digital Advertising Website to opt out of receiving ads, the ads were not blocked. Another company offered consumers an opt-out cookie through its website that failed to work through users browsers because it lacked a domain attribute. A fourth company that was a member of the OBA Accountability Program was unaware of the OBA Principles. As a result its opt-out cookie did not adhere to the 5 year standard and instead expired after one year, and it failed to provide adequate notice and choice when it served an interest-based ad. The OBA Accountability Programs decisions come at a time of intense scrutiny by lawmakers, regulators and privacy class action lawyers of the privacy practices of a diverse array of businesses in the advertising supply and distribution chain. The resulting recommendations appear to be consistent with recent privacy policymaking by the FTC, including its recommendations for protecting consumer privacy, and remedial measures imposed by the FTC in several privacy enforcement actions that we reported here and here. Members of the OBA Accountability Program should familiarize themselves with the programs OBA Principles and undertake a review of their data collection, use and disclosure practices for compliance. They should also review their privacy policies to ensure compliance with the OBA Principles and that representations in those policies reflect actual practices. |
|||||||||||||||
FCC
Issues Public Notice Seeking Comment on Mobile Service Provider Privacy
Practices On May 25, 2012 the Federal Communications Commission (FCC) released a Public Notice (PN) soliciting comments on the privacy and data-security practices of mobile wireless service providers with respect to customer information stored on their users mobile communications devices, and the application of existing privacy and security requirements to that information.1 This inquiry comes on the heels of the Carrier IQ flap, in which a number of wireless communications service providers used software embedded in mobile devices to capture certain user data. In support of this request for comments, the FCC cites its authority in section 222 of the Communications Act that telecommunications carriers have a duty to protect and limit the sharing and use of customer proprietary network information (CPNI) and other proprietary information relating to their customers.2 The FCC now joins a chorus of lawmakers, class action lawyers and regulators looking to examine mobile privacy. In 2007 the FCC first examined the privacy and security of customer information stored on mobile devices, and how that information is managed when those devices are refurbished and resold. In the current PN the FCC quotes some comments filed in that earlier proceeding to highlight the need for new data. For example, in its filing AT&T explained that customers have the final say over what personal data is stored on their devices and [c]arriers do not typically have access to such information. The FCC contrasts this with a letter AT&T sent to Senator Al Franken in response to questions about AT&Ts use of Carrier IQ to collect information about its customers. In that letter, AT&T stated that it gathers customer data to enhance its network reporting capabilities.3 The FCC now seeks comments on privacy and data-security questions, including:
The FCC also asks about what factors are relevant in determining wireless providers obligations under the law, and proposes several factors:
All businesses operating in the mobile space, not only carriers, could be affected by a potential rulemaking, policy statement or other FCC action on this issue. An FCC Public Notice typically, though not always, initiates a formal rulemaking proceeding that can culminate in the agency issuing new regulations. If the FCC were to restrict the ability of carriers to collect, use, or disclose customer data, it could affect the ability of handset manufacturers, app developers, analytics providers, and others to access and analyze this data. Moreover, the PN indicates that the FCC may be joining the FTC, Department of Commerce, and members of Congress in scrutinizing mobile privacy. The outcome of any of these efforts could lead to potentially conflicting or duplicative requirements. Comments are due 30 days after publication in the federal register and reply comments are due 45 days after publication.
1 Federal Communications Commission, Public Notice, COMMENTS
SOUGHT ON PRIVACY AND SECURITY OF INFORMATION STORED ON MOBILE COMMUNICATIONS
DEVICES, CC Docket No. 96-115 (May 25, 2012). |
|||||||||||||||
What
does it mean to have an Open Web? Its Inventor Speaks Out. Tim Berners-Lee, inventor of the World Wide Web, initiated the latest discussion with an essay in Scientific American. In it, he argues that seven principles underlying the web are being undermined on the one side by governments violating peoples network rights and on the other side by social networking sites and apps closing off content from the broader web. Berners-Lee writes that two of the Webs fundamental principles, Universality and Open Standards, are being undermined by social networks that put data on the web, but do not assign the data an URL, which effectively locks that data into that network by making it impossible for anyone else to link to the data. For example, though one can share a hyperlink to their LinkedIn profile, one cannot link to a particular employment experience. Similarly, one cannot export data from Facebook to fill in a LinkedIn profile. He levies a similar criticism against apps, which run over the Internet, but not the web, and so are likewise walled gardens. He compares these to America Online of the 1990s that provided a subset of the web to its users. Berners-Lee also argues that the web must be kept separate from the Internet, characterizing the web as an application that runs over the Internet. He asks us to think of the web as a refrigerator that runs on the electric grid (the Internet), and contends that we can ask the government to regulate the Internet, without regulating the web. The web is designed to be an open platform, but the Internet, he argues, is subject to gatekeepers that can restrict access to both it and the web. He argues that governments should put net neutrality into law to protect the principle of Electronic Human Rights. But he also argues that governments must avoid interfering with individual rights by refraining from Internet surveillance and censorship, and must provide due process before terminating an Internet connection or taking down a web site. Berners-Lee concludes the article with an example of how the strengths of all these principles could be leveraged through linked data, a development that is arguably already well under way as the web and its applications become more social and unified. Berners-Lee promotes the idea that as more data is added to the web and machines are better able to understand different types of data, the web will become ever more useful. He suggests that linking big data could lead to curing diseases, new and better businesses, and more effective government. He acknowledges that the trend to linked data could lead to privacy issues, and urges that legal, cultural and technical options that will preserve privacy without stifling beneficial data-sharing capabilities must be examined. He proposes that governments, developers, and citizens must work together to preserve these principles and thereby promote a platform for innovation and progress. Neelie Kroes, the vice-president of the European commission, echoed Berners-Lee in a speech at the World Wide Web (WWW2012) conference on April 19. She declared her support for openness in the Internet ecosystem, and the elimination of digital handcuffs. Kroes argued that openness refers to the underlying architecture of the Internet, freedom of speech, the posting of public documents to the web, and open standards. She highlighted the importance of choice and not stifling alternative models for business or self-expression. In discussing the importance of alternative business models, she advocated that Europe update its copyright rules to account for online realities. In her calls for regulation, Kroes agreed with Berners-Lee on the importance of legislating net neutrality but qualified her support by arguing that net neutrality does not mean banning all special Internet offers. Rather, it means being transparent about the transaction, its costs, and its benefits, while guaranteeing the availability of a choice to have full access to the Internet. Kroes disagreed with Berners-Lee on two points. On the detriments of walled gardens, Kroes is more confident that, should consumers be given a choice, they will choose full Internet access over a closed environment. She also insists that privacy laws apply as forcefully on the Internet as they do elsewhere, asserting that when you go online, you aren't stripped of your fundamental right to privacy. This discussion, which is only the most recent in a decades-long debate over control of the Internet and information, highlights the urgency with which both policymakers and technologists view these issues. Businesses are moving ever more quickly to collect and use data in new and innovative ways to the benefit of consumers. Public policy, always playing catch-up, appears to be falling ever farther behind. |
|||||||||||||||
A
Rolling Stone Gathers No Moss & Neither Should Your Privacy Policy:
The FTC-Myspace Settlement The FTC alleged that the social network companys privacy policy promised it would not share users personally identifiable information, or use such information in a way that was inconsistent with the purpose for which it was submitted, without first giving notice to users and receiving their permission to do so. The privacy policy also promised that the information used to customize ads would not individually identify users to third parties and would not share non-anonymized browsing activity. According to the Complaint, Myspace nevertheless provided advertisers with the Friend ID of users who were viewing particular pages on the site. Advertisers could use the Friend ID to locate a user's Myspace profile to obtain personal information publicly available on the profile and, in most instances, the user's full name. Advertisers also could combine the user's real name and other personal information with additional information to link broader web-browsing activity to a specific individual. Myspace also certified that it complied with the U.S.-EU Safe Harbor Framework, which provides a method for U.S. companies to transfer personal data lawfully from the European Union to the United States. As part of its self-certification, Myspace claimed that it complied with the Safe Harbor Principles, including the requirements that consumers be given notice of how their information will be used and the choice to opt out. The FTC alleged that these statements were false. The consent decree bars Myspace from misrepresenting the extent to which it complies with any compliance program, including the Safe Harbor. Lessons Learned. The FTC continues to focus on perceived disconnects between privacy policy promises and data collection, use, sharing and retention practices. In order to minimize the risk of unwanted attention and possible enforcement action the following practices should be obvious by now:
Bottom line: Dont say it if you dont do it.
1 Consumer Data Privacy In A Networked World: A Framework for
Protecting Privacy and Promoting Innovation in the Global Digital Economy,
available at
http://www.whitehouse.gov/sites/default/files/privacy-final.pdf. |
|||||||||||||||
ICAAN
Data Breach Exposes Customer Data as a Result of System Breakdown ICANN blamed the incident on a technical glitch associated with how the system was processing attachments, even though the system was tested prior to launch. ICAAN explained that the glitch was not the result of a hack or cyber-attack. Nevertheless, it shut down the affected system from April 12 until May 21. As of early May ICAAN had yet to notify the affected users and specify how many were actually affected, though it stated that it would do so once its investigation was completed. Some businesses that may not have initially wanted to apply for a new domain name may, however have considered applying anyway as a defensive strategy to prevent other entities from registering their name and potentially damaging, or otherwise harming their brand. Other businesses may have applied for a name in anticipation of a new product launch. For example, in the same blog post in which Google announced its application for its brand domains, it indicated that it applied for others that the company thinks have interesting and creative potential. Because it remains unclear what data may have been compromised, organizations should try to obtain information about whether their trade or proprietary information was compromised. This incident demonstrates that even sophisticated engineering and technical solutions to protecting data are not substitutes for robust privacy practices and data security plans and procedures. These practices and procedures must be constantly evaluated and modified to address any intervening changes in technology, system capacity or other known or suspected vulnerabilities. All companies that collect, handle and retain data must be prepared to minimize damage, resolve the issue, and, in most states, notify affected customers. Not only are these good privacy practices, increasingly these practices are also required by law. 1 ICANN, TAS Interruption - Frequently Asked Questions, http://newgtlds.icann.org/en/applicants/tas/interruption-faqs |
|||||||||||||||
Federal
Automobile Black Box Legislation Expands Vehicle Manufacturers
Focus Beyond Safety to Privacy EDRs have been installed in most U.S. vehicles since the 1990s. Though not currently required by law, 85% of cars already have EDRs installed by the vehicle manufacturers. Many of these devices are seen as being of limited value in defending against lawsuits because they typically record only a few seconds of data preceding a crash. Since 2004, EDRs have been the subject of limited federal and state regulation. That year California passed a law requiring that vehicle manufacturers disclose the presence of an event data recorder in the cars owners manual. The National Highway Traffic Safety Administration (NHTSA) adopted a rule in August 2006 that made this requirement apply nationally, but it also set some minimum standards about what information must be collected by EDRs. The two most viable Congressional bills for mandatory EDR installation are the Moving Ahead for Progress in the 21st Century Act (MAP-21), S.1813, approved by the Senate in April, and H.R 14, one of several highway funding measures pending in the House. H.R. 14 includes a section on EDRs that is identical to the EDR provision in S.1813 and has been pending since March. One of the house funding measures is expected to be voted on and reconciled with the Senate version before the summer recess. Privacy advocates have raised concerns about EDRs, including data ownership, security and the potential for misuse of EDR data. They fear, for example, that if information captured by an EDR is shared with insurance companies, it could lead to higher premiums or insurance denial. If the information is shared with traffic enforcement authorities, it could theoretically lead to ticketing drivers for speeding even when drivers were not detected by law enforcement. Privacy advocates also fear that EDR technology will inevitably lead to increasingly invasive tracking and monitoring of driver behavior and activities unrelated to safety. Both the Senate and House measures contain provisions that attempt to address these concerns, including assigning ownership over EDR data to the vehicle owner or lessee, and prohibiting access to the data except in specified limited circumstances. 2 However, the bill does not address such important details as how the data must be protected once it is retrieved under one of the exceptions. The EDR section in MAP-213 mandates that all new passenger motor vehicles sold in the United States install EDRs beginning with model year 2015. The records will be required to capture and store data related to motor vehicle safety covering a reasonable time period before, during, and after a motor vehicle crash or airbag deployment, including a rollover. The bills privacy and data security provisions mandate certain Limitations on Information Retrieval. In particular, all of the data stored in an EDR will be treated as the property of the person that owns or is leasing that car. Also, data stored on an EDR may not be retrieved by anyone other than the vehicle owner or lessee except in four circumstances: (1) a court order, (2) when the owner consents, (3) when retrieved in the course of an inspection or investigation authorized by federal law, on the condition that the owners personally identifiable information and the vehicle identification number are not connected to the retrieved data, and (4) when the information is retrieved to facilitate or determine the necessity of an emergency medical response in response to a motor vehicle crash. Many of the specific requirements regarding data security standards and implementation would be left to rulemakings and reports by the Department of Transportation (DOT). This includes requiring the DOT initiate a rulemaking to specify a format in which EDR data be made accessible with commercially available equipment and establish specific data security requirements to prevent unauthorized access to the data. MAP-21 appears to be a mixed bag for car manufacturers. Most already voluntarily install EDRs in their vehicles. But the bill appears to significantly limit the ability of manufacturers to access EDR data without the consent of either the vehicle owner or the courts, potentially impeding the identification and resolution of safety flaws. Also, law enforcement access to the data could theoretically make consumers reluctant to purchase cars loaded with this technology (though should it be enacted, there will not be a choice after 2015). If MAP-21 is enacted, DOT will be required to study and report on EDR safety and privacy issues, as well as the costs and benefits of the devices. Vehicle manufacturers should closely monitor the progress of this legislation and if it becomes law, the DOT proceedings that will follow. Vehicle manufacturers should also be prepared to develop carefully tailored privacy and data security strategies as they join the increasing number of industries competing for consumer business on the basis of good privacy practices. Doing so should also help minimize the potential for unwanted attention by the privacy class action bar.
1 National Conference of State Legislatures, available at
http://www.ncsl.org/issues-research/telecom/event-data-
recorder-quotblack-box-quot-le135.aspx; |
|||||||||||||||
UPDATES | |||||||||||||||
Netflix
Settles Privacy Class Action Lawsuit over Data Retention Practices The Complaint, filed in 2011, alleged that Netflix failed to destroy the Plaintiffs rental and payment history after cancelling the Plaintiffs cancelled their Netflix subscriptions. The VPPA requires deletion of such personally identifiable information as soon as practicable, and in any event one year from the date that it no longer serves the purpose for which it was collected. The Plaintiffs continued receiving marketing messages from Netflix that seemed to be generated on the basis of their rental history. Former customers who rejoined Netflix had their previous video queues reactivated, suggesting that Netflix had retained their viewing histories and preferences. The Settlement Agreement indicates that Netflix will remove information about former customers rental history one year after those accounts are canceled, and will do the same for current customers. In addition, Netflix will pay a total of $9 million, most of which of will be distributed to various nonprofit groups. A smaller portion will be awarded as attorneys fees. The video rental industry is subject to explicit statutory requirements for handling customer data. The laws provisions can be enforced by a private right of action, including one seeking monetary damages. (We previously reported, however, that the U.S. Court of Appeals for the 7th Circuit ruled that injunctive relief, and not monetary damages, is the only method for enforcing the VPPAs data destruction provisions). |
|||||||||||||||
UK
Cookie Compliance Grace Period Expires as ICO Permits Implied Consent The ICO posted information on its website, including a video, explaining how companies who wish to rely on implied consent may do so under the rules, including the following:
Members of the public will be able to report concerns about cookies on the ICO website, which the ICO will use to assist with enforcement. |
|||||||||||||||
The
PCI Data Security Standards Council (PCI SSC) Issues Mobile Payment Acceptance
Fact Sheet Interestingly, a freeze on approvals of card acceptance software for mobile card acceptance imposed last year by the PCI SSC remains in effect. As we previously reported, last year the PCI Data SSC issued payment card industry guidelines for virtualized environments. |
|||||||||||||||
Federal
Court Grants Class Certification in Song-Beverly Zip Code Action [N]o person or corporation that accepts credit cards for the transaction of business shall:2
As we previously reported, in 2011 the California Supreme Court ruled that zip codes are personal information under the Act. Following this decision, IKEA stopped collecting zip code information; it subsequently offered customers a loyalty program that they could enroll in by providing information that included a physical mailing address. The company argued that the class definition was overbroad because it included these individuals, who voluntarily provided personal information; accordingly, IKEA sought to exclude them from the class. The District Court rejected IKEAs arguments reasoning that allowing retailers to collect zip code information during a credit card transaction from persons who voluntarily provide that or other personal information for store promotions would undermine the Acts primary goal of preventing store clerks from collecting consumers personal information. This decision demonstrates that collecting credit card information in California, even if voluntarily provided prior to the particular transaction, poses significant risk for retailers.
1 California Civil Code §1747.08. |
|||||||||||||||
Copyright © 2010 St. Ledger-Roty & Olson, LLP. | |||||||||||||||