& INFORMATION LAW UPDATE
Since our first feature article on mobile privacy, the mobile ecosystem has evolved to one in which many providers offer a wide variety of data driven products and services that can collect, use and disclose customer data. People are being engaged by and marketed to by sports teams and venue operators, dating services, the gaming industry, fleet management services providers, educational institutions and emerging providers of educational content, retailers and merchants, the hospitality industry, providers of mobile payment systems and traditional financial services providers, political campaigns and even credit reporting services - just to name a few. Privacy issues arise in the mobile context due to the reliance on location-based technologies that can track a user in both physical and digital environments, and disclose this data to multiple parties. The new Chair of the FTC recently identified mobile privacy as one of her top priorities, particularly involving children, connected cars and homes.
But there are lots of gaps and inconsistencies in the privacy legal and regulatory framework that create uncertainty and risk. Among the resulting pitfalls is the tendency to react to specific issues when confronted; the understandable instinct is to focus on achieving business objectives, and limited resources are deployed for product development, or are otherwise stretched thin in corporate legal departments. A better "default" approach involves taking proactive measures to identify and manage risk over the long term. Many General Counsel and Privacy Officers are already thinking about this, with various degrees of cross-functional cooperation.
Of course when developing risk management measures it is necessary to understand the underlying technology in the context of applicable laws, regulations, guidelines and industry best practices that together form privacy legal and regulatory frameworks. In the U.S., these laws, rules, guidelines and best practices are generally based on fair information principles, the foundation of which is providing transparency through disclosures that provide timely, actionable information to users. In the mobile context, state and federal regulators are increasingly requiring that these disclosures take into account screen size and other practical limitations of mobile devices. One approach would be to use layered privacy notices that are available within an app and offered prior to a download or product purchase. The FTC recently advised in its revised "dot.com" disclosure guides that where screen sizes restricts the ability to make clear, timely material disclosures on a particular mobile device, companies should refrain from offering products or services, or otherwise marketing over those devices..
In addition, there are heightened requirements for certain classes of sensitive information, including employment, financial, location and children's data. Moreover, the goal of protecting critical infrastructure from cyber threats, as reflected in President Obama's cyber security Executive Order, can be expected to draw attention to how mobile devices are used to access smart grid financial services, telephone and other critical infrastructure products and services.
It is equally important to understand how commercial relationships that rely on mobile technology can pose privacy risks for business, and then devise and implement measures that address those risks. These measures include negotiating with third party service providers for adequate privacy, confidentiality and data security terms in commercial agreements. At a minimum, representation and warranty terms should require compliance with applicable law, recommendations and current industry standards and best practices. In addition, enhanced indemnification and limitation of liability terms should be included.
Technology is rapidly evolving and along with it, an ever growing array of mobile products and services. U.S. and International policymakers are trying to modernize and close gaps in applicable privacy law. Accordingly, we are devoting this issue of the STLRO Privacy & Information Law Update to recent legal and policy developments involving mobile privacy.
Court Limits School Searches of Student Cell Phones
On March 28, 2013 the U.S Court of Appeals for the 6th Circuit ruled that a schools search of a students cell phone was an unreasonable violation of the Fourth Amendment. The ruling provides useful guidance to school systems that are trying to address the ubiquitous presence and use of mobile devices by students in school settings. School systems should carefully review their policies for searching students mobile devices to minimize the potential of running afoul of the Fourth Amendment or state privacy laws.
In G.C. v. Owensboro Public Schools1 a teacher observed a student, G.C., sending a text message in class in violation of the schools cell phone policy. G.C. attended the school as an out-of-district student subject to the recommendation of the Principal and approval of the Superintendent. He had a history of misconduct, drug use and was viewed as potentially suicidal. The teacher confiscated G.C.s cell phone and gave it to a principal, who read several of G.C.s text messages to see if there was evidence that he was considering harming himself or others. The Principal recommended that G.C.s out-of-district attendance privilege be revoked and the Superintendent agreed. G.C. subsequently filed an action in federal court alleging an unreasonable search of his cell phone in violation of the Fourth Amendment. He also alleged other claims, including a violation of due process in connection with the revocation of his out of district student status.
In addressing the Fourth Amendment claim, the Court: 1) relied on New Jersey v. TLO,2 a 1985 Supreme Court case that involved a Fourth Amendment challenge to a schools search of a students purse; and 2) examined two district court decisions, one of which upheld3 a schools search of a students cellphone and the other which invalidated4 the search as a fishing expedition. The Court then concluded that:
A search is justified at its inception if there is reasonable suspicion that a search will uncover evidence of further wrongdoing or of injury to the student or another. Not all infractions involving cellphones will present such indications. Moreover, even assuming that a search of the phone were justified, the scope of the search must  be tailored to the nature of the infraction and  must be related to the objectives of the search.
Applying this two part test, the Court concluded that using a cellphone on school grounds does not automatically trigger an essentially unlimited right enabling a school official to search any content stored on the phone that is not related either substantively or temporally to the infraction.5
In reaching this conclusion the Court rejected the school systems argument that its concerns about G.C.s mental health, along with his disciplinary record and history of misconduct justified the search. G.C. was observed texting in class in violation of policy, and for which his phone was confiscated. The Court emphasized that general background knowledge of drug abuse or depressive tendencies, without more, does not enable a school official to search a students cell phone when a search would otherwise be unwarranted.6
1 G.C. v. Owensboro Pub. Sch., No. 4:09-cv-102 (6th Cir. March
On March 12, 2013 coordinated State Attorney General (AG) press releases announced that Google entered into a settlement with the 38 states that were investigating its Street View mapping project during which Wi-Fi equipped cars collected snippets of emails and other information transmitted over unsecured wireless networks. The terms of the settlement require that Google pay $7 million to the participating states. The amounts will be determined by the states Executive Committee of the Multistate Working Group. In addition to the payment, Google entered into an Assurance of Voluntary Compliance (AVC) with the AGs in which it agreed to a number of remedial and proactive measures, including deleting the wi-fi data at issue and obtaining consent before collecting such data in the future.
The data was collected between 2008 and 2010, before many wireless routers had encryption settings in place as a default. Google claimed it mistakenly collected the data due to a piece of experimental computer code included in the cars' software. The company further claimed the data was not used in connection with any Google services.
Pursuant to the settlement and the AVC, Google will destroy the data it collected in the United States. Google also agreed to launch an employee privacy training program that it must continue for 10 years, create certain privacy educational materials and make them available over Google and other platforms. Google is still negotiating with various EU authorities on how to handle the data it collected there. Googles Street View mapping project is also under investigation in Canada, Australia and Hong Kong.
Privacy advocates had raised questions about whether Googles conduct may have violated other federal and state laws, including the federal Communications Act and the Pen/Trap Act. The FCC fined Google $25,000 for impeding its investigation into this matter but indicated that it would not take any further action against Google.1
This settlement demonstrates that state AGs are adding dimension to federal privacy protection initiatives by using their authority to require businesses that offer data driven products and services to incorporate privacy protections during design and development.
1 Google, Inc., Notice of Apparent Liability for Forfeiture, 27 FCC Rcd 4012 (Enf. Bur. Apr. 12, 2012), available at http://transition.fcc.gov/Daily_Releases/Daily_Business/2012/db0416/DA-12-592A1.pdf.
A number of privacy proposals have recently been introduced in state legislatures and the U.S. Congress to modernize Fourth Amendment law, bringing it more in line with how reasonable expectations of privacy have evolved with changing technology and behavior. These proposals reflect a broader debate currently taking place in Washington1, state legislatures2 and the courts3 about how to ensure adequate privacy protections for location and electronic communications while allowing law enforcement access to this data, a growing amount of which resides in the cloud, for investigative purposes.
Two bills introduced in the Texas House of Representatives would address this issue by creating a unified "probable cause" standard before authorities would be able to access location and electronic communications data.
Texas House Bill (TX HB) 1608, introduced on February 21, 2013, would require that law enforcement get a warrant based on probable cause to obtain location information, including GPS-based data.4 Exceptions include theft of a mobile device or a life threatening situation, such as kidnapping.5 In addition to GPS information, the bill would protect pen register, tap and trace, and electronic serial number data. (Pen registers record phone numbers dialed by individuals, while tap & trace devices record the number of outgoing calls to individuals.) The measure would also impose certain reporting requirements on mobile carriers and Texas law enforcement agencies, including an annual transparency report from carriers to the Texas Department of Criminal Justice (TDCJ). The reports would be required to disclose:
The reports would also be required to include the amount that each agency was billed by the carrier or electronic communications service provider6 for each request.7
Additionally, prosecutors who obtain search warrants for location data would be required to report annually to the TDCJ about the information that was relied on for each warrant, including:
Another measure, Texas House Bill (TX HB) 3164, introduced on March 7, 2013, would close a loophole in state law governing warrant requirements for access to certain electronic communications. It would require law enforcement to obtain a warrant based on probable cause before accessing stored wire and electronic communications. This proposal is similar to recent efforts in the U.S. Congress to reform the Electronic Communications Privacy Act (ECPA), discussed below.9
TX HB 3164 would protect emails and other electronic communications such as instant messaging or Internet Relay Chat (IRC) messages, no matter when they were sent, or potentially even if they were not sent at all. Emails would be protected irrespective of their location in a users Inbox, sent or draft folders. For example, emails left in a users draft folder (but not sent) would be protected. The bill offers the same level of protection for other forms of electronic communications, including instant message and IRC communications. Authorities would still be able to seek information from email or cloud service providers, such as routing data, to and from data, or other information that could be used to demonstrate probable cause under the federal Stored Communications Act (SCA) and the Texas code.10
Under the SCA a warrant is already required for emails that are less than 180 days old, and this bill will extend that protection to older emails that may be stored indefinitely. Many courts have interpreted the federal SCA as a statutory floor,11 allowing states to provide additional privacy protection to its citizens.
Meanwhile, California Senate Bill 467, introduced March 20, by State Senator Mark Leno, would prohibit government entities from obtaining a stored wire or electronic communication without a warrant.12 Exceptions include when the messages sender or recipient consents to its disclosure to law enforcement, or in the event of an emergency that could result in death or serious harm.13 Senator Leno also, sponsored a bill in the last legislative session that would have required law enforcement agents to get a warrant to obtain the location information of an electronic device.14 That bill was approved by the legislature before being vetoed by Governor Jerry Brown.
Past efforts to pass similar legislation in the U.S. Congress have failed. Both Houses are considering renewed proposals to update the law to address privacy concerns and new technology particularly mobile technologies and cloud-based computing. The most recent proposal, the Electronic Communications Privacy Act Amendments Act ("Act"), was introduced on in the Senate March 19, 2013 by Patrick Leahy (D-VT) and Mike Lee (R-UT). It would require authorities to obtain a warrant in for access to certain messages from email providers15 by eliminating the current "180-day rule." Currently permits warrantless access to unopened emails that are stored online for over 180 days,16 and opened emails, regardless of whether the messages are over or under 180 days old. The Act would require a warrant for any email, opened or unopened, over or under 80 days old, subject to existing exceptions, such as life or death emergencies or user consent. In this respect the proposed framework in this bill is similar to those proposed in the Texas and California email privacy bills.
The House is considering a bill that addresses geolocation data in addition to electronic communications. The Online Communications and Geolocation Protection Act, H.R. 983, was introduced on March 6, 2013 by Rep. Zoe Lofgren (D-CA), and co-sponsored by Ted Poe (R-TX) and Suzan DelBene (D-WA). Under this measure it would be unlawful for government entities to intercept or disclose an individuals geolocation data unless one or more of the bills exceptions apply.17 These exceptions include where authorities have a warrant, and where an individuals geolocation data was collected while conducting surveillance under the Foreign Intelligence Surveillance Act.18 Other exceptions are similar to those in the state bills, such as where a person provides consent or in the case of a life threatening emergency.19
The tech industry and privacy advocates are pushing for a warrant requirement and representatives of the Justice Department have publicly acknowledged that there is no principled basis to maintain the distinction between emails that are 180 days old and more recent.
The Online Communications and Geolocation Protection Act and the ECPA Amendments Act are the most recent efforts by the U.S. Congress to update privacy protections in light of how technology is used to communicate and in a way that promotes further innovation. In this respect, their goals are similar to those reflected in the state proposals.
Irrespective of the extent to which any of the above-discussed bills differ on specific details, both federal and state proposals reflect attempts to address distinctions in Fourth Amendment law that are being rendered increasingly meaningless by the way that technology is developing and being adopted to communicate -- particularly cloud and mobile computing. By modernizing the legal framework for protecting consumer privacy in the criminal context, these measures could help instill trust in the level of privacy protection afforded communications in existing and emerging technologies, which is essential for their further growth and adoption.
1 Location Privacy Protection Act, S. 1223, 112th Cong. (2011)
Online Communications & Geolocation Privacy Act, H.R. 983, 113th Cong.
(2013) available at: http://beta.congress.gov/bill/113th-congress/house-bill/983/text.
See also, The Electronic Communications Privacy Act (ECPA) Amendments
Act, available at: http://www.leahy.senate.gov/
New FTC Chair to Continue Aggressive Privacy Enforcement Agenda with Added Priorities
In remarks at the March 2013 Global Privacy Summit of the International Association of Privacy Professionals Edith Ramirez, the newly appointed Chair of the Federal Trade Commission, announced that she will continue to pursue her predecessors aggressive privacy enforcement agenda. Childrens and mobile privacy will remain top priorities for the agency along with a new initiative involving connected environments and devices such as vehicles, appliances and televisions. Ramirez expects this initiative to launch with a series of workshops this year to address privacy and the Internet of things.
Interestingly, Ramirez indicated that the FTC could use its unfairness (as distinct from deceptive practices) authority under Section 5 of the Federal Trade Act in the context of privacy enforcement where actions that cause business harm also cause harm to consumers. She cited the recent rent- to-own spying settlement as an example of how the agency recently used its unfairness authority in the privacy context and suggested that it might do so in the future if warranted.
Ramirez also emphasized that the U.S. Government will continue to work with the Asian Pacific Economic Cooperation (APEC) forum and European regulators to move toward greater interoperability of global privacy legal and regulatory frameworks.
Issues Staff Report on Mobile Payments
On March 8, 2013 the FTC issued its staff report on Mobile Payments, Paper, Plastic or Mobile? An FTC Workshop on Mobile Payments (Report). The Report, released nearly one year after an FTC-convened public workshop that included representatives of the payment card industry, retailers, mobile payment providers and consumer groups, provides useful guidance to stakeholders in the mobile transactions sector. Specifically, it sets out recommendations in three critical areas that were identified during the workshop as posing particular concerns for business and consumers. These concerns could impede more accelerated adoption of the relatively nascent mobile payment industry. Those areas involve Privacy, Data Security, and Dispute Resolution.
The Report notes that uncertainties about the application of existing consumer protection and privacy laws to mobile transactions -- including whether the definition of money applies to emerging virtual currencies -- may have decelerated the adoption of mobile payment systems. In the long term the Report will likely lay a foundation for greater regulatory and legal certainty for businesses in the ecosystem. In the near term, it provides useful guidance about what the FTC considers deceptive or unfair practices that cause consumer harm in violation of the Federal Trade Act.
(1) Privacy. The Report finds that there are significant consumer privacy concerns involving mobile payments. Many mobile payment systems include new actors device manufacturers, developers, coupon and loyalty administrators and carriers that are able to collect and consolidate detailed consumer data (beyond financial and purchase history information) that is not feasible with traditional payment systems. In contrast, banks, merchants and payment card networks in traditional payment systems generally lack the ability to collect information about specific consumer purchases or other data. To address privacy concerns, the FTC recommends that mobile payment system providers implement privacy by design during product development, provide greater transparency about data collection and use practices, and offer simplified, actionable choices to consumers.
(2) Data Security. Concerns about the security of financial data security could similarly impede the ubiquitous adoption of mobile payment services and systems. The Report notes that although enhanced security technology exists -- such as end-to-end encryption and dynamic data authentication, it is unclear whether and to what extent companies in the mobile payments ecosystem are employing these technologies. Accordingly, the FTC recommends that mobile transaction providers increase data security as sensitive financial information moves through the payment channel, and encourages adoption of strong security measures by all companies in the mobile payments chain.
(3) A. Dispute Resolution/Billing Practices. The Report identifies a gap in current law that governs how fraudulent or unauthorized transactions are resolved. Payment card disputes are governed by statutory protections, capping liability in the case of credit cards at $50.00 per unauthorized charge. On other hand, there are no similar federal protections for fraudulent or unauthorized mobile transaction paid for by pre-funded or stored value cards such as loyalty or gift cards, general purpose reloadable cards (GPRs), pre-paid debit cards, or even mobile carrier bills (Although, as the FTC noted, the Consumer Financial Protection Bureau is currently examining the extension of protections to GPRS). Instead, each mobile payment method provides varying degrees of protection, complicating the landscape for consumers.
In order to limit confusion, the FTC recommends that businesses develop clear policies for addressing fraudulent unauthorized charges, and clearly convey these policies to consumers. The agency also called upon policymakers to consider the benefits of providing consistent protections across the mobile payment ecosystem, and weigh the benefits of providing these protections against the costs of implementation.
B. Mobile Carrier Billing Practices raise a unique challenge with regard to third parties placing fraudulent charges onto consumers mobile carrier bills, otherwise known as cramming. The FTC views cramming as a significant problem that appears to be on the rise and should be a cause of concern for all mobile payments stakeholders because it threatens to undermine carrier billing as a legitimate and trusted payment option.
Accordingly, the FTC recommended that: (1) to prevent cramming, consumers should be able to block all third-party charges on their mobile accounts, including being able to block third- party charges on individual accounts operated by minors in the household; (2) mobile carriers should clearly and prominently inform their customers that third-party charges may be placed on customer accounts and explain how to block such charges at the time that accounts are established and when they are renewed; and (3) mobile carriers should establish a clear and consistent process for customers to dispute suspicious charge on their account and obtain reimbursement.
The Report also touches on how other countries are addressing mobile payment issues, opining that U.S. authorities could learn from their global counterparts, particularly since the adoption of mobile payments is considerably further along elsewhere in the world. The FTC notes that the European Commission and the Organization for Economic Cooperation and Development are each in the process of preparing policy guidance to governments on many of the same issues addressed in the Report.
Report is the latest in a series of initiatives and actions
involving mobile privacy, including those reported here,
It suggests that the FTC is zeroing in on the mobile transactions sector
while the industry is in what the FTC may perceive as its still-early
stages. Companies in the mobile transactions supply chain should expect
increased enforcement activity that will shape the regulatory treatment
of mobile payment systems, products and services. Therefore, providers
(other than those regulated by the Federal Reserve or mobile telecommunications
carriers not engaged in common carrier activities) should: (1) anticipate
increased FTC monitoring; (2) consider adopting privacy by design
when developing products and services; and (3) evaluate their data collection,
use, disclosure and retention practices against the FTCs recommendations.
On February 27, 2013 the Data Protection Working Party (The Working Party) of the European Commission (The Commission) adopted an opinion seeking to clarify the legal rules regarding data collection and processing through applications on smart devices, with a particular focus on wireless devices. In doing so, the Working Party is aligning the EU privacy protection framework with current uses of and behaviors involving technology, including mobile technologies.
opinion focuses on the consent requirement for data collection, the need
for data limitation, minimization, and security, and the obligation to
inform users of their rights regarding data collection. It also emphasizes
the importance of transparent and secure data practices in application
development to address potential privacy risks arising from an uniformed
lone app developer or an uncoordinated engineering department in a large
company, both of which can create an app that can reach a global audience
in a short period of time.
The opinion asserts that key risks to consumers in this area are the lack of transparency and the EU privacy frameworks emphasis on informed consent. According to the opinion, users privacy is placed at a great risk because apps frequently have access to information already contained on a mobile device, including location information, personal data stored by the consumer, and data from other device sensors. Consumers may be unaware of the extent to which apps collect and process personal information and therefore lack the ability to give informed consent to these data practices. The opinion also seeks to address the trend towards data maximization and elasticity of purposes, where apps collect information from the user unrelated or unnecessary to the apps functioning. Also of concern, is apps collection and use of childrens data and the transmission of personal data to third parties.
Under the EUs Data Protection Directive (Directive),1 subject to very limited exceptions, an entity must get the affirmative consent of an individual, after having provided clear and comprehensive information, before storing personal information or gaining of access to personal information already stored. This requirement applies to all individuals living in the European Economic Area, regardless of the location of the service provider. The Directives requirements are not transferable, for example, by contracting to a third party or by offloading via unilateral declarations in terms of service. This framework requires an entirely different approach for apps launching in the EU than for apps released in the U.S., where blanket privacy policies have been the norm.
The legal requirements discussed by the opinion apply to the collection and processing of personal data, i.e. data that relates to an individual. The opinion gives numerous examples of types of data that are considered personal for the purposes of the rule including: location, contacts, unique device and customer identifiers, identity of the data subject, identity of the phone, credit card and payment data, phone call logs, SMS or instant messaging, browsing history, email, information society service authentication credentials (especially services with social features), pictures and videos, and biometrics. However, this is not an exhaustive list, as any data that can be used to identify an individual user or device owner can be considered personal data.
Furthermore, the consent, notice, and data minimization requirements discussed in the opinion may apply to any number of parties involved in the development and operation of an app for a smart device. If any party determines the purposes and means of the processing of personal data, that entity is classified as a controller under EU law and must comply with the entirety of the Data Protection Directive. This could include third party and primary developers, app stores, advertising networks, OS developers, device manufacturers, and other third party data processors that process data for their own purposes.
Ultimately, the opinion focuses on the requirement that apps obtain user consent before collecting or accessing already stored personal data. The app must give the consumer clear and comprehensive information about the entity collecting the data, the type of data being collected, the purposes of the collection, how the data will be used both by the primary party and in conjunction with third parties, and how the user may exercise his or her rights to withdraw consent. The request for consent cannot come in the form of a single screen with a terms of service and a Yes I accept button. Nor will consent be inferred when a user clicks install after having been presented with a data collection policy. The consumer must be given the opportunity to freely grant or refuse such consent for each specific type of data and for each specific purpose for which the data is being collected.
This opinion highlights the level of specificity required under EU data protection law. In order to comply, app developers and operators must conduct a detailed inspection of each type of data collection, access, and processing practice to ensure that consumers in the EU are given sufficient notice about the scope of data practices and an opportunity to consent or decline consent of a practice. As the EU moves forward in revising its data protection directive, the various member state data protection authorities can be expected to step up enforcement activity for data practices that fail to obtain consent or exceed the scope of consent given. These actions will most likely focus on the risks posed to user privacy by the proliferation of mobile apps noted by the opinion. Multinational corporations and other large companies, or well intentioned developers for smaller entities will be carefully scrutinized for potential data collection or processing practices in mobile apps that could violate EU law, which in turn could lead to monetary fines and reputational harm.
1 Data Protection Directive, 2002/58/EC, as revised by 2009/136/EC.
The settlement, the first of its kind involving a mobile device manufacturer, signals the FTCs unambiguous expectation that businesses operating in the mobile ecosystem -- including device manufacturers -- adopt the privacy by design framework for protecting user privacy recommended by the agency in its 2012 consumer privacy report.
This action stemmed from security vulnerabilities that the FTC alleged were created by HTCs customization of pre-installed apps and components on its Android-based mobile devices. Consumers did not have the option to uninstall or remove them. According to the Complaint, the apps and components were modified to differentiate HTCs products from those of competitors that also manufactured Android-based mobile devices. HTCs modifications enabled third-party apps to access user data by circumventing Googles permission-based security model, allowing malicious programs to be added that could perform a variety of intrusive functions. These functions could enable bad actors, as well as benign applications, to access a users online activity, text messages, location data, email address and phone number, and even activate the devices microphone. The FTC concluded that these security flaws should have been detected and adequately addressed by HTC during its design of the devices. Specifically, the FTC alleged that HTC failed to:
The FTC also alleged that HTCs user manual misrepresented the security of the devices when HTC promised that no third party app would be able to access a users location or other information without the users prior consent, as demonstrated by certain affirmative actions taken by the user.
All of the foregoing allegedly put HTCs users at risk of physical, financial and other injury, and amounted to unfair or deceptive acts under Section 5 of the Federal Trade Commission Act.1
The FTC seemed particularly focused on HTCs failure to include a permission check code, which it alleged would have protected user data from unauthorized access, or guarded against the risk that a users device or its functionality would be commandeered by a third party. This type of code checks the permissions a user grants to third party applications when an app attempts to access user data stored on the device, or activate a particular functionality. Once the code is activated through a users consent, the operating system typically allows access to the data or activates the functionality for which permission was granted. HTC allegedly undermined Androids version of this type of permission-based security model by introducing numerous permission re-delegation vulnerabilities through its custom, pre-installed applications.
Permission re-delegation occurs when one application that has permission to access sensitive information or sensitive device functionality provides another application that has not been given the same level of permission with access to that information or functionality. According to the FTC, HTCs permission re-delegation could have enabled third-party apps to access users personal information stored in HTCs pre-installed native apps, or perform other functions on the device, without first obtaining consent to do so.
By way of example, the FTC described how the Android operating systems permission-based security framework for sensitive functionality must receive a users permission to access the devices microphone, since the ability to record audio is considered sensitive functionality. However, in its devices, HTC pre-installed a custom voice recorder app that, if exploited, could provide any third-party application access to the devices microphone, even if the app had not requested permission to do so.
According to the FTC, this scenario could have been prevented if HTC included simple, well- documented permission-check code in its voice-recorder app to check that the third-party application had requested the necessary permission. Instead, HTCs failure to include permission check code in its customized voice recorder app created a risk that any third-party app could exploit that vulnerability and command HTCs microphone functionality on its behalf.
This settlement illustrates the FTCs expectations about how companies operating in the mobile ecosystem should implement privacy by design throughout all phases of design and development. Companies can expect to be held accountable for privacy and security vulnerabilities that are either undetected or inadequately addressed by engineers during a products design or development. Accordingly, companies that offer mobile products and services should, at a minimum: (1) implement adequate employee training for detecting potential privacy and security flaws; (2) perform adequate data security due diligence on third party applications; and (3) review user manuals, FAQs, privacy and other policies that address user privacy and data protection, and make appropriate modifications.
1 In re HTC America Inc., File No. 122 3049, Complaint at 8, citing 15 U.S.C. § 45 (a) (2006).
COPPA-Like Protections Proposed in California
On February 12, 2013, California Democratic Assemblywoman Nora Campos introduced AB 319, legislation that would require operators of commercial websites or online services directed at minors or that have actual knowledge that they are collecting information from minors to:
Unlike the federal Childrens Online Privacy Protection Act (COPPA),1 AB 319 defines minor as someone who is under the age of 18. Under COPPA, a minor is a person who is under 13. Arguably, this inconsistency could raise a preemption challenge if AB 319 becomes law because COPPA preempts inconsistent state laws.2
In 2009 Maine enacted a law that would have exceeded COPPAs protections by similarly prohibiting the collection of personal information from children under 18, among other restrictions. It was repealed in 2010.
Companies that market to children and teens should closely monitor the progress of AB-319. If enacted, it could limit the ability to engage or market to teens online even though Congress, when it enacted COPPA, rejected treating teens and young children equally for purposes of protecting their online privacy. More recently, the FTC considered but rejected raising the age to which the COPPA rule applies during recent proceedings to update the rule.
1 15 U.S.C. §6501 et. seq.
COPPA Amendments Take Effect July 1
The amended COPPA Rule takes effect July 1. Now is the time to review your data practices and make any necessary modifications to address risk. In order to do so, you must be familiar with the amendments adopted by the FTC in December and their interplay with the existing COPPA regime. This review should be part of, but not a substitute for, your ongoing evaluation of your COPPA compliance.
As we reported, the FTC continues to aggressively enforce COPPA against operators of a wide swath of child directed and general or mixed audience websites, apps and online services. The new FTC Chair indicated last month that childrens online privacy will continue to be a top enforcement priority. There has also been an uptick in COPPA investigative and enforcement proceedings by state Attorneys General.
Many of the amendments require very fact-specific assessments about such ambiguous issues as when a site is directed to children who are under 13, or when a party has actual knowledge that it is collecting information from underage children, or whether a persistent identifier is used solely to support internal operations as defined under the rule.
The following changes could have a significant impact on your COPPA risk management strategy:
|INTERNATIONAL PRIVACY DISPATCHES|
On April 2, 2013, the Article 29 Working Party (Working Party) adopted an important opinion that applies to data controllers processing data in the EU. The opinion analyzes and interprets the core foundation of data protection, the principle of purpose limitation, which protects data subjects by limiting the collecting and processing of their data. This principle is set out in Article (1)(b) of 95/46/EC, the EU Data Protection Directive (Directive). The opinion offers practical guidance on the application of purpose limitation in the current legal framework and offers policy recommendations for the future. The guidance includes illustrative examples and acceptable privacy notices.
The opinion reflects the Working Partys concern that the precise meaning of purpose limitation, and relevant exceptions, is currently the subject of discussion that calls for clarification of its scope and function. In order to do so, the Working Party examines the essential components of purpose limitation: 1) purpose specification and 2) compatible use.
In broad terms, personal data must be collected for specified, explicit, and legitimate purposes. Data collected for a specific purpose may not be further processed in a way [that is] incompatible with those purposes.
The purpose for collecting personal data must be:
Annex 3 of the opinion provides examples to illustrate practical application of purpose specification in various instances, including: a local shop selling to local people in a small town and collecting limited customer information; a social networking website operating across Europe that by its nature targets a broad user group across different cultures; a data controller that provides different services (in which case oversimplification should be avoided in favor of sufficient granularity to ensure that all users are sufficiently put on notice of different uses of their data); gaming websites aimed at teenagers; and a government website that provides advice to the mentally ill. The examples are intended to demonstrate that the overall context, and in particular the reasonable expectations of the data subjects and the extent to which the parties concerned have a common understanding of the purposes of the processing, will determine, to a large extent, the level of detail necessary.
Processing personal data for purposes that are incompatible with the purpose identified prior to collection is illegal. In such circumstances, data subjects must be provided with new privacy notices specifying any further purposes and the opportunity to provide or withhold appropriate consent. At the same time, the Working Party notes that processing for a purpose other than for which the data was collected does not necessarily mean that it is incompatible. Instead, compatibility must be assessed on a case-by-case basis, taking into account all relevant circumstances, including the following factors:
Annex 4 of the opinion includes examples that apply these factors to a wide variety of contexts, including predictive analysis of purchasing history and behavioral advertising, location tracking over mobile devices, childrens data, photo sharing, vehicle ownership information, air passenger name records, smart meter data and health data.
Big & Open Data.
The opinion also addresses application of the elements of the purpose limitation principle to big and open data.
'Big data' refers to the exponential growth in availability and automated use of information [in] digital datasets held by corporations, governments and other large organizations, which are then extensively analyzed using computer algorithms. Risks associated with Big Data include:
Open data refers to publicly available information that can be accessed by private entities. According to the Working Party, open data projects often involve: (i) making entire databases available (ii) in standardized electronic format (iii) to any applicant without a screening process, (iv) free of charge, and (v) for any commercial or non-commercial purposes under an open license.
While recognizing that the characteristics of open and big data are a key driver of innovation, its ubiquity and accessibility presents risks that require adequate safeguards. The current framework may not provide sufficient guidance about reuse. To address this concern, the opinion recommends different safeguards for different scenarios, including where open data will be used to identify trends and make correlations, and when it is available in digital searchable and machine readable formats on the Internet that can identify individuals. For the former, functionally separate the processing of the data and guarantee its confidentiality and security; for the latter obtain informed, opt-in consent.
Historical, Statistical or Scientific Processing.
The opinion proposes certain changes to the proposed Data Protection Regulation intended to provide greater legal certainty involving the application of the principles relating to personal data processing and safeguards for initial and further processing for historical, statistical or scientific processing. The opinion hints that recommendations in this area will be forthcoming.
On March 11, 2013, the Supreme Judicial Court of Massachusetts ruled in Tyler v. Michaels Stores, Inc. 1 that an individuals zip code may well be personal information under Massachusetts law. The issue was certified in the form of three questions to the Court by a federal district court judge presiding over an underlying class action. The Massachusetts courts decision potentially paves the way for an uptick in class action litigation against merchants who collect zip code information during credit card transactions. A similar ruling by the California Supreme Court in Pineda v. Williams-Sonoma Stores, Inc.2., construing Californias Song-Beverly Credit Card Act3 resulted in the filing of hundreds of class actions against merchants who collected zip code information for credit card transactions at the point of purchase.
The Plaintiff, Tyler, was asked to provide her zip code during a number of credit card purchases at a Michaels store. She did so under the mistaken impression that her zip code was required to complete the transactions. Michaels used Tylers name and zip code to search other commercially available databases to find her address and telephone number, and Tyler started receiving unsolicited marketing material from the company.
Tyler subsequently filed a federal class action4 in which she alleged that Michaels maintained a policy of writing customers' names, credit card numbers, and zip codes on electronic credit card transaction forms for each credit card purchase. Tylers credit card provider did not require this information to be requested or maintained. She further alleged that Michaels data collection practices, including its request for her zip code, violated a provision of Massachusetts law5 that prohibits retailers from seeking and recording personal information on the transaction form. That provision provides, in relevant part:
No person, firm, partnership, corporation or other business entity that accepts a credit card for a business transaction shall write, cause to be written or require that a credit card holder write personal identification information, not required by the credit card issuer, on the credit card transaction form.6
Although the Court found that Tyler sufficiently alleged facts to support a per se violation of the law, it granted Michaels dismissal motion on grounds that Tyler failed to allege injury of consumer fraud. The Court found that unlike the intent of Song-Beverly, which is to prevent merchants from collecting and using personal information for marketing, the intent of the Massachusetts law is to prevent consumer fraud. However, the Court rejected Michaels argument that its entry of customer transaction data in computer terminals did not create a credit card transaction form within the meaning of the statute, finding instead that §105(a ) can apply to electronic forms created at the point of purchase.
The Massachusetts High Court construed §105(a) more broadly, ruling that, based on the text, title, caption and legislative history of §105(a) the legislatures intent was not limited to preventing consumer fraud. Moreover, it found that cognizable harm can be caused when a merchant collects personal identification information in violation of §105(a) and uses [it] for its own business purposes, [including] sending the customer unwanted marketing materials . 7
The Courts decision could breathe new life into the underlying class action. And like the California Supreme Courts decision in Pineda, the Massachusetts Supreme Judicial Courts ruling in this case could open the door to a wave of similar class actions.
In light of the Courts decision, retailers with locations in Massachusetts should review their point of sale data collection practices, including for online transactions, and update those practices as warranted. Moreover, in addition to California, a number of other states have similar laws that prohibit or limit the collection of customer data during credit card transactions. Any review of point of sale data collection practices should also take into account those laws.
1 SJC-11145, (Mass. March 11, 2013).
On February 26, 2013 the National Institute of Standards and Technology ("NIST") published a Request for Information ("RFI") in the Federal Register seeking comment about the proposed framework for addressing cyber risk (Framework) called for in President Obama's Executive Order (Order) on Cybersecurity. Comments were due April 9.
As reported here, Section 7 directs the National Institute of Standards and Technology (NIST) to take the lead in creating the Framework.
According to the RFI, The Framework will include a set of standards, methodologies, processes and industry best practices that "align policy, business, and technological approaches" for addressing cyber threats. Comments are being sought that "help identify, refine, and guide the many interrelated considerations, challenges, and efforts needed to develop the Framework." In particular, NIST is seeking information about current risk management practices, the use of frameworks, standards, guidelines and best practices, and specific industry practices to address cyber threats.
In developing the Framework, NIST will consult with the Secretary of the Department of Homeland Security, the National Security Agency, sector-specific agencies and other interested agencies, including the Office of Management and Budget, owners and operators of critical infrastructure, and other stakeholders, including state, local, territorial and tribal governments.
The publication of the RFI kicks off of the NIST-led process for developing the Framework. The intent is to engage the broad range of critical infrastructure (including physical and "virtual"), and other sectors that face increasingly sophisticated cyber threats in order to promote the Framework's adoption by them. The risk to these sectors is seen as posing a potentially debilitating impact on the nation's security, including national economic security and national public health and safety.
Other agencies were similarly tasked with preparing reports recommending incentives for encouraging private sector adoption of the Framework. One of them, the Department of Commerce, issued a Notice of Inquiry (NOI) on March 28 to assist it in preparing its report. The NOI can be accessed here.
|Copyright © 2012 St. Ledger-Roty & Olson, LLP.|