|
|||||||||||||||||
PRIVACY
& INFORMATION LAW UPDATE |
|||||||||||||||||
|
|||||||||||||||||
|
|
||||||||||||||||
President
Obama Issues Cybersecurity Executive Order On February 13, 2013, President Obama released a summary of an Executive Order (Order) intended to protect critical infrastructure from escalating cyber threats against both government and industry. The Order, which returns cybersecurity to the spotlight, will impact companies that operate critical infrastructure, and may impact sector-specific regulated entities. The Order could potentially impact noncritical infrastructure and certain unregulated entities. The Order has two key objectives: (1) to improve information sharing between the federal government and industry and (2) to implement a cybersecurity framework (Framework) that will be developed by a NIST-led effort. The Framework will consist of standards, policies and procedures to guard against threats. Industry adoption of the Framework will be voluntary. Information Sharing. Toward this end, section 4 requires the Department of Homeland Security, Director of National Intelligence and Attorney General to create a process for sharing unclassified reports about cyber threats that identify a specific targeted entity, and classified reports to critical infrastructure entities and their contractors with security clearances, while calling for processes for expedited security clearances. Framework. Section 7 directs NIST to establish a framework that will include a set of standards, methodologies, procedures and processes, including incorporating voluntary consensus standards and industry best practices that align policy, business, and technological approaches to address cyber risks. The Order mandates that the Framework be technologically neutral and incorporate information security measures and controls to help owners and operators of critical infrastructure identify, assess, and manage cyber risk. The Framework will be subject to an open public review and comment process, with a preliminary version to be published within 240 days and a final version to be issued within one year of the Order. Section 8 calls for a coordinated effort among sector-specific agencies to establish a voluntary program, including incentives, to support the Frameworks adoption by critical infrastructure and other interested entities. The Order also calls for Homeland Security, counterterrorism and other officials to make recommendations to the President on the feasibility, security benefits, and relative merits of incorporating security standards into federal procurement process, including what steps might be necessary to align existing cybersecurity requirements with new ones. FIPS-Based Approach to Privacy and Civil Liberties Protections. The Order also addresses the Privacy and civil liberties issues that contributed to the failure of Congress to pass comprehensive cybersecurity legislation during the last session. Thus, section 5 mandates a coordinated agency effort to ensure that privacy and civil liberties protections are incorporated in the activities and processes mandated by the Order. The protections must be based on Fair Information Principles and other applicable privacy frameworks. Critical infrastructure and regulated entities -- as well as other interested entities -- should understand the Orders potential impact, and evaluate whether adoption of the Framework will prove beneficial to them. |
|||||||||||||||||
CA
Supreme Court Rules Song-Beverly Does not Apply to Online Transactions
Involving Digital Products On February 4, 2013, the California Supreme Court narrowly ruled in Apple, Inc. v. Krescent1 that the states Song-Beverly Credit Card Act of 19712 (Song-Beverly) does not apply to digital transactions involving electronically downloadable products. The 4-3 Majority concluded that Apple did not violate Song-Beverly when it collected the plaintiffs phone number and address to complete credit card purchases of downloads from the iTunes store. This ruling has significant implications for e-commerce as online and mobile transactions continue to proliferate, fueling the steady march from brick and mortar to digital transactions. The Court previously ruled in Pineda v. Williams Sonoma Stores, Inc.3 that zip code information is personal information under Song-Beverly, and that brick and mortar retailers are prohibited from seeking or recording such information in order to complete a credit card transaction. As we reported here, that ruling gave rise to hundreds of class actions as well as a statutory exemption for operators of gas stations. In Krescent the Court overturned a lower court ruling that Song-Beverly barred Apple from collecting personal information during online credit card transactions involving downloadable digital products. The Court examined Song-Beverlys legislative history and concluded that the legislature had two objectives when it enacted the statute: to protect consumer privacy and minimize the risk of fraud for merchants and consumers. In reaching this conclusion, the Majority reasoned that: 1) when Song-Beverly was enacted the online environment was so different the legislature was unlikely to have contemplated the kinds of transactions at issue before the Court; 2) online purchases of downloadable products differ from purchases from brick and mortar merchants because online retailers are unable to visually inspect a credit card, the cardholders signature, or confirm a customers identity with photo identification; and 3) the California Online Privacy Protection Act of 20034, which was enacted after Song-Beverly, establishes a separate framework for protecting consumers online privacy -- in part because the legislature believed that existing law lacked such protections. Accordingly, the Majority concluded that the key antifraud mechanism in [Song-Beverlys] statutory scheme has no practical application to online transactions involving electronically downloadable products. The Court emphasized that its ruling is limited to credit card transactions involving digital products. Nevertheless, the Courts reasoning could be applied more broadly to future Song-Beverly actions involving online purchases of physical products and services. In any event, the rapid migration of brick and mortar transactions to online and mobile environments suggests that Krescents practical is that it foreshadows the potential irrelevance of Song-Beverly to e- and mobile commerce.
1 Ct.App. 2/8 B238097 (CA February 4, 2013). |
|||||||||||||||||
FTC
Announces Settlement of Deceptive Practices, COPPA Charges with Social
Networking App According to the FTC, Path, Inc. operates a social networking service that allows users to keep journals about moments in their life and to share that journal with a network of up to 150 friends. Through the Path app, users can upload, store, and share photos, written thoughts, the users location, and the names of songs that the user is listening to. The FTC alleged that Paths application included an add friends feature to help users add new social network connections. The feature offered users three options: Find friends from your contacts; Find friends from Facebook; or Invite friends to join Path by email or SMS. However, Path automatically collected and stored personal information from the users mobile device address books even if the user had not selected the Find friends from your contacts option. For each contact in a users mobile device address book, Path automatically collected and stored any available first and last names, addresses, phone numbers, email addresses, Facebook and Twitter usernames, and dates of birth. The FTC charged that Paths privacy policy deceived consumers by claiming that it automatically collected only certain user information such as IP address, operating system, browser type, address of referring site, and site activity information. However, version 2.0 of the Path iOS app automatically collected and stored personal information from the users mobile device address book when the user first launched that version of the app and each time the user signed back into the account. In addition to the deceptive practices claim, the agency charged that Path, which collects birth date information during user registration, violated the COPPA rule by collecting personal information from approximately 3,000 children under the age of 13 without first getting parental consent. Through its apps for both iOS and Android, as well as its website, Path enabled children to create personal journals and upload, store and share photos, written thoughts, their precise location, and the names of songs to which the child was listening. Version 2.0 of the app also collected personal information from a childs address book, including full names, addresses, phone numbers, email addresses, dates of birth and other information, where available. The COPPA Rule requires that operators of online sites or services directed to children, or operators that have actual knowledge of child users on their sites or services, notify parents and obtain their verifiable consent before they collect, use, or disclose personal information from children under 13. Operators covered by the Rule also must also post a COPPA compliant privacy policy. The FTC alleged that Path violated the COPPA Rule by failing to:
The settlement highlights the FTCs continued focus on mobile app and childrens privacy. Website operators (as well as brick and mortar businesses) that offer mobile apps, including those that can be accessed and interacted with by children, should implement robust procedures that include guarding against the kind of design flaw that was presented here as well as aligning web and mobile privacy policies. The settlement should be seen more broadly as part of an escalating focus on mobile privacy particularly mobile app privacy, by federal and state authorities. The FTC released its mobile app privacy report on the same day that it announced the Path settlement. The report follows the agencys previous report on childrens mobile privacy, as we reported here. In addition, California has stepped up enforcement against mobile apps and platforms on which they are accessed, and the Maryland Attorney General recently announced creation of a privacy enforcement unit that will also focus on mobile privacy. |
|||||||||||||||||
FTC
Report Suggests Ways to Improve Mobile Privacy Disclosures On February 1, 2013, the FTC issued a staff report, Mobile Privacy Disclosures: Building Trust through Transparency (Report).1 The Reports sets out recommended improvements for how stakeholders in the mobile app ecosystem notify consumers about their privacy and data collection practices. The recommendations are not regulations; however they represent best practices and should be seen as potentially forming the basis for future enforcement actions for failure to adhere to them. Like California Attorney General Kamala Harriss mobile privacy initiative, the Report seeks to promote strong privacy practices for developers, platform and operation providers, device manufacturers and mobile ad networks. Noting that the limited screen space of many mobile devices makes the format of traditional privacy notices ineffective, the Report calls for an industry-driven solution to this and other disclosure challenges. The Reports recommendations are stakeholder-specific. For example, it recommends that platform providers provide consumers with just-in-time notice and obtain affirmative express consent before accessing sensitive content, such as geolocation, contacts, photos, or audio and video recordings. Providers should also consider:
App developers should:
Ad networks and other third-party data users should consider:
The Report does not call for new legislation; it emphasizes that there is an important role for industry associations in implementing the FTCs recommendations, including fostering transparency and uniformity, and promoting industry-wide collaboration.
1 FTC STAFF REPORT, MOBILE PRIVACY DISCLOSURES: BUILDING TRUST
THROUGH TRANSPARENCY, http://www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf
(Feb. 1, 2013). *Seth is a guest author and recent graduate of the University of Indiana Maurer School Of Law. A recent FCC intern, Seths admission to the D.C. bar is pending. He has written previously about the potential impact of the Performance Right Act on student radio stations. |
|||||||||||||||||
Maryland
Attorney General Creates Internet Privacy Unit On January 28, 2013, Maryland Attorney General Doug Gansler announced the creation of an Internet Privacy Unit in the office of the Maryland Attorney General. The unit will pay particular attention to enforcing the federal Childrens Online Privacy Protection Act (COPPA),1 which empowers states, in addition to the Federal Trade Commission, to enforce the COPPA rule. The unit will also monitor and examine privacy policies for deceptive representations, and general adherence to consumer protection law. This development is consistent with Ganslers public remarks that digital privacy, particularly mobile and childrens online privacy, is a top priority for his office. In addition, hes made privacy in the digital age a key initiative for the National Association of Attorneys General (NAAG). Ganslers announcement follows the creation of a similar unit by California Attorney General Kamala Harris last year. Since then Harris, through that unit, has initiated a number of actions to enforce Californias online privacy law. Businesses that collect, use and retain personal information about Maryland residents should be familiar with applicable law, including COPPA and Marylands Personal Information Protection Act.2 In addition, these businesses should review their data collection and security practices and procedures, including privacy notices, and take appropriate measures to minimize the risk of an investigation or enforcement action.
1 15 U.S.C. 6501, et seq. |
|||||||||||||||||
Operator
of Childrens Website Agrees to Recommended Privacy Improvements On January 17, 2013 the Childrens Advertising Review Unit (CARU) announced that SPIL Games, BV, the operator of a website directed to children and teenage girls, agreed to modify certain privacy practices in response to recommendations by CARU. CARU is the investigative arm of the advertising industrys self-regulatory program administered by the Council of Better Business Bureaus and one of several FTC-approved safe harbor programs. CARU monitors online and mobile services to ensure compliance with its Guidelines. According to CARUs press release, children who were under 13 were able to create profiles and personalized avatars, view profiles created by others, play and rate games and make friends. They were also able to register for the site using social media tools that prohibit use by children who are under 13 including Facebook and Twitter. According to CARU, the site allowed children to disclose personally identifiable information on member profile pages without first notifying parents and obtaining verifiable parental consent. In response to CARUs investigation, SPIL installed a session cookie to prevent retries during age screening, disabled all user-generated content on wall posts and comments, and changed user names of children who are under 13 to predefined, white-listed or randomly generated user names. SPIL also agreed not to accept new registrations from children under 13. These children will be still be able to play games on the site but will be unable to generate content. SPIL also disabled the feature that allowed log-in through social media and removed links to Twitter. This matter serves as an important reminder that any operational changes made to child- directed websites or online services (or to other sites or services with actual knowledge that they collect information from children who are under 13), should be made only after evaluating how those changes will affect the operators overall COPPA compliance strategy. This is especially so in light of recent amendments to the COPPA rule that modify key definitions and corresponding obligations, and impose new requirements on certain third parties. In addition, operators that use age screening should have adequate measures in place to prevent retries by children who are under 13. |
|||||||||||||||||
FTC Announces
FCRA Settlement with Mobile App Developer The Complaint alleged that Filiquarian advertised that people who bought its app could perform a quick criminal background check for convictions in a number of states states. Filiquarian said that its apps could access hundreds of thousands of criminal records, and be used to conduct searches on potential employees. Accordingly, Filiquarian, Linsk, and Choice Level were found to be acting as a consumer reporting agency under the FCRA and required to comply with certain obligations, including implementing certain safeguards with respect to personal data. The FTC charged three key FCRA violations: 1) failure to maintain reasonable procedures to verify who their users are and that the information would be used for a permissible purpose; 2) failure to have procedures to ensure that the information they provided in consumer reports was accurate; and 3) failure to provide required notices to users and to those who furnished Filiquarian with information that was included in consumer reports. The apps included disclaimers that also appeared on the parties websites that that they were not FCRA compliant; that their products were not to be considered screening products for employment, insurance, and credit screening; and that anyone who used the reports for such purposes assumed sole responsibility for FCRA compliance. The FTC noted, however, that the disclaimers contradicted express representations in Filiquarians ads urging people to use the reports to screen potential employees. The FTC admonished that these types of disclaimers is not enough to absolve a company from FCRA liability. The consent decree bars the Defendants from: 1) furnishing a consumer report to anyone they do not have reason to believe has a permissible purpose to use the report; 2) failing to take reasonable steps to ensure the maximum possible accuracy of the information conveyed in its reports; and 3) failing to provide users of its reports with information about their obligations under the FCRA. |
|||||||||||||||||
TCPA
Claim against Caribbean Cruise Line Survives Motion to Dismiss On January 7, 2013, a federal court judge denied a motion to dismiss a claim brought under the Telephone Consumer Protection Act (TCPA) against Caribbean Cruise Line and Economic Strategy Group. The class action, Birchmeier v Caribbean Cruise Line and Economic Strategy Group, was filed in US District Court for the Northern District of Illinois. Businesses that use third party service providers for marketing campaigns should ensure that the providers comply with applicable consumer protection laws, including those governing mobile message campaigns. The Plaintiffs alleged that the defendants used an autodialer or prerecorded voice to make unsolicited marketing calls to their cell phones in violation of the TCPA. They further alleged that the defendants acted under the guise of conducting political surveys but that this was a sham intended to get their foot in the door to sell ocean cruises to the calls recipients. The Court rejected each of the three arguments asserted by the defendants. First, the Court rejected a procedural argument that the Plaintiffs failed to distinguish the role of each defendant, noting that the whole point is that the Defendants acted in concert and it is difficult to tell where one defendant stops and the other one starts. Second, the Court rejected the defendants contention that TCPA liability attaches only to the party that placed the call, deriding the underlying logic as absurd. Congress would not have intended the TCPA to allow a well heeled marketer to hire a third party to make unlawful calls as a shield from TCPA liability. Third, the Court rejected as a nonstarter the Defendants argument that the Complaint is non-actionable under the TCPA because the calls were political surveys which are exempted under the statute. The Court noted that the while TCPA exempts political calls made by using artificial or prerecorded voices, the exemption does not apply to those made by autodialers, and the Complaint also alleged calls made by autodialers. In addition to denying the motion to dismiss the TCPA claims, the Court denied the Defendants motion to request class action allegations on grounds that a decision on whether the Plaintiffs can establish class certification would be premature. |
|||||||||||||||||
INTERNATIONAL DISPATCHES | |||||||||||||||||
EC Proposes Draft Cybersecurity Directive On February 7, 2013, the European Commission (EC) and the High Representative of the Union for Foreign Affairs and Security Policy issued a proposed draft directive (Directive) to ensure common network information security over the Internet and private information systems in the European Union (EU). The proposal is intended to unify the different, voluntary cybersecurity approaches of the EUs 27 member states by requiring them to create a cooperative mechanism for improving security and incident reporting. Operators of critical infrastructure, including energy, transportation, cloud service providers, financial services providers, and key providers of information services like e-commerce, social network platforms, search engines, cloud service providers and even app stores would be required to adopt appropriate security measures and report serious incidents to national authorities. It is unclear what the interplay will be between the proposed security and incident reporting requirements and those proposed in the General Data Protection Regulation that we reported about here. The proposed Directive is the latest in a series of actions intended to promote the overarching goals of establishing the EU as a secure and trustworthy digital environment and realizing attendant economic benefits of doing so. Key components call for:
The proposed Directive must be approved by the European Parliament and European Council before becoming law. |
|||||||||||||||||
Singapore DPA Seeks Public Comment on Proposed Data Protection Regulations On February 5, 2013, the Personal Data Protection Commission of Singapore (PDPC) issued its first public consultation setting out and seeking comment about proposed data protection regulations under Singapores Personal Data Protection Act (PDPA). The PDPA was adopted by the Singapore Parliament in October 2012 and became effective January 2013. The Act applies to all organizations in Singapore, except those in the public sector. The PDPA created the SPDPC, and empowered it to administer and enforce the PDPA, including imposing various remedial measures for noncompliance. The Public Consultation focuses on three areas: 1) the form, manner and procedures for access to or correction of personal data, including the process for responses; 2) the requirements for transferring personal data out of Singapore; and 3) the classes of persons who may act for minors or other individuals who lack capacity to act, and the manner in and extent to which any rights and powers may be exercised by these individuals under the PDPA. Areas involving certain administrative procedures will likely be addressed in mid- 2014. Rights of Access & Correction. The Consultation sets out rights and responsibilities for individuals and organizations, respectively, for seeking access to personal data and the process and time frame for responding to such requests. It recommends that organizations respond to written access requests within 30 days; those that are unable to do so should inform the requesting individual within 30 days of the reasonably soonest time the request can be granted. Organizations are authorized to charge a reasonable fee for responding to access requests. The PDPA specifically seeks comments on the proposed manner in which an individual may access or correction requests, or on how organizations are to respond to such requests. Cross-Border Transfer of Personal Data. The Consultation proposes to generally permit the transferring of personal information out of Singapore only if done so in a manner consistent with obligations under the PDPA and pursuant to a legally binding instrument that contains appropriate safeguards in the form of contractual clauses or binding corporate rules. The instrument must implement obligations involving data purpose, use, disclosure, accuracy, protection, retention and employee data handling policies. The PDPA specifically seeks comments on other means of ensuring the protection of personal data transferred out of Singapore and the proposed requirements for contractual clauses and binding corporate rules that would apply. Individuals Who May Act for Others. The Consultation proposes regulations for classes of persons who are unable to grant consent to exercise their rights under PDPA, including minors and deceased persons. The Consultation proposes that minors who are 14-18 years old be able to act on their own behalf if they understand the nature of the right or power conferred by the PDPA and the consequences of exercising the right or power. Certain of a deceased persons closest relative should act on his or her behalf where those persons do not have personal representative. The Consultation specifically seeks comments on a number of areas involving who may act for others, including the extent to which minors should be able to exercise rights and powers conferred under the PDPA, and the minimum age below which they should not be permitted to do so; and proposed priority lists of relatives that may act on behalf of a deceased. Public comment is due by 5:00 PM, March 19, 2013. |
|||||||||||||||||
Canadian & Dutch Authorities Investigate WhatsApp for Privacy Violations; Issue Report On January 28, 2013, The Office of the Privacy Commissioner of Canada and the Dutch Data Protection Authority (Dutch DPA) issued a joint press release announcing collaborative findings that California-based WhatsApp violated Dutch and Canadian privacy laws. The findings were released in separate reports that can be found here and here. This investigation comes at a time when the data collection, use and disclosure practices of mobile apps are being closely scrutinized by U.S. and International enforcement authorities. The Dutch DPA indicated it will continue to monitor WhatsApp and impose penalties if warranted. The DPA asserts that it is empowered to enforce Dutch law against WhatsApp because it targets and is used by Dutch users. Further, the company processes data in the Netherlands over the mobile devices of Dutch users. The investigation focused on WhatsApps mobile messaging platform, which allows users to send and receive instant messages over the Internet across various mobile platforms. While WhatsApp was found to be in contravention of Canadian and Dutch privacy laws, the organization has taken steps to implement some recommendations in the reports. The investigation revealed that WhatsApp violated certain internationally accepted privacy principles, mainly in relation to the retention, safeguard, and disclosure of personal data. For example:
|
|||||||||||||||||
UPDATES | |||||||||||||||||
California AG Issues Mobile App Privacy Guidelines On January 10, 2013, California Attorney General Kamala Harris issued Privacy on the Go: Recommendations for the Mobile Ecosystem. This development is the latest in a series of actions undertaken by Harris that are part of a broader initiative to protect mobile privacy. The guidelines are not regulations; however, they reflect Harris' interpretation of the California Online Privacy Protection Act (Act) and should be seen as forming the basis for future enforcement actions. As we reported here, Harris filed suit against Delta Airlines for failing to post a privacy policy in connection with its Fly Delta app in violation of the Act. Delta was one of 100 companies that received notices of noncompliance earlier in the year in which it was granted 30 days to post compliant privacy policies. The Report sets out guidelines that are intended to promote strong privacy practices for developers, platform providers, mobile carriers and mobile add networks with the goal of providing comprehensive transparency about data collection, use disclosure and retention practices. In order to address mobile device screen size, the Report recommends the use of special notifications such as icons, or pop-ups that explain in context how personally identifiable information is collected used and disclosed to third parties. Opt-in consent should be obtained before data is collected. The Report also recommends employing enhanced measures -- including just in time notice -- that alert users about and give them control over data collection practices that are unrelated to an apps basic functionality, or that involve collecting sensitive information; minimizing surprise about data uses that may be unexpected by consumers and enabling consumers to opt out of those uses; encrypting personal information for secure transmission; and attributing responsibility for ads that are displayed outside of an app. Platforms should make privacy policies accessible before a user downloads an app. |
|||||||||||||||||
NEWS & ANNOUNCEMENTS | |||||||||||||||||
Karen Neuman to Discuss Location Based Privacy at IAPP Global Summit Karen will discuss privacy and location-based services during a breakout session at this years IAPP Global Privacy Summit March 6-8 in Washington, D.C. The session, Location ,Location, Location: Risks and Rewards of LBS, will be held Friday, March 8 from 8:15 am 9:45. Karen and her co-panelists will focus on recent legal developments and trends involving the use of location-based technologies to serve advertising and other content over mobile devices. They will also offer an interactive exercise to share some practical suggestions for minimizing risk. |
|||||||||||||||||
Copyright © 2012 St. Ledger-Roty & Olson, LLP. | |||||||||||||||||