St. Ledger-Roty & Olson LLP

PRIVACY & INFORMATION LAW UPDATE
February 2013
A bimonthly update of trends and developments in privacy law & policy

Karen Neuman, Editor

  • You are receiving this publication because of your interest in privacy and data security. It is for informational including advertising purposes only and not a substitute for legal advice.
  • Not interested? Unsubscribe or forward to someone who might be.
  • Did someone send you this Update? Subscribe to receive your own or view past issues.

In this Issue:
President Obama Issues Cybersecurity Executive Order
CA Supreme Court Rules Song-Beverly Does not Apply to Online Transactions Involving Digital Products
FTC Announces Settlement of Deceptive Practices, COPPA Charges with Social Networking App
FTC Report Suggests Ways to Improve Mobile Privacy Disclosures
Maryland Attorney General Creates Internet Privacy Unit
Operator of Children’s Website Agrees to Recommended Privacy Improvements
FTC Announces FCRA Settlement with Mobile App Developer
TCPA Claim against Caribbean Cruise Line Survives Motion to Dismiss

INTERNATIONAL DISPATCHES:
EC Proposes Draft Cybersecurity Directive
Singapore DPA Seeks Public Comment on Proposed Data Protection Regulations
Canadian & Dutch Authorities Investigate WhatsApp for Privacy Violations; Issue Report

UPDATES:
California AG Issues Mobile App Privacy Guidelines

NEWS & ANNOUNCEMENTS:
Karen Neuman to Discuss Location Based Privacy at IAPP Global Summit

 

President Obama Issues Cybersecurity Executive Order
By Karen Neuman

On February 13, 2013, President Obama released a summary of an Executive Order (Order) intended to protect critical infrastructure from escalating cyber threats against both government and industry. The Order, which returns cybersecurity to the spotlight, will impact companies that operate critical infrastructure, and may impact sector-specific regulated entities. The Order could potentially impact noncritical infrastructure and certain unregulated entities.
Read more...

 

CA Supreme Court Rules Song-Beverly Does not Apply to Online Transactions Involving Digital Products
By Karen Neuman

On February 4, 2013, the California Supreme Court narrowly ruled in Apple, Inc. v. Krescent1 that the state’s Song-Beverly Credit Card Act of 19712 (Song-Beverly) does not apply to digital transactions involving electronically downloadable products. The 4-3 Majority concluded that Apple did not violate Song-Beverly when it collected the plaintiffs’ phone number and address to complete credit card purchases of downloads from the iTunes store. This ruling has significant implications for e-commerce as online and mobile transactions continue to proliferate, fueling the steady march from brick and mortar to digital transactions.
Read more...

 

FTC Announces Settlement of Deceptive Practices, COPPA Charges with Social Networking App
By Karen Neuman

On February 1, 2013, the FTC announced that the operator of the Path social networking app agreed to settle charges that the company deceived users by collecting personal information from their mobile device address books without their knowledge or consent, and collected personal information from children under 13 in violation of the COPPA rule. Path agreed to pay an $800,000 fine, delete children’s personal information obtained in violation of COPPA, and establish a comprehensive privacy program that will be subject to periodic FTC review for 20 years.
Read more...

 

FTC Report Suggests Ways to Improve Mobile Privacy Disclosures
By Karen Neuman
     Seth Williams*


On February 1, 2013, the FTC issued a staff report, Mobile Privacy Disclosures: Building Trust through Transparency (Report).1 The Reports sets out recommended improvements for how stakeholders in the mobile app ecosystem notify consumers about their privacy and data collection practices. The recommendations are not regulations; however they represent best practices and should be seen as potentially forming the basis for future enforcement actions for failure to adhere to them.
Read more...

 

Maryland Attorney General Creates Internet Privacy Unit
By Karen Neuman

On January 28, 2013, Maryland Attorney General Doug Gansler announced the creation of an Internet Privacy Unit in the office of the Maryland Attorney General. The unit will pay particular attention to enforcing the federal Children’s Online Privacy Protection Act (COPPA),1 which empowers states, in addition to the Federal Trade Commission, to enforce the COPPA rule. The unit will also monitor and examine privacy policies for deceptive representations, and general adherence to consumer protection law.
Read more...

 

Operator of Children’s Website Agrees to Recommended Privacy Improvements
By Karen Neuman

On January 17, 2013 the Children’s Advertising Review Unit (CARU) announced that SPIL Games, BV, the operator of a website directed to children and teenage girls, agreed to modify certain privacy practices in response to recommendations by CARU. CARU is the investigative arm of the advertising industry’s self-regulatory program administered by the Council of Better Business Bureaus and one of several FTC-approved safe harbor programs. CARU monitors online and mobile services to ensure compliance with its Guidelines.
Read more...

 

FTC Announces FCRA Settlement with Mobile App Developer
By Karen Neuman

The FTC announced in a January 13, 2013 blog post that it reached a settlement with Filiquarian Publishing, LLC (a mobile app developer), the company’s owner, Joshua Linsk, and Choice Level LLC, for violating the Fair Credit Reporting Act (FCRA). This is the first time an FCRA action has been brought against a mobile app. This action serves as an important “heads up” to owners and developers of apps that offer regulated products or services (e.g., background screening, children’s content, health and financial services), that they must be familiar with applicable privacy requirements.
Read more...

 

TCPA Claim against Caribbean Cruise Line Survives Motion to Dismiss
By Karen Neuman

On January 7, 2013, a federal court judge denied a motion to dismiss a claim brought under the Telephone Consumer Protection Act (TCPA) against Caribbean Cruise Line and Economic Strategy Group. The class action, Birchmeier v Caribbean Cruise Line and Economic Strategy Group, was filed in U.S. District Court for the Northern District of Illinois. Businesses that use third party service providers for marketing campaigns should ensure that the providers comply with applicable consumer protection laws, including those governing mobile message campaigns.
Read more...

 

INTERNATIONAL DISPATCHES

EC Proposes Draft Cybersecurity Directive

On February 7, 2013, the European Commission (EC) and the High Representative of the Union for Foreign Affairs and Security Policy issued a proposed draft directive (Directive) to ensure common network information security over the Internet and private information systems in the European Union (EU). The proposal is intended to unify the different, voluntary cybersecurity approaches of the EU’s 27 member states by requiring them to create a cooperative mechanism for improving security and incident reporting. Operators of critical infrastructure, including energy, transportation, cloud service providers, financial services providers, and “key providers” of information services like e-commerce, social network platforms, search engines, cloud service providers and even app stores would be required to adopt appropriate security measures and report “serious” incidents to national authorities. It is unclear what the interplay will be between the proposed security and incident reporting requirements and those proposed in the General Data Protection Regulation that we reported about here.
Read more...

 

Singapore DPA Seeks Public Comment on Proposed Data Protection Regulations

On February 5, 2013, the Personal Data Protection Commission of Singapore (PDPC) issued its first public consultation setting out and seeking comment about proposed data protection regulations under Singapore’s Personal Data Protection Act (PDPA). The PDPA was adopted by the Singapore Parliament in October 2012 and became effective January 2013. The Act applies to all organizations in Singapore, except those in the public sector. The PDPA created the SPDPC, and empowered it to administer and enforce the PDPA, including imposing various remedial measures for noncompliance.
Read more...

 

Canadian & Dutch Authorities Investigate WhatsApp for Privacy Violations; Issue Report

On January 28, 2013, The Office of the Privacy Commissioner of Canada and the Dutch Data Protection Authority (Dutch DPA) issued a joint press release announcing collaborative findings that California-based WhatsApp violated Dutch and Canadian privacy laws. The findings were released in separate reports that can be found here and here. This investigation comes at a time when the data collection, use and disclosure practices of mobile apps are being closely scrutinized by US and International enforcement authorities. The Dutch DPA indicated it will continue to monitor WhatsApp and impose penalties if warranted. The DPA asserts that it is empowered to enforce Dutch law against WhatsApp because it targets and is used by Dutch users. Further, the company processes data in the Netherlands over the mobile devices of Dutch users.
Read more...

 

UPDATES

California AG Issues Mobile App Privacy Guidelines

On January 10, 2013, California Attorney General Kamala Harris issued Privacy on the Go: Recommendations for the Mobile Ecosystem. This development is the latest in a series of actions undertaken by Harris that are part of a broader initiative to protect mobile privacy. The guidelines are not regulations; however, they reflect Harris' interpretation of the California Online Privacy Protection Act (Act) and should be seen as forming the basis for future enforcement actions. As we reported here, Harris filed suit against Delta Airlines for failing to post a privacy policy in connection with its “Fly Delta” app in violation of the Act. Delta was one of 100 companies that received notices of noncompliance earlier in the year in which it was granted 30 days to post compliant privacy policies.
Read more...

 

NEWS & ANNOUNCEMENTS

Karen Neuman to Discuss Location Based Privacy at IAPP Global Summit

Karen will discuss privacy and location-based services during a breakout session at this year’s IAPP Global Privacy Summit March 6-8 in Washington, D.C. The session, “Location ,Location, Location: Risks and Rewards of LBS, will be held Friday, March 8 from 8:15 am – 9:45. Karen and her co-panelists will focus on recent legal developments and trends involving the use of location-based technologies to serve advertising and other content over mobile devices. They will also offer an interactive exercise to share some practical suggestions for minimizing risk.
Read more...


President Obama Issues Cybersecurity Executive Order
By Karen Neuman

On February 13, 2013, President Obama released a summary of an Executive Order (Order) intended to protect critical infrastructure from escalating cyber threats against both government and industry. The Order, which returns cybersecurity to the spotlight, will impact companies that operate critical infrastructure, and may impact sector-specific regulated entities. The Order could potentially impact noncritical infrastructure and certain unregulated entities.

The Order has two key objectives: (1) to improve information sharing between the federal government and industry and (2) to implement a cybersecurity framework (Framework) that will be developed by a NIST-led effort. The Framework will consist of standards, policies and procedures to guard against threats. Industry adoption of the Framework will be voluntary.

Information Sharing. Toward this end, section 4 requires the Department of Homeland Security, Director of National Intelligence and Attorney General to create a process for sharing unclassified reports about cyber threats that identify a specific targeted entity, and classified reports to critical infrastructure entities and their contractors with security clearances, while calling for processes for expedited security clearances.

Framework. Section 7 directs NIST to establish a framework that will include a set of standards, methodologies, procedures and processes, including incorporating voluntary “consensus” standards and industry best practices that “align policy, business, and technological approaches to address cyber risks.” The Order mandates that the Framework be technologically neutral and incorporate information security measures and controls to help owners and operators of critical infrastructure identify, assess, and manage cyber risk. The Framework will be subject to “an open public review and comment process,” with a preliminary version to be published within 240 days and a final version to be issued within one year of the Order.

Section 8 calls for a coordinated effort among sector-specific agencies to establish a voluntary program, including incentives, to support the Framework’s adoption by critical infrastructure and other interested entities.

The Order also calls for Homeland Security, counterterrorism and other officials to make recommendations to the President on the feasibility, security benefits, and relative merits of incorporating security standards into federal procurement process, including what steps might be necessary to align existing cybersecurity requirements with new ones.

FIPS-Based Approach to Privacy and Civil Liberties Protections. The Order also addresses the Privacy and civil liberties issues that contributed to the failure of Congress to pass comprehensive cybersecurity legislation during the last session. Thus, section 5 mandates a coordinated agency effort to ensure that privacy and civil liberties protections are incorporated in the activities and processes mandated by the Order. The protections must be based on Fair Information Principles and other applicable privacy frameworks.

Critical infrastructure and regulated entities -- as well as other interested entities -- should understand the Order’s potential impact, and evaluate whether adoption of the Framework will prove beneficial to them.

Forward Article Back to Top


CA Supreme Court Rules Song-Beverly Does not Apply to Online Transactions Involving Digital Products
By Karen Neuman

On February 4, 2013, the California Supreme Court narrowly ruled in Apple, Inc. v. Krescent1 that the state’s Song-Beverly Credit Card Act of 19712 (Song-Beverly) does not apply to digital transactions involving electronically downloadable products. The 4-3 Majority concluded that Apple did not violate Song-Beverly when it collected the plaintiffs’ phone number and address to complete credit card purchases of downloads from the iTunes store. This ruling has significant implications for e-commerce as online and mobile transactions continue to proliferate, fueling the steady march from brick and mortar to digital transactions.

The Court previously ruled in Pineda v. Williams Sonoma Stores, Inc.3 that zip code information is personal information under Song-Beverly, and that “brick and mortar” retailers are prohibited from seeking or recording such information in order to complete a credit card transaction. As we reported here, that ruling gave rise to hundreds of class actions as well as a statutory exemption for operators of gas stations. In Krescent the Court overturned a lower court ruling that Song-Beverly barred Apple from collecting personal information during online credit card transactions involving downloadable digital products. The Court examined Song-Beverly’s legislative history and concluded that the legislature had two objectives when it enacted the statute: to protect consumer privacy and minimize the risk of fraud for merchants and consumers.

In reaching this conclusion, the Majority reasoned that: 1) when Song-Beverly was enacted the online environment was so different the legislature was unlikely to have contemplated the kinds of transactions at issue before the Court; 2) online purchases of downloadable products differ from purchases from brick and mortar merchants because online retailers are unable to visually inspect a credit card, the cardholder’s signature, or confirm a customer’s identity with photo identification; and 3) the California Online Privacy Protection Act of 20034, which was enacted after Song-Beverly, establishes a separate framework for protecting consumer’s online privacy -- in part because the legislature believed that existing law lacked such protections. Accordingly, the Majority concluded that the “key antifraud mechanism in [Song-Beverly’s] statutory scheme has no practical application to online transactions involving electronically downloadable products.”

The Court emphasized that its ruling is limited to credit card transactions involving digital products. Nevertheless, the Court’s reasoning could be applied more broadly to future Song-Beverly actions involving online purchases of physical products and services. In any event, the rapid migration of brick and mortar transactions to online and mobile environments suggests that Krescent’s practical is that it foreshadows the potential irrelevance of Song-Beverly to e- and mobile commerce.


1 Ct.App. 2/8 B238097 (CA February 4, 2013).
2 California Civil Code §1747.08.
3 51 Cal. 4th 524 (2011)
4 CAL. BPC. CODE § 22575.

Forward Article Back to Top


FTC Announces Settlement of Deceptive Practices, COPPA Charges with Social Networking App
By Karen Neuman


On February 1, 2013, the FTC announced that the operator of the Path social networking app agreed to settle charges that the company deceived users by collecting personal information from their mobile device address books without their knowledge or consent, and collected personal information from children under 13 in violation of the COPPA rule. Path agreed to pay an $800,000 fine, delete children’s personal information obtained in violation of COPPA, and establish a comprehensive privacy program that will be subject to periodic FTC review for 20 years.

According to the FTC, Path, Inc. operates a social networking service that allows users to keep journals about “moments” in their life and to share that journal with a network of up to 150 friends. Through the Path app, users can upload, store, and share photos, written “thoughts,” the user’s location, and the names of songs that the user is listening to. The FTC alleged that Path’s application included an “add friends” feature to help users add new social network connections. The feature offered users three options: “Find friends from your contacts;” “Find friends from Facebook;” or “Invite friends to join Path by email or SMS.” However, Path automatically collected and stored personal information from the users’ mobile device address books even if the user had not selected the “Find friends from your contacts” option. For each contact in a user’s mobile device address book, Path automatically collected and stored any available first and last names, addresses, phone numbers, email addresses, Facebook and Twitter usernames, and dates of birth.

The FTC charged that Path’s privacy policy deceived consumers by claiming that it automatically collected only certain user information such as IP address, operating system, browser type, address of referring site, and site activity information. However, version 2.0 of the Path iOS app automatically collected and stored personal information from the user’s mobile device address book when the user first launched that version of the app and each time the user signed back into the account.

In addition to the deceptive practices claim, the agency charged that Path, which collects birth date information during user registration, violated the COPPA rule by collecting personal information from approximately 3,000 children under the age of 13 without first getting parental consent. Through its apps for both iOS and Android, as well as its website, Path enabled children to create personal journals and upload, store and share photos, written “thoughts,” their precise location, and the names of songs to which the child was listening. Version 2.0 of the app also collected personal information from a child’s address book, including full names, addresses, phone numbers, email addresses, dates of birth and other information, where available.

The COPPA Rule requires that operators of online sites or services directed to children, or operators that have actual knowledge of child users on their sites or services, notify parents and obtain their verifiable consent before they collect, use, or disclose personal information from children under 13. Operators covered by the Rule also must also post a COPPA compliant privacy policy.

The FTC alleged that Path violated the COPPA Rule by failing to:

  • Spell out its collection, use and disclosure policy for children’s personal information;
  • provide parents with direct notice of its collection, use and disclosure policy for children’s personal information; and
  • obtain verifiable parental consent before collecting children’s personal information.

The settlement highlights the FTC’s continued focus on mobile app and children’s privacy. Website operators (as well as “brick and mortar” businesses) that offer mobile apps, including those that can be accessed and interacted with by children, should implement robust procedures that include guarding against the kind of design flaw that was presented here as well as aligning web and mobile privacy policies.

The settlement should be seen more broadly as part of an escalating focus on mobile privacy – particularly mobile app privacy, by federal and state authorities. The FTC released its mobile app privacy report on the same day that it announced the Path settlement. The report follows the agency’s previous report on children’s mobile privacy, as we reported here. In addition, California has stepped up enforcement against mobile apps and platforms on which they are accessed, and the Maryland Attorney General recently announced creation of a privacy enforcement unit that will also focus on mobile privacy.

Forward Article Back to Top


FTC Report Suggests Ways to Improve Mobile Privacy Disclosures
By Karen Neuman
     Seth Williams*

On February 1, 2013, the FTC issued a staff report, Mobile Privacy Disclosures: Building Trust through Transparency (Report).1 The Reports sets out recommended improvements for how stakeholders in the mobile app ecosystem notify consumers about their privacy and data collection practices. The recommendations are not regulations; however they represent best practices and should be seen as potentially forming the basis for future enforcement actions for failure to adhere to them.

Like California Attorney General Kamala Harris’s mobile privacy initiative, the Report seeks to promote strong privacy practices for developers, platform and operation providers, device manufacturers and mobile ad networks. Noting that the limited screen space of many mobile devices makes the format of traditional privacy notices ineffective, the Report calls for an industry-driven solution to this and other disclosure challenges.

The Report’s recommendations are stakeholder-specific. For example, it recommends that platform providers provide consumers with “just-in-time” notice and obtain affirmative express consent before accessing sensitive content, such as geolocation, contacts, photos, or audio and video recordings. Providers should also consider:

  • Creating a privacy “dashboard” to allow consumers to review the types of content apps can access.
    • The Report cites the privacy tabs or privacy setting menus offered by iOS6 and Android devices.
  • Developing icons to depict how user data is transmitted.
  • Promoting developer best practices. Further:
    • Platforms might require developers to make adequate privacy disclosures, reasonably enforce the disclosures, and educate developers;
    • Platforms might provide consumers with disclosures about the extent to which they review apps prior to making them available for download in app stores, and conduct compliance checks after the apps have been placed in the app stores.
  • Offer Smartphone owners a “do-not-track” mechanism.

App developers should:

  • Create a privacy policy and make it easily accessible through app stores;
  • Provide “just-in-time” disclosures and obtain affirmative express consent before collecting and sharing sensitive information. Even though there is some redundancy with this recommendation and the recommendation that platform developers also offer “just-in-time” notice, the Report emphasizes the usefulness of these disclosures as long as they reflect unique attributes and the developers don’t merely repeat the platform disclosures. For example:
    • An app might provide geolocation or other sensitive information to a third-party that the platform may not. The app should “provide a just-in-time disclosure and obtain affirmative consent from the user that reflects this data sharing.
  • Improve coordination and communication with ad networks and other third parties that provide services to developers such as analytics, so the developers provide accurate disclosures to consumers.
    • For example, app developers often integrate third-party code to facilitate advertising or analytics within an app with little understanding of what information the third party is collecting and how it is being used.
  • Consider participating in self-regulatory programs, trade associations, and industry organizations, which can provide guidance on how to make uniform, short-form privacy disclosures.

Ad networks and other third-party data “users” should consider:

  • Working with platforms in such instances to ensure that the creation of a Do Not Track mechanism for mobile systems is effective.
  • Working with developers to improve communication about how apps collect data.

The Report does not call for new legislation; it emphasizes that there is an important role for industry associations in implementing the FTC’s recommendations, including fostering transparency and uniformity, and promoting industry-wide collaboration.


1 FTC STAFF REPORT, MOBILE PRIVACY DISCLOSURES: BUILDING TRUST THROUGH TRANSPARENCY, http://www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf (Feb. 1, 2013).

*Seth is a guest author and recent graduate of the University of Indiana Maurer School Of Law. A recent FCC intern, Seth’s admission to the D.C. bar is pending. He has written previously about the potential impact of the Performance Right Act on student radio stations.

Forward Article Back to Top


Maryland Attorney General Creates Internet Privacy Unit
By Karen Neuman

On January 28, 2013, Maryland Attorney General Doug Gansler announced the creation of an Internet Privacy Unit in the office of the Maryland Attorney General. The unit will pay particular attention to enforcing the federal Children’s Online Privacy Protection Act (COPPA),1 which empowers states, in addition to the Federal Trade Commission, to enforce the COPPA rule. The unit will also monitor and examine privacy policies for deceptive representations, and general adherence to consumer protection law.

This development is consistent with Gansler’s public remarks that digital privacy, particularly mobile and children’s online privacy, is a top priority for his office. In addition, he’s made “privacy in the digital age” a key initiative for the National Association of Attorneys General (NAAG). Gansler’s announcement follows the creation of a similar unit by California Attorney General Kamala Harris last year. Since then Harris, through that unit, has initiated a number of actions to enforce California’s online privacy law.

Businesses that collect, use and retain personal information about Maryland residents should be familiar with applicable law, including COPPA and Maryland’s Personal Information Protection Act.2 In addition, these businesses should review their data collection and security practices and procedures, including privacy notices, and take appropriate measures to minimize the risk of an investigation or enforcement action.


1 15 U.S.C. 6501, et seq.
2 Md. Code Ann. Comm. Law 14-3504.

Forward Article Back to Top


Operator of Children’s Website Agrees to Recommended Privacy Improvements
By Karen Neuman

On January 17, 2013 the Children’s Advertising Review Unit (CARU) announced that SPIL Games, BV, the operator of a website directed to children and teenage girls, agreed to modify certain privacy practices in response to recommendations by CARU. CARU is the investigative arm of the advertising industry’s self-regulatory program administered by the Council of Better Business Bureaus and one of several FTC-approved safe harbor programs. CARU monitors online and mobile services to ensure compliance with its Guidelines.

According to CARU’s press release, children who were under 13 were able to create profiles and personalized avatars, view profiles created by others, play and rate games and make friends. They were also able to register for the site using social media tools that prohibit use by children who are under 13 – including Facebook and Twitter. According to CARU, the site allowed children to disclose personally identifiable information on member profile pages without first notifying parents and obtaining verifiable parental consent.

In response to CARU’s investigation, SPIL installed a session cookie to prevent “retries” during age screening, disabled all user-generated content on wall posts and comments, and changed user names of children who are under 13 to predefined, “white-listed” or randomly generated user names. SPIL also agreed not to accept new registrations from children under 13. These children will be still be able to play games on the site but will be unable to generate content. SPIL also disabled the feature that allowed log-in through social media and removed links to Twitter.

This matter serves as an important reminder that any operational changes made to child- directed websites or online services (or to other sites or services with actual knowledge that they collect information from children who are under 13), should be made only after evaluating how those changes will affect the operator’s overall COPPA compliance strategy. This is especially so in light of recent amendments to the COPPA rule that modify key definitions and corresponding obligations, and impose new requirements on certain third parties. In addition, operators that use age screening should have adequate measures in place to prevent “retries” by children who are under 13.

Forward Article Back to Top


FTC Announces FCRA Settlement with Mobile App Developer
By Karen Neuman


The FTC announced in a January 13, 2013 blog post
that it reached a settlement with Filiquarian Publishing, LLC (a mobile app developer), the company’s owner, Joshua Linsk, and Choice Level LLC, for violating the Fair Credit Reporting Act (FCRA). This is the first time an FCRA action has been brought against a mobile app. This action serves as an important “heads up” to owners and developers of apps that offer regulated products or services (e.g., background screening, children’s content, health and financial services), that they must be familiar with applicable privacy requirements.

The Complaint alleged that Filiquarian advertised that people who bought its app could perform a “quick criminal background check for convictions” in a number of states states. Filiquarian said that its apps could access hundreds of thousands of criminal records, and be used to conduct searches on potential employees. Accordingly, Filiquarian, Linsk, and Choice Level were found to be acting as a “consumer reporting agency” under the FCRA and required to comply with certain obligations, including implementing certain safeguards with respect to personal data. The FTC charged three key FCRA violations: 1) failure to maintain reasonable procedures to verify who their users are and that the information would be used for a permissible purpose; 2) failure to have procedures to ensure that the information they provided in consumer reports was accurate; and 3) failure to provide required notices to users and to those who furnished Filiquarian with information that was included in consumer reports.

The apps included disclaimers that also appeared on the parties’ websites that that they were not FCRA compliant; that their products were not to be considered screening products for employment, insurance, and credit screening; and that anyone who used the reports for such purposes assumed sole responsibility for FCRA compliance. The FTC noted, however, that the disclaimers contradicted express representations in Filiquarian’s ads urging people to use the reports to screen potential employees. The FTC admonished that these types of disclaimers is not enough to absolve a company from FCRA liability.

The consent decree bars the Defendants from: 1) furnishing a consumer report to anyone they do not have reason to believe has a “permissible purpose” to use the report; 2) failing to take reasonable steps to ensure the maximum possible accuracy of the information conveyed in its reports; and 3) failing to provide users of its reports with information about their obligations under the FCRA.

Forward Article Back to Top


TCPA Claim against Caribbean Cruise Line Survives Motion to Dismiss
By Karen Neuman

On January 7, 2013, a federal court judge denied a motion to dismiss a claim brought under the Telephone Consumer Protection Act (TCPA) against Caribbean Cruise Line and Economic Strategy Group. The class action, Birchmeier v Caribbean Cruise Line and Economic Strategy Group, was filed in US District Court for the Northern District of Illinois. Businesses that use third party service providers for marketing campaigns should ensure that the providers comply with applicable consumer protection laws, including those governing mobile message campaigns.

The Plaintiffs alleged that the defendants used an autodialer or prerecorded voice to make unsolicited marketing calls to their cell phones in violation of the TCPA. They further alleged that the defendants “acted under the guise of conducting political surveys but that this was a sham intended to get their foot in the door to sell ocean cruises to the calls’ recipients.”

The Court rejected each of the three arguments asserted by the defendants. First, the Court rejected a procedural argument that the Plaintiffs failed to distinguish the role of each defendant, noting that the “whole point” is that the Defendants acted “in concert” and it is difficult to tell “where one defendant stops and the other one starts.” Second, the Court rejected the defendants’ contention that TCPA liability attaches only to the party that placed the call, deriding the underlying logic as “absurd”. Congress would not have intended the TCPA to allow a “well heeled” marketer to hire a third party to make unlawful calls as a shield from TCPA liability. Third, the Court rejected as a “nonstarter” the Defendant’s argument that the Complaint is “non-actionable” under the TCPA because the calls were political surveys which are exempted under the statute. The Court noted that the while TCPA exempts political calls made by using artificial or prerecorded voices, the exemption does not apply to those made by autodialers, and the Complaint also alleged calls made by autodialers.

In addition to denying the motion to dismiss the TCPA claims, the Court denied the Defendants’ motion to request class action allegations on grounds that a decision on whether the Plaintiffs can establish class certification would be premature.

Forward Article Back to Top


INTERNATIONAL DISPATCHES

EC Proposes Draft Cybersecurity Directive

On February 7, 2013, the European Commission (EC) and the High Representative of the Union for Foreign Affairs and Security Policy issued a proposed draft directive (Directive) to ensure common network information security over the Internet and private information systems in the European Union (EU). The proposal is intended to unify the different, voluntary cybersecurity approaches of the EU’s 27 member states by requiring them to create a cooperative mechanism for improving security and incident reporting. Operators of critical infrastructure, including energy, transportation, cloud service providers, financial services providers, and “key providers” of information services like e-commerce, social network platforms, search engines, cloud service providers and even app stores would be required to adopt appropriate security measures and report “serious” incidents to national authorities. It is unclear what the interplay will be between the proposed security and incident reporting requirements and those proposed in the General Data Protection Regulation that we reported about here.

The proposed Directive is the latest in a series of actions intended to promote the overarching goals of establishing the EU as a secure and trustworthy digital environment and realizing attendant economic benefits of doing so. Key components call for:

  • The adoption by EU member states of a Network Information Security (NIS) strategy that includes strategic objectives and “concrete” policy and regulatory measures to achieve and maintain a high level of network security.
    • The national designation of NIS national authorities and a governance strategy to prevent and address NIS risks and incidents.
  • The creation of a cooperative, “early warning” network among national NIS authorities, the EC and other agencies to share information about security risks and cooperate in addressing those risks.
  • Operators of critical infrastructure, information service providers and public administrations to: 1) implement appropriate technical and organizational measures to manage risk to their networks and minimize any impact on the core services they provide, and 2) report incidents to national NIS authorities having a significant impact on such core services.

The proposed Directive must be approved by the European Parliament and European Council before becoming law.

Forward Article Back to Top


Singapore DPA Seeks Public Comment on Proposed Data Protection Regulations

On February 5, 2013, the Personal Data Protection Commission of Singapore (PDPC) issued its first public consultation setting out and seeking comment about proposed data protection regulations under Singapore’s Personal Data Protection Act (PDPA). The PDPA was adopted by the Singapore Parliament in October 2012 and became effective January 2013. The Act applies to all organizations in Singapore, except those in the public sector. The PDPA created the SPDPC, and empowered it to administer and enforce the PDPA, including imposing various remedial measures for noncompliance.

The Public Consultation focuses on three areas: 1) the form, manner and procedures for access to or correction of personal data, including the process for responses; 2) the requirements for transferring personal data out of Singapore; and 3) the classes of persons who may act for minors or other individuals who lack capacity to act, and the manner in and extent to which any rights and powers may be exercised by these individuals under the PDPA. Areas involving certain administrative procedures will likely be addressed in mid- 2014.

Rights of Access & Correction. The Consultation sets out rights and responsibilities for individuals and organizations, respectively, for seeking access to personal data and the process and time frame for responding to such requests. It recommends that organizations respond to written access requests within 30 days; those that are unable to do so should inform the requesting individual within 30 days of the “reasonably soonest time” the request can be granted. Organizations are authorized to charge a reasonable fee for responding to access requests. The PDPA specifically seeks comments on the proposed manner in which an individual may access or correction requests, or on how organizations are to respond to such requests.

Cross-Border Transfer of Personal Data. The Consultation proposes to generally permit the transferring of personal information out of Singapore only if done so in a manner consistent with obligations under the PDPA and pursuant to a legally binding instrument that contains appropriate safeguards in the form of contractual clauses or binding corporate rules. The instrument must implement obligations involving data purpose, use, disclosure, accuracy, protection, retention and employee data handling policies. The PDPA specifically seeks comments on other means of ensuring the protection of personal data transferred out of Singapore and the proposed requirements for contractual clauses and binding corporate rules that would apply.

Individuals Who May Act for Others. The Consultation proposes regulations for classes of persons who are unable to grant consent to exercise their rights under PDPA, including minors and deceased persons. The Consultation proposes that minors who are 14-18 years old be able to act on their own behalf if they understand the nature of the right or power conferred by the PDPA and the consequences of exercising the right or power. Certain of a deceased persons’ closest relative should act on his or her behalf where those persons do not have personal representative. The Consultation specifically seeks comments on a number of areas involving who may act for others, including the extent to which minors should be able to exercise rights and powers conferred under the PDPA, and the minimum age below which they should not be permitted to do so; and proposed priority lists of relatives that may act on behalf of a deceased.

Public comment is due by 5:00 PM, March 19, 2013.

Forward Article Back to Top


Canadian & Dutch Authorities Investigate WhatsApp for Privacy Violations; Issue Report

On January 28, 2013, The Office of the Privacy Commissioner of Canada and the Dutch Data Protection Authority (Dutch DPA) issued a joint press release announcing collaborative findings that California-based WhatsApp violated Dutch and Canadian privacy laws. The findings were released in separate reports that can be found here and here. This investigation comes at a time when the data collection, use and disclosure practices of mobile apps are being closely scrutinized by U.S. and International enforcement authorities. The Dutch DPA indicated it will continue to monitor WhatsApp and impose penalties if warranted. The DPA asserts that it is empowered to enforce Dutch law against WhatsApp because it targets and is used by Dutch users. Further, the company processes data in the Netherlands over the mobile devices of Dutch users.

The investigation focused on WhatsApp’s mobile messaging platform, which allows users to send and receive instant messages over the Internet across various mobile platforms. While WhatsApp was found to be in contravention of Canadian and Dutch privacy laws, the organization has taken steps to implement some recommendations in the reports.

The investigation revealed that WhatsApp violated certain internationally accepted privacy principles, mainly in relation to the retention, safeguard, and disclosure of personal data. For example:

  • In order to facilitate contact between application users, WhatsApp relies on a user’s address book to populate subscribers’ WhatsApp contacts list. Once users consent to the use of their address book, all phone numbers from the mobile device are transmitted to WhatsApp to assist in the identification of other WhatsApp users. Rather than deleting the mobile numbers of non-users, WhatsApp retains those numbers (in a hash form). The report found that this practice contravenes Canadian and Dutch privacy law, which holds that information may only be retained for so long as it is required for the fulfillment of an identified purpose. Only iPhone users running iOS 6 on their devices have the option of adding contacts manually rather than uploading the mobile address numbers of their address books to company servers automatically. According to the report, both WhatsApp users and non-users should be able to decide what contact information that want to share with the App.
  • Messages sent using WhatsApp’s messenger service were unencrypted, leaving them vulnerable to eavesdropping or interception, especially when sent through unprotected Wi-Fi networks. In September 2012, in partial response to the investigation, WhatsApp introduced encryption to its mobile messaging service.
  • WhatsApp was found to be generating passwords for message exchanges using device information that can be relatively easily exposed. This created the risk that a third party could send and receive messages in the name of users without their knowledge. According to the report, WhatsApp has strengthened its authentication process in the latest version of its app, using a more secure randomly generated key instead of generating passwords from MAC (Media Access Control) or IMEI (International Mobile Station Equipment Identity) numbers (which uniquely identify each device on a network) to generate passwords for device-to-application message exchanges.

Forward Article Back to Top


UPDATES

California AG Issues Mobile App Privacy Guidelines

On January 10, 2013, California Attorney General Kamala Harris issued Privacy on the Go: Recommendations for the Mobile Ecosystem. This development is the latest in a series of actions undertaken by Harris that are part of a broader initiative to protect mobile privacy. The guidelines are not regulations; however, they reflect Harris' interpretation of the California Online Privacy Protection Act (Act) and should be seen as forming the basis for future enforcement actions. As we reported here, Harris filed suit against Delta Airlines for failing to post a privacy policy in connection with its “Fly Delta” app in violation of the Act. Delta was one of 100 companies that received notices of noncompliance earlier in the year in which it was granted 30 days to post compliant privacy policies.

The Report sets out guidelines that are intended to promote strong privacy practices for developers, platform providers, mobile carriers and mobile add networks with the goal of providing “comprehensive” transparency about data collection, use disclosure and retention practices. In order to address mobile device screen size, the Report recommends the use of special notifications such as icons, or pop-ups that explain in context how personally identifiable information is collected used and disclosed to third parties. Opt-in consent should be obtained before data is collected. The Report also recommends employing enhanced measures -- including “just in time” notice -- that alert users about and give them control over data collection practices that are unrelated to an app’s basic functionality, or that involve collecting sensitive information; minimizing surprise about data uses that may be unexpected by consumers and enabling consumers to opt out of those uses; encrypting personal information for secure transmission; and attributing responsibility for ads that are displayed outside of an app. Platforms should make privacy policies accessible before a user downloads an app.

Forward Article Back to Top


NEWS & ANNOUNCEMENTS

Karen Neuman to Discuss Location Based Privacy at IAPP Global Summit

Karen will discuss privacy and location-based services during a breakout session at this year’s IAPP Global Privacy Summit March 6-8 in Washington, D.C. The session, “Location ,Location, Location: Risks and Rewards of LBS, will be held Friday, March 8 from 8:15 am – 9:45. Karen and her co-panelists will focus on recent legal developments and trends involving the use of location-based technologies to serve advertising and other content over mobile devices. They will also offer an interactive exercise to share some practical suggestions for minimizing risk.

Forward Article Back to Top


Copyright © 2012 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036