Proud sponsor of 1410 Q Street: DCs Innovation Hot Spot! |
||||||||||
PRIVACY
& INFORMATION LAW UPDATE |
||||||||||
|
||||||||||
|
|
|||||||||
Feature
Article: Hotels
and resorts (particularly those who market kids club or
similar activities ); apparel and
retail stores (particularly those with brand-specific websites or
mobile apps); online social gaming and
virtual worlds (including those on social media or mobile platforms);
music and video sites or apps;
social check-in and other geolocation-based
services, social media platforms (including brand fan
pages); entities that offer online contests or sweepstakes
and charitable giving sites are all destinations
likely to attract todays device-tethered, technology savvy-children.
Children visiting these destinations
are likely to disclose personal information that triggers the COPPA rule. THE COPPA RULE. The COPPA rule, promulgated and enforced by the FTC, is intended to protect children from unauthorized contact from marketers or other adults. It currently prohibits operators of commercial websites and online services directed to children -- as well as general audience sites that attract children -- from collecting personal information from kids under the age of 13 without first seeking and obtaining verifiable parental consent. Parents must be given access to their childrens personal data and the opportunity to have it deleted from the operators system. The rules application to general audience sites is triggered when those sites knowingly collect personal information from children. A site is deemed to knowingly collect personal information if a child discloses that he or she is under 13 or the site otherwise learns that the child is under 13. Personal information includes a physical or personal e-mail address, phone number, social security number or IM identifier that contains a childs e-mail address. It also can include gender or date of birth information if such information can be combined with other data to identify and contact a specific child. Perhaps the most perplexing compliance challenges involve the rules prior verifiable parental consent and age verification requirements. Many of the rules parental consent mechanisms are either impractical given the real-time nature of online interactions or activities (e.g., notifying parents of intent to collect personal information and asking that they fax consent or call an 800 number staffed by trained professionals), or reflect outdated assumptions about how to guard against a child masquerading as a parent (e.g., permitting a completed credit card transaction to serve as verifiable parental consent). Many of the rules age screening mechanisms (for example drop-down menus with dates of birth) are ineffective against kids who can outsmart this approach or who keep trying until they pass. Blocking repeated attempts or implementing age screening technologies can be cost- ineffective or otherwise impractical, particularly for small operators. ENFORCEMENT. The FTC has brought a number of high profile enforcement actions against general audience site operators alleging noncompliance with the COPPA rule. These actions have resulted in significant fines, remedial compliance obligations, and FTC oversight. In many instances the cost of complying with remedial measures can exceed the amount of civil penalties. For example, in 2009 the FTC settled charges with Iconix Brand Group (Iconix) for knowingly collecting, using or disclosing personal information from children under 13 without obtaining prior parental consent. Iconix operated a number of general audience apparel websites, including some that sold brands that appealed to children and teens. Visitors to the companys brand-specific Web sites were required to provide personal information, including full name, e-mail address, zip code, and in some cases mailing address, gender, date of birth and phone number in order to receive brand updates, enter sweepstakes contests, and participate in interactive brand-awareness campaigns, post photos and share stories. Under the settlement, Iconix was fined $250,000 and required to implement a number of remedial measures including deleting all personal information obtained and stored in violation of COPPA, distributing the settlement order and FTCs COPPA compliance materials to company employees, and complying with certain reporting and record-keeping obligations. In 2008 the FTC settled similar charges against Sony BMC Music (Sony). At the time Sony operated over 1,000 artist and label websites, 196 of which collected personal information from children under 13. Like Iconix, Sony required visitors to disclose personal information and date of birth when registering to use the sites. Sony agreed to pay $1 million to charges that it violated COPPA by collecting, maintaining and disclosing personal information from thousands of under-13 children without their parents consent. Sony was also required to delete all personal information collected in violation of the rule and comply with certain employee training and record-keeping measures. UPDATING THE COPPA RULE. In March 2010 the FTC initiated a comprehensive review of the COPPA rule to address privacy risks posed by the use of mobile technology by children to access the Internet, including interactive gaming, social and other media. Proposed changes are expected to be announced this summer. Accordingly, the rules reach may soon expand, making it more difficult for businesses to engage young people and even adults online. Potential changes include mandating the adoption of age screening and verification technologies to adequately reflect changes in behavior and technology; expanding the statutory definition of Internet to address the adoption of mobile computing technologies by children; and expanding the definition of personal information to include passive technologies such as mobile unique device identifiers, persistent identifiers such as IP addresses, as well as possibly increasing the minimum age of protection to teenagers older than 13. CONGRESSIONAL INTIATIVES TO PROTECT CHILDRENS ONLINE PRIVACY. Congress has also been paying close attention to privacy risks (as well as broader consumer protection risks) facing minors, holding hearings and considering measures intended to address those risks. On June 15, 2011, Senators Franken (D-MN) and Blumenthal (D-CT) introduced a location privacy bill containing a provision that would criminalize the knowing and intentional aggregation and sale of location data of children 10 years old and younger. In April 2011 the Chair of the Senate Commerce Committee, Senator Rockefeller (D-VA) indicated that Congress might consider amending COPPA to reflect childrens use of mobile technology and social media. On May 13, 2011, Representatives Markey (D-Mass.) and Barton (R-Texas) released a discussion draft of the Do Not Track Kids Act of 2011. The bill would give parents the choice of withholding consent for tracking and targeting ads to children who are under 13. In February 2011 several democratic lawmakers, including Representative Markey, sent a letter to the FTC calling for agency review of in-app purchases on Apple and Google/Android devices by children after a Washington Post article highlighted the ability of children to guess parental passwords and use them to incur substantial, unauthorized charges. Meanwhile, technology continues to evolve at an ever-accelerating pace, posing challenges to policymakers who are trying to craft a privacy regulatory framework for children that can adapt to technology without erecting unintended obstacles to innovation. CONCLUSION. As industry awaits anticipated changes to COPPA, operators of general audience websites and mobile apps may already need to add COPPA to their compliance checklists. Operators should be familiar with COPPAs current requirements while anticipating likely changes, and review their data collection, retention, sharing and use practices with an experienced professional to identify and address potential compliance issues. |
||||||||||
FTC
Releases 10-Year Regulatory Review Calendar & Announces Changes to
Regulatory Review Process According to a statement on the FTCs website, the purpose of the review is to address changes in technology and the marketplace by placing each rule on a 10-year review calendar, during which public comment is sought on the following questions: 1) what is the economic impact of the rule; 2) is there a continuing need for the rule; 3) are there possible conflicts between the rules and state, local, or other federal laws or regulations; and 4) has the rule been affected by any technological, economic, or other industry changes. The calendar identifies the rule, the year it is slated for review and the rules review status.
Privacy and consumer data items on the calendar that are currently under
review include:
1) use of pre-notification of the Negative Option Plans Rule, 2) Mail
or Telephone Order
Merchandise Rule, 3) the Childrens Online Privacy Protection Act
Rule, and 4) Telephone Order
Merchandise Rule. Privacy and consumer data rules slated for review starting
in this year
and continuing through 2020 include: 1) standards for safeguarding customer
information, 2)
disposal of consumer report information and records rule; 3) Red
Flags Rule; 4) Privacy of
Consumer Financial Information Rule; and 5) Health Breach Notification
Rule (for non-HIPPA
covered entities).
The FTC also announced that it also examining whether changes are necessary
to its regulatory
review process. The agency is seeking public comment on various aspects
of that process,
including how often it should review its rules and guides, and what changes
can be put in place
to make the process more responsive to the needs of consumers and
businesses.
Your business could be impacted by either of these proceedings. Reference
to the FTC regulatory review calendar will give you a good indication
of agency action relevant to your business, including comment periods
for proposed rules or guides. Participation in these proceedings is an
important opportunity to potentially impact the regulatory outcome by
educating regulators about how contemplated action could affect your industry.
The regulatory review calendar is also a useful resource for alerting
you to rules or guides you may have been unaware of but that your business
might be subject to. Changes to the FTCs regulatory review process could also impact your business and you may want to consider submitting comments that alert the agency about the possible impact of its regulatory review procedures on your business. |
||||||||||
FTC
Announces $1.8 Million Settlement for FCRA Violations
The Complaint alleged that Teletrack created a marketing database of consumer information that the company collected through its credit reporting business. The information included lists of consumers who had applied for non-traditional credit products -- for example payday and non- prime automobile loans. Teletrack sold this information to marketers and other third parties who in turn wanted to use it to target distressed customers in need of alternative sources of credit. The FTC alleged that these marketing lists were credit reports subject to the FCRA because they contained information about consumers creditworthiness. The FCRA makes it illegal to sell credit reports without a specific permissible purpose. Since marketing is not a permissible purpose under the Act, the FTC charged Teletrack with violating the Act. In addition to the civil penalty, the settlement order requires Teletrack to only give credit reports to entities that Teletrack has reason to believe have a permissible FCRA purpose for obtaining them, or as otherwise allowed under the FCRA. The Order also imposes certain reporting and record- keeping requirements on Teletrack to ensure Teletracks compliance with the terms of the settlement. This case is another example of the proactive approach the FTC is taking to protect consumer privacy in a variety of contexts. The FTC has brought a number of high profile enforcement actions that send a clear signal that the agency believes it has the tools it needs to protect privacy, even as Congress considers expanding the FTCs rulemaking and enforcement authority. This case can also potentially be seen as a precursor to heightened FTC enforcement actions involving financial services products as it awaits the anticipated spate of related rulemakings by the newly created Consumer Protection Financial Protection Board which, by statute, vests enforcement in the FTC. Finally, this case should not be seen as limited solely to data brokers that sell consumer financial or credit information for commercial purposes; data brokers that sell this information for other uses -- notably employee screening and background checks are also subject to the FCRA. Accordingly, companies that sell consumer data for commercial and other purposes should be familiar with the restrictions that the FCRA places on the sale of consumer data to third parties. |
||||||||||
Supreme
Court Invalidates Vermonts Physician-Privacy Prescription Drug Marketing
Law As we noted previously, the case, Sorrell v. IMS Health, Inc.,2 could have far-reaching implications for the manner and extent to which government may restrict the commercial use of non-public personal information. This case could also mark the beginning of a trend by businesses that acquire and use personal data to challenge government privacy regulation on First Amendment grounds. Vermonts Prescription Confidentiality Law3 was enacted in 2007 to address the legislatures concerns with the marketing practice known as detailing, which allows marketers to use physician prescription information to determine which drugs are likely to appeal to doctors and how best to present a particular sales message. Detailing involves the purchase of physician prescription data by data mining companies from pharmacies (who are required by law to maintain records about both the prescribing physician and the patient). The data is then combined with information from other databases and sold to drug companies for use in brand-name prescription drug marketing campaigns directed at individual physicians. The data is also used to monitor and evaluate marketing campaigns by drug companies and individual detailers. While the data did not include patient names, it did include information that identified specific physicians. Privacy advocates contended that the information could easily be combined with other available data to identify individual patients and disclose their prescription drug histories. Vermonts law attempted to address this practice by prohibiting any health insurer, self-insured employer, electronic transmission intermediary or pharmacy from selling or otherwise using prescriber-identifiable information for marketing or promoting a prescription drug without the doctors consent. The law further prohibited pharmaceutical manufacturers and marketers from using prescriber-identifiable information for marketing or promoting a prescription drug without prior physician consent. In reaching its decision the Court evaluated the laws constitutionality on First Amendment (as opposed to privacy) grounds. The Court first concluded that the law warranted heighted scrutiny because it imposed content- and speaker-based burdens on protected expression. Applying this standard, the Court noted that the law prohibited pharmaceutical companies from using prescription data for marketing uses but permitted its acquisition and use for other types of speech by other speakers. Given the widespread availability and many permissible uses of the data, the Court concluded that the States asserted interest in protecting physician confidentiality was undermined by the States failure to narrowly tailor the statute, thereby preventing Vermont from justifying the burdens that the law imposed on protected expression. Rather than protecting privacy, as asserted by Vermont, the Court concluded that the law was intended to suppress a specific type of speech by specific speakers that the legislature looked upon with disfavor. Interestingly, the Court emphasized that had the legislature imposed a more comprehensive privacy regime, for example by restricting all disclosure of the data except in only a few well justified circumstances, it would have viewed the law through quite a different lens. Although this case arose in the context of medical privacy, its outcome could provide useful guidance to legislators about how to craft laws that are intended to address broader privacy concerns that can withstand first amendment scrutiny. By the same token, the decision offers businesses that rely on personal data for marketing or targeted advertising a potentially new basis for challenging privacy legislation intended to curtail those practices. 1
See IMS Health Inc. v. Ayotte, 550 F.3d 42 (1st Cir. 2008), cert.
denied, 129 S. Ct. 2864 (2009) and IMS Health Inc.
v. Mills, 616 F.3d 7 (1st Cir. 2010). |
||||||||||
Location
Privacy Protection ACT of 2011 Introduced in Senate The bill attempts to address privacy concerns by closing purported loopholes in existing federal law (specifically the Electronic Communications Privacy Act and certain provisions of the Cable Act and Communications Act), by requiring covered companies and businesses to obtain express consent from users of smart phones, I-pads and similar devices before collecting and sharing information about those users location with third parties. The measure would also create criminal penalties for apps that knowingly disclose geolocation information while knowing and intending that domestic violence or stalking will occur as a result of the disclosure and criminalizes the knowing and intentional aggregation and sale of location data of children 10 years old and younger. The bill also calls for certain measures to be undertaken by law enforcement involving the study and training of dating and domestic violence crimes involving location- based technology. This bill is one of a number of privacy-related bills pending in Congress. Prior efforts to enact comprehensive privacy legislation have been unsuccessful. Deficit reduction, employment and defense matters continue to be a principle focus in Congress and it is unclear whether discrete privacy measures like this one will get traction and eventually become law. Nevertheless, a great deal of attention has been drawn to mobile privacy as a result of high profile data breaches, location information related lawsuits, and calls for action by privacy advocacy groups. Accordingly, mobile platforms, device manufacturers and app developers should monitor developments involving this bill and opportunities to impact the regulatory environment should it become law.
|
||||||||||
Payment
Card Industry Security Council Issues Guidelines for Virtualized Environments If your business retains payment card data in a cloud based or virtualized environment these guidelines will apply. Noncompliance could affect your ability to process payment card transactions. That said, the Council acknowledged that compliance may be difficult to achieve for some businesses. Merchants or other businesses that engage cloud service providers to store credit card data should conduct an internal PCI DSS compliance evaluation as well as perform due diligence on cloud service providers to ensure PCI-DSS compliance before engaging these service providers. Merchant due diligence should include an assessment of the virtual environment to ensure that the service provider implements procedures recommended by the Council, including network and access controls, segmented authentication, encryption and logging the goal of which is to quarantine each of the service providers customer environments from the others. To facilitate a merchants due diligence about a virtualized environment, the Guidelines impose an obligation on service providers to demonstrate rigorous evidence of adequate controls, including the results of the service providers own PCI DSS evaluation. These disclosures should also assist merchants in negotiating for contractual language to further ensure compliance. In addition to conducting due diligence, merchants or other businesses that use cloud- based services should :
One interesting question is the interplay, if any, between the virtualization guidelines and the Payment Application Data Security Standard (PA DSS). This standard applies to mobile transactions and provides standards for developing applications that store, process or transmits cardholder data. Ten days after issuing the virtualization guidelines, the Council issued clarification about what type of mobile application payments will be subject to the PA DSS as well as eligibility for PA DSS validation. However, not all mobile application payment transactions are eligible for PA-DSS validation. According to a statement on the Councils website, the Council focused on identifying risks associated with validating mobile payments under the PA DSS standard. A major risk factor involved the ability of applications environment to support PCI DSS compliance. With the advent of in-app transactions, digital currency and mobile wallets, app developers should be familiar with both standards and understand any compliance obligations that could be imposed by each. The Councils virtualization guidelines can be seen as a useful tool for merchants and service providers for achieving PCI DSS compliance. Well publicized breaches of cloud-based environments demonstrate the timely nature of these guidelines. The extent of their application to emerging technologies remains to be seen.
|
||||||||||
Operators
of Online Virtual Worlds Agree to $3 Million Settlement in FTC COPPA Enforcement
Action This settlement signals that FTC COPPA enforcement is not being put on hold pending the outcome of proceedings that were initiated in 2010 to consider proposed changes to the COPPA Rule. (The changes attempt to address the impact on childrens privacy of the widespread adoption of mobile communications technologies over which children are able to access websites, including virtual worlds, and download mobile applications.) The FTC charged that Playdom, Inc., and one of its executives operated online virtual worlds where users could access social games and engage in numerous activities, including 2 Moons, 9 Dragons and My Diva Doll. The violations occurred between 2006 and 2010, including after Playdoms acquisition of the sites developer studio in May 2010. At least one of the sites was directed to children; the others were general audience websites that attracted hundreds of thousands of children. The complaint alleged that the defendants violated the COPPA Rule because they: 1) collected childrens ages and email addresses at registration, and then enabled children to publicly post on personal profile pages or online community forums personal information, including their full names, email addresses, instant messenger IDs and location information; and 2) failed to provide proper parental notice and obtain prior verifiable parental consent. The FTC further alleged that the defendants violated the Federal Trade Commission Act because Playdoms privacy policy misrepresented that the company prohibited children under 13 from posting personal information online. Operators of both childrens and general audience websites should be familiar with their information collection, sharing, retention and data security practices, and ensure that those practices are accurately reflected in the sites policies. Policies should be reviewed by an experienced professional and revised to reflect changes brought about by the adoption of new products, services, technologies or platforms. In addition, this case appears to hold entities that acquire or otherwise take ownership of a commercial website liable for COPPA violations that occurred prior to a merger or acquisition. Accordingly, COPPA Rule compliance should be incorporated into the due diligence analysis undertaken in connection with any change in ownership of a commercial website.
|
||||||||||
CA
PUC Smart Grid Privacy Decision In addition to proposing a framework for protecting privacy and addressing certain jurisdictional issues, the decision appears to set the stage for real time pricing by requiring covered utilities to provide customers with access to their smart meter data, as well as cost, usage, pricing and bill forecast information, and notification when a rate tier is exceeded. The CPUCs decision comes at a time when smart grid technologies are being integrated with legacy utility systems to enable consumers to monitor and control energy consumption and help utilities to forecast use and manage load. The proposed framework takes into account a smart grid ecosystem consisting of numerous relationships and data uses. These relationships and data uses are reflected in a number of key terms that are defined as follows:
The proposed framework also incorporates widely recognized Fair Information (FIP) Principles of Notice/Transparency/Consent, Purpose Specification, Individual Access & Control, Data Minimization, Use & Disclosure Limitation, and Data Integrity/Security. Those principles and corresponding rules are generally summarized as follows: Use and Disclosure Limitations. Covered utilities may collect and use covered information for primary purposes without customer consent. Subject to certain exceptions all third parties must have prior customer consent even for primary purposes. Those exceptions include:
Use of covered information for secondary purposes requires prior express written customer consent, which, if granted, expires after two years. Transparency. Covered entities would be required to provide customers with clear notice of the collection, use, retention and disclosure of all categories of covered information. Upon request, utilities would be required to inform the CPUC who it is sharing covered data with. Access and Control. Utility customers would be able to have access to their covered information at least at the same level of detail that the covered entity provides to third parties and be able to amend any inaccuracies. Data Minimization. Covered entities would be permitted to collect only as much covered information as is reasonably necessary, retain it only as long as reasonably necessary, and disclose it to third parties only as much as is reasonably necessary. Data Integrity/Security. Covered entities would be required to ensure the reasonable accuracy of covered information that they collect, use and disclose. They would also be required to create and implement reasonable administrative, technical and physical safeguards to protect covered information. The rules also establish a breach notification protocol and impose certain privacy and data security accountability and audit reporting requirements on covered entities. A subsequent phase of the proceeding will address extension of the rules to gas companies, community choice aggregators, and non-investor owned utilities. The proposed decision can be viewed here. The PUC was accepting comments on it until May 26, 2011.
|
||||||||||
|
||||||||||
Copyright © 2010 St. Ledger-Roty & Olson, LLP. | ||||||||||