Proud sponsor of 1410 Q Street: DCs Innovation Hot Spot! |
|||||||||||
PRIVACY
& INFORMATION LAW UPDATE |
|||||||||||
|
|||||||||||
|
|
||||||||||
Living
in the Future: Baking Privacy into Product Development & Investor
Due Diligence Many
founders recognize the importance of privacy concerns, but believe that
their good intentions and ethical values alone will protect their users
privacy and inoculate them against a lawsuit or enforcement action. Founders
tend to focus on needs that are perceived as more pressing, and perhaps
employ interim measures to address privacy such as adopting privacy policies
from other websites. But there is no such thing as a one-size-fits-all
approach to privacy. On the other hand, many investors tend to focus on
evaluating traditional factors for an investment including the nature
of the venture and market, managements expertise and time frame
for profitability -- without recognizing that privacy is integral to the
analysis. Unfortunately, failure to account for privacy considerations at the outset can potentially shutter a promising new venture or convert an investment opportunity that initially seemed appealing into a costly one. Addressing privacy at the product development phase can not only help minimize costs, including legal costs and potential damage to a brand, but can also significantly assist a new venture in attracting funding (and ultimately users) by differentiating itself from competitors. In January, 2011, the Wall Street Journal profiled a search engine that began marketing itself on the basis of its privacy practices, including a promise not to store any personal information or send search data to other sites. (Whether or not the venture adheres to those promises or changes its practices is a separate matter.) By the same token, incorporating a thorough understanding of privacy laws and trends at the due diligence phase can aid investors in differentiating among the risks of potential investments. The Washington Post recently noted that privacy is now a line item in business plans, and quoted an investor with a large venture capital fund emphasizing that privacy policies have become integral to his decisions about new technology investments. One thing is certain: an approach that relies solely on a theoretical commitment to privacy ideals (or an approach that assumes privacy is someone elses department) should be considered high risk. Recent FTC enforcement actions against established technology companies demonstrate that even enterprises with a deep understanding of privacy concerns are at risk for a broad range of consequences -- including substantial legal costs, damage to brand, and extensive, long term government oversight of operations -- for failing to address privacy at the product development stage or to align data collection, retention and use practices with privacy policies. For example, in March 2011 the FTC settled an action against Google that arose from last years launch of Googles social network Buzz. The Complaint1 alleged that Gmail users were automatically enrolled in Buzz without an adequate opportunity to give informed consent, despite privacy policy promises that users could control access to their information. Even though they were initially given the choice of viewing Buzz before going to their Gmail In-boxes, Gmail users were not informed that doing so would automatically make their contacts visible to other Buzz users; certain Buzz features were activated anyway for users who chose this option. Moreover, Gmail users were connected to Buzz users on the basis of the number of emails exchanged between them information that was made public by default. Under the settlement Google is required to adopt a privacy by design approach to address privacy risk at the research and development stage for new products and services, and manage risk for existing offerings. Significantly, the settlements terms remain in effect for 20 years. Online businesses (including mobile applications) also face potential civil lawsuits if they fail to recognize and sufficiently address privacy risks. For example, in late 2010, a class action2 was filed in the Northern District of California against Apple and a number of app developers. The Complaint alleges that apps on Apples iPhone and iPad collected personal information from users and transmitted that information to third party advertising networks. The personal data claimed to have been collected included the devices unique identifier (the UDID), as well as age, gender, and location. The plaintiffs allege that these practices violate Apples published policy that prohibits apps from transmitting personal data without the users consent. Although rare, the threat of criminal proceedings surfaced earlier this year when reports indicated that a number of entities, including Pandora, were served with federal Grand Jury subpoenas in connection with an investigation about information sharing practices of mobile apps that run on the Apple and Android platforms. (Pandora is not currently the target of the investigation.) Defending against enforcement or other legal proceedings may be considered a cost of doing business for established entities, but for an aspiring start-up, such costs could be prohibitive. These developments leave no doubt that both innovators and investors can no longer afford to kick the bucket down the road when it comes to privacy. Instead, it is essential that each fully understand the current privacy legal and regulatory environment and incorporate that understanding early in the development or investment process. The earlier in the product development process that a new venture addresses privacy, the more likely it will be to attract funding at early or expansion stages. On the other hand, ventures that are forced to address a privacy issue that was initially deferred or overlooked may be less attractive to investors seeking late-round funding opportunities with less uncertainty. Likewise, investors who incorporate privacy into their due diligence processes will be more likely to avoid investments fraught with uncertainty and risk. 1
In the matter of Google, Inc. No. 102 3136 (March 30, 2011) |
|||||||||||
Privacy
Legislation Unveiled in Congress Senate Bill. On April 12, 2011, Senators Kerry (D-MA) and McCain (R-AZ) introduced the Commercial Privacy Bill of Rights Act of 2011. This bills move the U.S. closer to the European notice and consent approach to privacy protection but falls short of treating privacy as a human right. Key provisions would:
House Bill. On April 13, 2011, the Consumer Privacy Protection Act of 2011, introduced by Representative Cliff Stearns (R-FLA), is similar to the Kerry-McCain measures but goes farther by recognizing that the combination of both off- and online data also poses privacy risks. Like the Kerry-McCain bill, it applies to covered entities. The bills definitions are similar to those in the Kerry McCain bill but broaden definition of PII to include birth date and electronic address (including IP address) if combined with enumerated data to identify an individual. Neither measure provides for a private right of action, giving the FTC and state attorneys general the sole right to seek redress for violations of the law, thus mirroring the framework embodied in other statutes, such as Childrens Online Privacy Protection Act2. Key provisions would:
Both measures are silent on do not track (including the technology-based approach that is currently being promoted by the FTC). Such a mechanism could, however, be implemented by the FTC under rulemaking authority provided for in the Kerry-McCain bill. At a press conference introducing the bill, Senator Kerry indicated that a do not track mechanism seemed inconsistent with efforts to get industry and consumer support for the measure and in any event is unnecessary. At the same time, public remarks by FTC Chairman Jon Liebowitz and other senior agency staff strongly suggests that, with appropriate authority, it is a matter of if not when a do not track mechanism will be adopted. It remains to be seen whether these bills will be made a priority given other pressing matters pending in Congress. Nevertheless, if enacted, businesses should recognize the strong likelihood that they will be confronted with new regulations that could add significant compliance costs to operations. Accordingly, action on these bills should be closely monitored and opportunities to affect the legislative outcome and regulatory environment should be identified. 1
Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework
for Businesses and Policymakers Preliminary FTC Staff Report,
December 2, 2010. |
|||||||||||
Federal Court Allows Data Breach Class Action against Social Media Application
Provider to Proceed Without Damages Allegations The action, filed in U.S. District Court for the Northern District of California, arose from a December 2009 data breach that exposed unencrypted user data, including social media login credentials, of 32 million RockYou users. The breach was caused by a security problem with RockYous SQL database known as an SQL injection flaw. The flaw enabled hackers to introduce malicious code into RockYous network to access its users webmail accounts and social media login credentials. RockYous products include applications that enable users to share photos, post text on a friends social media page, or engage in online social gaming. Users sign up to use these applications on rockyou.com. During the sign-up process users are asked to provide a valid e- mail address and password that RockYou stores in its database. Users may also be asked to provide a user name or password to access a particular social network. The named plaintiff signed up for a RockYou photo-sharing application and provided his e-mail address and password. The Complaint alleges that RockYou promised through its online privacy policy that it uses commercially reasonable, physical, managerial and technical safeguards to preserve the integrity and security of your personal information. According to the plaintiff, despite this promise and widespread industry knowledge of SQL injection flaws, RockYou failed to implement commercially reasonable methods, such as hashing, salting or other common data protection measures to prevent the data breach that lead to the lawsuit. Instead, RockYou allegedly stored its users personal information in clear or plain text format. Following a warning from RockYous database security company about the SQL injection flaw, RockYou brought its site down until a security patch was in place. (The Complaint alleges that RockYou waited a day to act.) RockYou subsequently acknowledged that the database had not been maintained in a manner consistent with standard industry security protocols. Although the court dismissed five causes of action, it allowed the lawsuit to proceed based on plaintiffs claims of breach of contract, breach of implied contract, negligence and negligence per se. In allowing those claims to go forward, the court noted the paucity of controlling authority about the legal sufficiency of the plaintiffs damages theory. Specifically, the court observed that the context in which plaintiffs theory arises i.e., the unauthorized disclosure of personal information via the Internet is itself relatively new, and therefore more likely to raise issues of law not yet settled in the courts. For that reason . . . the court finds plaintiffs allegations of harm sufficient at this state to allege a generalized injury in fact. The courts observation is similar to those made recently by other courts that have applied existing laws to new or emerging communications technologies, and is another indication that judges continue to seek to better understand these technologies and the behaviors and uses associated with them. The courts ruling gives the Plaintiffs a green light to proceed against RockYou under a breach of contract theory for failing to adhere to its privacy policy and its promises to take adequate steps to secure user data. Importantly, however, the court indicated that it will dismiss the claims for lack of standing if it becomes apparent through discovery that there is no legal basis upon which the plaintiff can demonstrate tangible harm arising from the unauthorized disclosure of personal information. The ruling in this case was issued in the context of a motion to dismiss and therefore is not a decision on the merits of plaintiffs claims. Nonetheless, it is potentially significant for online businesses that retain user data, even if the databases are hosted by third parties. In the event of a breach, businesses can expect to incur substantial legal fees to defend actions at least through the discovery phase of a case. At a minimum, businesses should be familiar with the latest industry standards for protecting user data, including the latest forms of encryption or other data protection methods. In addition, prompt action should be taken if a business is notified of a security problem by its data hosting vendor. 1
Claridge v. RockYou, Inc., Case No. 4:09-cv-o6032-PJH (D. N.D. CA,
April 11, 2011). |
|||||||||||
U.S. Securities Exchange Commission Imposes First Fine Under Privacy Rule
The SEC alleged that GunnAllens President authorized the National Sales Manager to transfer customers information from more than 1600 accounts to his new employer without giving the customers reasonable notice to opt out. Instead, customers were informed after the fact in violation of the rule. Names, addresses, account numbers and the values of customers assets were downloaded to a thumb drive and transferred to the Sales Managers new employer following his resignation from GunnAllen. The transfer was performed as GunnAllen was winding down its affairs. The SEC also alleged that the firms chief compliance officer failed to implement adequate privacy and data security measures to protect customer information despite a number of earlier serious security breaches that occurred over a four-year period. Other breaches involved the theft of three laptops belonging to the firms registered representatives and access to its e-mail accounts using stolen login credentials by an employee who had been terminated from the firm. The SEC found that GunnAllen had maintained written procedures to protect customer information. However, the Agency found that the procedures were inadequate: they failed to instruct supervisors and company representatives about how to respond to a breach or comply with the Safeguard Rule. Moreover, they were not revised after the incident involving the terminated employee to address similar scenarios in the future. Financial services firms should familiarize themselves with Regulation S-P and the orders in this matter. Firms should also undertake a thorough review of their data security policies and procedures to 1) ensure that adequate measures are in place to prevent a departing employee or officer from taking customer information with them to a new employer in violation of the rules notice and opt- out requirements; and 2) assess whether their current procedures for responding to actual or possible breaches are in line with the rule and this decision, and then revise them as warranted. The three separate Cease and Desist Orders can be found here here and here. |
|||||||||||
U.S.
Department of Education Proposes Changes to Family Educational Rights
and Privacy Act Rule If finalized, the immediate result will be a move by public schools to strengthen privacy and data collection practices and policies, and align them more closely with approaches for protecting personal privacy in other sectors. Third parties that contract with public schools and universities to provide educational, research or hosting services that require access to and use of student information could also be affected. These entities should undertake a review of their data collection, sharing and security practices and policies now in anticipation of new obligations under the revised rule. The proposed changes include:
The Department of Education is accepting public comment on the NPRM until May 23, 2011. The rule is expected to be finalized later this year. 1
20 U.S.C. § 1232g; 34 CFR Part 99. |
|||||||||||
FTC & Google Reach Settlement Involving Buzz Social Network The FTC had charged Google with using deceptive tactics that violated its own privacy policy promises when it launched Buzz last year. It also alleged that Google made certain misrepresentations that led users to believe that the handling of their personal information from the EU was in compliance with the U.S-EU Safe Harbor framework specifically that Google failed to give consumers required notice and choice before using information for purposes other than that for which it was collected. The settlement is notable for its scope, and for its use as a vehicle through which the FTC seems to be implementing key aspects of the framework for protecting consumer privacy that were previewed in its 2010 privacy report2. The settlement not only bars Google from misrepresenting the privacy or confidentiality of its users information; it also requires Google to get affirmative, opt-in consent for new or additional uses and sharing of personal information that were not disclosed when the information was first collected. Moreover, the settlement requires Google to implement a comprehensive privacy program that adopts a privacy by design approach to address privacy risk at the research and development stage for new products and services, and manages risk for existing offerings. The privacy program will be subject to regular, independently conducted privacy audits that are subject to FTC review. Google is further barred from misrepresenting its compliance with the U.S.-EU Safe Harbor program. The Complaint3 alleged that Gmail users were automatically enrolled in Buzz (which was launched as an adjunct to Gmail) without an adequate opportunity to give informed consent, despite privacy policy promises that users could control access to their information. Even though they were initially given the choice of viewing Buzz before going to their Gmail In-boxes, Gmail users were not informed that doing so would automatically make their contacts visible to other Buzz users; certain Buzz features were activated anyway for users who chose this option. Moreover, Gmail users were connected to Buzz users on the basis of the number of emails exchanged between them information that was made public by default. Google was alleged to have engaged in a deceptive act or practice because: 1) Gmail users information was used for purposes other than those for which it was collected (i.e., social networking instead of e-mail) without obtaining prior consent as promised in the privacy policy and 2) Google failed to disclose the full extent of the methods by which Buzz connections were made. Following the ensuing uproar, Google moved to offer improved privacy tools. Nevertheless, the FTC initiated the investigation that led to the Settlement.- This is the first time the FTC has required the target of this type of action to create and implement a comprehensive privacy program of this scope and nature. The requirement is not limited to Buzz, but applies to all of Googles products and services; oversight will be a substantial job that could require significant resources. The settlement also demonstrates the reach of the FTCs enforcement power in the general absence of rulemaking authority, lending support for arguments in other arenas that privacy legislation is not warranted because the agency has shown that it will use its enforcement authority to protect privacy. The settlement is not final. The FTC is accepting public comment through May 2, 2011. 1
Agreement Containing Consent Order In the Matter of Chitika, Inc,
File No. 1023087 (March 15, 2011); Stipulated Final Order for Permanent
Injunction and Other Equitable Relief; In the Matter of Twitter, Inc.,
No.092 3093 (March 11, 2011). |
|||||||||||
Federal Court Rules Can-Spam Applies to Certain Social Media Advertising
Communications Facebook alleged that MaxBounty: 1) engaged in deceptive and misleading marketing practices by inducing Facebook users to send commercial electronic messages to their Facebook friends, and 2) misled its affiliates by implying that various advertising campaigns, including the creation of fake Facebook pages, are approved by Facebook. Instead, MaxBounty advises and assists its affiliates to create the pages that are, in fact, advertisements. These pages claim that if a Facebook user completes a 3-step registration process, he or she will receive an iPad, iPhone, or other similarly high-end product. The registration process requires a Facebook user to 1) become a fan of the page on Facebook, 2) invite all of their friends to become a fan whereby users were urged to copy and paste messages that automated the invitations sent to their friends Facebook walls, newsfeeds, and inboxes, and 3) give personal information to the advertised company. Users who completed the process did not receive the promised gift but were instead directed to a third-party website (via an auto-redirect from a MaxBounty-registered domain) to complete additional steps. These steps included signing up for sponsor offers such as paid magazine subscriptions, movies and music, and disclosing personal information. MaxBounty was paid by the third-party advertiser based on the number of users that completed the first 3 steps. The Court rejected MaxBountys argument that the communications were not e-mail messages within the meaning of the Act. The Court cited two cases3 in the Central District of California involving another social network to construe the term electronic mail messages. These courts broadly interpreted the term to effectuate congressional intent to mitigate the number of commercial communications that overburden the Internets infrastructure. Thus, the courts concluded that phishing messages delivered to MySpace.com inboxes (as opposed to traditional e-mail domain boxes) were e-mail messages within the meaning of the Act. The Court further cited the MySpace courts reasoning that Congress was aware of various forms of electronic communications when it drafted the Act and that the statutes plain language includes alternative forms [of e-mail] while recognizing that the most commonly used form of an electronic address was the traditional e-mail with a local part and domain part. The Court next considered whether the Facebook pages were sent to a unique electronic mail address, or a destination [] to which an electronic mail message can be sent []. The Court found it significant that under MaxBountys scheme, a user is instructed to effect transmission of Facebook pages to all of his or her Facebook friends. The pages are transmitted to unique destinations including the wall, newsfeed or home page of the users friends, the Facebook inbox of the users friends, and to users external e-mail addresses. Finally, the court reasoned that Facebooks routing of these communications implicates the volume and traffic infrastructure issues that CAN-SPAM seeks to address. Accordingly, the Court concluded that a determination that the communications are electronic mail messages within the meaning of the Act is consistent with the Acts stated purpose, and that Facebooks CAN-SPAM Act clam is sufficiently pled for purposes of surviving a motion to dismiss. This action seems to be an effort by Facebook to protect its brand by limiting the ability of commercial entities to engage in deceptive practices in a manner that could confuse users about who is responsible for sending unsolicited commercial communications through Facebook. In light of this ruling, a message promoting a product sent by a Facebook user to the users friends newsfeeds or walls (for example when a user likes a product or the products fan page) could theoretically be construed as a commercial electronic message subject to CAN-SPAMs requirements. However, such an outcome seems unlikely given the purpose of Facebooks CAN-SPAM Act claim (apart from the fact that liking a product is a Facebook sanctioned activity). Moreover, a court would be unlikely to sustain a CAN-SPAM action like this one against the user because of the Acts provisions regarding 1) deceptive practices and (2) standing. The Federal Trade Commissions guidance on CAN-SPAM indicates that transmission of mass commercial e-mail messages is not necessarily a violation of the Act, so long as certain measures are taken to avoid conveying deceptive or misleading information.4 This includes avoiding misleading consumers as to the origin of the message, clearly identifying the message as an ad, and honoring opt-out requests promptly. In the MaxBounty case, Facebook alleged that MaxBounty not only sent a high volume of messages through Facebooks network, but that the messages and advertisements were misleading. Facebook Pages has explicit terms of use that prohibit deceptive activities by advertisers that track many of CAN-SPAMs requirements.5 It is unlikely that a Facebook user could assert a viable CAN-SPAM claim against Facebook (even if Facebook Page lacked such explicit terms of use) because of standing. Under CAN-SPAM, a private right of action is only available to Internet Access Services who have been adversely affected by a violation of the Act. Individual users cannot bring a CAN- SPAM action (although they can register a complaint with the Federal Trade Commission). The Ninth Circuit has addressed the scope of an Internet Access Service and ruled that it encompasses more than just ISPs, including for example, Facebook and MySpace, but excludes small website operators with only a nominal role in providing Internet-related services.6 To have standing to bring a CAN-SPAM action, the Complainant must also have suffered a harm contemplated under the Act, which the Court described as bandwidth, hardware, Internet connectivity, network integrity, [and] overhead costs. So, a Facebook user would not have standing to file suit under the Act, and so long as Facebook sanctions the advertisers activity, would be unlikely to be able to file a CAN-SPAM action against an advertiser. 1 Ari
Moskowitz is a third-year law student at George Washington University
and a Law Clerk at St. Ledger-Roty & Olson LLP. He previously
interned at the NTIAs Internet Policy Task Force, where he worked
on the Department of Commerce Privacy Green Paper that was
released in December, 2010. |
|||||||||||
OKLAHOMA HOUSE OF REPRESENTATIVES PASSES SMART GRID DATA PRIVACY MEASURE
The measure follows the release of the U.S. Department of Energys 2010 Smart Grid Data Access and Privacy Report,2 in which the Agency urged states to carefully consider and address privacy issues, including the conditions under which consumers can authorize third party access to CIEUD. The Report also called on states to enact laws or regulations to safeguard CIEUD and establish mechanisms whereby consumers can access their data or consent to its disclosure to third parties. The Oklahoma House of Representatives appears to have adopted the privacy regulatory approach that has been implemented for other sectors (e.g., health care, financial services and telecommunications) that collect highly granular, sensitive consumer data that can be used for targeted marketing, profiling, surveillance or other unwanted purposes. Under this regulatory approach certain customer data may be used for specified business purposes, and certain disclosures may be made to affiliated third party service providers for limited uses without requiring prior customer consent. Thus, under the Oklahoma Act utilities would be permitted to use CIEUD without obtaining prior consent for such business purposes as providing services, billing, infrastructure support, developing, enhancing, marketing or providing energy related products or services and promoting public policy objectives, including energy efficiency. A utility would also be able to disclose CIUED to affiliates and third parties that assist in providing services and implementing business objectives without obtaining prior customer consent. The affiliate or third party would be required to agree in writing to use the data only for authorized purposes and to safeguard the datas confidentiality. The law is silent on the conditions for disclosure of CIEUD for other uses, including research and development. Examples of R&D uses include addressing technical shortcomings involving the Smart Grid systems reliability, security architecture, or operational and scalability needs. Interestingly, customer is defined under the Act to include not only individuals, but corporate and other legal entities, including residential, commercial or industrial entities that receive service from an electric utility. In addition, the Oklahoma House of Representatives seems to have resolved the current debate over customer usage data ownership in favor of the utility, while imposing a duty to provide customers with reasonable access to usage data. Usage data includes information relating to 1) the amount of electricity consumed at a residence or customer premises; and 2) the characteristics of consumption. Characteristics of consumption can include the date and time of consumption and data about appliances that use electricity. The Act governs access to and disclosure of usage data in identifiable and aggregate form. Identifiable data is information that can identify a customer includes information that is uniquely associated with a customer, including name, social security or taxpayer ID number, street address, telephone number, electric utility account number, meter number or financial account information. Identifiable data is not limited to individuals but includes business or legal entities that are electric utility customers. Aggregate data is usage data from which all identifying information has been removed to minimize the potential that the data can be associated with an individual customer. If enacted, Smart Grid vendors, service providers and other organizations with operations in Oklahoma, will want to be familiar with the law and understand its interplay with applicable federal laws or guidelines, and other generalized consumer protection laws to ascertain the extent to which information can be shared, including for purposes not specified in the Act. 1 H.B.
1079. |
|||||||||||
FTC
Settlement Signals Close Scrutiny of Privacy Policy Representations This case is one of several recent actions1 that indicate the FTC is making good on its promise to escalate enforcement in the area of consumer privacy. The settlements are being announced as website operators and online businesses implement measures to their bring data collection, retention and use practices in line with the framework embodied in the FTC Staffs December 2010 Privacy Report. These measures include context and product specific privacy protections, do not track or opt-out tools, privacy preference dashboards and privacy icons. It is evident that the FTC remains troubled by tracking technologies (including those that may not capture what is currently considered to be personally identifiable information). Companies that implement enhanced privacy protections still risk attracting the FTCs attention if they dont get it right -- both in terms of functionality and transparency. Consumers visiting Chitikas website were given the choice of opting out of being tracked by clicking an opt-out button on Chitikas site. Prior to March 1, 2010, consumers who clicked the button were served an opt-out cookie, after which they received a message stating you are currently opted-out. Between May 2008 and March 1, 2010, the opt-out cookie automatically expired after 10 days without notice. When consumers subsequently visited websites in the Chitika network, Chitika set new tracking cookies or received tracking cookies that were set before the consumer chose to opt-out. After being contacted by the FTC, Chitika changed the expiration date of opt-out cookies from 10 days to 10 years effective March 1, 2010. The FTC nevertheless instituted the action that led to the settlement. The Settlement Order imposes numerous obligations, including: 1) Requiring Chitika to make certain disclosures about its information collection, sharing and use practices, and the operational effect of selecting the opt-out tool both before the FTC initiated its investigation and going forward; 2) requiring targeted ads to include a hyper-link to a clear opt-out mechanism for at least 5 years; 3) barring Chitika from using or selling information that can be associated with a Chitika users computer or device obtained by Chitika prior to March 1, 2010 and requires that any such information be deleted from its files; and notably, 4) requiring the Order, and its remedial requirements, to remain in effect for 20 years. The FTC seems to be paying close attention to privacy policy promises. Earlier this month it finalized a settlement with Twitter for security breaches that occurred in 2009. Hackers figured out Twitter employee passwords, accessed user accounts and passwords and then sent Tweets from several accounts, including President Obamas. The FTC alleged that Twitters privacy policy misrepresented the extent to which Twitter employs[s] administrative, physical and electronic measures to protect customer data from unauthorized access. Under the Settlement, Twitter is barred for 20 years from making misleading statements about the extent to which it protects nonpublic data, including measures Twitter takes to prevent unauthorized access to customer accounts. Twitter is also required to create and maintain a comprehensive information security program that will be independently assessed every other year for 10 years. The Chitika and Twitter settlement terms illustrate how the FTC intends to implement the framework outlined in its Staff Privacy Report. Ad networks and businesses with an online presence (including mobile) should be familiar with the tracking technologies they use and how that use is disclosed in the privacy policy. A careful review of privacy and data security policies should be undertaken to ensure that promises reflect actual practices and are not misleading. The practices of third party service providers should also be examined to ensure proper due diligence of the privacy practices of those providers. 1
FTC v. Echometrix, Inc., No. 10-cv-05516 (E.D.N.Y. 2010), Stipulated Final
Order for Permanent Injunction and Other Equitable Relief; In the Matter
of Twitter, Inc, No.092 3093 (March 11, 2011). |
|||||||||||
Karen
Neuman to Participate in FCBA-DC Bar Panel on Privacy & Mobile Apps
|
|||||||||||
Copyright © 2010 St. Ledger-Roty & Olson, LLP. | |||||||||||