St. Ledger-Roty & Olson LLP
Proud sponsor of 1410 Q Street: DC’s Innovation Hot Spot!

PRIVACY & INFORMATION LAW UPDATE
April 2011
A quarterly update of trends and developments in Privacy law & policy

Prepared by Karen L. Neuman

  • You are receiving the STLRO Privacy & Information Law Update because of your interest in privacy, information management & data security. It is not intended to be and should not be considered legal advice.
  • Not interested? unsubscribe. Know someone who might be? Please forward.
  • If someone sent you this Update you may subscribe to receive future issues.
  • To view previous issues click here.

In this Issue:
FEATURE ARTICLE: Living in the Future: Baking Privacy into Product Development & Investor Due Diligence
Privacy Legislation Unveiled in Congress
Federal Court Allows Data Breach Class Action Against Social Media Application Provider to Proceed Without Damages Allegations
U.S. Securities Exchange Commission Imposes First Fine Under Privacy Rule
U.S. Department of Education Proposes Changes to Family Educational Rights and Privacy Act Rule
FTC & Google Reach Settlement Involving Buzz Social Network
Federal Court Rules Can-Spam Applies to Certain Social Media Advertising Communications
Oklahoma House of Representatives Passes Smart Grid Data Privacy Measure
FTC Settlement Signals Close Scrutiny of Privacy Policy Representations
Karen Neuman to Participate in FCBA-DC Bar Panel on Privacy & Mobile Apps

 

 

 

 

 

Feature Article:
Living in the Future: Baking Privacy into Product Development & Investor Due Diligence


Despite high profile enforcement actions, escalating privacy-based class action lawsuits, and steady calls for comprehensive privacy regulation, start-up Internet businesses and their financial backers often view privacy as an important concern that is being addressed by others.

Many founders recognize the importance of privacy concerns, but believe that their good intentions and ethical values alone will protect their users’ privacy and inoculate them against a lawsuit or enforcement action. Founders tend to focus on needs that are perceived as more pressing, and perhaps employ interim measures to address privacy such as adopting privacy policies from other websites. But there is no such thing as a one-size-fits-all approach to privacy. On the other hand, many investors tend to focus on evaluating traditional factors for an investment including the nature of the venture and market, management’s expertise and time frame for profitability -- without recognizing that privacy is integral to the analysis.
Read more...


Privacy Legislation Unveiled in Congress
Earlier this month two measures were introduced in the House and Senate in response to mounting calls for comprehensive privacy legislation to limit the on- and off-line collection, use and sharing of consumer data by business. Both bills apparently try to balance businesses’ interests in collecting and using customer data against consumers’ interests in knowing what information is collected about them, controlling how it is used or being able to opt out of those practices. If either measure (or anything like them) becomes law, the most immediate consequence would be FTC rulemaking proceedings to not only implement the law but to give teeth to the privacy regulatory framework embodied in the FTC’s 2010 Staff Report on Privacy1.
Read more...


Federal Court Allows Data Breach Class Action against Social Media Application Provider to Proceed Without Damages Allegations
On April 11, 2011 a federal district court issued a decision1 declining to dismiss portions of a class action lawsuit against RockYou, Inc., a publisher and developer of services and applications used on social media sites including Facebook and MySpace, despite the plaintiff’s failure to allege “damages harm”.
Read more...


U.S. Securities Exchange Commission Imposes First Fine Under Privacy Rule
On April 7, 2011, the U.S. Securities Exchange Commission (SEC) imposed fines on three former executives of GunnAllen Financial, Inc., a defunct brokerage firm, for failing to comply with Regulation S- P of the Securities and Exchange Act of 1934, also known as the SEC “Safeguard” Rule. This is the first time that the SEC has imposed fines on individuals charged solely with violations of Regulation S-P. The rule requires financial services firms to protect confidential customer information from unauthorized release to unaffiliated third parties. The Cease and Desist Orders require GunnAllen’s President and National Sales Manager to pay $20,000 each. The Chief Compliance officer was ordered to pay $15,000.
Read more...


U.S. Department of Education Proposes Changes to Family Educational Rights and Privacy Act Rule
On Thursday, April 7, 2011 the Department of Education announced a number of proposed changes to the Family Educational Right to Privacy Act (FERPA)1, the federal privacy law that protects personal information in student education records. The changes are intended to address states’ concerns about managing and safeguarding student performance and other data that is collected for recordkeeping, compliance and research purposes. The proposed changes appear to strike a balance between protecting student privacy and enabling limited use of student information to support data-driven initiatives to improve public education.
Read more...


FTC & Google Reach Settlement Involving Buzz Social Network
On March 30, 2011, the FTC and Google announced a proposed settlement in an enforcement action against Google arising from Google’s privacy policy promises made in connection with the launch of Google’s Buzz social network. The settlement is third in a recent suite of privacy policy enforcement actions1 with remedial measures that will remain in effect for 20 years.
Read more...


Federal Court Rules Can-Spam Applies to Certain Social Media Advertising Communications
By Karen L. Neuman
Ari Moskowitz1
On March 28, 2011, the U.S. District Court for the Northern District of California ruled2 that certain communications sent by Facebook users to their “Facebook” friends’ Facebook “inboxes” walls, or “newsfeeds” are electronic mail messages subject to the Controlling the Assault of Non-Solicited Pornography & Marketing Act of 2003 (CAN-SPAM Act). The Act regulates the transmission of commercial e-mail messages and applies to persons and entities (including nonprofits) that send them. The Court broadly interpreted “electronic mail message” to include “fake” Facebook fan pages posted on Facebook by MaxBounty, a marketing and advertising company that uses a network of publishers to drive traffic to its customers’ websites. The ruling was issued when the Court denied Maxbounty’s motion to dismiss the CAN-SPAM claim in a 7-count Complaint asserting various statutory and common law causes of action.
Read more...


Oklahoma House of Representatives Passes Smart Grid Data Privacy Measure
On March 18, 2011, the Oklahoma House of Representatives passed the Electric Utility Data Protection Act1. The Act’s purpose is to establish standards for governing access to and use of certain customer identifiable energy usage data (CIEUD) by utilities, their customers and third parties. The Act is an example of the growing number of state initiatives to address privacy concerns brought about by the Smart Grid’s integration of new products, services, vendors and technologies with legacy entities’ infrastructure and services. The state Senate will consider the measure.
Read more...


FTC Settlement Signals Close Scrutiny of Privacy Policy Representations
On March 15, 2011, the Federal Trade Commission (FTC) announced a settlement with Chitika, Inc. for false and misleading practices involving the company’s use of tracking technology to serve targeted ads to consumers based on their web activities. The action was triggered by representations in Chitika’s privacy policy that failed to disclose the extent to which an opt-out feature remained in effect.
Read more...


Karen Neuman to Participate in FCBA-DC Bar Panel on Privacy & Mobile Apps
On May 3, 2011, Karen Neuman will participate in a Federal Communications Bar Association Privacy & Data Security Committee/ DC Bar Computer and Telecommunications Law Section program on consumer privacy and mobile apps. Her remarks will focus on some of the legal complexities involving questions of data ownership and control, privacy obligations and liability, including in the context of the FTC’s COPPA Rule, as mobile devices fuel the expansion of the “app economy”, its products and services.



Living in the Future: Baking Privacy into Product Development & Investor Due Diligence

Despite high profile enforcement actions, escalating privacy-based class action lawsuits, and steady calls for comprehensive privacy regulation, start-up Internet businesses and their financial backers often view privacy as an important concern that is being addressed by others.

Many founders recognize the importance of privacy concerns, but believe that their good intentions and ethical values alone will protect their users’ privacy and inoculate them against a lawsuit or enforcement action. Founders tend to focus on needs that are perceived as more pressing, and perhaps employ interim measures to address privacy such as adopting privacy policies from other websites. But there is no such thing as a one-size-fits-all approach to privacy. On the other hand, many investors tend to focus on evaluating traditional factors for an investment including the nature of the venture and market, management’s expertise and time frame for profitability -- without recognizing that privacy is integral to the analysis.

Unfortunately, failure to account for privacy considerations at the outset can potentially shutter a promising new venture or convert an investment opportunity that initially seemed appealing into a costly one.

Addressing privacy at the product development phase can not only help minimize costs, including legal costs and potential damage to a brand, but can also significantly assist a new venture in attracting funding (and ultimately users) by differentiating itself from competitors. In January, 2011, the Wall Street Journal profiled a search engine that began marketing itself on the basis of its privacy practices, including a promise not to store any personal information or send search data to other sites. (Whether or not the venture adheres to those promises or changes its practices is a separate matter.)

By the same token, incorporating a thorough understanding of privacy laws and trends at the due diligence phase can aid investors in differentiating among the risks of potential investments. The Washington Post recently noted that “privacy is now a line item” in business plans, and quoted an investor with a large venture capital fund emphasizing that privacy policies have become integral to his decisions about new technology investments.

One thing is certain: an approach that relies solely on a theoretical commitment to privacy ideals (or an approach that assumes privacy is someone else’s “department”) should be considered high risk.

Recent FTC enforcement actions against established technology companies demonstrate that even enterprises with a deep understanding of privacy concerns are at risk for a broad range of consequences -- including substantial legal costs, damage to brand, and extensive, long term government oversight of operations -- for failing to address privacy at the product development stage or to align data collection, retention and use practices with privacy policies.

For example, in March 2011 the FTC settled an action against Google that arose from last year’s launch of Google’s social network “Buzz”.

The Complaint1 alleged that Gmail users were automatically enrolled in Buzz without an adequate opportunity to give informed consent, despite privacy policy promises that users could control access to their information. Even though they were initially given the choice of viewing Buzz before going to their Gmail In-boxes, Gmail users were not informed that doing so would automatically make their contacts visible to other Buzz users; certain Buzz features were activated anyway for users who chose this option. Moreover, Gmail users were connected to Buzz users on the basis of the number of emails exchanged between them – information that was made public by default.

Under the settlement Google is required to adopt a “privacy by design” approach to address privacy risk at the research and development stage for new products and services, and manage risk for existing offerings. Significantly, the settlement’s terms remain in effect for 20 years.

Online businesses (including mobile applications) also face potential civil lawsuits if they fail to recognize and sufficiently address privacy risks. For example, in late 2010, a class action2 was filed in the Northern District of California against Apple and a number of app developers. The Complaint alleges that apps on Apple’s iPhone and iPad collected personal information from users and transmitted that information to third party advertising networks. The personal data claimed to have been collected included the device’s unique identifier (the “UDID”), as well as age, gender, and location. The plaintiffs allege that these practices violate Apple’s published policy that prohibits apps from transmitting personal data without the user’s consent.

Although rare, the threat of criminal proceedings surfaced earlier this year when reports indicated that a number of entities, including Pandora, were served with federal Grand Jury subpoenas in connection with an investigation about information sharing practices of mobile apps that run on the Apple and Android platforms. (Pandora is not currently the target of the investigation.) Defending against enforcement or other legal proceedings may be considered a cost of doing business for established entities, but for an aspiring start-up, such costs could be prohibitive.

These developments leave no doubt that both innovators and investors can no longer afford to kick the bucket down the road when it comes to privacy. Instead, it is essential that each fully understand the current privacy legal and regulatory environment and incorporate that understanding early in the development or investment process.

The earlier in the product development process that a new venture addresses privacy, the more likely it will be to attract funding at early or expansion stages. On the other hand, ventures that are forced to address a privacy issue that was initially deferred or overlooked may be less attractive to investors seeking late-round funding opportunities with less uncertainty. Likewise, investors who incorporate privacy into their due diligence processes will be more likely to avoid investments fraught with uncertainty and risk.


1 In the matter of Google, Inc. No. 102 3136 (March 30, 2011)
2 Lalo v. Apple, Inc., No. 10-cv-05878 (N.D. Cal. filed Dec. 23, 2010)

Back to Top


Privacy Legislation Unveiled in Congress
Earlier this month two measures were introduced in the House and Senate in response to mounting calls for comprehensive privacy legislation to limit the on- and off-line collection, use and sharing of consumer data by business. Both bills apparently try to balance businesses’ interests in collecting and using customer data against consumers’ interests in knowing what information is collected about them, controlling how it is used or being able to opt out of those practices. If either measure (or anything like them) becomes law, the most immediate consequence would be FTC rulemaking proceedings to not only implement the law but to give teeth to the privacy regulatory framework embodied in the FTC’s 2010 Staff Report on Privacy1.

Senate Bill. On April 12, 2011, Senators Kerry (D-MA) and McCain (R-AZ) introduced the Commercial Privacy Bill of Rights Act of 2011. This bills move the U.S. closer to the European “notice and consent” approach to privacy protection but falls short of treating privacy as a “human right”.

Key provisions would:

  • Give the FTC rulemaking authority to promulgate regulations that address privacy concerns with the evolution of technology and corresponding expansion of new methods to collect, store and retain personal data.
  • Define “covered information” as personally identifiable information that, if “lost, compromised or disclosed without authorization,” carries a significant risk of economic or physical harm.
  • Define “personally identifiable information” (PII) as first name or initial and last name, address of physical place of residence, e-mail address, telephone number, social security number or other government-issued id number, credit card number, unique identifier that, if use alone, can identify a specific individual and biometric data, including fingerprints and retinal scans. PII can include information that can be linked to an individual if combined with the above- enumerated data, including date or place of birth and geolocation information.
  • Define “covered entities” as anyone that collects, uses transfers or stores “covered information” on 5,000 or more individuals consecutively over a period of 12 months.
  • Give consumers the right to opt-out of online behavioral advertising.
  • Give consumers the right to access and correct their personal information retained by covered entities.
  • Require covered entities to obtain opt-in consent prior to collecting sensitive personal information, such as medical, sexual preference or religious affiliation data.
  • Authorize the FTC to create a Safe Harbor program and sanction industry self- regulatory programs for protecting consumer privacy from online behavioral advertising.

House Bill. On April 13, 2011, the Consumer Privacy Protection Act of 2011, introduced by Representative Cliff Stearns (R-FLA), is similar to the Kerry-McCain measures but goes farther by recognizing that the combination of both off- and online data also poses privacy risks. Like the Kerry-McCain bill, it applies to “covered entities”. The bill’s definitions are similar to those in the Kerry McCain bill but broaden definition of PII to include birth date and electronic address (including IP address) if combined with enumerated data to identify an individual. Neither measure provides for a private right of action, giving the FTC and state attorneys general the sole right to seek redress for violations of the law, thus mirroring the framework embodied in other statutes, such as Children’s Online Privacy Protection Act2.

Key provisions would:

  • Define covered entities as those that collect, sell, disclose or use personally identifiable information of more than 5,000 consumers during any consecutive 12-month period. Certain entities are excluded from this definition, including government agencies and professional services providers subject to confidentiality obligations.
  • Define personally identifiable information as “individually” identifiable information, including name, address, Social Security Number (as well as date of birth and electronic address, including IP address if they can be combined with any of the above) relating to an individual who can be identified from that information. Aggregate, anonymized or publicly available data is excluded from this definition.
  • Require covered entities to provide notice to consumers prior to the collection of PII for purposes unrelated to certain transactions involving covered entities and their customers.
  • Require covered entities to provide notice of a material change to the entity’s information collection and use practices.
  • Require that consumers be given the opportunity to withhold consent for the sale or disclosure of PII for other than contemplated transactional purposes, free of charge and for up to five years.
  • Require covered entities to implement a data security policy to safeguard consumer PII.
  • Require the FTC to approve self-regulatory programs for five years.
  • Authorize civil penalties of up to $500,000 for violations of the Act.
  • Preempt state statutory schemes that address commercial collection and use of PII.

Both measures are silent on “do not track” (including the technology-based approach that is currently being promoted by the FTC). Such a mechanism could, however, be implemented by the FTC under rulemaking authority provided for in the Kerry-McCain bill. At a press conference introducing the bill, Senator Kerry indicated that a do not track mechanism seemed inconsistent with efforts to get industry and consumer support for the measure and in any event is unnecessary. At the same time, public remarks by FTC Chairman Jon Liebowitz and other senior agency staff strongly suggests that, with appropriate authority, it is a matter of “if” not “when” a do not track mechanism will be adopted.

It remains to be seen whether these bills will be made a priority given other pressing matters pending in Congress. Nevertheless, if enacted, businesses should recognize the strong likelihood that they will be confronted with new regulations that could add significant compliance costs to operations. Accordingly, action on these bills should be closely monitored and opportunities to affect the legislative outcome and regulatory environment should be identified.


1 Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers – Preliminary FTC Staff Report, December 2, 2010.
2 15 U.S.C. §§6501-6506.

Back to Top


Federal Court Allows Data Breach Class Action against Social Media Application Provider to Proceed Without Damages Allegations
On April 11, 2011 a federal district court issued a decision1 declining to dismiss portions of a class action lawsuit against RockYou, Inc., a publisher and developer of services and applications used on social media sites including Facebook and MySpace, despite the plaintiff’s failure to allege “damages harm”.

The action, filed in U.S. District Court for the Northern District of California, arose from a December 2009 data breach that exposed unencrypted user data, including social media login credentials, of 32 million RockYou users. The breach was caused by a security problem with RockYou’s “SQL” database known as an “SQL injection flaw”. The flaw enabled hackers to introduce malicious code into RockYou’s network to access its users’ webmail accounts and social media login credentials.

RockYou’s products include applications that enable users to share photos, post text on a friend’s social media page, or engage in online social gaming. Users sign up to use these applications on rockyou.com. During the sign-up process users are asked to provide a valid e- mail address and password that RockYou stores in its database. Users may also be asked to provide a user name or password to access a particular social network.

The named plaintiff signed up for a RockYou photo-sharing application and provided his e-mail address and password. The Complaint alleges that RockYou promised through its online privacy policy that it uses “commercially reasonable, physical, managerial and technical safeguards to preserve the integrity and security of your personal information.” According to the plaintiff, despite this promise and widespread industry knowledge of SQL injection flaws, RockYou failed to implement commercially reasonable methods, such as hashing, salting or other common data protection measures to prevent the data breach that lead to the lawsuit. Instead, RockYou allegedly stored its users’ personal information in clear or plain text format.

Following a warning from RockYou’s database security company about the SQL injection flaw, RockYou brought its site down until a security patch was in place. (The Complaint alleges that RockYou waited a day to act.) RockYou subsequently acknowledged that the database had not been maintained in a manner consistent with standard industry security protocols.

Although the court dismissed five causes of action, it allowed the lawsuit to proceed based on plaintiff’s claims of breach of contract, breach of implied contract, negligence and negligence per se.

In allowing those claims to go forward, the court noted the “paucity of controlling authority” about the legal sufficiency of the plaintiff’s damages theory. Specifically, the court observed that the “context in which plaintiff’s theory arises – i.e., the unauthorized disclosure of personal information via the Internet – is itself relatively new, and therefore more likely to raise issues of law not yet settled in the courts. For that reason . . . the court finds plaintiff’s allegations of harm sufficient at this state to allege a generalized injury in fact.” The court’s observation is similar to those made recently by other courts that have applied existing laws to new or emerging communications technologies, and is another indication that judges continue to seek to better understand these technologies and the behaviors and uses associated with them.

The court’s ruling gives the Plaintiffs a green light to proceed against RockYou under a breach of contract theory for failing to adhere to its privacy policy and its promises to take adequate steps to secure user data. Importantly, however, the court indicated that it will dismiss the claims for lack of standing if it becomes apparent through discovery that there is no legal basis upon which the plaintiff can demonstrate tangible harm arising from the unauthorized disclosure of personal information.

The ruling in this case was issued in the context of a motion to dismiss and therefore is not a decision on the merits of plaintiff’s claims. Nonetheless, it is potentially significant for online businesses that retain user data, even if the databases are hosted by third parties. In the event of a breach, businesses can expect to incur substantial legal fees to defend actions at least through the discovery phase of a case. At a minimum, businesses should be familiar with the latest industry standards for protecting user data, including the latest forms of encryption or other data protection methods. In addition, prompt action should be taken if a business is notified of a security problem by its data hosting vendor.

Back to Top


1 Claridge v. RockYou, Inc., Case No. 4:09-cv-o6032-PJH (D. N.D. CA, April 11, 2011).

Back to Top


U.S. Securities Exchange Commission Imposes First Fine Under Privacy Rule
On April 7, 2011, the U.S. Securities Exchange Commission (SEC) imposed fines on three former executives of GunnAllen Financial, Inc., a defunct brokerage firm, for failing to comply with Regulation S- P of the Securities and Exchange Act of 1934, also known as the SEC “Safeguard” Rule. This is the first time that the SEC has imposed fines on individuals charged solely with violations of Regulation S-P. The rule requires financial services firms to protect confidential customer information from unauthorized release to unaffiliated third parties. The Cease and Desist Orders require GunnAllen’s President and National Sales Manager to pay $20,000 each. The Chief Compliance officer was ordered to pay $15,000.

The SEC alleged that GunnAllen’s President authorized the National Sales Manager to transfer customers’ information from more than 1600 accounts to his new employer without giving the customers reasonable notice to opt out. Instead, customers were informed after the fact in violation of the rule. Names, addresses, account numbers and the values of customers’ assets were downloaded to a thumb drive and transferred to the Sales Manager’s new employer following his resignation from GunnAllen. The transfer was performed as GunnAllen was winding down its affairs.

The SEC also alleged that the firm’s chief compliance officer failed to implement adequate privacy and data security measures to protect customer information despite a number of earlier serious security breaches that occurred over a four-year period. Other breaches involved the theft of three laptops belonging to the firm’s “registered representatives” and access to its e-mail accounts using stolen login credentials by an employee who had been terminated from the firm. The SEC found that GunnAllen had maintained written procedures to protect customer information. However, the Agency found that the procedures were inadequate: they failed to instruct supervisors and company representatives about how to respond to a breach or comply with the Safeguard Rule. Moreover, they were not revised after the incident involving the terminated employee to address similar scenarios in the future.

Financial services firms should familiarize themselves with Regulation S-P and the orders in this matter. Firms should also undertake a thorough review of their data security policies and procedures to 1) ensure that adequate measures are in place to prevent a departing employee or officer from taking customer information with them to a new employer in violation of the rule’s notice and opt- out requirements; and 2) assess whether their current procedures for responding to actual or possible breaches are in line with the rule and this decision, and then revise them as warranted.

The three separate Cease and Desist Orders can be found here here and here.

Back to Top


U.S. Department of Education Proposes Changes to Family Educational Rights and Privacy Act Rule
On Thursday, April 7, 2011 the Department of Education announced a number of proposed changes to the Family Educational Right to Privacy Act (FERPA)1, the federal privacy law that protects personal information in student education records. The changes are intended to address states’ concerns about managing and safeguarding student performance and other data that is collected for recordkeeping, compliance and research purposes. The proposed changes appear to strike a balance between protecting student privacy and enabling limited use of student information to support data-driven initiatives to improve public education.

If finalized, the immediate result will be a move by public schools to strengthen privacy and data collection practices and policies, and align them more closely with approaches for protecting personal privacy in other sectors. Third parties that contract with public schools and universities to provide educational, research or hosting services that require access to and use of student information could also be affected. These entities should undertake a review of their data collection, sharing and security practices and policies now in anticipation of new obligations under the revised rule.

The proposed changes include:

  • Strengthening FERPA’s enforcement provisions of FERPA to ensure that every entity working with personally identifiable information from student education records (including those who access the data under an exception) use it solely for authorized purposes. Failure to do so could result in being barred from student data sharing arrangements for five years or be subject to having funding withheld.
  • Authorizing schools to implement functionally specific directory information policies that limit access to student records to specific uses, preventing marketers or identity thieves from accessing the data.
  • Authorizing states to enter into research multi-district research agreements to measure the success of programs, such as early childhood math or reading programs that effectively prepare children for kindergarten.
  • Authorizing high school administrators to share information on student achievement to track how their graduates perform academically in college.

The Department of Education is accepting public comment on the NPRM until May 23, 2011. The rule is expected to be finalized later this year.


1 20 U.S.C. § 1232g; 34 CFR Part 99.

Back to Top


FTC & Google Reach Settlement Involving Buzz Social Network
On March 30, 2011, the FTC and Google announced a proposed settlement in an enforcement action against Google arising from Google’s privacy policy promises made in connection with the launch of Google’s Buzz social network. The settlement is third in a recent suite of privacy policy enforcement actions1 with remedial measures that will remain in effect for 20 years.

The FTC had charged Google with using “deceptive tactics that violated its own privacy policy promises” when it launched Buzz last year. It also alleged that Google made certain misrepresentations that led users to believe that the handling of their personal information from the EU was in compliance with the U.S-EU Safe Harbor framework – specifically that Google failed to give consumers required notice and choice before using information for purposes other than that for which it was collected.

The settlement is notable for its scope, and for its use as a vehicle through which the FTC seems to be implementing key aspects of the framework for protecting consumer privacy that were previewed in its 2010 privacy report2. The settlement not only bars Google from “misrepresenting the privacy or confidentiality” of its users’ information; it also requires Google to get affirmative, opt-in consent for new or additional uses and sharing of personal information that were not disclosed when the information was first collected. Moreover, the settlement requires Google to implement a comprehensive privacy program that adopts a “privacy by design” approach to address privacy risk at the research and development stage for new products and services, and manages risk for existing offerings. The privacy program will be subject to regular, independently conducted privacy audits that are subject to FTC review. Google is further barred from misrepresenting its compliance with the U.S.-EU Safe Harbor program.

The Complaint3 alleged that Gmail users were automatically enrolled in Buzz (which was launched as an adjunct to Gmail) without an adequate opportunity to give informed consent, despite privacy policy promises that users could control access to their information. Even though they were initially given the choice of viewing Buzz before going to their Gmail In-boxes, Gmail users were not informed that doing so would automatically make their contacts visible to other Buzz users; certain Buzz features were activated anyway for users who chose this option. Moreover, Gmail users were connected to Buzz users on the basis of the number of emails exchanged between them – information that was made public by default.

Google was alleged to have engaged in a deceptive act or practice because: 1) Gmail users’ information was used for purposes other than those for which it was collected (i.e., social networking instead of e-mail) without obtaining prior consent as promised in the privacy policy and 2) Google failed to disclose the full extent of the methods by which Buzz connections were made.

Following the ensuing uproar, Google moved to offer improved privacy tools. Nevertheless, the FTC initiated the investigation that led to the Settlement.-

This is the first time the FTC has required the target of this type of action to create and implement a comprehensive privacy program of this scope and nature. The requirement is not limited to Buzz, but applies to all of Google’s products and services; oversight will be a substantial job that could require significant resources. The settlement also demonstrates the reach of the FTC’s enforcement power in the general absence of rulemaking authority, lending support for arguments in other arenas that privacy legislation is not warranted because the agency has shown that it will use its enforcement authority to protect privacy.

The settlement is not final. The FTC is accepting public comment through May 2, 2011.

1 Agreement Containing Consent Order In the Matter of Chitika, Inc, File No. 1023087 (March 15, 2011); Stipulated Final Order for Permanent Injunction and Other Equitable Relief; In the Matter of Twitter, Inc., No.092 3093 (March 11, 2011).
2 Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers (Dec. 2010).
3 In the Matter of Google, Inc., No. 102 3136 (March30, 2011).

Back to Top


Federal Court Rules Can-Spam Applies to Certain Social Media Advertising Communications
By Karen L. Neuman
Ari Moskowitz1
On March 28, 2011, the U.S. District Court for the Northern District of California ruled2 that certain communications sent by Facebook users to their “Facebook” friends’ Facebook “inboxes” walls, or “newsfeeds” are electronic mail messages subject to the Controlling the Assault of Non-Solicited Pornography & Marketing Act of 2003 (CAN-SPAM Act). The Act regulates the transmission of commercial e-mail messages and applies to persons and entities (including nonprofits) that send them. The Court broadly interpreted “electronic mail message” to include “fake” Facebook fan pages posted on Facebook by MaxBounty, a marketing and advertising company that uses a network of publishers to drive traffic to its customers’ websites. The ruling was issued when the Court denied Maxbounty’s motion to dismiss the CAN-SPAM claim in a 7-count Complaint asserting various statutory and common law causes of action.

Facebook alleged that MaxBounty: 1) engaged in deceptive and misleading marketing practices by inducing Facebook users to send commercial electronic messages to their Facebook friends, and 2) misled its affiliates by implying that various advertising campaigns, including the creation of fake Facebook pages, are approved by Facebook. Instead, MaxBounty advises and assists its affiliates to create the pages that are, in fact, advertisements. These pages claim that if a Facebook user completes a 3-step registration process, he or she will receive an iPad, iPhone, or other similarly high-end product.

The registration process requires a Facebook user to 1) become a “fan” of the page on Facebook, 2) invite all of their friends to become a fan whereby users were urged to copy and paste messages that automated the invitations sent to their friends’ Facebook walls, newsfeeds, and inboxes, and 3) give personal information to the advertised company. Users who completed the process did not receive the promised gift but were instead directed to a third-party website (via an auto-redirect from a MaxBounty-registered domain) to complete additional steps. These steps included signing up for “sponsor offers” such as paid magazine subscriptions, movies and music, and disclosing personal information. MaxBounty was paid by the third-party advertiser based on the number of users that completed the first 3 steps.

The Court rejected MaxBounty’s argument that the communications were not e-mail messages within the meaning of the Act. The Court cited two cases3 in the Central District of California involving another social network to construe the term “electronic mail messages”. These courts broadly interpreted the term to effectuate congressional intent to mitigate the number of commercial communications that “overburden” the Internet’s infrastructure. Thus, the courts concluded that phishing messages delivered to MySpace.com inboxes (as opposed to traditional e-mail domain boxes) were e-mail messages within the meaning of the Act. The Court further cited the MySpace courts’ reasoning that Congress was aware of various forms of electronic communications when it drafted the Act and that the statute’s plain language includes alternative forms [of e-mail] while recognizing that the most commonly used form of an electronic address was the traditional e-mail with a local part and domain part.

The Court next considered whether the Facebook pages were sent to a unique electronic mail address, or “a destination [] to which an electronic mail message can be sent [].” The Court found it significant that under MaxBounty’s scheme, a user is instructed to “effect” transmission of Facebook pages to all of his or her Facebook friends. The pages are transmitted to unique destinations including the wall, newsfeed or home page of the user’s friends, the Facebook “inbox” of the user’s friends, and to users’ external e-mail addresses. Finally, the court reasoned that Facebook’s routing of these communications implicates the volume and traffic infrastructure issues that CAN-SPAM seeks to address. Accordingly, the Court concluded that a determination that the communications are electronic mail messages within the meaning of the Act is consistent with the Act’s stated purpose, and that Facebook’s CAN-SPAM Act clam is “sufficiently pled” for purposes of surviving a motion to dismiss.

This action seems to be an effort by Facebook to protect its brand by limiting the ability of commercial entities to engage in deceptive practices in a manner that could confuse users about who is responsible for sending unsolicited commercial communications through Facebook. In light of this ruling, a message promoting a product sent by a Facebook user to the user’s friends’ newsfeeds or walls (for example when a user “likes” a product or the product’s fan page) could theoretically be construed as a commercial electronic message subject to CAN-SPAM’s requirements. However, such an outcome seems unlikely given the purpose of Facebook’s CAN-SPAM Act claim (apart from the fact that “liking” a product is a Facebook sanctioned activity). Moreover, a court would be unlikely to sustain a CAN-SPAM action like this one against the user because of the Act’s provisions regarding 1) deceptive practices and (2) standing.

The Federal Trade Commission’s guidance on CAN-SPAM indicates that transmission of mass commercial e-mail messages is not necessarily a violation of the Act, so long as certain measures are taken to avoid conveying deceptive or misleading information.4 This includes avoiding misleading consumers as to the origin of the message, clearly identifying the message as an ad, and honoring opt-out requests promptly.

In the MaxBounty case, Facebook alleged that MaxBounty not only sent a high volume of messages through Facebook’s network, but that the messages and advertisements were misleading. Facebook Pages has explicit terms of use that prohibit deceptive activities by advertisers that track many of CAN-SPAM’s requirements.5

It is unlikely that a Facebook user could assert a viable CAN-SPAM claim against Facebook (even if Facebook Page lacked such explicit terms of use) because of standing. Under CAN-SPAM, a private right of action is only available to Internet Access Services who have been adversely affected by a violation of the Act. Individual users cannot bring a CAN- SPAM action (although they can register a complaint with the Federal Trade Commission). The Ninth Circuit has addressed the scope of an “Internet Access Service” and ruled that it encompasses more than just ISPs, including for example, Facebook and MySpace, but excludes small website operators with only a “nominal role in providing Internet-related services.”6 To have standing to bring a CAN-SPAM action, the Complainant must also have suffered a harm contemplated under the Act, which the Court described as “bandwidth, hardware, Internet connectivity, network integrity, [and] overhead costs.” So, a Facebook user would not have standing to file suit under the Act, and so long as Facebook sanctions the advertiser’s activity, would be unlikely to be able to file a CAN-SPAM action against an advertiser.


1 Ari Moskowitz is a third-year law student at George Washington University and a Law Clerk at St. Ledger-Roty & Olson LLP. He previously interned at the NTIA’s Internet Policy Task Force, where he worked on the Department of Commerce Privacy “Green Paper” that was released in December, 2010.
2 Facebook v. MaxBounty, Inc., Case No. CV-10-4712-JF (N.D. California March 28, 2011).
3 MySpace v. Wallace, 498 F.Supp.2d 123, 1300 (C.D. Cal. 2007); MySpace v. TheGlobe.com, 2007 WL 168696 at 4 (C.D. Cal. Feb. 27, 2007).
4 See Federal Trade Commission, The CAN-SPAM Act: A Compliance Guide for Business, available at http://business.ftc.gov/documents/bus61-can-spam-act-compliance-guide-business
5 See, Facebook Advertising Guidelines, http://www.facebook.com/ad_guidelines.php; Facebook Pages Terms, http://www.facebook.com/terms_pages.php
6 Gordon v. Virtumundo, Inc., 575 F.3d 1040, 9th Cir. 2009

Back to Top


OKLAHOMA HOUSE OF REPRESENTATIVES PASSES SMART GRID DATA PRIVACY MEASURE
On March 18, 2011, the Oklahoma House of Representatives passed the Electric Utility Data Protection Act1. The Act’s purpose is to establish standards for governing access to and use of certain customer identifiable energy usage data (CIEUD) by utilities, their customers and third parties. The Act is an example of the growing number of state initiatives to address privacy concerns brought about by the Smart Grid’s integration of new products, services, vendors and technologies with legacy entities’ infrastructure and services. The state Senate will consider the measure.

The measure follows the release of the U.S. Department of Energy’s 2010 Smart Grid Data Access and Privacy Report,2 in which the Agency urged states to carefully consider and address privacy issues, including the conditions under which consumers can authorize third party access to CIEUD. The Report also called on states to enact laws or regulations to safeguard CIEUD and establish mechanisms whereby consumers can access their data or consent to its disclosure to third parties.

The Oklahoma House of Representatives appears to have adopted the privacy regulatory approach that has been implemented for other sectors (e.g., health care, financial services and telecommunications) that collect highly granular, sensitive consumer data that can be used for targeted marketing, profiling, surveillance or other unwanted purposes. Under this regulatory approach certain customer data may be used for specified business purposes, and certain disclosures may be made to affiliated third party service providers for limited uses without requiring prior customer consent.

Thus, under the Oklahoma Act utilities would be permitted to use CIEUD without obtaining prior consent for such business purposes as providing services, billing, infrastructure support, developing, enhancing, marketing or providing energy related products or services and promoting public policy objectives, including energy efficiency. A utility would also be able to disclose CIUED to affiliates and third parties that assist in providing services and implementing business objectives without obtaining prior customer consent. The affiliate or third party would be required to agree in writing to use the data only for authorized purposes and to safeguard the data’s confidentiality.

The law is silent on the conditions for disclosure of CIEUD for other uses, including research and development. Examples of R&D uses include addressing technical shortcomings involving the Smart Grid system’s reliability, security architecture, or operational and scalability needs.

Interestingly, “customer” is defined under the Act to include not only individuals, but corporate and other legal entities, including “residential, commercial or industrial” entities that receive service from an electric utility. In addition, the Oklahoma House of Representatives seems to have resolved the current debate over customer usage data ownership in favor of the utility, while imposing a duty to provide customers with “reasonable” access to usage data.

Usage data includes information relating to 1) the amount of electricity consumed at a residence or customer premises; and 2) the characteristics of consumption. Characteristics of consumption can include the date and time of consumption and data about appliances that use electricity. The Act governs access to and disclosure of usage data in “identifiable” and “aggregate” form.

“Identifiable” data is information that can identify a customer includes information that is uniquely associated with a customer, including name, social security or taxpayer ID number, street address, telephone number, electric utility account number, meter number or financial account information. Identifiable data is not limited to individuals but includes business or legal entities that are electric utility customers. “Aggregate” data is usage data from which all identifying information has been removed to minimize the potential that the data can be associated with an individual customer.

If enacted, Smart Grid vendors, service providers and other organizations with operations in Oklahoma, will want to be familiar with the law and understand its interplay with applicable federal laws or guidelines, and other generalized consumer protection laws to ascertain the extent to which information can be shared, including for purposes not specified in the Act.


1 H.B. 1079.
2 Data Access and Privacy Issues Related to Smart Grid Technologies, U.S. Department of Energy, October 5, 2010.

Back to Top


FTC Settlement Signals Close Scrutiny of Privacy Policy Representations
On March 15, 2011, the Federal Trade Commission (FTC) announced a settlement with Chitika, Inc. for false and misleading practices involving the company’s use of tracking technology to serve targeted ads to consumers based on their web activities. The action was triggered by representations in Chitika’s privacy policy that failed to disclose the extent to which an opt-out feature remained in effect.

This case is one of several recent actions1 that indicate the FTC is making good on its promise to escalate enforcement in the area of consumer privacy. The settlements are being announced as website operators and online businesses implement measures to their bring data collection, retention and use practices in line with the framework embodied in the FTC Staff’s December 2010 Privacy Report. These measures include context and product specific privacy protections, “do not track” or “opt-out” tools, privacy preference dashboards and privacy “icons.”

It is evident that the FTC remains troubled by tracking technologies (including those that may not capture what is currently considered to be personally identifiable information). Companies that implement enhanced privacy protections still risk attracting the FTC’s attention if they don’t “get it right” -- both in terms of functionality and transparency.

Consumers visiting Chitika’s website were given the choice of opting out of being tracked by clicking an “opt-out” button on Chitika’s site. Prior to March 1, 2010, consumers who clicked the button were served an “opt-out” cookie, after which they received a message stating “you are currently opted-out.” Between May 2008 and March 1, 2010, the opt-out cookie automatically expired after 10 days without notice. When consumers subsequently visited websites in the Chitika network, Chitika set new tracking cookies or received tracking cookies that were set before the consumer chose to opt-out. After being contacted by the FTC, Chitika changed the expiration date of opt-out cookies from 10 days to 10 years effective March 1, 2010. The FTC nevertheless instituted the action that led to the settlement.

The Settlement Order imposes numerous obligations, including: 1) Requiring Chitika to make certain disclosures about its information collection, sharing and use practices, and the operational effect of selecting the opt-out tool both before the FTC initiated its investigation and going forward; 2) requiring targeted ads to include a hyper-link to a clear opt-out mechanism for at least 5 years; 3) barring Chitika from using or selling information “that can be associated with a…Chitika user’s computer or device” obtained by Chitika prior to March 1, 2010 and requires that any such information be deleted from its files; and notably, 4) requiring the Order, and its remedial requirements, to remain in effect for 20 years.

The FTC seems to be paying close attention to privacy policy promises. Earlier this month it finalized a settlement with Twitter for security breaches that occurred in 2009. “Hackers” figured out Twitter employee passwords, accessed user accounts and passwords and then sent Tweets from several accounts, including President Obama’s. The FTC alleged that Twitter’s privacy policy misrepresented the extent to which Twitter “employs[s] administrative, physical and electronic measures” to protect customer data from unauthorized access.”

Under the Settlement, Twitter is barred for 20 years from making “misleading” statements about the extent to which it protects nonpublic data, including measures Twitter takes to prevent unauthorized access to customer accounts. Twitter is also required to create and maintain a “comprehensive information security program” that will be independently assessed every other year for 10 years.

The Chitika and Twitter settlement terms illustrate how the FTC intends to implement the framework outlined in its Staff Privacy Report. Ad networks and businesses with an online presence (including mobile) should be familiar with the tracking technologies they use and how that use is disclosed in the privacy policy. A careful review of privacy and data security policies should be undertaken to ensure that promises reflect actual practices and are not misleading. The practices of third party service providers should also be examined to ensure proper due diligence of the privacy practices of those providers.


1 FTC v. Echometrix, Inc., No. 10-cv-05516 (E.D.N.Y. 2010), Stipulated Final Order for Permanent Injunction and Other Equitable Relief; In the Matter of Twitter, Inc, No.092 3093 (March 11, 2011).

Back to Top


Karen Neuman to Participate in FCBA-DC Bar Panel on Privacy & Mobile Apps
On May 3, 2011, Karen Neuman will participate in a Federal Communications Bar Association Privacy & Data Security Committee/ DC Bar Computer and Telecommunications Law Section program on consumer privacy and mobile apps. Her remarks will focus on some of the legal complexities involving questions of data ownership and control, privacy obligations and liability, including in the context of the FTC’s COPPA Rule, as mobile devices fuel the expansion of the “app economy”, its products and services

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036