St. Ledger-Roty & Olson LLP
Proud sponsor of 1410 Q Street: DC’s Innovation Hot Spot!

PRIVACY & INFORMATION LAW UPDATE
February 2011
A quarterly update of trends and developments in Privacy law & policy

Prepared by Karen L. Neuman

  • You are receiving this Update because of your interest in privacy, information management & security issues.
  • Not interested? Click here to unsubscribe
  • Know someone who might be interested in this Update? Please forward.
  • If someone forwarded this Update to you, subscribe to receive future Updates.
  • To view previous issues of this Update click here.
  • This update is for informational, including advertising, purposes only and is not intended to be nor should it be considered to be legal advice.

In this Issue:
FEATURED ARTICLE: It’s the (App) Economy, Stupid: Privacy, IP & Taxes
SPRING FORWARD/LOOK BACK: Time to Evaluate Data Security Practices for Compliance with the Massachusetts Data Security Act Rules
Al Franken to Chair Senate Subcommittee on Privacy, Technology & the Law
California Supreme Court Rules Zip Code is Personal Information: Implications beyond Brick & Mortar Credit Card Transactions
Enactment of Red Flags Clarification Law Ends Uncertainty about Rule's Application
Lack of Appropriate Data Security Measures Results in FINRA Fines for Broker-Dealer Firms
FTC-Echometrix Consent Agreement Presages FTC Privacy Report Findings & Proposed Framework Re Transparency & Privacy Policies
6th Circuit Grants Fourth Amendment Protections to E-mails Stored by Third-Party Service Providers: Implications for Social Media & the Cloud
Physician Prescription Case Could Have Broad Implications for the Commercial Use of Consumer Data
Karen Neuman to participate in February 28, 2011 NATOA Webinar on Local Franchise Authority Use of Social Media & Mobile Apps for Distributing Government Content

 

 

 

 

 

Featured Article:
It’s the (App) Economy, Stupid: Privacy, IP & Taxes
(Second in a series about the impact of legal and policy developments on the App Economy)

By Karen L. Neuman
Ari Moskowitz*
Amid the growing presence of “apps” in daily life, the business practices of developers, publishers, and platforms are attracting the attention of policymakers, regulators and the plaintiffs’ bar. Privacy and data ownership have been and will continue to be a central focus of attention. Both new entrants and established players are being targeted for lawsuits with increased frequency, as parties look to the courts to clarify legal rights and obligations. In addition to privacy issues, courts are being asked to resolve a variety of other disputes, such as intellectual property claims. For the foreseeable future, the proliferation of lawsuits and varying judicial decisions will likely increase the level of uncertainty for developers, platforms and content providers concerning their legal obligations, risks and costs.

This uncertainty may be desirable for companies with an established market presence. Nevertheless, as the industry continues to grow and mature, businesses will need to take into account the increased risk and cost of litigation and potential liability. The following summary of recently filed cases provides some indication of the types of legal challenges that businesses in the “app economy” can expect to face in the coming years.
Read more...


SPRING FORWARD/LOOK BACK:
Time to Evaluate Data Security Practices for Compliance with the Massachusetts Data Security Act Rules

This March many of us will be reminded to push the clocks forward one hour for daylight savings time. This year the annual rite will also mark the one year anniversary of the compliance deadline for regulations that implement the Massachusetts Data Security Act.1 Now is a useful time for businesses to review their data collection practices and security programs for compliance with the rules.
Read more...


Al Franken to Chair Senate Subcommittee on Privacy, Technology & the Law
On February 14, 2011, Senate Judiciary Committee Chair Patrick Leahy (D-Vt.) announced the creation of a new Senate subcommittee on Privacy, Technology and the Law, to be chaired by Sen. Al Franken (D-Minnesota). Other Democratic Senators will include Chuck Schumer of New York, Sheldon Whitehouse, of Rhode Island, and Richard Blumenthal of Connecticut. Republican members will include Orrin Hatch of Utah and Lindsey Graham of South Carolina. Tom Coburn (R-Oklahoma) will serve as the panel’s ranking Republican member.
Read more...


California Supreme Court Rules Zip Code is Personal Information: Implications beyond Brick & Mortar Credit Card Transactions
On February 10, 2011, the California Supreme Court ruled in Pineda v. Williams- Sonoma1 that zip codes are “personal identification information” under the state’s Song-Beverly Credit Card Act of 19712. The Act prohibits traditional “brick and mortar” retailers from seeking and recording personal identification information from credit card holders engaged in credit card transactions as a condition of those transactions. Retailers can be subject to significant fines – up to $1,000.00 in damages for each violation of the Act. The decision has already prompted the filing of numerous similar actions against other retailers.
Read more...


Enactment of Red Flags Clarification Law Ends Uncertainty about Rule's Application
President Obama’s signing of the Red Flags Program Clarification Law of 2010 in December 2010 ended a lengthy period of uncertainty that began with the FTC’s 2007 promulgation of its “Red Flags Rule”. The Rule required businesses and organizations that act as "creditors" within the meaning of the Fair Credit Reporting Act (FCRA) to establish policies and procedures for detecting signs of potential identity theft, or “red flags” and take specified measures in response. The Rule’s underlying assumption was that a breach may have already occurred, unlike other rules, whose focus was on preventing unauthorized access to personal data.
Read more...


Lack of Appropriate Data Security Measures Results in FINRA Fines for Broker-Dealer Firms
On February 16, 2011, the Financial Industry Regulatory Authority (FINRA), an industry self-regulatory group that is overseen by the Securities and Exchange Commission (SEC), fined Lincoln Financial Securities, Inc. (LFS) and an affiliate, Lincoln Financial Advisors Corp. (LFA) for failing to adequately safeguard customer data as required by the SEC and FINRA. LFS was fined $450,000 for violations that occurred over a seven-year period. LFA was fined $150,000 for similar violations that occurred over a three-year period.
Read more...


FTC-Echometrix Consent Agreement Presages FTC Privacy Report Findings & Proposed Framework Re Transparency & Privacy Policies
By Karen L. Neuman
Ari Moskowitz*
The day before releasing its long-anticipated consumer privacy report, in December 2010, the Federal Trade Commission issued its Final Order in FTC v. EchoMetrix, Inc.1

The Order effectively previewed how the FTC anticipates implementing and enforcing the framework proposed in its December 2010 consumer privacy report. The impetus for FTC action in this case involved two software programs sold by EchoMetrix: (1) Sentry Parental Controls and (2) The Pulse. Sentry was a computer program that allowed parents to record and monitor their childrens’ online activity. Sentry collected such data as website history, chat and instant messages. Echometrix subsequently released Pulse, a market research product that collected and analyzed user generated content about certain products from social media sites. Data collected through Sentry was incorporated in the Pulse database and made available to marketers and other third parties.
Read more...


6th Circuit Grants Fourth Amendment Protections to E-mails Stored by Third-Party Service Providers:
Implications for Social Media & the Cloud

By Karen L. Neuman
Shannon Mackenzie Orr
On December 14, 2010 a federal appeals court ruled in U.S. v. Warshak1 that e-mail subscribers have a reasonable expectation of privacy in the contents of their e-mails, and therefore, the government must obtain a search warrant before it can compel a commercial ISP to turn over the contents of a subscriber’s emails. The Court also ruled that provisions of the Stored Communications Act (SCA2) that allow warrantless access to opened or archived e-mail stored by third-party service providers violate the Fourth Amendment.
Read more...


Physician Prescription Case Could Have Broad Implications for the Commercial Use of Consumer Data
By Karen L. Neuman
Shannon Mackenzie Orr
On December 13, 2010, the U.S. Supreme Court announced that it will review a Vermont law that limits the commercial use of physician prescription information. In addition to the merits, the case, Sorrell v. IMS Health Inc.,1 could have far-reaching implications for the manner and extent to which government may restrict the commercial use of non-public personal information. This case could also mark the beginning of a trend by businesses that incorporate the collection and use of personal data into their business models to challenge government data privacy regulation on First Amendment grounds.
Read more...


Karen Neuman to participate in February 28, 2011 NATOA Webinar on Local Franchise Authority Use of Social Media & Mobile Apps for Distributing Government Content
When the Cable Telecommunications Act of 1984 was enacted no one could have imagined, nor does the statute reference, such new communications platforms and technologies as social media and mobile apps. These platforms and technologies can be used by local governments to achieve many of the objectives that are embodied in the Cable Act’s PEG Access provisions and incorporated into local cable franchise agreements. Many LFAs are considering augmenting PEG channels with emerging platforms for distributing government content. Karen will address some legal issues associated with local government use of these technologies, particularly mobile apps.


Back to Top


Featured Article:
It’s the (App) Economy, Stupid: Privacy, IP & Taxes Second in a series about the impact of legal and policy developments on the App Economy

By Karen L. Neuman
Ari Moskowitz*
Amid the growing presence of “apps” in daily life, the business practices of developers, publishers, and platforms are attracting the attention of policymakers, regulators and the plaintiffs’ bar. Privacy and data ownership have been and will continue to be a central focus of attention. Both new entrants and established players are being targeted for lawsuits with increased frequency, as parties look to the courts to clarify legal rights and obligations. In addition to privacy issues, courts are being asked to resolve a variety of other disputes, such as intellectual property claims. For the foreseeable future, the proliferation of lawsuits and varying judicial decisions will likely increase the level of uncertainty for developers, platforms and content providers concerning their legal obligations, risks and costs.

This uncertainty may be desirable for companies with an established market presence. Nevertheless, as the industry continues to grow and mature, businesses will need to take into account the increased risk and cost of litigation and potential liability. The following summary of recently filed cases provides some indication of the types of legal challenges that businesses in the “app economy” can expect to face in the coming years.

Privacy and Data Ownership. The integration of apps with geolocation (including “hyperlocation”) services, technologies, advertising (including interstitials), social media, cloud-based data storage services and near field communications (a mobile payments technology) raises new questions about privacy and data ownership. Device and platform fragmentation add another layer of complexity to the privacy/data ownership debate because the acquisition of consumer data is integral to monetizing content and attracting and retaining users.

In December 2010 a class action1 was filed in the Northern District of California against Apple and a number of app developers. The complaint alleges that apps on Apple’s iPhone and iPad collected personal information from users and transmitted that information to third party advertising networks. The personal data claimed to have been collected includes the device’s unique identifier (the “UDID”), as well as age, gender, and location. The Plaintiffs allege that this is done in violation of Apple’s published policy that prohibits apps from transmitting personal data without the user’s consent.2

Another case currently making its way through the courts is In re Zynga Privacy Litigation.3 This class action was consolidated from a number of complaints against Zynga, some of which also named Facebook as a defendant, arising from privacy concerns over Zynga’s Facebook apps. The complaints allege that Zynga, a developer and provider of game apps such as Farmville, shared information it collected from its Facebook users with third parties. The plaintiffs claim that this violates Zynga's agreement with Facebook, Facebook's privacy policies, and laws including the Electronic Communications Privacy Act and Stored Communications Act. The complaints against Facebook allege that when users clicked on an ad or app on Facebook, a referrer header would be sent to the advertiser or app developer that allowed the advertiser or app developer to locate and view the user’s Facebook page. In May 2010, prior to the filing of the lawsuits, Facebook fixed the referrer headers sent to advertisers referenced in the complaint.4

Some lawsuits have not been “app specific” but have implications for the ability of developers to collect information about users for targeting in-app advertising and other purposes. For example, in Pineda v. Williams-Sonoma Stores, Inc.,5 the California Supreme Court ruled that zip codes are personal information when requested by traditional “brick and mortar” retail businesses engaged in credit card transactions. The Court observed that unlike online purchases, zip code information is unnecessary for in-store purchases. Although the Court distinguished between online and traditional retailers, one can easily imagine a lawsuit challenging the need to collect zip codes for online downloads of mobile apps. Developers and other publishers will want to be able to show a clear business purpose for seeking this information from users who access and download apps in order to stay clear of this ruling and potential efforts to expand it.

Lawmakers are paying close attention to these actions. On February 8, 2010, some Democratic members of Congress asked the FTC to initiate an investigation into “in-app purchases” by children on childrens’ game apps. One of the stated concerns is that children are able to figure out parents’ passwords and then make purchases without their parents’ knowledge or consent. Right now the headline is simply that the FTC has been asked to investigate this issue but has not yet indicated whether it will. However, as near field communications technologies and the in-app purchase model for monetizing apps continue to be adopted, developers, publishers and platforms will need to evaluate their privacy policies to ascertain whether those policies provide enough information about this issue. The timing of the lawmakers’ request is particularly important as the FTC is currently engaged in a review of the Children’s Online Privacy Protection Act rule, including the possibility of extending the rule’s application to mobile services and tools like mobile apps.

In order to reduce the potential for becoming the target of a lawsuit, app platforms and developers, at a minimum, should examine their data collection and protection practices, clarify contractual data ownership issues, and consider the adoption of standards and industry best practices.

Beyond Privacy:

Intellectual Property. In addition to privacy, intellectual property disputes will play a role in defining rights and responsibilities in the app ecosystem – particularly as advertising, social games, and virtual worlds incorporate third party intellectual property.

Copyright. An example of litigation which demonstrates how the courts are being tasked with resolving IP issues is an action pending in a Pennsylvania federal court, Hershey v. Hottrix6. This case involves copyright claims over simulated drinking apps. Hershey initiated the action seeking a declaratory judgment that its Hershey’s Syrup App does not infringe on Hottrix’s copyrights related to the latter’s iMilk App. In both applications, the iPhone user simulates the pouring of a glass of milk and drinking the milk by tilting the iPhone. The apps differ in various respects. For example, Hershey’s app allows the user to add chocolate syrup and drink from a straw. Hottrix’s app allows the user to shake the milk, turning the milk into cheese. Nevertheless, the Court, in rejecting Hottrix’s motion to dismiss the complaint, concluded that Hottrix stated sufficient facts to allow the lawsuit to proceed. Significantly, the role of the app economy in commerce did not go unnoticed by the court.

Another closely watched case is Viacom v. YouTube.7 YouTube successfully defended a summary judgment motion last year in which Viacom claimed that by posting videos on its site without the permission of Viacom (which held the copyrights to those videos), YouTube infringed on Viacom’s protected content. The court found that YouTube was protected under the safe harbor provisions of the Digital Millennium Copyright Act8 because it removed infringing videos upon receiving notice from Viacom. On December 3, 2010 Viacom appealed and the matter is pending before the Second Circuit. The outcome of this case could be significant for developers that integrate third-party content into their applications.

Patent. Patent disputes threatening to impact device manufacturers, developers and platforms and are also winding their way through the courts.

In December, Nokia filed suit9 in three European countries – Great Britain, the Netherlands, and Germany – alleging that Apple violated its patents covering on-device app stores and touch interfaces. These lawsuits are only the latest in a battle over patents that began in October 2009, when Nokia first sued Apple in Delaware federal court and Apple countersued.10 Other companies that own app platforms, such as Google and its Android operating system, are also involved in patent disputes.11 Although these lawsuits are primarily of concern to platforms, they could eventually have ramifications for developers. In one suit against Google and a variety of handset manufacturers, Gemalto S.A. alleges that the defendants violate its patent on a technology that allows Java to run on mobile phones.12 While it has not sued any developers, Gemalto’s complaint alleges that “android applications and the development of such applications using the Android [Software Development Kit]” infringe on its patents. App developers should watch these patent suits closely for any potential claims, particularly for software patent claims, that could be applied to them in the future.

Tax Policy. A state sales tax case that is being closely watched in some quarters could affect certain businesses in the app industry, creating potential barriers to entry and deterring innovation.

Last year, Amazon and Overstock.com challenged13 New York’s “Amazon” law, which requires online retailers with in-state “affiliates” to collect sales tax. This tax would be collected from the websites’ sales in New York when those sites are encouraged to advertise for the retailer and generate more than $10,000.00 per tax year. (Subsequent implementing regulations create a rebuttable presumption in favor of the “resident” websites, exempting them from the law’s tax obligations if the sites can demonstrate that they did not solicit sales in New York).

Neither Amazon nor Overstock.com owns property, maintains an office, or has any employees in New York, and they do not maintain any other presence in the state. However, each company created an “affiliates” program whereby individuals could link to Amazon or Overstock from their own websites. The affiliates were paid a commission from sales of the retailers’ merchandise. The Court rejected a number of facial challenges to the sales tax law, including a facial challenge to the law’s constitutionality, but preserved certain “as applied” claims. The finding that the law is facially constitutional could encourage cash-strapped states seeking new sources of revenue to adopt similar laws. Indeed, such laws have already been enacted in Rhode Island and North Carolina, and similar legislation is pending in several others states, including: Connecticut, Maryland, Minnesota and Tennessee.

Depending on statutory language, it is conceivable that laws intended to capture proceeds from sales involving affiliate marketing programs could be expanded to apps, wherein an app could link to a retailer’s website in exchange for a commission or referred sales, particularly given the trend toward in-app purchases and payments. If so, a number of complicated questions are likely to be raised. For example: What if the sale happened through an Amazon app on Facebook? What if both Amazon and Facebook have affiliates in the state? Who should be collecting the tax? Does it matter how the app works - whether the transaction goes through Facebook or directly through Amazon? Which, if any, of these entities can be seen as having solicited sales?

Accordingly, affiliate marketing or similar programs, if incorporated into app business models, could subject developers, publishers or platforms to prohibitive, unanticipated tax liability.

At a minimum, developers, publishers and platforms should review the terms of any contracts with out-of-state retailers, as well as their own practices regarding marketing or sales of the retailers’ merchandise, to ascertain whether they can be seen as having the required nexus for state tax liability.

Conclusion.

Over the next few years, developers, publishers and platforms can expect a period of disruption and uncertainty as new entrants and established businesses look to the courts to define legal rights and obligations. Much of the litigation has been, and will continue to be, focused on privacy, data ownership and security. Policymakers have been struggling to impose a regulatory framework in other contexts (notably in the telecommunications and online contexts) that addresses privacy concerns without stifling innovation. This process has not played out in the policy arena as quickly as some would like and so the courts have been tasked with resolving these issues. It is almost certain, however, that obtaining clarity in the law through litigation would in itself take many years.

In addition to privacy, litigation involving intellectual property rights can be expected to create important guideposts for developers, publishers and platforms. Some businesses may have to decide whether resorting to the courts is the best way to protect intellectual property and assets, while others may want to consider business practices to minimize the risk of becoming the target of a lawsuit.


*Ari Moskowitz is a third-year law student at George Washington University and a Law Clerk at St. Ledger-Roty & Olson, LLP. He previously interned at the NTIA’s Internet Policy Task Force, where he worked on the Department of Commerce Privacy “Green Paper” that was released in December, 2010.

1 Lalo v. Apple, Inc., No. 10-cv-05878 (N.D. Cal. filed Dec. 23, 2010)
2 See also Chiu v. Apple, Inc., No. 11-cv-00407 (N.D. Cal. filed Jan. 27, 2011) (making similar allegations regarding Apple’s disclosure of user’s UDID to third parties).
3 In re Zynga Privacy Litigation, No. 10-cv-4680 (N.D. Cal filed October 18, 2010).
4 Facebook, Protecting Privacy with Referrers, May 24, 2010, http://www.facebook.com/note.php?note_id=392382738919
5 Pineda v. Williams-Sonoma Stores, Inc., No. S178241 slip op. (Cal. Feb. 10, 2011).
6 Hershey Co. v. Hottrix, LLC, No. 10-cv-01178 (M.D. Pa. filed June 2, 2010)
7 Viacom Int'l, Inc. v. YouTube, Inc., 718 F. Supp. 2d 514 (S.D.N.Y. 2010)
8 17 U.S.C. § 512(c).
9 Press Release, Nokia, Nokia files patent infringement complaints against Apple in the UK, Germany and the Netherlands (Dec. 16, 2010), available at http://press.nokia.com/2010/12/16/nokia-files-patent-infringement-complaints-against-apple-in-the-uk-germany-and-the-netherlands-2/
10 Nokia Corporation v. Apple Inc., No. 09-cv-00791 (D. Del. filed Oct. 22, 2009)
11 See, e.g. Oracle America, Inc. v. Google Inc., No. 10-cv-3561 (N.D. Ca. filed Aug. 12, 2010) (alleging that Google violated Oracle’s patents for various Java technologies. Google recently requested the Patent and Trademark Office reexamine the validity of many of those patents.)
12 Gemalto S.A. v. HTC Corp., No. 10-cv-00561 (E.D. Tex. filed Oct. 22, 2010).
13 Amazon.com, LLC v. New York State Department of Taxation and Finance, New York Supreme Court, Appellate Division, No. 601247/08, Nov. 4, 2010; Overstock.com v. New York State Department of Taxation and Finance, No. 107581/08, Nov. 4, 2010.

Back to Top


SPRING FORWARD/LOOK BACK: Time to Evaluate Data Security Practices for Compliance with the Massachusetts Data Security Act Rules
This March many of us will be reminded to push the clocks forward one hour for daylight savings time. This year the annual rite will also mark the one year anniversary of the compliance deadline for regulations that implement the Massachusetts Data Security Act.1 Now is a useful time for businesses to review their data collection practices and security programs for compliance with the rules.

By way of reminder, the rules apply to all nongovernmental entities that collect and retain personal information in connection with the provision of goods and services, or for the purposes of employment, about any resident of the Commonwealth of Massachusetts in both paper and electronic form. The term “person” includes non-governmental corporations and other legal entities,2 as well as service providers that “receive, store, maintain, process, or otherwise are permitted access to personal information through its provision of services directly to the entity subject to the law”.3 The rules apply irrespective of whether the service provider “resides” in the Commonwealth of Massachusetts.

Entities subject to the rules are required to create and implement a “written, comprehensive information security program” (WISP) that takes into account the particular business' size, scope of business, amount of resources, nature and quantity of data collected or stored, and the need for security. The rules mandate that the WISP contain, to the extent technically feasible, the following elements:

  • (1) Secure user authentication protocols including:
    • (a) Control of user IDs and other identifiers;
    • (b) A reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices;
    • (c) Control of data security passwords to ensure that such passwords are kept in a location and/or format that does not compromise the security of the data they protect;
    • (d) Restricting access to active users and active user accounts only; and
    • (e) Blocking access to user identification after multiple unsuccessful attempts to gain access or the limitation placed on access for the particular system;

  • (2) Secure access control measures that:
    • (a) Restrict access to records and files containing personal information to those who need such information to perform their job duties; and
    • (b) assign unique identifications plus passwords, which are not vendor supplied default passwords, to each person with computer access, that are reasonably designed to maintain the integrity of the security of the access controls;

  • (3) Encryption of all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly.

  • (4) Reasonable monitoring of systems, for unauthorized use of or access to personal information;

  • (5) Encryption of all personal information stored on laptops or other portable devices;

  • (6) For files containing personal information on a system that is connected to the Internet, there must be reasonably up-to-date firewall protection and operating system security patches, reasonably designed to maintain the integrity of the personal information.

  • (7) Reasonably up-to-date versions of system security agent software which must include malware protection and reasonably up-to-date patches and virus definitions, or a version of such software that can still be supported with up-to- date patches and virus definitions, and is set to receive the most current security updates on a regular basis.

  • (8) Education and training of employees on the proper use of the computer security system and the importance of personal information security.4

Representatives of the Massachusetts Attorney General's Office and the Office of Consumer Affairs and Business Regulation of the AG’s office recently suggested in public remarks5 that the following factors that could trigger an investigation:

  • Knowledge by the reporting entity of the breach yet failure to notify affected individuals as required by the Notice Law.
  • No Written Information Security Plan (WISP) or one that cannot be produced.
  • The WISP is inadequate, or had significant gaps because of a lack of due diligence in the risk assessment process.
  • The compromised data was stored or maintained in circumstances not compliant with the “reasonable” security required by the Regulations.
  • Unfairness or deception around the purpose for which the data was originally collected.
  • Collected data that was subsequently used for purposes not disclosed to consumers, or where the collection itself is not disclosed leading to unfairness or deception to Massachusetts residents.

In addition to the risk of an investigation for a breach of its own customer or employee database, businesses also may risk unwanted attention in the event of a breach of a third party vendor’s database (such as a payment or payroll partner). Accordingly, businesses should carefully review the data collection, management and protection practices of vendors when contracting for services. Businesses should also take the risk of a third party vendor’s data breach into account when creating a WISP and related breach response policies. In the event of such a breach the risk of an investigation by the state can be reduced by taking several measures, including:

  • Notifying the AG of the vendor’s security breach; and
  • Producing evidence of your due diligence in selecting the vendor or a contract that addresses the vendor’s obligations to protect the security of personal information received from the business.

Conclusion.

The one-year anniversary of the Massachusetts data protection regulations compliance deadline is rapidly approaching. A wide swath of organizations is subject to these rules. Now is the time for businesses to carefully assess risk and evaluate their written data security policies, third party vendor agreements, and employee training programs to ensure compliance.


1 MASS. GEN. LAWS ch. 93H (2010); 201 MASS. CODE REGS. 17.05 (2010)
2 MASS. GEN. LAWS ch. 93H, § 1 (2010)
3 201 MASS. CODE REGS. 17.02 (2010)
4 Id. at 17.04
5 See Scott Shafer, Chief, Consumer Prot. Div., Mass. Attorney General, Address at International Association of Privacy Professionals KnowledgeNet: Boston (Dec. 13, 2010), https://www.privacyassociation.org/publications/2011_01_07_mass._officials_discuss_data_security_regs.

Back to Top


Al Franken to Chair Senate Subcommittee on Privacy, Technology & the Law
On February 14, 2011, Senate Judiciary Committee Chair Patrick Leahy (D-Vt.) announced the creation of a new Senate subcommittee on Privacy, Technology and the Law, to be chaired by Sen. Al Franken (D-Minnesota). Other Democratic Senators will include Chuck Schumer of New York, Sheldon Whitehouse, of Rhode Island, and Richard Blumenthal of Connecticut. Republican members will include Orrin Hatch of Utah and Lindsey Graham of South Carolina. Tom Coburn (R-Oklahoma) will serve as the panel’s ranking Republican member.

According to a press release issued by Senator Franken’s office, the Subcommittee’s jurisdiction will include oversight of laws and policies governing the collection, protection, use, and dissemination of commercial information by the private sector, including online behavioral advertising; social networking and other online privacy issues; enforcement and implementation of commercial information privacy laws and policies; use of technology by the private sector to protect privacy, enhance transparency and encourage innovation; privacy standards for the collection, retention, use and dissemination of personally identifiable commercial information; and privacy implications of new or emerging technologies.

The Subcommittee’s creation comes at a time of accelerated attention to privacy in Congress (where at least three privacy bills were recently introduced) and at the FTC, where the agency is reviewing comments submitted in response to its draft privacy report. A final version of the report will be released in a few months which could include policy recommendations. In addition, FTC Chairman Jon Liebowitz recently indicated that the agency has initiated a number of privacy investigations and that more enforcement actions are in the works.

Privacy has garnered significant attention in other quarters, including the Department of Commerce (which released its privacy “Green Paper” shortly after the release of the FTC Report), the Department of Energy (which released its report on Privacy Issues & the Smart Grid in October, 2010) and the FCC (Which addressed privacy issues in its National Broadband Plan). The Consumer Financial Protection Bureau created by the Dodd-Frank financial reform law will focus on a wide swath of privacy issues involving the financial services industry.

It is difficult to predict what role the new Subcommittee will play in policymaking, as other Senate Committees (Commerce and Financial Services) already have jurisdiction over many of the privacy and data security issues mentioned in Senator Franken’s press release. But one thing is clear: Congress has struggled in previous sessions to come up with a framework for protecting privacy in the digital age, and the 112th Congress promises, at a minimum, to match that effort.

Back to Top


California Supreme Court Rules Zip Code is Personal Information: Implications beyond Brick & Mortar Credit Card Transactions
On February 10, 2011, the California Supreme Court ruled in Pineda v. Williams- Sonoma1 that zip codes are “personal identification information” under the state’s Song-Beverly Credit Card Act of 1971.2 The Act prohibits traditional “brick and mortar” retailers from seeking and recording personal identification information from credit card holders engaged in credit card transactions as a condition of those transactions. Retailers can be subject to significant fines – up to $1,000.00 in damages for each violation of the Act. The decision has already prompted the filing of numerous similar actions against other retailers.

The decision’s immediate practical effect is to prohibit retailers from collecting and recording zip code information during credit card transactions, creating a significant impediment to the ability of retailers to generate catalog sales from existing customers. The decision’s long-term impact is unclear. It is plausible, however, that efforts will be made to apply Pineda to online credit card transactions through the filing of lawsuits in California, and possibly in other jurisdictions with similar statutes. If successful, these lawsuits could have significant consequences for a wide swath of businesses engaged in online credit card transactions, including mobile purchases and virtual, in-app purchases over mobile devices.

The case originated when the Plaintiff made a credit card purchase at a Williams-Sonoma store. During the transaction she was asked to provide her zip code. The Plaintiff did so, believing it was necessary to complete the purchase. The Complaint alleged that Williams Sonoma subsequently stored her zip code for marketing purposes. The trial court dismissed the Complaint after concluding that a person’s zip code alone is not personal identification information within the meaning of the statute. The Court of Appeals affirmed. The Supreme Court reversed.

The Song-Beverly Credit Card Act limits the collection of personal information from consumers during credit card transactions. The Act provides:

  • [N]o person or … corporation that accepts credit cards for the transaction of business shall
  • . . . (2) Request, or require as a condition to accepting the credit card as payment … for goods or services [] the cardholder to provide personal identification information, which the person … or corporation accepting the credit card writes, causes to be written, or otherwise records...”3
  • “Personal Identification Information” is defined as:
  • information concerning the cardholder, other than information set forth on the credit card, and including, but not limited to, the cardholder’s address and telephone number.4 [emphasis added].

The Court used the dictionary and applied traditional principles of statutory construction to interpret key terms in the Act. Examining the law’s legislative history, including its “robust” consumer protection purpose, the Court interpreted the term “address” to include such “components” as a person’s zip code. “Alone or together”, address components could be linked to a specific credit card number and identify an individual. Thus, zip codes are personal identification information under the Act.

Citing subsequent amendments to the Act that included a prohibition against recording consumer identification information, the Court also determined that the Act’s legislative history "demonstrates the Legislature intended to provide [added] consumer protections by prohibiting retailers from soliciting and recording information about the cardholder that is unnecessary to the credit card transactions." The Court found that zip code is used by retailers to create databases for their own marketing uses or to sell to third parties. Hence, this information is unnecessary to credit card transactions.

Retailers with operations in California should review their information collection and retention practices in light of this decision, and stop collecting zip code information from customers during off- line credit card transactions. Online retailers, including businesses that engage in transactions involving new technologies (mobile phone payments or “in-app “purchases) should be familiar with Pineda and undertake a careful review of their data collection and retention practices to make sure those practices are in line with the decision.


1 Pineda v. Williams-Sonoma Stores, Inc., No. S178421, 2011 WL 446921, _ Cal. 4th _ (Cal. Feb. 11, 2011).
2 California Civil Code §1747.08.
3 Id., §1747.08(a)(2)
4 Id., §1747.08(b).

Back to Top


Enactment of Red Flags Clarification Law Ends Uncertainty about Rule's Application
President Obama’s signing of the Red Flags Program Clarification Law of 2010 in December 2010 ended a lengthy period of uncertainty that began with the FTC’s 2007 promulgation of its “Red Flags Rule”. The Rule required businesses and organizations that act as "creditors" within the meaning of the Fair Credit Reporting Act (FCRA) to establish policies and procedures for detecting signs of potential identity theft, or “red flags” and take specified measures in response. The Rule’s underlying assumption was that a breach may have already occurred, unlike other rules, whose focus was on preventing unauthorized access to personal data.

The term "creditor" was broadly defined to include "any person who regularly extends, renews, or continues credit or any person who regularly arranges for the extension, renewal, or continuation of credit; or any assignee of any original creditor who participates in the decision to extend, renew, or continue credit."2 Accordingly, a broad swath of businesses were surprised to discover that they were covered under the rule, including lawyers, medical professionals, “mom and pop” retailers that routinely bill in arrears for goods and services, and even veterinarians. The result was a stream of requests for exemption from the Rule’s application, which in turn, resulted in numerous FTC decisions to delay enforcement.

The issue was addressed by the 111th Congress, culminating in the Clarification law, which limits application of the Red Flags Rule to “creditors” that regularly and in the ordinary course of business: 1) obtain or use consumer reports, directly or indirectly, in connection with a credit transaction; 2) furnish information to certain consumer reporting agencies in connection with a credit transaction; or 3) advance funds to or on behalf of a person, based on a person’s obligation to repay the funds or on repayment from specific property pledged by or on the person’s behalf.

The amendment excludes creditors “that advance funds on behalf of a person for expenses incidental to a service provided by the creditor to that person.” This exclusion addresses the concern that the original definition of creditor in the FCRA improperly extended the Red Flags Rule’s scope to businesses that were not typically considered to be creditors, including health care providers and law firms. These entities may still be subject to the Rule if they offer or maintain accounts that are subject to a “reasonably foreseeable risk of identity theft.”

At a minimum, businesses should assess their business practices -- including billing practices – to ascertain whether they are covered by the amended definition of "creditor". They should also assess their current programs for detecting and responding to potential identity theft to ensure that these programs are in compliance with the Rule.

Back to Top


Lack of Appropriate Data Security Measures Results in FINRA Fines for Broker-Dealer Firms
On February 16, 2011, the Financial Industry Regulatory Authority (FINRA), an industry self-regulatory group that is overseen by the Securities and Exchange Commission (SEC), fined Lincoln Financial Securities, Inc. (LFS) and an affiliate, Lincoln Financial Advisors Corp. (LFA) for failing to adequately safeguard customer data as required by the SEC and FINRA. LFS was fined $450,000 for violations that occurred over a seven-year period. LFA was fined $150,000 for similar violations that occurred over a three-year period.

Entities subject to FINRA and SEC rules, including broker-dealers, are required to have written rules and procedures in place to safeguard customer data. In particular, Rule 30 of Regulation S-P requires that every broker dealer adopt written policies and procedures that address administrative, technical, and physical safeguards for the protection of customer records and information. The policies and procedures must be reasonably designed to:

  • (1) insure the security and confidentiality of customer records and information;.
  • (2) protect against any anticipated threats or hazards to the security or integrity of customer records or information;
  • (3) protect against unauthorized access to or use of customer records or information that could result in substantial harm or inconvenience to any customer.1

The rules specifically prohibit the use of some of the practices at issue here, including the use of shared logins.

LFS entered into a consent decree under which it acknowledged failing to adequately protect customer records and information in OmniSource, its electronic portfolio management system, between 2002 and 2009. LFS acknowledged having engaged in such practices as letting employees use shared logins, including usernames and passwords, to access customer data, including names, social security numbers, birth dates, e-mail addresses and account balances. The passwords were not changed, and remained in place even after employees who knew the passwords left the companies. From 2002 to 2009, 1 million customer account records were accessed through the shared logins. In addition, LFS acknowledged that it failed to establish procedures mandating that field representative install anti-virus software and other protection on representative-owned computers used to conduct LFS securities-related business off-site. LFS also failed to audit representative-owned computers to confirm the installation of security software or look for indicia of potential or actual breaches.

The Letter of Acceptance, Waiver and Consent clearly spells out the measures that FINRA considers to be required for compliance with its data protection rules. Entities subject to FINRA and the SEC should review their data security policies and practices, including employee training, to ensure that they are not engaging in the practices that led to the fines in this instance and that appropriate measures are being taken to safeguard customer data in compliance with the rules.


1 17 CFR 248.30.

Back to Top


FTC-Echometrix Consent Agreement Presages FTC Privacy Report Findings & Proposed Framework Re Transparency & Privacy Policies
By Karen L. Neuman
Ari Moskowitz*
The day before releasing its long-anticipated consumer privacy report, in December 2010, the Federal Trade Commission issued its Final Order in FTC v. EchoMetrix, Inc.1

The Order effectively previewed how the FTC anticipates implementing and enforcing the framework proposed in its December 2010 consumer privacy report. The impetus for FTC action in this case involved two software programs sold by EchoMetrix: (1) Sentry Parental Controls and (2) The Pulse. Sentry was a computer program that allowed parents to record and monitor their childrens’ online activity. Sentry collected such data as website history, chat and instant messages. Echometrix subsequently released Pulse, a market research product that collected and analyzed user generated content about certain products from social media sites. Data collected through Sentry was incorporated in the Pulse database and made available to marketers and other third parties.

When parents set up Sentry on a computer they were asked to enter a name, mailing and email addresses, and the ages and genders of the children who would be using the monitored computer. Parents were notified through the privacy policy that information collected from the monitored computer would be stored on EchoMetrix's servers and accessible to parents anywhere they could log in online to their Sentry account. The only indication to parents that the collected information would be shared with third parties, for example through Pulse, was a sentence in the End User License Agreement (EULA), which parents must accept to use Sentry. This sentence was found after scrolling down about 30 paragraphs: “[Sentry] uses information for the following general purposes: to customize the advertising and content you see, fulfill your requests for products and services, improve our services, contact you, conduct research, and provide anonymous reporting for internal and external clients.”

The FTC action focused on the relationship between Sentry and Pulse, and the specifics of Echometrix’s disclosures to their customers in the Sentry EULA about how their personal data was captured and shared with or used by third parties. The FTC concluded that the only notice given to Sentry customers that their online activities could be shared with third parties was that “vague” statement 30 paragraphs down in the EULA. The FTC also noted that the EULA was visible in a small scroll box when customers initially registered the software, but after registering the EULA was difficult to find and review. Customers that could find the EULA were given a choice to opt out of a "collection process." The FTC charged that Echometrix’s failure to "adequately disclose" Pulse, its interaction with Sentry, and how collected information would be shared between them, was a deceptive practice under the FTC Act.

The consent decree lays out an approach that closely tracks the proposed framework in the FTC’s privacy report. As we noted in December’s Privacy & Information Law Update, the FTC report highlighted three major approaches to improved privacy protection: (1) Privacy by design, including collecting only necessary data and keeping that data only as long as necessary; (2) Clearer and more streamlined choices for data collection and use, such as offering the choice whether to allow a particular collection or use of data in an appropriate context. For example, a business would not need to explicitly seek its customers' consent to collect shipping addresses to ship purchased products. However, if the company also wishes to share that address with direct marketers, it should seek consent when and where the customer provides their shipping information; and (3) Promoting transparency, particularly by making privacy policies and license agreements clear and concise.

The consent decree embodied in the EchoMetrix Final Order2 includes stipulations along these lines. EchoMetrix agreed to restrictions on the use of information collected through Sentry, to destroy any Sentry-collected information already made available to third parties through Pulse, and to keep records of all complaints, as well as all versions of their privacy policies and EULAs. Privacy by design is promoted by restricting the flow of information to third parties, as that sharing is not necessary for the functioning of Sentry. The requirement that data be destroyed serves a similar purpose. The restrictions on use also highlight the second of the FTC's proposed approaches - clarity of choice. The use restrictions distinguish between the collected data's primary use, supporting Sentry, and secondary use, market research through Pulse. It is worth noting that the FTC's stated reason for bringing this action was a perceived deficiency in the way EchoMetrix handled the FTC's third approach, a lack of transparency in EchoMetrix's Sentry EULA and privacy policies. While the FTC did not require a particular EULA from EchoMetrix to resolve the case, it presses on privacy by design and clarity of choice to compensate for those deficiencies.

This case provides important guidance for businesses that collect information (directly or through the use of multiple online products) from their customers:

  • Businesses that collect information on children must be extraordinarily careful about how that information is collected, shared and used; and provide meaningful notice to parents.
  • Lengthy, boilerplate terms of service no longer comply with prohibitions in the Federal Trade Commission Act against deceptive practices. The FTC will look, not only at whether the customer was given notice, but what form that notice took. Terms of service and EULAs presented at the outset of a registration process that cover everything from copyright terms to arbitration clauses to data security policies may not be considered sufficient notice of any one single term. Lengthy, everything-but-the-kitchen-sink legal documents are now subject to being treated as a deceptive practice.
  • A customer’s decision whether to accept or reject a secondary use of their data should be context-driven and clear. Businesses should present customers with the choice of how they will permit their information to be used at the time and place that information is collected. Businesses can simplify these choices by only asking for explicit permission when the proposed use is secondary – such as using a shipping address for direct marketing.

The release of the FTC’s privacy report could result in an increase in similar enforcement actions against businesses whose terms of service, EULAs and privacy policies fail to adequately disclose actual information collection, sharing and use practices, or are otherwise not sufficiently transparent. Businesses should undertake a thorough review of their data collection, sharing and use practices in the context of Echometrix and the FTC’s privacy report. Privacy policies, terms of use and EULAs should be updated to reflect adherence to the privacy protection framework embodied in these proceedings.


*Ari Moskowitz is a third-year law student at George Washington University and a Law Clerk at St. Ledger-Roty & Olson, LLP. He previously interned at the NTIA’s Internet Policy Task Force, where he worked on the Department of Commerce Privacy “Green Paper” that was released in December, 2010.

1 Federal Trade Commission v. Echometrix, Inc., No. 10-cv-05516 (E.D.N.Y. 2010), Stipulated Final Order for Permanent Injunction and Other Equitable Relief.
2 Id.

Back to Top


6th Circuit Grants Fourth Amendment Protections to E-mails Stored by Third-Party Service Providers:
Implications for Social Media & the Cloud

By Karen L. Neuman
Shannon Mackenzie Orr
On December 14, 2010 a federal appeals court ruled in U.S. v. Warshak1 that e-mail subscribers have a reasonable expectation of privacy in the contents of their e-mails, and therefore, the government must obtain a search warrant before it can compel a commercial ISP to turn over the contents of a subscriber’s emails. The Court also ruled that provisions of the Stored Communications Act (SCA2) that allow warrantless access to opened or archived e-mail stored by third-party service providers violate the Fourth Amendment.

The impact of this decision remains to be seen as courts continue to be asked to apply increasingly outdated laws, including the ECPA, to new and evolving communications technologies and services. The decision’s immediate practical impact is that, at least in the 6th Circuit, a warrant based on probable cause is now required before the government can compel an Internet Service Provider (ISP) to disclose customer e-mail communications, including opened or archived customer e-mails. To the extent the 6th Circuit’s reasoning is persuasive for other circuits, the same result could obtain in those circuits.

The SCA.

The SCA, a 1986 amendment to the Electronic Communications Privacy Act (ECPA)3, requires investigators to obtain a search warrant for electronic communications that have been stored for 180 days or less. After 180 days the government can obtain sought-after communications by securing an evidence preservation order, and then obtaining a subpoena for the preserved communications. The procedural protections that accompany the issuance of a subpoena are significantly less than those that govern the issuance of a search warrant.

A search warrant is issued by a judge and requires a law enforcement official to provide a sworn affidavit citing probable cause (facts and circumstances within the officer’s knowledge to cause a reasonable person to believe that criminal activity is occurring or that evidence of a crime may be found). By contrast, a subpoena may be issued under the less rigorous “reasonableness” standard; in other words, upon a showing that there is a reasonable possibility that the evidence will provide information that is germane to the investigation. Although subpoenas issued under the SCA can be challenged in theory, in practice there is often no opportunity to do so before the evidence is seized by the government, particularly since the targets are not required to be notified about before they issue.

Warshak.

Warshak arose from Warshak’s (and his codefendants’) convictions for operating an illegal drug distribution operation. Thousands of the defendants’ e-mails were seized by the government from the defendants’ e-mail service providers pursuant to a subpoena under §2703(d) of the SCA after the e-mails were preserved pursuant to a sealed evidence preservation order. The subpoena required the defendants’ ISPs to turn over both account information and the e-mails’ contents to the government. The ISPs were barred from disclosing the subpoena, and Warshak was only notified about it by the government more than a year later.

The trial court rejected Warshak’s motion to suppress the e-mail evidence and the defendants were convicted.

Warshak appealed, arguing that the government’s seizure of his e-mails violated both the SCA and his Fourth Amendment right to be protected against unreasonable searches and seizures. The Court agreed with respect to the Fourth Amendment claim, concluding that the SCA is unconstitutional to the extent it permits the government to obtain warrantless access to stored individual e-mails.

The Court found that Warshak had a subjective expectation of privacy in the seized e-mails that contained “sensitive and often damning” details of his “his entire business and personal life,” – an expectation that society is prepared to recognize as reasonable. Noting their prominent role in modern communication, and similarities with traditional forms of communication, the Court found that e-mails require strong Fourth Amendment protection -- to afford e-mails less Fourth Amendment protection would defy common sense.4

The Court cited Fourth Amendment protections historically accorded to traditional forms of communications and reasoned that modern communications methods, like e-mail, are no different from letters traveling via post or telephone calls traveling via telephone wires. Phone calls are protected against warrantless wiretaps even though phone companies have the capacity to monitor calls; letters are protected even though the postal service can easily open and read them. Thus, “the threat or possibility of access is not decisive when it comes to assessing the reasonableness of an expectation of privacy.”5 Likewise, the Court noted when an individual sends an e-mail he or she does so for the purpose of delivery and storage, not for the ISP’s use in the ordinary course of business or for other purposes. Therefore, “a subscriber enjoys a reasonable expectation of privacy in the contents of e-mails that are stored with, or sent or received through, a commercial ISP.”6

On the basis of this analysis, the Court concluded that: 1) users have a reasonable expectation of privacy in private e-mail accounts and the contents of messages that are stored on a third-party server, and therefore, 2) these e-mails are protected under the Fourth Amendment. Accordingly, government requests for disclosure of a user’s e-mail from an ISP are subject to the Fourth Amendment’s warrant requirement.

The government had argued that the “search” and seizure of the e-mails was constitutional because: 1) the ISP’s Terms of Service (TOS) agreement notified users that the ISP retained the right to access users’ e-mails for various business and other purposes; and 2) Warshak had no reasonable expectation of privacy in his e- mails because he consented to third-party storage. The government also argued that disclosure of e-mails could be compelled from third party service providers under Section 2703(d) of the SCA, which expressly authorizes the disclosure of electronically stored e-mails by administrative subpoena.

The Court rejected each of these arguments. It held that even where a statute like the SCA authorizes government access to electronic communications, the ability to do so will not circumvent the warrant requirement where the statutory due process scheme is unconstitutional.7

The Court also rejected the argument that the TOS put Warshak on notice that the ISP reserved a right of access to users’ e-mail, thereby vitiating any expectation of privacy. The Court compared an ISP’s TOS with a landlord’s rights of access reserved in lease terms to apartments or hotel owners to rooms for the purposes of maintenance or inspection (neither of which defeat an expectation of privacy to permit warrantless access by law enforcement agents). By the same token, the language in the TOS allowing ISP access to customer e-mails for specified purposes did not defeat Warshak’s reasonable expectation of privacy. The Court refused to create a bright-line rule, however, acknowledging that there could be circumstances under which a user’s privacy expectation could be defeated -- for example when an ISP “expresses an intention to ‘audit, inspect and monitor’ its subscriber’s emails.”8 The Court left that issue to future litigants.

The Court similarly rejected the government’s assertion that Warshak lacked an expectation of privacy in his e-mails because they were stored by a third-party. It also tried to inoculate its ruling against precedent9 finding no reasonable expectation of privacy in the contents of bank records, including deposit slips or checks. The Court distinguished these types of “communications” from the potentially unlimited variety of “confidential communications” at issue in Warshak. In the Court’s view, bank customers “take the risk of revealing personal information to others” by voluntarily giving business record information to banks for use by personnel solely to provide banking services.

By contrast, ISP e-mail accounts containing confidential information are simply hosted by the ISP in its capacity solely as “an intermediary, not [as] the intended recipient of the e-mail” itself10 in connection with a service that is essential to the communication and do not involve the use of e-mail contents to provide customer services.

Warshak in Context.

Warshak is the another example of how courts are being asked to resolve ambiguities in laws that could not have foreseen, and do not reference, today’s communications technologies and services, which include third-party hosted content and public communications to defined groups of people over social media platforms.

For example, in Crispin v. Audigier,11 a federal district court concluded that private social media messages, as opposed to what the Court considered to be public social media wall postings, are not discoverable under the SCA. This outcome turned largely on the Court’s analysis and classification of certain ISP services. The Court also undertook an extensive analysis of the SCA noting, in the process, the difficulty of applying a statute that was enacted over two decades ago to today’s communications technologies and users’ practices.

In City of Ontario v. Quon,12 the Supreme Court ruled that a Police Department’s search of an employee’s Department-provided mobile communications device was reasonable under the Fourth Amendment where those employees had notice that they lacked an expectation of privacy in their communications (text messages) transmitted over workplace devices. Interestingly, the Court declined to review a separate aspect of the Ninth Circuit’s decision, which had ruled that a service provider who discloses opened and stored text messages in the absence of a warrant is liable under the SCA. Moreover, the Court appeared to invite further litigation involving new communications technologies to better understand changes in “information transmission” technology and what “society accepts as proper behavior.”

One question not directly addressed by Warshak involves circumstances where a third party’s role as an intermediary is blended with other functions, such as the use of algorithms or other technologies by ISPs to scan e-mails or other communications in order to serve targeted advertisements.

A federal district court in Texas may have the opportunity to apply Warshak’s reasoning to answer this question in Dunbar v. Google.13 The Plaintiff in this case alleges that Google violated the ECPA by scanning GMail users' emails and using that information to serve targeted ads to those users that appear on their computer screens. Interestingly, the Complaints contend that Google’s actions make it more than an intermediary. In another action,14 a federal district court rejected an argument made on behalf of a class that an ISP’s sharing of e-mail communications with a third party ad serving company violated the ECPA’s prohibition against “procuring” e-mail intercepts. The court reasoned that the Plaintiff’s had ample notice and opportunity to opt-out of the intercepts and consented to them.

Conclusion.

Although Warshak arose in the context of e-mail communications, it appears to recognize a strong privacy interest in third party storage of digital data. Accordingly, Warshak could have significant legal implications for cloud computing, including: 1) the discoverability of electronic communications and other personal data hosted by third parties; 2) assignment of liability for businesses that use the cloud to make unauthorized content available online; and 3) clarification of data ownership when that data is stored in or accessed from third party cloud service providers.

Commercial ISPs, cloud service providers and social media service providers should take close look at their privacy policies and terms of use in light of Warshak. It may be that those policies can be used to commercial advantage to differentiate one service provider from another by tailoring the policies to users’ expectations and promoting post-Warshak protections. By the same token, businesses should examine the policies of their service providers to ensure the desired level of privacy protection for employee communications. In addition, all organizations should closely monitor policy and judicial developments, including ECPA reform, and other Court decisions addressing ECPA protections involving stored or in-transit communications.


1 U.S. v. Warshak, No. 06-4092, 2010 WL 5071766 (6th Cir. Dec. 14, 2010).
2 The Stored Communications Act, 18 U.S.C. § 2701 (2010).
3 The Electronic Communications Privacy Act, 18 U.S.C. §§ 2510-2522 (2010).
4 Warshak, 2010 WL 5071766 at 19.
5 Id. at 21.
6 Id. at 23.
7 Id. at 23.
8 Id. at 22.
9 United States v. Miller, 425 U.S. 435 (1976).
10 Warshak, 2010 WL 5071766 at 23.
11 Crispin v. Audigier, 717 F.Supp.2d 965 (2010).
12 City of Ontario v. Quon, 130 S.Ct. 2619, 177 L.Ed.2d 21 (2010).
13 Dunbar v. Google, 5:2010cv00194 (E.D. Tex., filed Nov. 17, 2010).
14 Mortensen v. Bresnan Communications, CV 10-13-BLG-RFC (D. Mont, December 13, 2010).

Back to Top


Physician Prescription Case Could Have Broad Implications for the Commercial Use of Consumer Data
By Karen L. Neuman
Shannon Mackenzie Orr
On December 13, 2010, the U.S. Supreme Court announced that it will review a Vermont law that limits the commercial use of physician prescription information. In addition to the merits, the case, Sorrell v. IMS Health Inc.,1 could have far-reaching implications for the manner and extent to which government may restrict the commercial use of non-public personal information. This case could also mark the beginning of a trend by businesses that incorporate the collection and use of personal data into their business models to challenge government data privacy regulation on First Amendment grounds.

Sorrell involves a challenge by a pharmaceutical industry trade group and three data- mining companies to Vermont’s Prescription Confidentiality Law.2 The law was enacted in 2007 to address privacy and other concerns associated with the marketing practice known as “detailing.”

“Detailing” involves the purchase of physician prescription data by data mining companies from pharmacies (who are required by law to maintain records about both the prescribing physician and the patient). The data is then combined with additional information available in other databases and sold to drug companies for use in their brand-name prescription drug marketing campaigns directed at individual physicians. The data is also used to monitor the marketing success of drug companies and individual detailers. While the data does not include patient names, it does include information that identifies specific physicians. Privacy advocates contend that the information can easily be combined with other available data to identify individual patients and reveal their prescription drug histories.

Troubled by perceived threats to medical privacy associated with detailing and the rising health care costs attributed to the widespread use of non-generic, name-brand pharmaceutical drugs, the Vermont Legislature passed the Prescription Confidentiality Law. The law prohibits any health insurer, self-insured employer, electronic transmission intermediary or pharmacy from selling or otherwise using “prescriber-identifiable information for marketing or promoting a prescription drug” without the doctor’s consent.3 The law further prohibits pharmaceutical manufacturers and marketers from using “prescriber-identifiable information for marketing or promoting a prescription drug” without prior physician consent.4

The plaintiffs filed suit in federal district court claiming the law infringed their free speech rights in violation of the First Amendment. The district court found in favor of the plaintiffs, and they appealed. The Second Circuit reversed, ruling that statute unconstitutionally restricts the plaintiff’s commercial speech. This ruling created a conflict among circuits5 that upheld nearly identical laws in Maine and New Hampshire. The Vermont Attorney General petitioned the Supreme Court to resolve the conflict, and on January 7, 2011, Vermont’s cert petition was granted.

Although this case arises in the context of medical privacy, its outcome could have a significant impact on the ability of other businesses, including online retailers or other service providers, to share customer or sensitive data with third parties for marketing or other commercial purposes.

Commercial speech enjoys less protection than other forms of speech. Presumably, the Court will first have to address the question of whether commercial transactions involving the aggregation and sale of personal or sensitive data constitute “commercial speech” under the First Amendment.

In the past, the Court has invalidated laws restricting commercial speech in similar contexts as either too broad (tobacco advertising directed to minors)6 or that could have been accomplished through less restrictive means (federal rules restricting pharmaceutical advertising).7 Given the integral medical privacy issues in this case, the Court can be expected to evaluate whether: 1) Vermont has a substantial interest in restricting the sale and use of the data, and 2) whether the law directly advances this interest without being too restrictive.

It is difficult to predict the outcome of this case, especially since the First Amendment and privacy issues appear to be so entwined. The Court has previously been reluctant to make broad privacy pronouncements in other contexts. It remains to be seen whether this case will be seen by the Court as an opportunity to fashion a more unified approach to privacy protection – at least with respect to the commercial use of personal or sensitive data. State governments that may be considering crafting similar consumer protection legislation will want to articulate the underlying government interest that demonstrate a clear nexus to the harm sought to be protected against as well as showing that the law is narrowly drawn to advance this interest. To the extent businesses are able to influence the regulatory environment, they will want to demonstrate that the use of personal data can be consistent with consumer protection interests while protecting consumer privacy.


1 Sorrell v. IMS Health Inc., No. 10-779, 2010 WL 4723183 (2nd Cir. December 13, 2010).
2 Prescription Drug Cost Containment Law, Vt. Stat. Ann. tit. 18, § 4631 (2007-2008).
3 Id. § 4631 (d).
4 Id. § 4631 (d).
5 See IMS Health Inc. v. Ayotte, 550 F.3d 42 (1st Cir. 2008), cert. denied, 129 S. Ct. 2864 (2009) and IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010).
6 Lorillard Tobacco Co. v. Reilly, 533 U.S. 525 (2001).
7 Va. State Bd. of Pharmacy v. Va.Citizens Consumer Council, Inc., 425 U.S. 748 (1976)

Back to Top


Karen Neuman to participate in February 28, 2011 NATOA Webinar on Local Franchise Authority Use of Social Media & Mobile Apps for Distributing Government Content
When the Cable Telecommunications Act of 1984 was enacted no one could have imagined, nor does the statute reference, such new communications platforms and technologies as social media and mobile apps. These platforms and technologies can be used by local governments to achieve many of the objectives that are embodied in the Cable Act’s PEG Access provisions and incorporated into local cable franchise agreements. Many LFAs are considering augmenting PEG channels with emerging platforms for distributing government content. Karen will address some legal issues associated with local government use of these technologies, particularly mobile apps.

Back to Top


Copyright © 2010 St. Ledger-Roty & Olson, LLP.
1250 Connecticut Avenue, N.W., Suite 200, Washington D.C 20036