Want data? Learn to share

In case you missed it live under a rock, privacy and use of consumer data has been rearing up as a major issue in our increasing digital world. Recently I wrote about how troubling I find it that marketers are treating control over data use as something that’s being taken away from us. (I argue it was never ours to begin with.)

I feel like we all know the drill. Consumer data use needs to be an exchange of cost (sharing data about yourself) and benefit (personalised content, more applicable advertising.) And while we don’t ever expect consumers to fully understand the technology like we do, it is important they understand it well enough to not be thoroughly creeped out. The reason targeted advertising is so creepy? People genuinely believe there is some central repository of data that knows everything about them. (After all, how could the same ad follow them around so many websites?!) So yes, education is absolutely critical.

However, if we really want consumers to understand the benefit of data use, we need to start providing data to help our customers make better decisions. Not just use it for our own business.

If companies can 1) Provide data to benefit customers, 2) Help customers understand this is being powered by data and how their data contributes and 3) Be transparent about how they’re using it (transparency = less scary!) we can move towards a more educated discussion of privacy and data use.

So what are some examples of using data to benefit customers?

  • Recommendations: The Amazon-esque “customers who are interested in X are looked at Y” is a classic use of data to benefit shoppers.
  • Valuation and forecasts: For example, automotive valuation companies like Kelley Blue Book or home buying sites like Zillow use historical data and predictive models to better inform consumers in the buying process.
  • Credit scores: Companies like Credit Karma use customer’s own information to provide credit scores and help customers make better financial decisions.
  • Editor and user ratings: Companies like CNet inform customers via their editorial reviews, and a wealth of sites like Amazon and Newegg provide user ratings to help inform buying decisions.
  • Price alerts: Alerting customers to a shift in a data point (price) can help customers find the best deal – whether it be on merchandise, travel or more.

This list is by no means exhaustive, and I don’t think we’ve truly begun to explore how this explosion of data can benefit consumers and businesses alike. The problem with the current privacy debate is that there’s everything in it for businesses, and not nearly enough benefit for the consumer.

What can your company do to share the benefit of your data with your customers?

Digital Privacy: Benefits, Education and Building the Road to Trust

Last week, I attended the Direct Marketing Association conference in Las Vegas, and one of the things that struck me was the attitude of marketers to privacy. At one session, an impassioned speech was given by the president of the DMA to “rise up” against the “privacy zealots”, and that failure to do so would be the end of the relevant internet.

In short, marketers seem to perceive restrictions in data use as something being taken away from us.

So I have to wonder: Do we even have a right to react that way? Was carte blanch access to customers’ data ever really ours? Or were we just lucky to skate by using it so far?

Who owns the data?

Organisations capture and store customer data. However, the data represents the actions of consumers. So, who “owns” it?

As an analogy, consider medical information. A doctor’s office or hospital captures and stores this information. But at any time, a patient has the right to request it, and has control over whether one doctor can share it with another. If digital consumer data is treated in this way, it seems that a consumer may have similar control.

Education and Trust

The exchange of data for benefit is one that requires education and trust. And herein lies the issue. To date, we (digital marketers, analysts) have not done a good job of either.

By using data in the ways we have, without having educated along the way, we created this environment. Until the privacy uproar began, there was little or nothing done by marketers to put limits in place. We had to respond to raps on the knuckles before taking (what appear to be) begrudging actions. No wonder consumers don’t trust us!

We have failed to educate, and by developing new and ill-understood technologies, this was our responsibility. After all, when air travel first become possible, customers had to weigh a benefit (getting there quickly) versus the risk (being harmed.) For airlines to succeed, they had to educate consumers about both the benefits and risks of air travel. Of course, I’m not arguing this was done out of the goodness of the travel industry’s hearts – air travel is a very clear “opt in” situation, and success required it.

But had we educated – what a difference it might have made already. Think about what consumers find “creepy”. Going to a grocery store website, ordering groceries online, then having the store recommend recipes to use these ingredients? Perfectly okay. Being followed on the internet by advertising related to products you’ve previously browsed? Creepy. Why? Because consumers don’t understand why they are seeing these ads. They understand how their actions led to recipe recommendations. Re-targeting, on the other hand, makes them feel as though they are being watched on every site they visit, and that this information is somehow stored in some central “spy” repository.

Too complex to educate?

Yes, there is some complexity in how digital marketing technologies work. However, education is not impossible, and complexity is no excuse.

Let’s consider another medical example. Doctors need to seek informed consent prior to patients undergoing medical treatments. Medical procedures are often far more complex than digital tracking, and the consequences of making the wrong choice can be severe. (We are literally talking life and death!) If doctors can educate patients regarding complex medical procedures, we can certainly educate regarding digital marketing and tracking.

Balancing benefits

Education is, of course, critical. The consumer must understand the two sides of the equation: the concerns they have with digital tracking and targeting, and the benefits they receive from it.

But marketers must ensure there are two sides of the equation. Every use we make of data needs to have some benefit for the consumer, or all the education in the world won’t help us.

Fighting for the consumer

Marketers are fighting for their right to market.

Privacy advocates are fighting for tougher restrictions.

What we need is to fight for the consumer. To empower them with education, so they can balance the risks and benefits, and reach their own decisions. Only by providing benefits, unbiased education and allowing consumers to reach a decision (then truly respecting that decision) can we build the trust we’ll need to move forward.

[Credit: Thanks to Bryan Pearson for the great food for thought at DMA2012. You can read more about consumer perception of targeting and tracking in LoyaltyOne’s 2012 Privacy Research: http://loyalty.com/knowledge/articles/2012-privacy-research

KeystoneATX: Privacy and Personalisation and Data, Oh My!

I had the pleasure of participating in Keystone Solution’s first ever Speaker Series in Austin TX this week, and wanted to share a few takeaways from the day.

The focus of the event was on privacy, or, more specifically, how we can create relevant experiences for users without stalking. It was a great opportunity to hear different viewpoints and consider my own, and I took away a number of valuable takeaways:

1. Education of users is critical, to allow them to make informed decisions about whether the benefits they’re receiving are sufficient for them to consent to tracking. It is important that decisions not be made out of misinformation and fear.

2. But the crux of users’ issues with privacy tends to center around control. Users are made to feel they are being “held ransom” (“We have your data!”) and the uproar that takes place normally follows removing control from users. (For example, Facebook changes that make information (retroactively) public by default, or persistent cookies that override user choice to delete.)

3. Not too long ago, there was fear about buying online. While privacy concerns may never disappear, over time (and with education and control options) people may start to get more comfortable with tracking, just as they did with online shopping.

4. No one had the “magic bullet” for easing user privacy concerns, but there were really two threads emerging – 1) Concern about government regulation, due to the nuance of the issues. If legislators are not educated enough to distinguish between types of tracking, types of data, and the distinction between collection and use, blanket measures are likely to result. 2) Industry self-regulation will require a widespread commitment, as one company, vendor or individual can create a great deal of negative publicity. (And likely then lead to regulation.)

5. Tracking and personalisation get creepy when you’re doing it for the brand’s benefit, not the consumers. The best brands use data to make you feel special. However, too many companies don’t trust that doing things the right way will pay off, and try to take shortcuts or rely on the “magic bullet”

6. Those collecting data and members of the analytics industry need to act as stewards of the data (and signing the WAA Code of Ethics is a great start!)

7. Most seemed to agree that while the issue has become tracking, the focus should be on data use rather than collection. (You don’t arrest someone for murder because they have a gun and motive… they have to take action.)

8. Similarities to offline were discussed (for example, grocery store loyalty/discount cards, cable box and ISP tracking.) But the question is, are users aware of all of this tracking? The internet is not necessarily causing privacy concerns, but online tracking is simply more public. In a way, online tracking might even be raising awareness of the offline tracking that takes place.

9. There is often an argument of, “If you don’t like tracking, don’t visit the website.” However, is this reasonable? Is it truly consent if you don’t really have a choice? For example, you either submit to TSA procedures or never fly, which may even be required for a person’s job. Similarly, reading Terms and Conditions of software gives you only the choice to not use it – you have no option to negotiate with the company about those terms.

10. There is a gap between user privacy concerns and the amount of social sharing that users are doing. The responses to Facebook changes really show the kind of ownership that people feel over their Facebook experience, and that users forget that ultimately, they’re playing in Zuckerberg’s sandbox, not their own. Users will need to keep in mind that no matter the network, if it’s on the internet, it could be public, and exercise caution in sharing.

11. In the end, users want more control over their privacy, but the appropriate mechanism for that has not been figured out. Users may be willing to share data with some sites and not others, and allow personally identifiable information to some companies and not others. (For example, maybe I trust Zappos but not ShoezWarehouze.) But how do we allow control of these levels of privacy without creating a complicated, unusable system of permissions?

In the end, no one had the solution, but that’s not really the point. This will be an ever-evolving process, and it’s important we talk through the issues, to ensure we don’t over simplify in our desire to solve for the benefit of all.

And just for laughs, a few quotes of the day:

  • “Privacy is a mean ass topic” – @EvanLaPointe
  • Emer is scared of her “bottom” being invaded by the US government. (Re: TSA measures)  – @Exxx
  • “I wouldn’t mind the government being involved if I didn’t think they’d fail miserably!” – @Jenn_Kunz
  • “Common sense is not terribly common” – @jdaysy
  • “If only ‘good tracking’ and ‘bad tracking’ were as easily understood as ‘good touching’ and ‘bad touching’”  – @aprilewilson
  • “We have an unsexy community. Well, everyone here is sexy, but you know what I mean…” – @keithburtis
  • “At @KeystoneSocial, we make sushi unhealthy.” – @mgellis
  • The @KeystoneSocial motto: “Tread softly and carry a Kyle.” – @mgellis

 

Whose responsibility is online privacy?

Kissmetrics and a variety of its clients have been center stage in the news lately for tracking unique visitor behaviour, despite a user clearing their cookies. Shortly after the story broke, a number of high profile clients removed Kissmetrics tracking, arguably “throwing them under the bus” in the process. Now, Kissmetrics and more than twenty of its customers are facing a class action lawsuit, claiming the tracking violates privacy laws. However, there was  a similar lawsuit in 2009 over the use of “zombie cookies”, with some of the same businesses named as defendants.

This got me thinking, and into a rather lengthy debate/rant/conversation with fellow industry member Lee Isensee, which helped to shape (and refine somewhat!) a few thoughts around the responsibilities of the organisation tracking vs. the vendor providing tracking capabilities. While I find myself defensive of vendors and organisations that are being respectful of customers privacy, in line with the WAA Code of Ethics, the real question is:

Whose responsibility is it to protect consumer privacy – the business using the tracking, or the vendor providing a solution or product?

I can’t help but think – if you, as a company:

  • Choose a method of tracking that (many argue) violates users’ privacy and wishes
  • Don’t disclose the level of detail being collected, or how it will be used
  • Face legal action as a result of that tracking, and settle by agreeing not to use that technology again
  • Later, face accusations of similar tracking (similarly intentioned, though the mechanics perhaps differ)
  • But sever ties with the vendor, essentially blaming them, while claiming your company takes user privacy seriously

What conclusion is there to draw from that? Does it suggest that you, as a business, want to do that kind of tracking, and seek out vendors who provide those capabilities? (It’s a little hard to argue the “but we didn’t know” defense if you’ve faced legal action for this type of thing before.)

If that’s the case (and I understand this is a little difficult in the current climate) why not stand by this kind of tracking, disclose the approach and method, and explain the consumer benefits of it? Why claim to be privacy conscious and blame the vendor when your company has a major privacy backlash. You’ve previously chosen to engage in this kind of tracking (and faced the repercussions!) before? What leads you to do so again?

So if a business is inclined to this kind of tracking, what is the responsibility of the vendor providing it? Do they own a customer’s implentation (post initial engagement) or chosen use of the data? Do they owe a duty to the customers of their clients? What legal duty do they owe? Do they owe a duty to allow opt-out? Or is that in the hands of the company doing the tracking? What ethical duty do we impose? (And how far does that go? To the vendors that support the vendor? Ah, forget it. I’m hearing an Adam Carolla “slippery slope” rant starting as it is.)

I’d argue there’s one level of responsibility, that falls squarely to the company itself. A business decides what kind of tracking to do, and which vendors to use. They owe a duty to their customers. If a vendor is found to use “unsavoury” practices, actively recommending those practices in collusion with the business and disregards industry accepted practices, isn’t it the responsibility of the business to have thoroughly evaluated the vendor?

Something along the lines of: we don’t sue gun companies for homicides. The analytics vendor sells the gun, the implementation is the bullet, the business is the person holding the gun … who ultimately made the choice to shoot the customer?¹

Am I way off base? Where do you think this responsibility lies?

 

¹  I can’t take the credit for all of this. Thanks Lee for boiling it down to a simple analogy.

California Privacy Bill: Commentary and Summary

Consumer privacy has been a hot topic for the past few months. Late last year, the Federal Trade Commission issued their report endorsing a “Do Not Track” option. The Commerce Department weighed in on privacy, leaning towards industry self-regulation. With the buzz privacy was receiving in the media, several browsers soon released updated privacy options. However, without any action by tracking companies to take these privacy settings and apply them in determining whether an individual’s behaviors be tracked, these browser updates are essentially futile.

In April, the Commercial Privacy Bill of Rights Act of 2011 by Kerry and McCain came out recommending:

  • Clear data collection notice, including the purpose of collection and the ability to opt-out.
  • Individuals to be able to access their information, and request cessation of its use and distribution.
  • An opt-out for all data, with an opt-in for sensitive personally identifiable information.
  • Companies to minimize only the data they need to provide transaction or services.
  • Companies bind third parties they share data with as to what they can do with the data.
  • No private right of action.
  • Voluntary “safe harbor” program that would allow companies to exempt themselves from the requirements if they had procedures in place that were just as good.

Now, the California Senate Judiciary has cleared proposed Privacy Bill SB 761. The Bill calls for the Attorney General to adopt new regulations for consumer opt-out of tracking.

Commentary

While the intention to protect consumer privacy is to be commended, there are some concerns with the proposed Bill.

For a summary of the Bill, please read below under The Bill: In Summary.

State laws in an online world

Global businesses already face a minefield when it comes to international laws. For example, European laws regarding privacy are stricter, with Google Analytics in Germany prohibited.

The reality is, online business does not have a simple physical presence, where state boundaries are easily applied. The draft Bill applies to entities doing business in California with a consumer in California. This appears general enough that it is not limited to simply companies located in California. Has a situation therefore been created where companies across the United States (and arguably, the world) must comply with these proposed regulations for California users?

Recognizing California Users’ Opt-In

How are companies to detect what state a user is located in, to ensure compliance? Understanding that a user is a California consumer, and therefore must be provided with recognized opt-out, would require capture of (and respond to, in real time) IP or physical address – the very data collection that users could be trying to opt out of.

First Party vs. Third Party Data

The Bill does not clearly distinguish between third party tracking done by companies such as ad networks, and first party web analytics for the purposes of site optimization and business analysis.

In one way, the Bill can be read as prohibiting web analytics data capture: the type of data covered includes “Internet Web Sites and content from Internet Web Sites accessed”, including time and date of access – the foundation of first party web data capture.

On the flipside, there are exceptions that can be interpreted as permitting web analytics.

For example, an exception to regulation is granted if analysis of that consumer data is not the company’s primary business. Does this allow for web analytics? Company X sells mobile phones, but analyzes their site visitors. That analysis is not their business, phones are. Is web analysis permitted?

There are also exceptions made for “commonly accepted business practices.” Among them is using data to improve products, services or operations. Arguably, first party web analysis for the purposes of site optimization falls into this. Is it therefore permitted?

An exception is also granted for businesses not collecting or storing sensitive information (defined as medical, health, race or ethnicity, religious, sexual, biometric or social security information.) Does this therefore mean that first party web analytics is permitted for companies not storing any of this type of data? What if a company collects this data but it is completely separate from their online data?

One hopes a revised Bill, or the regulations themselves, would speak more clearly to first versus third party data capture. The current draft would likely lead to litigation to resolve the issue of first party web and business analytics.

Add your commentary

I’d love to hear your thoughts – good, bad or ugly. Please leave a comment!

The Bill: In Summary

California already has in place existing law requiring notification to Californian users of a company’s privacy policy. However, the proposed SB 761 requires the Attorney General to adopt new regulations to allow for consumer opt-out of tracking.

Who would these regulations apply to?

  • Anyone person or entity doing business in California that uses online data collected from a consumer in the state.

What would these regulations require?

  • A company to provide California consumers with a method to opt out of data collection, storage and use.
  • Require disclosure information on data practices and to whom this data may be disclosed to.
  • Prohibit data collection or use if a consumer has opted out.

In addition, the Attorney-General may also require that covered entities provide:

  • A way for consumers to access their information.
  • Data retention and security policies in an easy to understand way.

What type of data is covered?

  • Web sites and what content is accessed
  • Time and date of access
  • Geolocation
  • Method of access (for example, device or browser being used)
  • IP address
  • Personally Identifiable Information:
    • Name
    • Address
    • Email address or username
    • Phone or fax number
    • Government ID number (e.g. passport number, driver’s license)
    • Financial account numbers and the security codes used to grant access to these accounts
    • Sensitive information:
      • Information related to medical history and health
      • Race or ethnicity
      • Religious beliefs and affiliation
      • Sexual orientation or behavior
      • Financial information such as income and assets.
      • Biometric information such as fingerprints
      • Social security information.
      • Does not include:
        • Business information such as business email address or business phone number.

Exemptions from regulation:

  • Regulations should not interfere with a commercial relationship where the consumer has expressly opted in for those purposes. However, the Bill specifically states that if that business is online advertising and marketing, the regulations may affect that relationship.
  • Federal, State or Local government
  • Smaller businesses collecting information from fewer than 15,000 individuals total, or 10,000 in a 12 month period.
  • Business not collecting or storing sensitive information.
  • Companies not using the information to study, monitor or analyze behavior of individuals as the primary business.
  • May be an exemption for commonly accepted business practices:
    • Customer service and support.
    • Using data to improve products, services or operations
    • Basic business functions such as accounting, inventory, QA.
    • Defending rights or property.
    • Complying with law, such as court order, subpoena (etc.)

Consequences of non-compliance:

  • Individuals are able to take civil action for damages between $100 and $1000 per breach, plus punitive damages the court may allow.

To read the full bill, please visit: http://info.sen.ca.gov/cgi-bin/postquery?bill_number=sb_761&sess=CUR&house=B&site=sen

To contribute your thoughts, please comment!

What Your Company Needs to Know About Potential Online Privacy Regulation

[Part 1 originally published in Colorado Biz on 01.18.11.
Blog post updated on 03.06.11 to include Part 2 of the article.]

The internet is a wonderfully measurable place. Businesses are able to use online data to drive strategy and measure return on investment. However, the wealth of data that makes the online world a prime space for analysis is also leading to consumer concerns over privacy. Facebook privacy controls, data capture, ad targeting and mobile applications have all been the subject of privacy discussions in the media.

A recent survey by USA Today and Gallup suggests that only 35% of respondents believe that “the invasion of privacy involved [in behaviorally targeted online ads] is worth it to allow free access to websites”, with younger respondents (40%) being more willing to accept this than older respondents (31%). However, while only 14% would allow all companies to target ads to them, another 47% would be willing to allow the advertisers they choose to target ads.

With so many concerns out there about data capture and privacy, what is a company to do to ensure their behavior and data practices are not called into question – or made front-page news?

What kind of data does your company use?

First, your company needs to differentiate between types of data capture, understand what you are leveraging, and the current climate around different types of data use.

Understanding the current landscape

Recently, the Federal Trade Commission released their draft report on Consumer Privacy. The FTC’s report distinguished first and third party data capture, with different views as to what consent and regulation should be required for each.

First party data includes web analysis done through tools such as Google Analytics, Webtrends and Adobe Omniture, for the purpose of improving consumers’ online experience and a company’s profitability online. First party data use also includes first-party marketing: a company recommending products or services based on a consumer’s previous purchases. The FTC recommended that this type of data capture not require specific consent, as these are considered commonly accepted business practices.

Third party data capture, however, is considered separately. This includes companies that deal in the buying and selling of information. For example, ad networks who buy and sell data to allow delivery of highly-targeted advertising. The FTC’s main concern regarding third party tracking is not banning the practice, but rather, allowing for informed consumer choice. While the FTC declined to declare opt-in or opt-out as the appropriate method for expressing consumer choice, the FTC did call for a Do Not Track mechanism, enforced through either legislation or industry self-regulation.

Legislative vs. Self-Regulatory approaches

The FTC’s recommendations open the door for potential legislation of online privacy and data capture. However, the Commerce Department has recently recommended self-regulation.

The Commerce Department disfavored prescriptive rules, noting the need for an approach that allows for rapid evolution and innovation, while enhancing trust. The Department called for voluntary but enforceable codes of conduct that promote transparency in data collection and use, and recommended enlisting the “expertise and knowledge of the private sector” in this regard.

The web analytics industry in fact recommended this very thing back in September 2010. A voluntary code of ethics for web analytics practitioners was proposed and drafted by Eric T. Peterson and John Lovett of Web Analytics Demystified, in conjunction with the Web Analytics Association, and a second initiative has begun regarding consumer education.

New medium, same challenges

While online data and privacy may seem new and uncharted territory, this is simply a new medium for similar challenges faced off-line. For example, consumer acceptance of tracking and targeted advertising in exchange for free online content is not too different to accepting grocery store data capture via loyalty cards in exchange for discounts. The difference is that online data capture is a newer, without a well-established procedure for privacy safeguards, and a lack of education about what the benefits or exchanges for tracking may be.

What is required in the industry is two-fold:

  1. Finding the appropriate way (not necessarily legislatively) to establish and regulate those safeguards online; and
  2. Educating consumers about the types of data capture and use, and potential benefits, to allow for informed consent.

How can a company protect itself?

So how can a company protect itself, in light of the current uncertainty around online privacy?

Safeguard consumer privacy as if you are already legislated to.

  1. This has two benefits: If enough companies voluntarily safeguard consumer privacy, legislation may not be needed, leaving flexibility for companies to find the right way to protect privacy within their own business model; and
  2. If legislation does occur in the future, your company should not require major changes to your privacy model to be in line with the requirements.

Follow FTC recommendations and integrate privacy considerations into your company’s daily business practice, by:

  1. Taking reasonable precautions to protect the safety and ensure the accuracy of data;
  2. Only collecting data required for a specific, legitimate business need (rather than capturing data “in case” it can later be monetized); and
  3. Ensuring your data retention periods are reasonable.

For first party data capture and marketing, ensure that you have a plain language, non-legalese privacy policy that allows consumers to understand what data you’re capturing, how you use it, and clearly distinguish your first-party data use from third-party data use.

For any third-party data capture, make your practices transparent (this means not burying information behind legal jargon!) and educate your consumers. Advise what data is being captured, the benefits to the consumer, and provide an easy way to opt out. (A hint: if the only benefits are for your company, and not the consumer, you should expect a high opt-out rate.)

Additional considerations

For companies in business overseas, keep in mind that privacy laws may differ between countries. For example, Europe’s privacy laws are already stricter than the United States, and will potentially receive further overhaul in 2011 to modernize the 1995 Data Protection Directive.

Prepare for the future

Online privacy is not likely to quiet down in the coming months. However, by being proactive and considering consumer privacy in your daily and long-term business strategy, your company can set itself up on the right side of proposed legislative or self-regulation.