eAndreas Weigend, Social Data Revolution | MS&E 237, Stanford University, Spring 2011 | Course Wiki

Class_11: Individual Privacy

Date: May 3, 2011
Audio: weigend_stanford2011.11_2011.05.3.mp3
Other:
Initial authors:[Louis Lecat, lecat@stanford.edu],[Léo Grimaldi, leogrim@stanford.edu], [Aldo Briano, abriano@stanford.edu]

Key Points

Brad Rubenstein(see also his personal web page) led a discussion on individual privacy. The keys ideas to approach privacy issues were:
  • Privacy and Trust: Who we trust, how it affects us and what can cause a loss or regain of trust
  • The Irrationality of Privacy: How we make privacy decisions, how irreversible they are and what we really have to hide
  • Privacy Tradeoffs: The benefits of sharing data, how we measure them, and the effects of social norms on our decisions
  • The Privacy Ecosystem: Normalization issues, the effect of choices on our community, advocacy and collective action

Privacy And Trust

1. Whom do you trust?

Trust is not absolute or static feeling that we typically sit down and think about it. Some compare trust to gut feeling and others to rational thoughts. Interestingly, we saw in class how our classmates had different opinions on trust.
Situational
Do you trust doctors, lawyers and accountants more than your neighbors? What is it about this situation that separates from itself?
They lose their livelihood if they lose your data.
Unconditional
Is there such a thing as unconditional trust?
Why would you trust a company rather than another with your data?
The notion of data privacy and trusting the social networks where the data belong is a concept that is becoming a highly debated issue.

2. The indicators of trust, or distrust

  1. Predictability
  2. Value exchange
  3. Delayed reprocity
  4. Exposed vulnerabilities
  5. Who you are working with
  6. What data are you sharing

3. Losing and regaining trust - incidents and their effects. What works?


Every company suffers some type of privacy scandal. From Facebook changing their privacy policy at will to Apple's iPhone location data exposed to Sony's Playstation 3 Shutdown. Some companies tend to hide any bad press, while some embrace and try to make it better by appealing to the customers pain and offering apologies/consolation gifts. For the customer, an important question is: Does your opinion of the company change? How many times does a breach have to occur for the person to stop using the system, or to stop caring about the data? What actions can a company take to regain the trust?


















Every time privacy is breached, there are multiple outcomes and consequences. The first example is data theft. Credit card information can make you lose money if your secret numbers are used by someone else. Another common example is identity theft which can be caused by leaking sensitive information such as social security numbers. In the case of the Sony leak, the company announced that it would not ask for this information in order to mitigate data theft. Other situations include taking the leaked data and correlating it with publicly available data in other social networks to make inferences and/or conclusions. Here are two example of public issues with privacy:
Steve Jobs admits mistakes with data
Google sued over Android location tracking

4. Privacy

One of the readings in Dog Food 2 was discussing a research from Cambridge University that showed that "many of the 45 social networks they studied buried their privacy statements in obscure corners of their sites". The concern is that alerting people of privacy as a potential issue might make them less inclined to share their personal information.
external image diaspora_dandy_logo_thumb.png
Now, developers have started to create alternative systems focusing mainly on the privacy of data and giving full ownership of it to the user. For instance, Diaspora is a decentralized social network where each person hosts and owns their own data. There are no centralized servers such as the ones you could find at Facebook and they do not control the user's data. This idea that "privacy concerns and risks should not lie in the hands of Facebook's will" came from four NYU students who managed to raise $200K in a few weeks through Kickstarter. Yet the ultimate question remains: do people really care about privacy?

On the other hand, some companies such as Google define what transparency should be for social data. They allow the user to view/edit/delete/manage/port any data that they have about them. Rather than being obscure about their privacy policy and statements, they allow full access to all of your data at any time. Check out your own privacy tools here: Google Privacy Tools
external image google_privacy.png


The Irrationality Of Privacy

1. How do we decide what matters? How do we make decisions for our own privacy?

The setup

When it comes to making privacy decisions, a lot of factors are to be taken into account. One makes a pretty sophisticated mental calculation before coming up with a decision: "what is the value in it for me? Am I violating social norms? What is the risk in the short-term? In the long-term?" The setup is a really important part of it as well. For instance when Andreas asked in class if we all agreed to share our Facebook data for the assignment, the fact that we were being asked unexpectedly, and that we had to come up with an answer quickly might have caused stress for certain persons. The social pressure of being part of the class community very likely played an important role in people's response, and even more so because the class had to come up with one answer for everybody.

The elements of the decision

As we are making this decision to divulge something or not, what comes into that calculus? What are the real reasons why we are going to take the leap and share the data or not? Everyone has a different answer here, and some need bigger incentives than others to share the same data. Here are some key insights and comments from students in the class:
external image cartoon-gaping-void-trust-me-trust-you.jpg
  • Making the decision of sharing Facebook data with the class was fairly easy: the incentive was simply to get school work done, and on the other hand students knew that the data would stay in the Stanford community, which was a strong enough guarantee.

  • Some students have rules they do not break, such as not sharing their friend's information without their permission. But how far does this hold? We might be able to come up with sufficiently big incentives for them to give up their rule.

The question that naturally comes up thereafter is how much are you relying on your friends to keep your information private? You might be protecting theirs while they just do not mind giving out yours.

2. The irreversibility of privacy choices - the persistence of data

First, the web is a public forum: once the data is out there, you cannot take it back. The class had a great discussion about the implications of data leaks. And even though some might argue that no one is willing to work with people who discriminate based on private data, a student made the interesting point that, for two similar profiles to choose from, it is not a personal decision to hire one guy against the other based on their Facebook profile, but rather seems to be rational one to not hire the person that has 1000 pictures of him/her partying within the past week.

external image facebook_fired.jpg
Second, one interesting suggestion is that if data leaks are really small, one might still be able to protect their info, but after a certain amount of leaks, one should not care about protecting the data anymore. Thus how many leaks does it take to not care about protecting the information any longer? Where is the threshold at which one stops taking action against these leaks?

Discussing the implications of old personal information messing with your career plans as an example, some students came up with the hypothesis that at some point, having that kind of data public on the web is going to become the norm for most of us, and nobody will pay attention to it as we do now. As every politician and important businessman will be in the same situation, people may just stop attacking them based on private facts and we might avoid the more and more imminent "privacy war".

3. "What have you got to hide?"

external image 1539d1256762317-cartoon-eraser.jpgOn Wired, a reader has the following reaction to the iPhone Location-tracking issue:

"So, when your old lady checks the backup file and assumes that you routinely go to a titty bar instead of the pub with the guys (that's across the street), you have no problem with her jumping you with a frying pan when you walk in the door on your 'night out with the guys'. It's not always the facts that get you in trouble, but more the assumptions that can wreak havoc.

You don't have to be guilty to be assumed to be guilty. You have a right to privacy. Erode that one, and you might as well let all the others go too. There's a reason the GPS is turned off on my phone. There's a reason I clear my txt messages regularly. There's a reason I don't keep my contacts on my phone. It's my business and no one else's."

In a world where people can misinterpret everything, what should we choose to hide or share?

Is it fair that data we share is used against us? Can shared information be reliable? Some insurance companies rely on shared data online to profile their customers. Here is a tale of a woman who lost insurance coverage based on information on her Facebook profile.

In the class, most of the reactions are based on the fact that a data leak, or someone being able to get their hands on our information is very unlikely. But do we really agree on the definition of "unlikely"? Indeed, what’s unlikely for an individual becomes very likely for an entire population following the same behaviors, and living with the same standards and conventions. Hence the question becomes: do you want to live in a society where a small number of people have to assume the consequences of "unlikely problems" for the rest of us? Are you willing to take the risk of becoming one of them?

At the end of the day, we are responsible for creating norms for our communities and have to shape the world we will be living in a few years from now. [This relates to the Black Swan Debate and how rational decision making becomes enormously difficult when handling low probabilities. More info here: Black Swan Theory]

Debating previous sessions and experience, it seems that what people really care about are things that are not politically correct. They consider these things very differently from illegal actions. For instance they would be more inclined to share that they had a ticket for speeding on the I-280 than displaying 'PC' comments or pictures of themselves.

Eventually, the class came to the conclusion that hiding everything, although it seems to be the standard behavior today, might become obsolete in the near future. For instance, would you be willing to hire someone that you cannot find on Facebook or anywhere on the web?

Privacy Tradeoffs

As an entrepreneur, it is very useful to understand what privacy tradeoffs are acceptable in order to be able to maximize the participation in your service.

1. What do you willingly trade for loss of privacy?


external image connect_graphic.png

Facebook Connect enables you to use your Facebook account to connect with third party websites, software or applications. Why provide your information to these services?
- It is convenient as it exempts you from setting up a new dedicated account.
- It also allows these external services to integrate with Facebook in order to "socialize" your user experience.
- In the end, it may even be more secure to use Facebook Connect as you get a place that centralizes your privacy settings on the web.
But is it really a good thing or does it simply help Facebook connect your "digital dots" and track your activity more efficiently?

Technical details can be found on the Facebook Developers page.


external image activate-facebook-text-messaging-service.png

Facebook has recently unified texts, e-mails and former Facebook messages into a single service. It enables you to send free texts to your friends. Of course, this requires to provide Facebook with your phone number.
In addition to the text message service, Facebook allows you to setup a two-tier log-in process by compelling any new log-in to enter a security code sent via SMS to your cellphone. Of course, this means that you must give Facebook your phone number.
There is no such thing as a free lunch, is there?

external image gmail.png
Who could have thought that any user would accept to share all his/her correspondence with Google and let the company run contextual ads based on it? Gmail is so well designed that users accept the privacy trade-off.

Targeted ads turn out to be not so annoying and a small price to "pay" for the service.




external image 110407_TECH_privacyTN.jpg


"If users knew how the data were used, they would probably be more impressed than alarmed" (The Economist, //Data, Data Everywhere//)

As a consequence, companies would be well inspired to emphasize their good use of our personal data as it would often help diffuse privacy concerns. And Prof. Weigend has indeed previously described the key feature of a good incentive system: "People need to see that they profit from the outcome in some way if they are willing to put in the effort to contribute truthfully" (See HBR)

In the end, privacy paranoia may hinder online innovation...unless companies are smart enough to highlight the benefits to the users. Users should be smart enough in return to accept the privacy trade-off. Otherwise, they should "be ready to suffer the consequences"// F. Manjoo warns, as we already depend on so many great services powered by our personal data!

2. How do you measure what you get?

It is not always easy to measure the exact benefits that come with a reduction of online privacy. A service such as Facebook Connect first provides a simplified and enriched (social) user experience on third-party services. But sometimes, unexpected benefits outsmart these primary incentives. Indeed, Facebook Connect can impact (positively) the way people behave online and thus emphasizes the importance of identity on Internet.

external image Facebook-and-Journalists.jpg
For instance, enabling people to connect on news websites or blogs using their Facebook profiles:

  • Drives content quality:
Under their "real" identity, people are less inclined to write hasty/stupid comments, all the more so if these appear in their friends' FB newsfeed.

  • Drives activity:
People are more inclined to comment as they know that their friends will be able to see their insights.

Facebook actually offers a new "distribution channel" for people's personal content from these third-party websites. Quantifying the number of your friends who will be able to see your witty comments on Slate or your clever answers on Quora thanks to Facebook Connect may be a good measure of what you get in the privacy tradeoff with the service.

The Privacy Ecosystem

When do advocacy and collective action matter?

There are long scanning organizations that exist solely to advocate for everybody's data protection and to push back against reductions in privacy at the scale of society:
  • Reputation.com is an Online Reputation Management System that helps you "Monitor the Internet / Protect your Personal Information / Define your Image / Defend your Reputation"
  • The Electronic Frontier Fundation"fights in the courts and Congress to extend your privacy rights into the digital world, and supports the development of privacy-protecting technologies"
  • The ACLU "monitors the interplay between cutting-edge technology and civil liberties, actively promoting responsible uses of technology that enhance privacy and freedom, while opposing those that undermine our freedom and move closer to a surveillance society"
  • Tor is intended to protect internet users' personal freedom, privacy, and ability to conduct confidential business, by keeping their internet activities from being monitored. It is composed of client software and a network of servers which can hide information about users' locations and other factors which might identify them. Use of this system makes it more difficult to trace internet traffic to the user, including visits to Web sites, online posts, instant messages, and other communication forms. Freenet is another version of the same idea.

external image 5268782208_f685e1fd7b.jpg
Facebook founder Mark Zuckerberg said that "The greater openness social networks bring to human interactions is probably the greatest transformative force in our generation, absent a major war." (See //A World of Connections// by The Economist ) We are probably facing a radical transformation of the concept of privacy. Social norms appear to be shifting, people are willing to share information about themselves more widely, all the more so if they can benefit from enhanced services in return. Is this really a long-term trend or will privacy catch up?

Some think that reduction in privacy is directly correlated with advancement of technology and that privacy is likely to lose the battle in the future anyway. But we need someone to read and control this 35-page iTunes user agreement before we can sign it. Thus, though social norms are indeed changing, collectives can still help reduce the information asymmetry between companies and customers and make sure there is no unfair use of our personal data. (Picture by J. O'Dell, C. Tsevis)

This aggregator maintains a list of websites that store your password in plaintext or with some reversible hash. This is revealed usually when a password reset email echoes your password, which would be impossible with a one-way hash.

A little more background information on privacy on search engine from Wired.com:
How to Foil Search Engine Snoops

Conclusion


Privacy_Settings.jpg