As mentioned before, the game is nothing more than a tool to explore some issues related to privacy and sharing personal information. During our pilots we discovered that 30 minutes is about the right duration, as the first complex interactions based on memory of previous exchanges appear more prominently after the 20th minute.
We decided to split the group of participants in 3 sub-groups, and discuss privacy in each one of these. We had 2 basic questions in mind:
- What privacy concerns did the game raise?
- How do these relate to real-life concerns?
Another group during the debrief
Each group had a slightly different take on issues – as is normal. However, this is our attempt at unifying and categorising all the issues and concerns raised during the workshop:
Theft of data (overhearing, snooping, …) outrages people and reduces any future element of trust.
What is true? How can I validate it?
Very difficult to verify if information is correct, if a person doesn’t already have access to trusted channels.
What happens if information is not simply false, but simply inaccurate? During the game (and during the first days of Internet), people can falsify information, or make up some data.
There is a component of individual attitudes to share information.
People with more ‘private’ attitudes, felt the pressure to share information, in order not to be excluded from the game. This led to sub-optimal exchanges, even in moments of embarrassment.
Appropriateness for purpose
“Does a shop need to know when I had my first kiss?”
Age of the first kiss was one of the information to be traded during the game. While asking such a question was perceived as normal and expected during the game, participants were quick to point out that this would be very odd and inappropriate in other situations.
People were feeling uncomfortable if they weren’t able to associate the questions asked by others with the purpose of the game. In general, this can be extended to clarify that the information requested and gathered by a system should be based on the main purpose of the service, and on the idea that people have of its mechanisms.
Each context has unwritten ‘etiquette’
Face to face trading of personal information with new acquaintances generated a bit of anxiety. In these cases, people couldn’t rely on well-defined real-life ‘rules’, or any other form of long-term commitment with their counterparts.
Someone was “not comfortable to ask a woman some specific personal information”.
Overlapping contexts generate fear of loss of control
Some participants considered the questions asked during the game were‘safe enough’: no politics, no job-related questions,… At the same time, other participants noticed the overlapping between private and professional spheres. This is something – they mentioned – that often happens in real life.
The game created a different world, governed by its own rules. This fostered ‘game intimacy’, which is something different from ‘true intimacy’.
Understanding the flow of information and its consequences
Most of the people were not aware that their information would be passed on to others, the first time they gave it to someone. What are the consequences of making something public? They realised it only if and when they had a tangible example of information about themselves that they saw in the hands of some other participants. At that moment, it’s way too late to try and make the information more private.
In the long term, relationships are established and reputation is formed. It happened during the game, after the first 20 minutes. In real life it’sa slower process, but it is the foundation to base any ‘trade’ on.
Affiliation among people based on ‘loose’ commonalities
Building a team was one of the main strategies seen during the game, to share folders full of information about the other participants, and even share money sometimes. This sometimes generated a proper ‘network of trust’. These were far from ‘solid’ groups: participants based their affiliation on gut feelings (“if I like you, I agree to swap info; otherwise not”) or secondary elements (e.g.: “I used the interestingness of responses” [SIC]).
Structural barriers to open sharing
Technological barriers slowed down data sharing, replication, mining. This was true forthe game (e.g. folders are physical, and they are owned by one person). This is less and less true in a digital environment.
The game also presented very rigid time barriers. In a digital context, excess of information is one of the new barriers.
Unbalanced power (AKA: feeling powerless)
There have been cases in the social network space, where rules have changed without any visible notice; this generates frustrations. While confined spaces generate ‘loose’ commonalities and ‘game intimacies’, they also generate tensions with the space keepers, which are perceived as authorities: “ I felt uncomfortable in leaving the information to the facilitators at the end of the game: what are you going to do with it?”.
On the other side, authorities can be useful: “I felt alone: there was no arbiter supervising the game”. All was left to individuals’ ethics, fairness and moral code.
Granularity of information
The possibility to be more or less specific in the way people answered their questions made the game more interesting: some of the participants mentioned it as one of the things that the game allowed and that – on the contrary – is not often allowed in digital interactions.
Real value of data
It is very difficult to identify the value of data and information, as it changes depending on whom is going to get the data (it’s different from person to person, from a person to an organisation,…).
Also, the value of data changes according to the level of intimacy: in the game, the fact that the information wasn’t extremely intimate (e.g. age of first kiss), reduced its value.
Tell me and I will forget / make me experience it and I will understand
Exposures to the mechanics of how data can be exchanged (e.g. read the rules) was not enough. People understood the ‘real value’ of their information only during the game, by playing and finding themselves in the middle of the action. During one of the pilots we ran, people discovered it only by overhearing others’ conversations, when their own name was mentioned, and information abut them traded by someone else.
Restrictions augment value A.K.A. commoditisation of information
During the course of the game, some participants perceived that data was losing value, as it was becoming so widespread that nobody else was interested anymore: “oh, I already know you – useless”.
My data is more valuable than yours’
“Sharing my profile’s information was more sensitive than sharing information I had gathered about other people; information about me had more value than information about someone slse.
“Trading info about others was much faster”. At a certain point information about others became almost like a commodity.
Symmetry of exchange / what’s in it for me?
The perception that the exchange would benefit both parties was essential to open sharing of information: “Others wanted more from me than they were prepared to give”. In general, participants felt more comfortable when they had a clear understanding of why certain information was being asked for, and when there was a reciprocal commitment (e.g.: “I don’t want to be taken advantage of – even if I am profiting in some way”). Being explicit in the questions (and the motivations behind them) was appreciated: “more comfortable with direct questions, if it’s clear what the trade is”; “theft outraged me”.