What Wearables Mean for Privacy

Currently the world is experiencing a surge in wearable technology. Every week a new press release hits the front pages about some major company bringing out their new line of sleek wearables that automatically interface with everything else you own.  At the same time app ecosystems are springing up with everyone desperate to make the killer application to add value to our lives.  Those of us who have been researching wearable devices long before they became hype know that the really interesting aspect of wearables is not only what they can do for us, but how well situated they are to help our technological environments better understand us. However the current app ecosystem for mobile devices dictates that “he who builds the software, owns the data.”  We can either decide to use the software and accept the fine print, or not use it at all.  The crux: the people who make our technology now have the ability to understand us better than we understand ourselves in a fashion which goes beyond our own powers of self reflection.  And the consumer has no way of knowing how much information they are really giving away, and currently no way of controlling it.

Wearable devices are perfectly positioned to observe our behavior, actions and interactions.

Wearable devices are perfectly positioned to observe our behavior, actions, interactions, and even character and emotional state.

Alex (Sandy) Pentland is a researcher at MIT who is one of the most highly cited computer science researchers of all time.  I’ve cited him myself, often. His work is mostly focussed on using technology, especially wearables, to understand the complex facets of human social interaction. A recent article from The Verge names him as the “Godfather of Wearables,” a title which he certainly deserves. The field is a hybrid of psychology, sociology, mathematics, physics and computer science, which researchers have now begun to refer to as “Computational Social Sciences.”

“Algorithms, computers and sensors can give us insight into aspects of social interaction, and into ourselves, that are beyond the abilities of our own perception”

Sandy Pentland’s research is so interesting, because for the first time, he demonstrated that algorithms, computers and sensors can give us insight into aspects of social interaction, and into ourselves, that are beyond the abilities of our own perception.  In essence, these devices can understand us better than we understand ourselves! In The Verge article they mention his “Sociometer” which extracts cues from human speech patterns and behavior to discover the outcome of the social interaction before the individuals involved know it themselves, for example if a salary negotiation will be successful or not. I remember a great Keynote talk of his where he demonstrated the power of these analytics to evaluate the effectiveness of a company based only on inter-department person-to-person interaction.  Even more interesting, the work resulted in Sandy’s book called “Honest Signals,” where the insight gained from machines about humans is then given back to the humans.  And, even more spectacular, some of it is even actionable! Reading that book helped me improve my presentation style by understanding how I can appear excited about the content and sure of myself at the same time, for example.

But Pentland is not the only one working in this direction.  Scientists all over the world are working to make sense of the tremendous amount of data generated by mobile and wearable devices. The results are astonishing.  For example by monitoring interaction with your phone and app usage, can allow your personalty type to be inferred. A wearable smartwatch on your wrist with a sensor on the back can recognize your stress level with at any given time.  Tracking eye movements can also tell us not only what your are looking at, but what kind of information you are consuming and how you are doing it.  An app on your phone, watch, hearing aide or wearable device can tell someone if you would be a good future employee based only on your physical, non-verbal behavior.   Wearable devices can also be used to detect depression, which can be a very useful pedagogical tool. If you would like to stay abreast of what is happening at this level then keep an eye on the annual International Symposium on Wearable Computers (ISWC), coming up this year in Seattle in September.

“it is almost impossible to know what an app knows about you based on the permissions it requests”

So what does this all mean?  The first thing to note is that any device you wear or carry with you can tell someone else amazing and incredibly insightful things about you.  The second, is that as a lay person, it is near impossible for you to tell what can be gleaned about your psyche from any basic type of raw data, e.g. motion sensing, skin resistance, location, app usage, etc.: for you it is almost impossible to know what an app knows about you based on the permissions it requests. For all scientists working on this technology, it is incredibly exciting and unbelievably scary.  We create the technology that makes all of this possible, but are not the ones who decide how it will be used.  If used correctly, devices and systems which we use can have the power to make our lives more enjoyable and improve our experience of the world.  If used incorrectly, they make us completely naked in the eyes of an observer who we may not be aware of, know personally, or trust in any fashion.

“Having our internal psychological workings exposed, means we can be manipulated by everybody”

And it is not that we are physically naked.  However embarrassing or uncomfortable that might be for some people, this type of naked would be far worse; our psyche would be naked. As demonstrated in the research, we don’t need to get at your brain waves to figure out what you are thinking, because your body displays this information in a humanly imperceivable, but technically measurable way, like a lie detector for “honest signals.”  As children, we learn to pick up on these signals and understand what is going on in the minds of the people around us. We also learn to use that knowledge to manipulate people (every child knows to put your parents in a good mood before asking for something big or risky).  But having our internal psychological workings exposed, means we can be manipulated by everybody.  In a previous article I talked about how personal information is acquired and resold by data brokers on the personal information market. Imagine if telemarketers could pay to be alerted when you are in a minor depression by your wearable devices and then use that to their advantage?  Do we really want that kind of information on the market?

From a privacy perspective that sounds very alarmist and menacing.  But the technology itself is not neither intrinsically good or bad, it only becomes bad when it is not combined with a conscious decision making process.  I am personally thrilled that wearables are finally finding the traction they deserve, and use them myself.  But I think it should be clear to the consumer that a danger is there, and that he or she is probably not capable of recognizing it. The potential is very real to create a world that very few would want to live in, and for those in the know it is also very possible with the technology we have now. The solution lies in control over the flow of data and where the power of decision making lies.  Pentland himself advocates this, where users can decide what data goes where, instead of the “agree vs. decline”, “all or nothing” approach that we currently have in the mobile app ecosystem.

But it is not that simple, since the person who is making the call about who can obtain what data would not be aware of the latent meaning within that data.  It is necessary to empower the user with their own data, and also with the tools and understanding they need to weigh the value of giving someone their data for the services and software they receive in return.  The difficulty lies in making individuals aware of a danger they can’t perceive, and convincing them that this is something that they need to pay attention to.  The problem is of a social, economic and political nature, and therefore does not have a technological solution.  Rather a grass-roots movement of the consumers, driven by an awakening must be brought about, to enact the economic and political change necessary in order to realize true personal data empowerment.

Share on RedditShare on FacebookTweet about this on TwitterPin on PinterestShare on Google+Share on LinkedInEmail this to someone