This past year has seen the launch of several products that claim to predict what we want. Most of these products are content with trying to anticipate our immediate desires. Google Now, named innovation of the year for 2012, suggests the quickest way home or recommends places for lunch depending where you are and what time it is. MindMeld, a voice calls app dubbed ‘Siri on steroids’ also launched last year, listens to your conversation and predicts the online content you might want to see in the next ten seconds.
However it looks as if predictive computing is about to get bolder. Where are you going to be 285 days from now at 2pm? Microsoft researchers claim to know, using big datasets to predict your location far into the future with, apparently, an accuracy of more than 80 percent. Armed with this knowledge, the researchers imagine commercial applications like adverts that say “Need a haircut? In four days, you will be within 100 meters of a salon that will have a $5 special at that time.”
As anticipatory computing moves from thinking about the next ten seconds to worrying about next year, what will this mean for privacy? Setting aside for the moment the accuracy of such predictions, the point, as Eric Siegel says, is that ‘predictive analytics reveals a future often considered private’. In other words when companies start using predictive analytics, privacy stops being just about controlling what historic information is held about you but also what speculation they make about your future self.
It could be a supermarket predicting you are pregnant before you’ve told your family or your employeranticipating if you are one of the employees most likely to quit. These are important life events that you probably don’t want other people betting on until you yourself know. You almost certainly don’t want people speculating and getting it wrong.
And in this new era of anticipatory computing the less data you share, the less accurate are the predictions made about you. Of course no one really claims to be able to accurately predict the future, which is why the emerging market around anticipatory computing talks about relevancy instead. The more personal data Google Now can access about you, the more ‘relevant’ its predictions about what you want.
Receiving irrelevant recommendations is irritating but it’s not life changing. However the issue for the debate on privacy is this coming scenario: if you block access to your personal data, a company will use what little data it can find to make a prediction about you anyway. This could have real consequences, such as increasing your insurance premium or making it harder to secure a mortgage.
So what’s at stake in predictive analytics is more than the privacy of the personal data used in making predictions about your future. We need to start thinking about what happens when the companies making predictions about us get it wrong. The privacy battleground will move from the volumes of data held about you, to the statistical assumptions hidden within algorithms that decide how your future self is presented to the rest of the world.