Facebook’s announcement that it is testing a digital assistant called “M”means that each of the “big five” technology companies is now in the digital assistant game. Facebook M joins Apple’s venerable Siri app, along with Google Now and Microsoft’s Cortana. Even Amazon has the Echo, a voice-activated internet of things appliance.
These assistants might revolutionise how we interact with our digital devices, our homes, and the world. They promise to effortlessly help us find, and even predictwhat we want. Facebook says M will make advances through leveraging its unmatched database of personal information, coupled with invisible human “trainers”. By applying artificial intelligence to your Facebook data, M could help you buy gifts, book travel, and reserve tables at restaurants.
But there’s a catch. And it strikes at the heart of what’s at stake in our digital lives. Just like human assistants, M and its competitors work better the more they know about you. The more you invest yourself and your data in its world, the more effective M will likely be. For M to work its best, we must trust it with everything.
When lives and corporates are cut loose
M’s demands for our trust will expose us to Facebook more than ever before. Consider what this could mean in practice. Linked to Facebook, M could accidentally share sensitive details about our lives to others. In an age when data breach seems inevitable, a hack of M could make Ashley Madison look like small potatoes.
More fundamentally, will M be acting in our best interests, or will its loyalties lie with Facebook and its advertiser clients? We might never know, because the important details of how these technologies work are shrouded in secrecy, and there is little obligation for Facebook to be transparent about how it works.
One reaction to this might be that M is creepy, but creepiness tells us little about whether technology will improve or worsen our lives. Similarly, M will be touted as resting on user choice. In practice no one reads these policies, such as Apple’s infamously long and wordy document. And they shouldn’t be expected to. These long, complex, take-it-or-leave it terms, like other problems in consumer protection, usually just saddle us with the risks of our valuable personal data being disclosed.
Read the full piece at The Guardian.