Over the last two years, two troubling incidents from large companies have demonstrated the challenge facing modern companies that rely on large amounts of aggregated data to provide a compelling service.
The first was Facebook’s usage of data to test how people behaved when they were made upset.
The second was Uber’s threat to use information about what we are doing with the service for less than admirable reasons.
In both cases, customers had – unknowingly – handed over vast amounts of private information to a service provider that could use that data in ways that didn’t align with our best interests.
Although it’s easy to demonize Facebook and Uber, let’s not. Big Data is an industry that has exploded, and we are all collectively learning how to behave. Every industry has gone through a maturation phase. And my goal is to suggest a path for good because I believe in the value of these services.
The danger for service providers is twofold.
The first is that customers will begin to distrust the service providers and stop using the service. As a very private person about certain very personal topics, Facebook’s confusing and changing approach to privacy has made it difficult to use the service for topics where privacy matters. This switch from early on in my life with FB, where its commitment to my privacy was why I used the product.
The second is that customers will demand or gravitate to services that provide privacy. As a result, the ability to use the data to deliver better and more personalized services will be hurt. This is more likely. And what is more likely is that incumbents will provide that security. And almost as if to prove my point about this, WhatsApp now has a fully encrypted messaging channel. With fully encrypted messaging channels – SPAM becomes a much harder problem to solve, and ad-supported free mail services become much harder to monetize – this is not necessarily a better world for consumers…
What needs to be done?
I believe that people are decent. The world works not because we police it but because decent people know what they should do. We have morals and standards, and decent human beings gravitate to them.
With Big Data and the power to know more about your customers and manipulate your customers in ways that are not always aligned with their wants and desires, the danger for unintended evil is great.
Much like some medical experiments are not done because we think they are wrong, some data uses are wrong.
I don’t think regulation is necessary at this time. I think regulation will hurt the industry. What I do think is needed is a Hippocratic Oath for Data Scientists. If we agree on what is acceptable behavior, then most decent people will behave in the right way, and then the bad ones will be easily identified as bad actors.
More laws won’t protect us from bad people; decent people knowing what is right will protect us.
In that spirit, let me offer an oath.
- I will do no intentional harm. I will not knowingly manipulate people to be unhappy or sad or miserable without their explicit, clear, and obvious consent.
- I will never use our data in ways that are not aligned with the customer’s needs.
- The company is not the customer, and if I must choose the customer’s needs over the company, I will always do so. My job is to protect the user’s data, not the company’s survival.
I suspect if those three rules existed and we relied on the basic decency of human beings, the recent justifiable outrage would be significantly more muted because the things that did happen would not have.
We are a new industry, and as a new industry, we are figuring things out, and figuring things out means making mistakes.
Let’s take this opportunity where we made mistakes to make things better.
Leave a Reply