With Big Data Comes Big Responsibility
por HBR Editors

Big data and the “internet of things”—in which everyday objects can send and receive data—promise revolutionary change to management and society. But their success rests on an assumption: that all the data being generated by internet companies and devices scattered across the planet belongs to the organizations collecting it. What if it doesn’t?
Alex “Sandy” Pentland, the Toshiba Professor of Media Arts and Sciences at MIT, suggests that companies don’t own the data, and that without rules defining who does, consumers will revolt, regulators will swoop down, and the internet of things will fail to reach its potential. To avoid this, Pentland has proposed a set of principles and practices to define the ownership of data and control its flow. He calls it the New Deal on Data. It’s no less ambitious than it sounds. In this edited conversation with HBR senior editor Scott Berinato, Pentland talks about how the New Deal is being received and how it’s already working—in a little town in the Italian Alps.
HBR: How did you come to be concerned about data collection and privacy?
Pentland: In my research at the Media Lab, I use wearable sensor technology that measures tone of voice, movement, gesticulation—innate behaviors—to collect very personal data about how people communicate with one another. When I started that work, I was impressed by the power of the data being generated, but I also saw very quickly how it could be abused.
Collectively, we now have data that could help green the environment, create transparent government, deal with pandemics, and, of course, lead to better workers and better service for customers. But obviously someone or some company can abuse that.
And the internet of things exacerbates this concern?
I think so. Data is data, regardless of how it’s generated. If anything, the internet of things helps people see what they’re actually doing. When you pick up the kids, how fast do you drive? How much food do you eat in a week? How much time do you spend in your kitchen? In your bedroom? Data points like these make people feel invaded. As sensors are built into more and more products, there’s a sense of being increasingly spied on.
So when consumers become aware of data collection, they start to ask, “Is it really okay that I’m letting a company collect information about my workouts and heart rate?”
Yes. And some consumers may decide they’re fine with it. But right now there’s no notification that people are spying on you, collecting data. It’s a big, ongoing battle among industry and regulators and consumer groups: Do you have the right to know what people are collecting?
You believe that people do have a right to know. How did the idea of the New Deal on Data form?
I thought we needed to create a win for customers and citizens, a win for companies, and a win for government. In 2008 I wrote a policy piece for the World Economic Forum, which continued as a series of meetings and follow-up pieces. It laid out the power of this data, and the disaster scenarios, and the idea of a total reset: the New Deal on Data.
The term “New Deal” has historical resonance and connotes huge ambition. Was that deliberate?
Yes—that’s exactly why I chose it. The original New Deal in the United States was a reset, and it turned out to be a pretty good thing for at least 50 years. It really changed the way people thought.
What, specifically, is the New Deal on Data?
It’s a rebalancing of the ownership of data in favor of the individual whose data is collected. People would have the same rights they now have over their physical bodies and their money.
So it’s not just guidelines—you’re proposing rules under which people can control data about themselves?
Yes. The New Deal would give people the ability to see what’s being collected and opt out or opt in. Imagine you had a dashboard that showed what your house knows about you and what it shares, and you could turn it off or on. Maybe there’d be some best practices concerning that data. Then people wouldn’t get so flipped out, because they’d know what was going on and why it was going on, and they could control it.
Transparency is key. The data being recorded about you will form a fairly complete picture of your life. You need somewhere to store and manage it, because it’s very valuable when it’s together in one place. Seeing all the patterns of your life allows you to personalize medicine, personalize insurance, personalize finances. The question is, Who’s going to hold the complete picture? Some credit-rating service? I hope not. Google? No. Is it going to be the individual? I hope that’s the way we end up going.
Dana Smith
Are companies afraid that if they’re transparent with customers, then customers will opt out?
A lot of companies are afraid that this kind of regulation will kill their business models, and in some cases they may be right. Many telecommunications companies, for instance, have tried to get permission from customers to share data. They’ve spent hundreds of millions of dollars on it and have gotten basically nowhere. Look what happened when Do Not Call became an option.
So some businesses may disappear, but that’s probably good—the economy will be healthier if the relationship between companies and consumers is more respectful, more balanced. I think that’s much more sustainable and will prevent disasters.
You mean real disasters, not your garden-variety data breach?
I don’t mean just credit cards being stolen. I mean people selling data out the back, and criminals using it for some enterprise that affects critical systems, and people dying as a result. If that kind of disaster happened, there would be an overreaction: Shut it down. You’d see very strong regulation passed overnight, and a lot of companies would be in deep trouble. That’s what I’d like to avoid, because big data and the internet of things can create a lot of positive change. The New Deal gives customers a stake in the new data economy; that will bring first greater stability and then eventually greater profitability as people become more comfortable sharing data.
The internet of things seems to put data generation and collection into exponentially more mission-critical systems: supply chains, power grids, cars, food, health care. As the data gets closer to our physical selves…
It already is. For example, something like two million people are running around with wireless pacemakers in them. Somebody can wirelessly look at your heart rate, and that’s already yielding great improvements in health care. I can also see terrible outcomes if the owner of the heart doesn’t control that data.
The New Deal rebalances the ownership of data in favor of the individual whose data is collected.
Businesses are investing billions in strategies that rely on unfettered access to data. Google bought Nest. Facebook acquired WhatsApp. Wearable health tech is taking off. These companies want to own all the data about consumers’ health, location, preferences, and behaviors.
Well, if you’re an internet company, you look at the New Deal on Data and you say, “This is nuts.” These companies are going to have to work hard to show their customers the value in collecting all this data. I was at a World Economic Forum meeting where Nest was explaining what it does, and there was practically a revolution in the room. You mean Google is now going to know the temperature in my kitchen and when I went into the living room? It was like “Over our dead bodies!”
People are OK about sharing data if they believe they’ll benefit from it and it’s not going to be shared further in ways they don’t understand. That has shaped some of the legislation that’s coming out of six years of work on the New Deal on Data, such as the Consumer Privacy Bill of Rights proposed by the Obama administration and the EU’s data protection directives. This kind of legislation is going to kill a lot of weird “Let’s collect everything about everybody” strategies.
Those strategies may not be as good as the companies believe, either. I don’t think companies realize that the costs of a “grab all the data” strategy are very high. They’re taking on huge amounts of risk in the form of data breaches and damage to critical systems. Not only is it expensive to maintain security, but breaches will become increasingly expensive. The FTC has made it very clear that it’s going to come down hard on them. And along with financial risk, there’s brand risk. Target has really suffered from a breach that wasn’t even its fault. It just failed to catch somebody who was inserting a little bit of software—accessed through an HVAC system, speaking of the internet of things. I think companies don’t realize that these strategies contain poison that can come back and bite them.
Then again, if you’re in a regulated industry, such as telco, or banking, or health care, you need a license to operate and you haven’t really been able to monetize your data. The regulator says you can’t get into the data business; it’s not your data. Regulators are now beginning to say to these enterprises, you can get into the data business if you respect the New Deal on Data. The key is that they have to abandon the trick-and-trap of internet companies’ EULAs [end-user license agreements] and complex terms and conditions, where we all have to click “I Agree.” The New Deal actually engages customers, which could be much more valuable in the long run, because you’re building trust. But I can see how for internet companies it’s a little scary to start down this road.
How do you answer the CEO who says, “Look, it’s good you’re thinking about this, Sandy, but you’re hampering innovation. We need to collect this data. We’ll figure out how to make it secure and make sure people are comfortable with what we’re collecting.”
I think that’s exactly wrong. Look at the financial industry, for instance. Starting in the 1800s, it was basically unregulated, and we had booms and busts that destroyed huge swaths of the economy and ruined many families and communities. That’s where we are with personal data. People say personal data is the new oil of the internet. What they mean is that it’s a new asset class, a new value, a new money. And we don’t have the regulations to treat it like the value class it is. We need data banks. We need data auditing. We need them in order to avert huge disasters from breaches and attacks and class action suits. The FTC has said it’s going to go after data collection, and the only thing that limits it is that it doesn’t have enough lawyers. But it may get more lawyers if it collects a lot of big fines.
Who’s going to hold the complete picture of your life? Some credit-rating service? Google?
What happened when finance was regulated? Less volatility, greater trust, and more-successful financial institutions. People will be more willing to share data if they’re confident that it’s safe to do so. Right now there’s lots of data you would never share. You’d never share the location of your kids. You’d never share certain financial things you do. If this was a regulated industry, you might feel comfortable sharing personal data.
Is there any evidence that data is like money in this way?
Yes. We’ve set up some safe-harbor areas in Europe—cities that run by different rules than the rest of Europe. In Trento, Italy, hundreds of families are living with the New Deal on Data. They get notification and control of data generated about them. It’s securely shared in an auditable way. And guess what? These people share a lot more than people who don’t live under New Deal rules, because they trust the system and recognize the value in sharing. Being confident about your personal data makes for a better economy, not a worse one.
Trento is a collaboration with Telecom Italia and Telefónica (disclosure: I sit on the advisory board of Telefónica), and it is a way of anticipating data protection regulation in the EU. We’ve used software called openPDS, developed by my group at MIT. It stands for “open personal data store,” and it allows people to see what data companies have and to share data in a secure, safe way. The idea of the Trento experiment is to ask how people feel about this way of sharing data, how they use it.
A big question is, Does an openPDS environment generate more innovative services or fewer? The answer seems to be that you get more, because consumers have explicitly trusted you with their data, so you can offer services you never would have been able to offer before because it wasn’t safe. A simple example is sharing financial data with peers. When anonymous sharing is safe, you get new sorts of applications that rely on the trust and security of the openPDS platform.
Can this scale?
Absolutely. We’ve had extensive discussions with companies that have hundreds of millions of customers, and yes, it can.
Are you hopeful the New Deal will truly come into being?
I’m quite hopeful, actually, because people are fed up. They’re cynical. If you ask what they worry about, identity theft comes in ahead of nuclear war. They don’t do much about it because they don’t see that they can do much, but the New Deal is a good, plausible thing we can do today. Regulators believe in it. Computer scientists believe in it. Smart people who are the heads of tech companies think we can do this. They may not be in favor of it, but they think we can do it. It simply requires that creative businesspeople harness the will of consumers in order to construct a value proposition better than the current steal-all-your-data paradigm. We’ve just got to push on through.
Artículos Relacionados

La IA es genial en las tareas rutinarias. He aquí por qué los consejos de administración deberían resistirse a utilizarla.

Investigación: Cuando el esfuerzo adicional le hace empeorar en su trabajo
A todos nos ha pasado: después de intentar proactivamente agilizar un proceso en el trabajo, se siente mentalmente agotado y menos capaz de realizar bien otras tareas. Pero, ¿tomar la iniciativa para mejorar las tareas de su trabajo le hizo realmente peor en otras actividades al final del día? Un nuevo estudio de trabajadores franceses ha encontrado pruebas contundentes de que cuanto más intentan los trabajadores mejorar las tareas, peor es su rendimiento mental a la hora de cerrar. Esto tiene implicaciones sobre cómo las empresas pueden apoyar mejor a sus equipos para que tengan lo que necesitan para ser proactivos sin fatigarse mentalmente.

En tiempos inciertos, hágase estas preguntas antes de tomar una decisión
En medio de la inestabilidad geopolítica, las conmociones climáticas, la disrupción de la IA, etc., los líderes de hoy en día no navegan por las crisis ocasionales, sino que operan en un estado de perma-crisis.