SIGNUP FOR E-MAIL UPDATES

Are tech companies doing enough to protect customer rights and privacy?

When Apple CEO Tim Cook said his company would not comply with the FBI’s demand that it help unlock an encrypted iPhone used by one of the San Bernardino shooters, it set off an intense debate about how the right to privacy and other civil liberties apply to the digital technologies we rely on every day. The ensuing controversy has also highlighted how little most people know about who has access to our personal data and how that information is used—and even why we should care.
 
These are issues that Rebecca MacKinnon—researcher, Internet freedom advocate, and author of Consent of the Networked: The Worldwide Struggle for Internet Freedom, one of the most dog-eared books on my shelf—has been drawing attention to for a long time. Last year, MacKinnon launched the Ranking Digital Rights Corporate Accountability Index, which details how some of the world’s most powerful Internet and telecommunications companies fare when it comes to protecting their customers’ freedom of expression and privacy. The first of its kind, the Index helps us see how corporate practices shape the lives of billions of people around the world, and understand the kinds of questions we need to be asking about the digital products and services we rely on.
 
To MacKinnon, the fate of the Internet is a shared responsibility. She urges us to stop thinking of ourselves as passive “users” of technology who “consume content” and are “served ads,” and instead assert our rights—to information, access, and privacy—in the same way that people throughout history have fought for civil and human rights. We now live in a world when a person’s ability to access to a loan can be determined by an algorithm that scrutinizes their social network, and the difference between Apple and Google’s approach to encryption on mobile devices is the difference between being surveilled and not—with people living in the Global South or in low-income U.S. communities typically on the losing side. With our political lives so dependent on privately owned digital services and platforms, MacKinnon’s work reveals how the dynamics of global power are changing.

Though we spoke before the news broke about Apple’s dispute with the FBI, that news makes MacKinnon’s case for transparency and accountability even more timely. As she told me, “When you see Tim Cook from Apple standing up to the NSA, he’s doing the right thing. But it also happens to be really good for his business.”

 
Let’s start with the basics: Why did you create the Ranking Digital Rights Corporate Accountability Index, and what are your goals for it?
 
In Consent of the Networked, I talk about how power and human rights—our traditional concepts of how you protect rights and have democratic governance—are being challenged in the Internet age. Increasingly, companies are exercising a kind of private sovereignty over people’s lives. Google cannot put you in jail, but the decisions companies like Google and Facebook and AT&T and Vodafone are making—about their design, what data is collected about us, who it’s shared with, how they handle their relationships with governments, what private rules they set for what you can and cannot do with these services and platforms—increasingly shape what you’re able to do in real life. They shape who you know, what you learn, political outcomes, what the government knows about you. And so companies have a responsibility. If these platforms and services are not run and developed with a commitment to users’ rights, people’s rights and freedoms are going to be abused.

There is actually a great deal of variation in how governments regulate companies, and also what the practices and norms are. Having a global ranking comparing policies and practices of a sampling of companies across the world helps to clarify the great variation.

 
How do companies themselves benefit from respecting people’s privacy and abiding by human rights standards?

The idea that a company’s only obligation is to maximize profits—that they shouldn’t care about anything else—is no longer a mainstream view. Increasingly, there’s awareness that companies need to behave sustainably and contribute to an environment in which our children and grandchildren can live.

If you want your brand to be trusted and people to feel good about using your products, if you want a prosperous and creative community in which to operate, it’s in your long term interest to act responsibly. With Internet and telecommunications, if people don’t trust your service or your products, that’s not in your long-term business interest.

In the post-Snowden era, we’re seeing that a number of companies have been identifying privacy and security as a selling point, and feeling that respecting and defending user’s rights is a profitable way to go. When you see Tim Cook from Apple standing up to the NSA, he’s doing the right thing. But it also happens to be really good for his business.

 
Who did you develop the Ranking Digital Rights Index for?

The Index connects advocates with clear information about what companies are doing, and how they can improve freedom of expression and privacy protections for their users. Based on that information, those advocates can approach the companies with very specific ideas about how they should improve those policies.

We also provide comparisons between companies, and concrete examples of what good practice looks like. Sometimes companies say they do a particular thing for a justifiable reason; they can’t do it any differently. But then the ranking points to other companies who do it differently and better—and the first company didn’t know about it. We’re also finding it useful as a source of concrete data for specific conversations that relate to company policy and practice in response to emerging events.
 
In American companies, transparency about government requests has improved a lot in the past few years. But transparency about terms of service enforcement and private requests, especially to remove content, is much weaker. That helps to point to concrete weaknesses in company practice that can form the basis of conversations about how to improve.

 
Soon after the Index was released, Yahoo’s Business and Human Rights program announced that they are studying the results of the Index and looking forward to discussing how its policies affect users’ privacy and free expression. What can you tell us about reactions of Yahoo and some of the companies included in the Index so far?

Yahoo, Google, Facebook, and Microsoft have all said publically, to varying degrees, that they found this exercise useful. It sparked internal conversations about how their policies and practices are perceived externally, and about what they want to do in response to our evaluation of them.

We have seen a couple of changes already. Less than a month after we published the index, Facebook came out with a new transparency report. They actually clarified some things that we ding-ed them for being unclear about in their previous transparency reporting. I think we had something to do with that change. Too late to be included in the index, Microsoft came out with a new transparency report on content removal. They’d previously done a transparency report on government requests for user information, but hadn’t reported on content removal and restrictions.

 
Why is a transparency report like that important?

Users need to understand what demands governments are making of companies to hand over their data, or to restrict their speech or their access to content. And they need to understand how companies are responding to those demands. The most insidious kind of censorship is when people don’t know the censorship is happening. That happens in a lot of places where governments are demanding content be taken down and there’s no awareness that it’s happening.

And so transparency about what circumstances people’s accounts are being deactivated under, what circumstances people’s speech is being removed from a platform, under what circumstances and whose authority content (or access to content) is being blocked—that’s fundamentally important to freedom of expression, and to holding those accountable who limit our freedom of expression.
 
And Ideally we want to see transparency on both sides: Companies tend to be much more open with the public about what they’re giving governments, than governments are about what they’re asking those companies for.  But fundamentally, this is about making sure that the companies we’re sharing our information with are exercising their power responsibly, not abusing it.

 
We often hear people say they aren’t worried about privacy because they aren’t doing anything wrong and don’t have anything to hide. Yet when they understand that data can be collected about them or about their children—data that might affect their ability to get a loan or a job—that makes them think about it differently. Is part of this initiative is about changing people’s relationship to privacy?

Absolutely. People are increasingly becoming aware that you don’t have to be a dissident or somebody who’s “up to something” to get ensnared in a set of unfortunate consequences. There are concerns about whether being from an immigrant family and having relatives in certain parts of the world is going to make you a target for surveillance, or whether your rights are going to be infringed upon by association with a friend of a friend in your Facebook network.

Increasingly, growing numbers of people are starting to recognize that actually the small details really matter: that whether Tim Cook of Apple decides to encrypt the iPhone’s data or not is one of many choices that make a real difference. And there are actions we can take to move things in a direction that make people’s rights a priority.

Topics