What is Biden’s Peloton, Your Fitbit, And Our Digital Privacy
Is it risky for the president of the United States to use smart products like a Peloton or a Fitbit? And if so, what does that say about the personal privacy of all the rest of us?
And, will we ever have a global solution to privacy across all our digital life?
“A little bit of tape goes a long way when you want to maintain your privacy,” former Googler and current Sourcepoint CEO Ben Barokas told me in a recent TechFirst podcast. “90% of users are really not going to care, and they’re going to happily give their data and their attention to places that they trust.”
Barokas left Google to found Sourcepoint, a company focused on building a “sustainable media ecosystem” where people make choices about what’s private, what they’ll share, and what they’ll get back in return for sharing that.
For, in other words, building a more open way to trade privacy for product. Product like free email, free search, free social, free messaging, and so on.
Asked why he left the adtech giant and raised $47 million for the startup, Barokas is brief and humorous: “penance for my sins.”
“It’s not that I look at what I did previously as exploiting people’s data,” he told me. “That being said, I’ve seen vendors, I’ve seen actors, I’ve understood processes that were not as above board as they should be. And I feel that users need to have a better understanding of the transactions that occur as they travel around the web and they go from app to app.”
Which brings us back to Peloton and Fitbit. Or, perhaps to the phone in your pocket or the websites you surf or the apps that you use.
All of them collect data on you. All of them use that data to improve their services. All of them transfer that data over the internet to servers in the cloud. But many of them may also use your data to target you for ads. And some of them also package that data to sell to others.
Peleton, of course, is a paid service. Your data stays within that company. And yet having the data travel on the internet was enough of a risk for Michelle Obama to rip the camera out of her machine. Fitbit, owned by Google, has made assurances that its data will not be commingled with Google’s other data on you, and that it won’t be used for ad targeting. But many other services are not paid, or at least not explicitly. Those services are the kinds that collect and leak the kind of data that allows companies to track who stormed the Capitol, or see where a president is moving in near real time. Or reveal where a secret US military base in the Middle East is.
“Even if you’re using a virtual private network in order to encapsulate your data on the way to the servers, the question is once somebody has your identifier for advertising or your unique ID from your mobile device, and then you’ve given them permission to give them Lat/Long or not only your location coordinates, but any number of browsing behaviors … once that gets to someone else’s servers, it’s very easy to copy that data and send it along,” says Barokas.
Privacy advocates have passed legislation like GDPR in Europe and CCPA in California to ensure that data use is open, sanctioned, and limited, but the reality is that most of us click through the “we collect your data” screens without too much of a moment’s thought to get to what we want.
And for even more vulnerable groups like teens who just want their apps and games and videos, or seniors who may not fully grasp what is at stake, those warnings and clickthroughs probably mean even less. Which, of course, leads to situations like Cambridge Analytica and the weaponization of waste data for the manipulation of populations.
“The value exchange has to be very clear and everyone should have the choice to either pay with data and attention — which we normally understand to be advertising, but doesn’t always have to be advertising, that can be bifurcated — or you can pay with fiat currency,” Barokas says. “But understanding that there is a transaction that’s occurring every time digital utility is being utilized, and every time there is content that is being consumed, must be clear, and that must be understood globally. And once that’s understood, then everyone has made a free choice.”
My question, however, is how realistic it is at our current level of technological sophistication. Knowing how I treat GDPR or tracking notifications — annoyed, click-through as quickly as possible — I’m not making truly intelligent choices.
What we need is a functionality in our browsers themselves that sets global standards for what we are willing to give and what we are not, so we can make a one-time updatable intelligent choices that our technology then mediates with all the websites we engage with. And an equivalent operation-system level function on our phones for all our mobile apps, of course.