Mark Zuckerberg survived the gauntlet. In the face of substantial pressure from several members of Congress to testify personally on the Facebook/Cambridge Analytica mess, this week he sat through ten hours of questioning from the Senate Judiciary and Commerce Committees and the House Energy & Commerce Committee.
And, in my opinion, he did a decent job. He was calm, and generally seemed in control, despite some very aggressive lines of questioning. Sure, there are the Twitter jokes about him seeming robotic, but the baseline for emotional connection at a Congressional hearing isn’t all that high.
But I still think he missed a big opportunity to steer Facebook, and maybe even the entire tech industry, back onto a course of trust with internet users and government policymakers all around the world. The tech industry is on the hot seat right now on a host of fronts — in my last post I referenced my frequent characterization of last year, 2017, as the year of “everybody attacks tech.” Contributing to that tension are the robust data collection and use practices common to many successful businesses in the industry, practices with which Facebook is essentially synonymous. Users rarely have meaningful options to escape from these practices without sacrificing some or all of the features and benefits of online services.
I used to say that the culture of Silicon Valley is “collect all the data, store it forever, and figure out how to monetize it later”. Things are getting better, but we have a ways yet to go. Meanwhile, governments are starting to figure it out, and concerns are growing. So policymakers, particularly in the European Union but increasingly in the United States, are aggressively targeting tracking with an eye toward strengthening legal and regulatory frameworks.
We learned from the SESTA/FOSTA process that the tech industry mustn’t just obstruct a major political fight. We can’t play ostrich, claiming there’s no problem and waiting for it to go away. We have to listen, understand, and help shape the future together with policymakers and the public — or that future will be built without our input.
With that context in mind, here’s what I think Zuckerberg should have opened with. (Caveat: These aren’t my opinions about Facebook’s practices — this is a hypothetical of how I can imagine Zuckerberg describing Facebook’s practices.)
Facebook is a business. Our primary source of revenue is selling advertising. The ads we sell are targeted based on what our users say and do online.
Targeting ads is a valuable practice for us. It means we are able to invest in new experiences, and make mergers with companies like WhatsApp and Instagram, to deliver a combined experience that is even more awesome for our users.
Using targeting makes ads more relevant, which many users appreciate. Some don’t, and we give them controls to change targeting for the ads they see. Others don’t appreciate the practices that we engage in, behind the scenes, to make targeting possible — our data collection and use practices.
We try to help people understand, and hopefully not feel uncomfortable with, our data collection and use practices. We document them and talk about them publicly, and we believe that what we do is state of the art within our industry when it comes to transparency and user empowerment. But that doesn’t mean we can’t do more.
We’ve made mistakes along the path to where we are today. We’re sorry for those mistakes. We know the trust of our users is paramount for the continued success of our business, and we will work to do better. Here’s some of what we have already done to rectify our mistakes and make them less likely to occur in the future: We’re investigating apps that had access to large amounts of information and will ban any such developers who refuse an audit or who were found to have misused the data. We’re limiting developers’ data access in additional ways going forward, which I’ll be happy to explain in more detail if you are interested. And we’re in the process of improving the tools that users have to view and control the use of their data both by Facebook and by apps running on our platform.
We currently give Facebook users many options to change our data practices as applied to them — for example, we allow users to prevent third-party platform access entirely. And we are actively exploring alternative business models that would take that empowerment even further, offering those users who so choose even more aggressive separation from our data collection and use practices. It’s important to do that in a way that still allows us, as a business, to provide a sustainable and complete service package that includes the full network and platform benefits every other Facebook user can experience.
We believe that the value proposition we offer in our core, free to the user, advertising-supported business is and will continue to be the right model for the majority of our users. But we also know that users want and deserve to have meaningful privacy options. And, regardless of the choices our users make, it’s paramount for us to make our data collection and use practices as privacy protective and secure as possible.
That’s where the GDPR comes in. The European Union passed a strong privacy law that we and every other tech company must abide by. We opposed this law, not because we don’t care about privacy, but because we felt we could do a better job on our own of figuring out how to make our core business as privacy protective as possible. Reflecting on the Cambridge Analytica mistake and breach of trust, we should accept a dose of humility as to our own abilities here. Consequently, I’m happy to say that we’ve reversed our position on the GDPR, we embrace its principles, and we will voluntarily extend the practices and protections we offer to European Facebook users to Facebook users in the U.S. and around the world.
This alternative universe opening is, of course, vastly different from Zuckerberg’s actual opening statement before the Senate, reprised in tone in the house. It’s different in both style and substance.
Stylistically, this alternative emphasizes that Facebook is a business, rather than a community — and a successful business, not focusing so much on the dorm room origin story. Then the reason for Facebook’s interest in privacy isn’t framed as altruism (which doesn’t land well) but rather necessary business, because as a business, Facebook depends on delivering an experience to its users that makes them want to stay active.
Substantively (and stylistically), this alternative is extremely direct about the ultimate root cause / sine qua non of Facebook’s troubles: its data collection and use practices. By being upfront about that and not avoidant, it meets the challengers head on, shows an understanding of their concerns, shares with them a healthy dose of reality, and commits to presenting users with compelling transparency and more meaningful privacy options.
This alternative doesn’t touch on artificial intelligence, Russia, or the opioid epidemic, all of which played major roles in the hearings. But, the same principles apply to those topics. Be open about the business case and concerns, and the business need to protect and empower users. And don’t hide under the cloak of technology. Sure, technology is still pretty opaque to many of the people asking questions, but it’s not the shield it used to be.
Would this approach make Facebook into a beloved mega-company, and Zuckerberg into an American hero? No. Because there’s no such thing anymore, not in 2018. And, that’s ok. Leave the days of the dorm room and the garage behind, tech industry; we’re the incumbents now. There are good, responsible ways to be a successful business, ways that will keep profits going and growing, and make policymakers champions rather than enemies. Embrace those.
Ultimately, I’d like to see Facebook and other tech companies go even further than that, and while maintaining their identity as businesses, radically reorient those businesses around users — focusing on transparency and empowerment as necessary for user retention. But that’s the next goal to pursue, and we have to establish a stronger foundation of trust before such an approach would be taken honestly.