Interoperability: Questions of principle and principal

Chris Riley
10 min readApr 27, 2018

In a previous post, I wrote that competition law and policy ought to push for more transparent, third-party accessible APIs to promote interoperability, including in vertical merger reviews such as the recent Facebook/WhatsApp integration. Focusing on the nondiscriminatory offering of APIs to enable interoperability encourages digital platforms to continue being platforms on which other services can continue to build. In contrast, letting the modern API ecosystem shrivel would convert today’s platforms into tomorrow’s gatekeepers threatening internet openness.

In this post, I want to dig into the interoperability APIs concept a little bit more, to help unpack what it would look like in practice. APIs have been getting a lot of attention over the past few weeks, and not the good kind. The recent Facebook hearings in Congress related to an early version of Facebook’s Graph API, one that allowed users to authorize the sharing of not only their data, but also their friends’ data with third party developers. This wasn’t a data breach, and Facebook didn’t “sell” data to Cambridge Analytica. This was an API working according to (bad) design. And one unfortunate possible future after this incident would be if Facebook, and other companies with substantial data pools, decided to shut the doors on their technical mechanisms for authorized sharing. If carried to an extreme, the result would be the end of the platform economy.

All those edges inside the blue circle? They’re probably API calls.

It’s not a huge stretch to say that APIs power the internet. Heck, there’s even an API Evangelist. And APIs have been the subject of competition challenges before. A few years ago, the Federal Trade Commission (FTC) conducted an antitrust investigation of Google based in part on the conditions the company imposed on its AdWords API. The FTC had substantial competition concerns with conditions Google was imposing that limited advertisers from running the same campaigns on multiple search engines.

The AdWords API investigation was the smallest of three pieces of the FTC’s investigation, however. Both the FTC’s dismissal of charges related to Universal Search and the paired settlement on essential patents in the same proceeding took higher billing. Yet, as a result of the investigation, Google made concrete changes to the terms around its core advertising API, removing those controversial pieces. The business community around search took notice, and although the conditions were only legally required for five years under the settlement, Google announced that it would extend them indefinitely late last year.

The FTC’s AdWords inquiry drives home the importance of nondiscriminatory practices in the offering of APIs — a piece of the policy puzzle that I didn’t get into in my last post. A nondiscrimination standard for evaluating the terms and conditions of such APIs sets a high bar. But if the goal is to empower downstream integrators and create meaningful competitive horizontal choices for users, that bar must be set.

With the right scale of API provisioning from a competition policy perspective, others can build businesses on their shoulders and give users more opportunities and choices for their online experience — without making available the scale of data that results in widespread privacy concerns, or undermines their own investment and competitiveness. They then benefit by unlocking downstream innovation and future revenue opportunities.

So, the emerging principle in this context is that digital platforms should offer sufficient third-party accessible APIs to enable interoperability (the core challenge), and should offer them on a transparent, well documented, nondiscriminatory basis (to avoid the AdWords dynamic of making them a tool for anticompetitive behavior), subject to individual user authorization (to try to prevent Cambridge Analytica style mass sharing).

Knowing we want APIs doesn’t answer all of the questions.

Which gets us to the question, APIs that provide access to what, exactly? To achieve the goals envisioned here, the offered access should include whatever core functionality and data are needed to enable effective interoperability — and not more. But that scope will probably vary greatly from service to service, not only differences in the platform offering the APIs, but also the nature of the interoperability itself, including whether horizontal (in the sense of a service that’s competing with some ‘core’ business of the platform’s) or vertical (in the sense of a service that uses its integration with ‘core’ business of the platform in order to exist/succeed).

Given these variances, it seems hard to articulate a universal articulation of sufficiency in greater technical detail than the general principle. But I’m hoping that a case study will help illustrate some factors that could be taken into account by an adjudicator down the road trying to interpret that principle in practice. Given its profile and the public understanding that has been built (at least a little) by recent news, I’ll use Facebook as the core of my case study in this post. (Aside: If you have a different user experience than I do with Facebook about what’s ‘necessary’, and you wish to offer a different view of ‘core functionality’ — let me know! @mchrisriley here or on Twitter.)

Facebook is, in many ways, one of the hardest cases, because Facebook’s service is so complex. For my user experience, it includes, at minimum: 1) my News Feed of status updates, pictures, video, events, etc. posted by my friends; 2) the archive of my posts and photos of me, some of which tag other Facebook users, and some of which have comments and reactions left by other Facebook users; 3) events, pages, groups and other quasi-ancillary community style features; 4) my friends’ individual Facebook pages / “walls”, which I value the ability to write on (to say Happy Birthday, e.g.) and to peruse from time to time to catch anything that might not have been presented in my News Feed including their pictures; 5) Facebook messenger (not that I use it much); and 6) third-party apps built on Facebook’s platform.

There’s a lot of data out there.

The complexity of Facebook, and the vast amount of info so many of us voluntarily share to enrich the many dimensions of the experience, is what creates privacy risks. The specific problems with Facebook’s Graph 1.0 API related to the scope of data that could be authorized for sharing, including but not limited to ‘extended permissions’. Basically, everything you could see, you could share — and third parties were able to aggregate enough of that data together to raise major alarm flags. But an ‘everything you can see you can share’ principle goes beyond one of ‘that which is necessary to enable effective interoperability’.

With that context in mind: what do I, as a Facebook user, want to be able to authorize Facebook to share with a third party service to enable effective interoperability? Here are some examples:

  • If the third party service I want to use with Facebook is a messaging app, then I want to be able to share my friend list, to find out if any of them are also using the same messaging app. I don’t want the messaging app to be able to read from, or write to, my News Feed. But I would like to be able to authorize it to access photos of me and photos I’ve uploaded so that I can send them over the service. Ideally, I’d like to be able to read my Facebook messages in the app as well, and write messages to my Facebook friends in a way they can read them through the standard Facebook messenger interface.
  • If the third party service is a news aggregator, then I probably want to be able to authorize the service to read from my News Feed. But I don’t necessarily want it to see my friend list, and I don’t want it to be able to read my Facebook messages (or send any!).
  • If the third party service is a social media aggregator, I may want to be able to authorize both read and write access to my News Feed, so that I can write one update and post to Facebook, Twitter, LinkedIn, and other services with a single stroke.

It gets harder to stretch beyond these three, thinking about hypotheticals that exist in poorly defined (or totally undefined) ancillary market segments. Imagine, for example, that I wrote an app that linked with Facebook and automatically posted on my behalf to say ‘happy birthday’ whenever a Facebook friend of mine had a birthday. (I’m not saying I would use such an app, but if it existed, somebody would.) Maybe it even has a business model behind it, e.g. it steers you ads for birthday gifts — even tailored ones, maybe mining available data about the person celebrating their birthday to make better gifts. Do we care about this actual or potential economic market? And if so, should I be able to share my friends’ birthdays with this service to make it work, or is that a bridge of sharing too far?

I’m going to posit that the “core functionality” of Facebook as a social media digital platform to enable effective cross-market-segment interoperability — setting aside Facebook messenger as a platform in its own right, to keep things cleaner — includes 1) reading and writing of pictures, video, and text status the user posts, is tagged in, and sees in the user’s News Feed, and 2) the user’s list of Facebook friends. I don’t think that most other data about the user or about the user’s friends needs be considered for these purposes (e.g. birthdays, group membership, religious affiliation, salary information, etc.), and the risk / privacy tradeoff balance cuts against offering this data.

This may feel like an aggressive set of expectations, to imagine that a successful company would be freely willing to make this data available. But that’s only because we’re looking at the “mature” Facebook business. It’s natural to start out an early tech company by offering this kind of openness, and taking advantage of the openness that’s offered by others. That’s what it means to be part of the platform economy. Just as entrepreneurs and new businesses depend on reasonable access to others’ APIs, so too do they typically offer their own on reasonable (even free) terms, to encourage others to build on top of and with their products and services. It increases their visibility within a crowded tech ecosystem and helps them acquire new users, and can lead to new revenue streams from downstream innovations over time.

Growth changes things.

But then, at some point along the curve of scaling up, these incentives change. Looking just at the theoretical dimension: At some point, the benefits that third parties get from openness can exceed the benefits that the platform derives from that openness. At that point, the platform will take steps to internalize the externalities it has been generating — either by shutting down the APIs offering that interoperability, or charging a monopoly rent for use of them. Some folk point to Slack today as evidence of this phenomenon in action (though there are usually arguments about security or user experience that make the reality a bit more complicated).

When a platform shuts down an external-facing API, it’s foreclosing the possibility of revenue from charging for access to the API — and foreclosing the future growth that can come from it— because downstream businesses get a greater relative benefit from the access and thus a competitive edge. The same effect occurs if the API is offered but on discriminatory terms; the result is a tax on downstream growth and a net loss for the overall economy.

Take a step back. Imagine if the APIs at stake in the Google AdWords FTC case, or Facebook / Cambridge Analytica, were just … removed. (To be clear, neither Google nor Facebook has indicated any consideration of that.) There’s ample reason to view such a move, should it happen, as an anticompetitive act under current U.S. law. The reverse of this is the positive story I want to drive home — if we want competition policy to actually promote competition, offering effective interoperability via APIs must be a core tool in the toolkit.

We need to have somebody who’s able to bring the hammer, when needed.

More important than precision on the what is having an answer to the who — as in, who will make sure a policy of interoperability through APIs is maintained. We must have a regulatory body capable of evaluating specific contexts, and empowered to say ‘yes this action not to offer interoperability on nondiscriminatory terms constitutes a competitive harm that must be addressed’.

On some level, the notion of interoperability via nondiscriminatory access to APIs is broader even than competition policy. I think we all can agree Facebook was right to block Cambridge Analytica from its APIs. But, what if they weren’t? Imagine if Cambridge Analytica had done nothing wrong, but their services were heavily used by the Republican party and not at all by the Democrats, and Facebook had made a decision not to allow it to use their APIs — but made the opposite decision to offer API access to the hypothetical Democrat-aligned Oxford Rhetorica. Is there a government actor who could hear such a dispute? Is there a standard that says Facebook should not be so partisan? I’ve been talking about interoperability and nondiscrimination in the context of competition law, and it’s not even clear we have the legal framework today to advance these competition goals. It’s much less clear whether any broader consumer protection or public interest standard could be applied by a government agency.

But I can see a light at the end of the tunnel here. It’s not hard to imagine a regulator, armed with some version of a public interest standard, empowered to evaluate closing-up actions, like Slack’s or the hypotheticals I’ve offered, and make a distinction between those that are on balance good because they create maximal efficiency and incentive to invest, and those that are on balance bad because of their downstream harm or, even more proximately, their anti-competitive impact on current or future competitors. It’s also straightforward to take this standard into merger review, and imagine conditioning future mergers — perhaps Apple and Shazam? — on the offering of APIs on nondiscriminatory terms to prevent leveraging of the combination in ways that stifle competition.

Is this going to fix all the competition problems we see in the tech sector? No. But it’s a solid foundation that echoes the core spirit of the internet, and gets us ahead of the next wave of potential gatekeepers in the internet ecosystem.

--

--

Chris Riley

Disruptive internet policy engineer, beverage connoisseur, gregarious introvert, contrarian order Muppet, and proud husband & father. Not in order.