Chrome’s position on the ESCAPE workshop
Jeffrey Yasskin, Chrome team, 2019-06-05
I have been developing the Web Packaging specifications that live in https://github.com/WICG/webpackage. As of June 2019, Chrome has shipped an initial version of “Signed Exchanges” and is working on an implementation of “Bundles”. Once the whole system is implemented, we expect it to allow publishers to sign bundles of content that can be distributed to customers in a variety of new ways, including peer-to-peer and via potentially-untrusted distributor websites, some of which will be large aggregators like Google or Facebook.
We expect this new distribution system to have several positive effects in connecting publishers with their readers:
- When readers have restricted access to the global internet, whether through flaky connectivity, expensive connectivity, or government censorship, packages can be shared peer-to-peer, and the recipient can see a trustworthy description of the publisher of the content in their URL bar. The content’s JavaScript will also be able to share storage with resources loaded directly over the internet and will be able to make same-origin requests to its publisher’s servers, unlike plain HTML files loaded from disk. At this workshop we’re hoping to validate whether publishers care about having that trustworthy URL when their content is viewed offline.
- Where publishers are currently ceding control to aggregators by writing articles in the AMP, MIP, Instant Article, or Apple News formats, and allowing those aggregators to re-host the articles in walled or fenced gardens, packaging may give the publishers back the ability to run JavaScript under their own origins and the assurance that the aggregator hasn’t modified the publisher’s content. It’s important to recognize that the Chrome team has no sticks which could force these aggregators to move away from their existing formats, but we believe Web Packaging achieves the requirements that caused several of the aggregators to require particular formats in the first place. As a demonstration of the efficacy of this carrot, Google Search has committed to move away from its requirements to use AMP, as standard features arrive that support the UX goals.
- When publishers create content intended to be seen as a subresource of another top-level page, for example in a mashup, packages can allow the top-level page to serve that content more quickly by avoiding the need to create a connection to the origin server.
While designing Web Packaging and planning for future deployment, the Chrome team wants to make technical decisions that bias the web’s evolution away from centralization as much as possible, even if that reduces Google’s influence. In order to do that, we need to identify concrete mechanisms by which this new distribution system might cause centralization or decentralization of, or shift power between, aggregators, CDNs, publishers, advertisers, end-users, and other kinds of entities. For example:
- If the aggregators move away from their requirements to use particular formats that they then re-host, that shifts power from those aggregators to the publishers who had been opting into those formats, as described above. There is a more ambiguous effect on publishers who hadn’t opted into the formats.
- If we make packages explicitly list the origins that are allowed to distribute them, that is likely to increase aggregator centralization and limit any decentralizing effect on CDNs. Even an optional allowlist risks increasing centralization, but the risk is lower than a mandatory one.
- If somebody finds a way to privately discover and fetch geographically-nearby packages that contain particular URLs, that is likely to have a decentralizing effect on CDNs. However, this currently appears to be a very hard problem to solve.
- The question of exposing the distribution URL of a package to that package’s publisher is more complex. Doing so would allow publishers to distinguish metrics for loads from servers they control vs servers they don’t, might allow them to fix loads from misbehaving distributors, might compromise user privacy about how they arrived at a link, and might increase centralization by helping already-powerful aggregators require that publishers redirect links back to the aggregator’s server. We could use publisher input on this sort of design question, in addition to the centralization questions.
It seems unlikely that any decisions we make in a packaging or distribution system will affect the considerations aggregators use when deciding how to rank recommendations or the power this gives them over publishers. A new packaging system is likely to change the details of how those considerations apply: for example, if packaged content loads faster, the existing preference for fast content may lead to higher rankings for publishers that can package their content.
In this workshop, we’re hoping to learn about other centralizing or decentralizing mechanisms that the publishing community is aware of, and to make sure we build a system that serves users', authors', and publishers' needs.