Cloudflare has revealed its ambitious plans to launch a marketplace within the next year that allows website owners to sell access to their site’s content to AI model providers. This move is the culmination of Cloudflare CEO Matthew Prince’s vision to give content creators more control over how AI bots interact with their websites.
“If you don’t compensate creators one way or another, then they stop creating, and that’s the bit which has to get solved,” Prince explained in an interview with TechCrunch.
As a first step towards this initiative, Cloudflare introduced free observability tools called AI Audit on Monday. This new feature provides website owners with a dashboard to track analytics, showing how, why, and when AI models are crawling their sites. Additionally, AI Audit allows website owners to block unwanted AI scrapers with a single click, giving them the power to prevent all AI bots from scraping their content or selectively allowing access to certain scrapers that provide value.
The AI Audit tool also lets website owners monitor which AI scrapers are visiting their sites. A demo shared with TechCrunch highlighted the ability to track scrapers from companies like OpenAI, Meta, and Amazon, giving website owners unprecedented insight into their site’s interactions with AI models.
Cloudflare is tackling a major issue facing the AI industry: how smaller publishers can survive when AI tools like ChatGPT gather information from their sites without offering any compensation. Currently, AI model providers scrape content from countless smaller websites to power their language models. While major publishers like TIME, Condé Nast, and The Atlantic have struck licensing deals with OpenAI, most smaller websites are left uncompensated, despite contributing valuable content to these models.
Earlier this year, the issue gained attention when AI-powered search startup Perplexity was accused of scraping websites that had specifically opted out of being crawled through the Robots Exclusion Protocol. In response, Cloudflare introduced a feature allowing customers to block all AI bots with one click, a move Prince says came out of frustration from creators who felt their content was being exploited.
Many website owners have voiced concerns about AI bots scraping their sites at such high volumes that it impacts their server performance, with some even comparing it to DDoS attacks. Beyond the frustration, the sheer amount of scraping can lead to higher cloud costs and degraded service.
Cloudflare’s AI Audit tool aims to address these issues by offering selective control. Website owners can block unwanted AI scrapers, like Perplexity, while allowing others, like OpenAI, to access their content if beneficial.
Even large publishers with content licensing deals have limited insight into how often AI models like ChatGPT scrape their sites. Prince notes that this lack of transparency can affect whether publishers are receiving fair compensation from their licensing agreements.
Cloudflare’s upcoming marketplace will offer small publishers a chance to set their own terms for AI model providers wanting to scrape their content. This opportunity, previously limited to major platforms like Reddit and Quora, could allow smaller sites to strike deals similar to those secured by larger publishers.
While Cloudflare has not yet shared specific details on how this marketplace will function, Prince suggests that websites could charge based on the frequency of scraping or set a monetary price for access. Alternatively, websites could negotiate for attribution rather than payment. The finer points of this system remain to be worked out, and it’s unclear how eager AI model providers will be to pay for content they currently obtain for free.
Despite the potential resistance from AI companies, Prince believes that this approach is ultimately beneficial for the entire AI ecosystem. The current model, where some AI companies don’t compensate content creators at all, is unsustainable, and Cloudflare’s marketplace could represent a significant step toward resolving this imbalance.