Answer and attribution bar on Gist for a question about the new Superman movie, and Prorata chief business officer Annelies Jansen
Prorata’s chief business officer has set out four things the AI start-up needs to prove to publishers as it builds up to a wider launch of its products.
Prorata, which has received investment from Mail owner DMG Media and signed almost 100 publisher agreements, promises to share revenue split proportionally by the amount of each creator’s content used to create each AI-generated answer.
The start-up, which has a team of 80 people globally, has created an AI answer engine called Gist, just launched its first AI answer widget for a publisher, and is building an AI advertising tool to ensure it brings in the revenue it promises to split with creators.
Prorata will keep 50% of the revenue made from advertising alongside its AI answers and the remaining 50% will be shared proportionally among the publishers whose content is used to create the response.
Prorata’s 100 publisher agreements give it access to more than 500 titles to create what Jansen described as “arguably the largest database of licensed content used for gen AI answers”.
It uses that database to generate answers that can appear anywhere “within the trusted content world” of its publisher partners who integrate a Gist widget.
Chief business officer Annelies Jansen told Press Gazette one of the major principles Prorata needs to prove is that “AI engagement on a decentralised level is something that users want” and would use across publisher websites, adding that it’s “not a replacement of the search button in the top right corner”.
“This is about bringing new behaviour to existing users, but also using a network to drive users across a network of quality journalism, quality content.”
The second is that it can create that network effect benefiting all the publishers involved equally via its model of attributing where each part of an AI-generated answer came from.
“In, let’s say mid next year, I want to be able to have a series of use cases that say, on average, publishers who’ve integrated Gist.ai have been reporting plus double-digit growth of engagement on their own media channels. That’s my goal.”
The percentage of content from each publisher is displayed to users through an attribution bar, which Jansen hopes could become an industry standard because society is “owed that transparency”.
The third – which Jansen said “for some people would be number one” – is for Prorata’s AI ad unit to outperform existing ad units.
“Because our case study is plus 20% more engagement and plus 50% more CPC [cost per click], that is incremental to the industry.”
And finally, Jansen said, is the need to prove to publishers that Prorata is “very easy” to work with making it “the easiest onboarding journey they’ve ever experienced with a tech partner”.
She promised a partner portal “that will be a destination in its own right, where our partners will be able to see not just what is the value of my content, but how’s it being used? Where is it coming from?…
“One of our goals is that in the editorial room you’ll have a new line on your Chartbeat which says traffic from AI and the average revenue per answer is up week on week, because it needs to be embedded in the workflow, and it needs to be super easy.”
Other AI publisher deals ‘not a systematic approach to bringing fair value’
“He thought: ‘This is weird. How is taking an LLM to court or trying to strike a deal going to build a sustainable business model for the whole industry, for all content creators?’” Jansen said.
She continued: “He thought: ‘Well, this is not going to help 99.9% of all people out there, because a deal actually requires you to bring in your legal team, and it’s not a systematic approach to how you bring fair value to the content used in a gen AI answer, and that’s where the idea for Prorata came from.
“The notion that you share revenue with the content creators is not new in the digital age. We see it in Spotify, we see it with Youtube or see it with Apple News, but in gen AI it’s just harder. It’s harder because you can’t count. And so we developed a system to count…
“We believe attribution is necessary to count contribution of original content in a gen AI answer. That’s why attribution is the foundation of our business, to be able to count it so you can share it. And I would say that, in that, we are different from deal makers.”
Publisher AI answer engines
Gist was built as an answer engine as a proof of concept to “prove to the market that attribution can work”, Jansen said, but a lot of the focus is now on building up the tech on publishers’ own sites.
Prorata’s Gist AI answer engine, currently in beta
Adweek launched its AI “companion” answer agent specifically for Cannes Lions in June which was able to draw on the publisher’s own news updates about the festival as well as lifestyle and travel content from the likes of Complex, The Skift and Atlas Obscura to help attendees make the most of their time in France.
Jansen said publishers can display the Gist widget in multiple ways, whether as an “ask me anything”, an article summarisation or related questions, and in various places on an article page.
Jansen said the idea is that people will stay on the sites longer with this interactive experience and that it can “keep them away from” the big tech platforms.
“If your experience in a media channel is not fulfilling, you often go to Google and now OpenAI or ChatGPT, Perplexity – can we, by providing a product to the industry to collaborate, not collaboration through negotiation, but collaboration through product solution, can we actually create an additional layer and network that lives between my own media channel and the big tech general environment?”
Traffic versus engagement question for publishers
But fewer people are going directly to publisher websites in the first place. The latest Reuters Institute Digital News Report found that less than a quarter (22%) of 18 to 24-year-olds globally said news websites and apps were their main source of news compared to 44% relying on social media and video networks. Among 25 to 34-year-olds the split was 26% newsbrands and 38% social media.
Jansen said in response that although Prorata is only working in text currently, they are building the tech to work in video and audio. The company has signed agreements with the likes of Sky and Universal Music Group.
Jansen also said the aim is the widgets will also bring incremental traffic gains for publishers, who could choose whether they want to prioritise traffic or engagement.
“For some publishers, they actually realise that there is so much incremental usage of their content in answers and the monetisation through the ad brings a significant revenue stream that they are less focused on the click through from the answers… for those publishers who have a subscription product, then that click through to their own channel is much more important.”
Asked about the potential disadvantage of users clicking sources in the answer widget, thereby leaving the publisher site, Jansen said the key will be for product people at both Prorata and the publisher to work together to find a “healthy medium”.
“What we see from our launch last week – related questions at the bottom of an answer, not surprisingly, are a really good way to keep people on site because they’ll spend more time on the site and every time they click on the related prompts, a new page opens up, a new opportunity to keep people inside. That’s also why it is so important that Gist.ai in terms of traffic count, lives within the publisher environment.”
Jansen said that AI answer engines are less satisfying for the user when it is based on only one publisher’s content because “in general, their questions are broader than what the publisher actually has to offer, even if you have an archive going back 150 years”.
Instead Prorata is trying to encourage publishers to be open to collaboration. Under the agreements they have been signing, a Gist answer widget on one publisher’s website can show content sourced from hundreds of other providers.
For this reason, publishers signing agreements with Prorata must meet its content quality standards so media companies do not unknowingly show untrustworthy information as a result of their mutual association.
“We are not afraid to publish an answer with contribution from other people in the ecosystem,” Jansen said. “As you can tell, I’m trying to avoid the word competitor because I think these publishers are not competing with each other anymore. They’re actually competing with big tech.”
She also said: “Every publisher on their own trying to solve for this will make it much tougher to turn the tide.”
But it changed strategy, Jansen said, upon thinking “why don’t we just go back to what publishers really understand, which is advertising”.
Prorata AI ad unit: Currently building supply and demand
Prorata is building an agentic AI ad unit that creates ads from material uploaded by advertisers and that also decides how and where the ads should show up alongside relevant content.
Jansen said this means the adverts are created “on the fly in real time, which is highly contextual, because it has the semantic understanding of both sides of the spectrum”.
These adverts will be able to appear on every Gist answer. They are undergoing a pilot in the US currently and will roll out in the US and UK in September. DMG Media, Time, Fortune, Lee Enterprises, Adweek and The Arena Group are also testing the ads on their own platforms.
In the meantime, Jansen said, they are working on building supply and demand – speaking to both publishers and advertisers at the recent Cannes Lions festival for example – so they can get over the “cold start problem” and create an ad business that “generates substantial money for publishers”.
Why Prorata says it signs agreements, not deals
Jansen opposes the use of the word “deals” in relation to Prorata’s agreements with publishers, saying “a deal’s a negotiation” whereas Prorata, which does not pay any money upfront but pins everything on the revenue share, has the same terms for all its partners.
Prorata was a partner at last month’s PPA Awards in London where it had a full-page advert in the programme stating: “They scrape and steal. We credit and compensate.”
Jansen said they have been deciding internally “how aggressive should we be in highlighting that your content is being stolen every day?… With the acceleration of the growth of users for LLMs, and no real solution yet, to stop that scraping without proper compensation we’ve actually become a little bit more outspoken, I would say, in our messaging.”
She added: “I think it is important to wake up or continue to let the industry know that if we don’t do anything, it’s not going to go away. This is not one where you think it’s going to be solved by the Government, or you think it’s going to be solved by the regulator, and I as a publisher could just continue to go on and do what I do.
“This needs to be solved by awareness and education on how you can actually participate in the incremental engagements. And so we make it easy to work with us, it’s for the benefit of the publishing ecosystem. We’re here to bring more money to the publishers.”
Despite her opposition to the deals, Jansen said they have actually helped foster an understanding on two key points.
She said they “create awareness for new user engagement tools, and also create realisation that the deals are only there for the happy few. The deals are not licensing deals. The deals are, get out of jail deals…”
Jansen said building an AI website chatbot often appeared to be the easiest thing to do with those tech credits and claimed this showed publishers that using only their own body of content is “often not enough”.
“So we’ve seen examples of publishers with deals who have added content to their corpus, who have gone out and said ‘I want to license your content’ or simply… have added Wikipedia.”
Input versus output
Something else Jansen said is often misunderstood is that Prorata’s licensing and attribution system relates only to output – not input for AI training and scraping which is opposed, when done without permission, by most publishers.
“What is not helpful is that usually the two use cases, input and output, through journalists are being put into one problem to solve for. And I think that’s not helpful for the regulators. It’s not helpful for anybody that needs to make a decision in the industry.
“Because if you think it’s one problem, then you’re also looking for one solution. And if you’re able to decouple them and say, okay, there is one problem to solve for which is my content has been stolen and used for training, that’s an input problem, it’s stolen. How do I get value for whatever it’s stolen?”
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our “Letters Page” blog
There’s been much talk recently – especially among politicians – about productivity. And for good reason: Australia’s labour productivity growth sits at a 60-year low.
To address this, Prime Minister Anthony Albanese has convened a productivity round table next month. This will coincide with the release of an interim report from the Productivity Commission, which is looking at five pillars of reform. One of these is the role of data and digital technologies, including artificial intelligence (AI).
But what do we really know about how AI impacts productivity?
What is productivity?
Put simply, productivity is how much output (goods and services) we can produce from a given amount of inputs (such as labour and raw materials). It matters because higher productivity typically translates to a higher standard of living. Productivity growth has accounted for 80% of Australia’s income growth over the past three decades.
Productivity can be thought of as individual, organisational or national.
Your individual productivity is how efficiently you manage your time and resources to complete tasks. How many emails can you respond to in an hour? How many products can you check for defects in a day?
Organisational productivity is how well an organisation achieves its goals. For example, in a research organisation, how many top-quality research papers are produced?
National productivity is the economic efficiency of a nation, often measured as gross domestic product per hour worked. It is effectively an aggregate of the other forms. But it’s notoriously difficult to track how changes in individual or organisational productivity translate into national GDP per hour worked.
AI and individual productivity
The nascent research examining the relationship between AI and individual productivity shows mixed results.
A 2025 real-world study of AI and productivity involved 776 experienced product professionals at US multinational company Procter & Gamble. The study showed that individuals randomly assigned to use AI performed as well as a team of two without. A similar study in 2023 with 750 consultants from Boston Consulting Group found tasks were 18% faster with generative AI.
A 2023 paper reported on an early generative AI system in a Fortune 500 software company used by 5,200 customer support agents. The system showed a 14% increase in the number of issues resolved per hour. For less experienced agents, productivity increased by 35%.
But AI doesn’t always increase individual productivity.
A survey of 2,500 professionals found generative AI actually increased workload for 77% of workers. Some 47% said they didn’t know how to unlock productivity benefits. The study points to barriers such as the need to verify and/or correct AI outputs, the need for AI upskilling, and unreasonable expectations about what AI can do.
A recent CSIRO study examined the daily use of Microsoft 365 Copilot by 300 employees of a government organisation. While the majority self-reported productivity benefits, a sizeable minority (30%) did not. Even those workers who reported productivity improvements expected greater productivity benefits than were delivered.
Prime Minister Anthony Albanese has convened a productivity round table in August. Lukas Coch/AAP
AI and organisational productivity
It’s difficult, if not impossible, to attribute changes in an organisation’s productivity to the introduction of AI. Businesses are sensitive to many social and organisational factors, any one of which could be the reason for a change in productivity.
Nevertheless, the Organisation for Economic Co-operation and Development (OECD) has estimated the productivity benefits of traditional AI – that is, machine learning applied for an industry-specific task – to be zero to 11% at the organisational level.
A 2024 summary paper cites independent studies showing increases in organisational productivity from AI in Germany, Italy and Taiwan.
In contrast, a 2022 analysis of 300,000 US firms didn’t find a significant correlation between AI adoption and productivity, but did for other technologies such as robotics and cloud computing. Likely explanations are that AI hasn’t yet had an effect on many firms, or simply that it’s too hard to disentangle the impact of AI given it’s never applied in isolation.
AI productivity increases can also sometimes be masked by additional human labour needed to train or operate AI systems. Take Amazon’s Just Walk Out technology for shops.
More generally, think about the unknown number (but likely millions) of people paid to label data for AI models.
Amazon’s Just Walk Out technology intended to reduce labour as customer purchases would be fully automated. John G. Mabanglo/EPA
AI and national productivity
The picture at a national level is even murkier.
Clearly, AI hasn’t yet impacted national productivity. It can be argued that technology developments take time to affect national productivity, as companies need to figure out how to use the technology and put the necessary infrastructure and skills in place.
The common narrative around AI and productivity is that AI automates mundane tasks, making us faster at doing things and giving us more time for creative pursuits. This, however, is a naive view of how work happens.
Just because you can deal with your inbox more quickly doesn’t mean you’ll spend your afternoon on the beach. The more emails you fire off, the more you’ll receive back, and the never-ending cycle continues.
Imagine a world in which AI isn’t simply about speeding up tasks but proactively slows us down, to give us space to be more innovative, and more productive. That’s the real untapped opportunity with AI.
North America business correspondent & Business reporter
Watch: Beverley Morris flushes her toilet using a bucket because of low water pressure
When Beverly Morris retired in 2016, she thought she had found her dream home – a peaceful stretch of rural Georgia, surrounded by trees and quiet.
Today, it’s anything but.
Just 400 yards (366m) from her front porch in Mansfield, Georgia, sits a large, windowless building filled with servers, cables, and blinking lights.
It’s a data centre – one of many popping up across small-town America, and around the globe, to power everything from online banking to artificial intelligence tools like ChatGPT.
“I can’t live in my home with half of my home functioning and no water,” Ms Morris says. “I can’t drink the water.”
She believes the construction of the centre, which is owned by Meta (the parent company of Facebook), disrupted her private well, causing an excessive build-up of sediment. Ms Morris now hauls water in buckets to flush her toilet.
She says she had to fix the plumbing in her kitchen to restore water pressure. But the water that comes of the tap still has residue in it.
“I’m afraid to drink the water, but I still cook with it, and brush my teeth with it,” says Morris. “Am I worried about it? Yes.”
Meta, however, says the two aren’t connected.
In a statement to the BBC, Meta said that “being a good neighbour is a priority”.
The company commissioned an independent groundwater study to investigate Morris’s concerns. According to the report, its data centre operation did “not adversely affect groundwater conditions in the area”.
While Meta disputes that it has caused the problems with Ms Morris’ water, there’s no doubt, in her estimation, that the company has worn out its welcome as her neighbour.
“This was my perfect spot,” she says. “But it isn’t anymore.”
Huge data centres are being built across the state of Georgia
We tend to think of the cloud as something invisible – floating above us in the digital ether. But the reality is very physical.
The cloud lives in over 10,000 data centres around the world, most of them located in the US, followed by the UK and Germany.
With AI now driving a surge in online activity, that number is growing fast. And with them, more complaints from nearby residents.
The US boom is being challenged by a rise in local activism – with $64bn (£47bn) in projects delayed or blocked nationwide, according to a report from pressure group Data Center Watch.
And the concerns aren’t just about construction. It’s also about water usage. Keeping those servers cool requires a lot of water.
“These are very hot processors,” Mark Mills of the National Center for Energy Analytics testified before Congress back in April. “It takes a lot of water to cool them down.”
Many centres use evaporative cooling systems, where water absorbs heat and evaporates – similar to how sweat wicks away heat from our bodies. On hot days, a single facility can use millions of gallons.
One study estimates that AI-driven data centres could consume 1.7 trillion gallons of water globally by 2027.
Few places illustrate this tension more clearly than Georgia – one of the fastest-growing data centre markets in the US.
Its humid climate provides a natural and more cost-effective source of water for cooling data centres, making it attractive to developers. But that abundance may come at a cost.
Gordon Rogers is the executive director of Flint Riverkeeper, a non-profit advocacy group that monitors the health of Georgia’s Flint River. He takes us to a creek downhill from a new construction site for a data centre being built by US firm Quality Technology Services (QTS).
George Dietz, a local volunteer, scoops up a sample of the water into a clear plastic bag. It’s cloudy and brown.
“It shouldn’t be that colour,” he says. To him, this suggests sediment runoff – and possibly flocculants. These are chemicals used in construction to bind soil and prevent erosion, but if they escape into the water system, they can create sludge.
QTS says its data centres meet high environmental standards and bring millions in local tax revenue.
While construction is often carried out by third-party contractors, local residents are the ones left to deal with the consequences.
“They shouldn’t be doing it,” Mr Rogers says. “A larger wealthier property owner does not have more property rights than a smaller, less wealthy property owner.”
Tech giants say they are aware of the issues and are taking action.
“Our goal is that by 2030, we’ll be putting more water back into the watersheds and communities where we’re operating data centres, than we’re taking out,” says Will Hewes, global water stewardship lead at Amazon Web Services (AWS), which runs more data centres than any other company globally.
He says AWS is investing in projects like leak repairs, rainwater harvesting, and using treated wastewater for cooling. In Virginia, the company is working with farmers to reduce nutrient pollution in Chesapeake Bay, the largest estuary in the US.
In South Africa and India – where AWS doesn’t use water for cooling – the company is still investing in water access and quality initiatives.
In the Americas, Mr Hewes says, water is only used on about 10% of the hottest days each year.
Still, the numbers add up. A single AI query – for example, a request to ChatGPT – can use about as much water as a small bottle you’d buy from the corner shop. Multiply that by billions of queries a day, and the scale becomes clear.
Gordon Rogers takes regular water samples to monitor the health of Georgia’s Flint River
Prof Rajiv Garg teaches cloud computing at Emory University in Atlanta. He says these data centres aren’t going away – if anything, they’re becoming the backbone of modern life.
“There’s no turning back,” Prof Garg says.
But there is a path forward. The key, he argues, is long-term thinking: smarter cooling systems, rainwater harvesting, and more efficient infrastructure.
In the short term, data centres will create “a huge strain”, he admits. But the industry is starting to shift toward sustainability.
And yet, that’s little consolation to homeowners like Beverly Morris – stuck between yesterday’s dream and tomorrow’s infrastructure.
Data centres have become more than just an industry trend – they’re now part of national policy. President Donald Trump recently vowed to build the largest AI infrastructure project in history, calling it “a future powered by American data”.
Back in Georgia, the sun beats down through thick humidity – a reminder of why the state is so attractive to data centre developers.
For locals, the future of tech is already here. And it’s loud, thirsty, and sometimes hard to live next to.
As AI grows, the challenge is clear: how to power tomorrow’s digital world without draining the most basic resource of all – water.
Correction: This article originally said that Beverly Morris lives in Fayette County, Georgia, and has been amended to explain that she lives in Mansfield, Georgia.
Get our flagship newsletter with all the headlines you need to start the day. Sign up here.
Opinions expressed by Entrepreneur contributors are their own.
We’re witnessing a pivotal moment in the evolution of search. Search engine optimization (SEO) has become more complex and dynamic than ever as Google’s Search Generative Experience (SGE) and other AI-powered summary tools become the face of the search experience.
With the rise of AI and social media platforms as primary search channels, traditional SEO tactics are falling short. If AI summaries become the new gatekeepers of online discovery, your brand’s visibility depends on more than just ranking on page one. You’ll need to optimize for how these algorithms synthesize, repurpose and favor content. That means prioritizing credibility, clarity and domain relevance.
In this regard, 2025 is shaping up to be a turning point. As the SEO landscape shifts, brands need to rethink everything from their domain strategy to their presence in AI-generated search results to stay competitive. Ultimately, if your brand isn’t seen as a clear expert in your field, you risk becoming invisible online.
Disappear or adapt: Why you need to invest in organic AI optimization
As AI-driven search continues to evolve, brands will face a choice: Invest in more intelligent, AI-optimized SEO or become increasingly overlooked in search results. Brands are confronting heightened competition for limited visibility within AI-generated results. In response, forward-thinking brands are approaching AI search as a distinct optimization channel.
This approach requires updating the website structure and content to align with how AI systems parse information. As a result, brands will want to make fresh content part of their SEO strategy. This involves regularly updating cornerstone pages, refreshing stats and maintaining an active publishing cadence because AI craves relevance and recency. On the technical side, they’ll also need to invest in optimizing their sites with structured data, schema markup and clear metadata to make content easier for AI models to understand, surface and cite.
Your domain name might be holding you back
One of the easiest ways to stand out in AI-generated search is by leveraging a strategic domain name. In an AI-powered ecosystem, short, descriptive and memorable domains can provide an edge by standing out, signaling relevance and credibility to both prospective customers and algorithms.
By adopting a domain closely aligned with the interests of your target audience, you’re helping generative AI search better identify the purpose of your website, while strengthening the authority and clarity of your services for AI.
Where social media search comes into play
Today, social media platforms like TikTok and Instagram are channels where people — especially Gen Z — begin their search journeys. Why? They want to see a product, hear about it and watch someone use it.
To meet this increase in social search, work to align your SEO, marketing and social media strategies around shared messaging and content. Starting this July, Instagram will allow public posts to be indexed by search engines. Brands that treat social media content as a standalone channel, separate from SEO, may miss out on this discoverability opportunity. An integrated, cross-platform strategy reinforces your authority across all discovery channels, AI included.
But here’s the wildcard: with more discussion around regulation and algorithm shifts, social media platforms are also becoming increasingly unpredictable. So what happens if platforms get banned for certain users or decline in popularity? Will more consumers default back to Google and Amazon? The answer isn’t clear, but one thing is: Those that align and optimize for visibility across all search channels will be better positioned for success.
The future of search revolves around clarity, credibility and relevance
At its core, SEO has always centered around making your brand easier to discover. But in this new age of AI and social-driven discovery, clarity, credibility and relevance matter more than ever.
That’s why businesses need to treat their digital identity and everything it touches — including their domain, content and brand messaging — as a holistic ecosystem. Your domain name should reflect who you are. Your content should prove what you know. And your online presence should signal relevance, credibility and authority to machines and humans alike.
The brands that thrive in this new search era will be the ones that adapt quickly, invest smartly and make their digital identities crystal clear.
We’re witnessing a pivotal moment in the evolution of search. Search engine optimization (SEO) has become more complex and dynamic than ever as Google’s Search Generative Experience (SGE) and other AI-powered summary tools become the face of the search experience.
With the rise of AI and social media platforms as primary search channels, traditional SEO tactics are falling short. If AI summaries become the new gatekeepers of online discovery, your brand’s visibility depends on more than just ranking on page one. You’ll need to optimize for how these algorithms synthesize, repurpose and favor content. That means prioritizing credibility, clarity and domain relevance.
In this regard, 2025 is shaping up to be a turning point. As the SEO landscape shifts, brands need to rethink everything from their domain strategy to their presence in AI-generated search results to stay competitive. Ultimately, if your brand isn’t seen as a clear expert in your field, you risk becoming invisible online.