A trademark battle that pitted technology giant OpenAI against a company known as Open AI (note the space between the terms) has resulted in a summary judgment that has ordered the smaller enterprise to cease use of the name and its prized internet real estate, open.ai.
Software engineer Jeff Schomay has been experimenting with techniques to transform game visuals using artificial intelligence-powered rendering. In a recent blog post, he shared screens from his Thunder Lizard “old-school ASCII RPG style game” and you can see various AI models reconstructing the title’s basic character-based visuals into far plusher full-motion graphics. However, there are quite a few compromises needed to achieve a ‘playable’ 10fps and 1ms latency AI visual enhanced game.
The current game
You can play Thunder Lizard, using the current ASCII rogue-like engine visuals, right now, via that link. In brief, you use the cursor keys to navigate around the prehistoric landscape while attempting to catch and east smaller dinosaurs, while avoiding larger ones. You grow as you eat smaller prey, but you have to flourish enough to become the most powerful dinosaur before the volcano destroys the entire landscape.
Wouldn’t this game be more fun with snazzy pixel art visuals? Schomay must have thought so, too. And you can see various samples of AI-rendered alternatives to the ASCII throughout his blog post.
Cake and eat it – not possible
As Schomay put it in his blog post, there is a “need for speed” in gaming, that isn’t yet met by many AI renderers. At least concerning what is available to indie devs.
We get to see some fantastic looking prehistoric landscape renderers, which would really add something to Thunder Lizard. However, the software engineer couldn’t coax better than “a four-second latency, it was unplayable” from the most handsome AI game graphics renderer.
The compromise
You can read much more about Schomay’s efforts with various AI models in the linked blog, but for actual playability the choice was ultimately Fal.ai. This model beat the other contenders for its “very fast generation times, source image adherence, and decent look and feel,” according to the software engineer.
Though what you see is an (AI) engine capable of just 10fps and 1ms latency, it’s the best option, according to Schomay’s efforts, in this case.
Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.
Pixel peepers might feel a little let down by the chosen AI visuals, which eventually won out. However, Schomay ends his blog on an upbeat note, stating that “it is amazing to see AI image generation happen in real-time.” He is also excited about the future, but hopes some strides can be made in frame to frame consistency.
We’ve previously reported on the use of AI in developing and enhancing games and graphics. Nvidia shared a lot of RTX technology reliant AI-enhanced development tool information earlier this year.
A group of Democratic lawmakers announced the breakthrough to their colleagues four days into the recent special session — a tentative agreement struck to rewrite the state’s much-contested artificial intelligence regulations that would put an end to more than a year of protracted debate.
The Sunday night deal, reached after days of uncertainty during a session otherwise focused on the state’s budget, would provide a path toward firmer tech oversight.
But by early the next afternoon, the framework had collapsed. Major tech companies and a leading business group hadn’t agreed to a key provision that would hold companies liable if their AI-powered products were responsible for discriminating against people who, say, had applied for a job or a loan. That lawmakers had announced a deal came as a shock.
Anger and intensive lobbying followed. Some groups that had supported the framework began to back away, just hours after the negotiations had appeared to be reaching the finish line. When one Democratic lawmaker called a lobbyist friend to talk about the agreement, the lobbyist replied: “There is no (expletive) deal.”
“The screaming was palpable,” said Loren Furman, the president and CEO of the Colorado Chamber of Commerce, which had been negotiating with Google and another lawmaker. Neither Google nor the chamber had agreed to the framework.
Last week’s deal-that-wasn’t renewed concerns about the influence of major technology companies in the state Capitol, alongside broader frustrations that lawmakers and business groups had been tasked — under pressure from Gov. Jared Polis — with regulating a massive industry in a matter of days.
For progressive lawmakers and their allies, there were echoes of the regular session in the spring, when lawmakers overwhelmingly passed legislation regulating other tech sectors — social media, autonomous vehicles, ride-hailing apps — only for Gov. Jared Polis to step in and veto them.
“I believe that the governor’s office, and the influence that the tech industry has over the governor’s office, is a huge problem — particularly when our constituents are rightfully and incredibly concerned about the influence of AI,” said Rep. Javier Mabrey, a Denver Democrat who was involved in the special session’s AI negotiations.
Last week, instead of rewriting the state’s current AI rules, which would have taken effect in February without any intervention, lawmakers simply delayed them. They’ll now take effect on June 30, more than two years after they were signed into law — an action that had set off immediate calls for changes to them by tech companies, government entities that use AI, and by Polis, who’d half-heartedly signed the bill.
The delay means lawmakers will, once again, take up the issue in the coming months, now that they’ve bought themselves more time.
The regulations, as now written, will require risk assessments and disclosures from companies that develop and use AI. Amid a tug-of-war over how the rules should be amended, the potential framework that emerged early last week demonstrated that there was likely a path out of the gridlock. Negotiators on either side were optimistic about the next round of talks.
Other lawmakers and negotiators, though, argued that the technology companies and the chamber were key groups that needed to be more directly involved in approving the terms of the deal before it was announced.
Furman said the technology sector is a large and growing part of the state’s economy; policymakers can’t close the state off to those companies, she argued. She and others objected to the notion that there was ever a deal, given that they hadn’t agreed to it, and criticized the anti-tech comments from other lawmakers.
“What I’m learning in this building is that you have to stakehold with all organizations, and if there are people who are not at the table, then when you think you have a deal, you might not,” said Rep. Michael Carter, an Aurora Democrat. He’d also sponsored an AI bill and was involved in the final negotiations.
Gov. Jared Polis, left, signs bills passed during the special legislative session at a media event where he also announced actions to help close a budget deficit caused by the federal tax bill, in the Governor’s Residence at Boettcher Mansion in Denver on Thursday, Aug. 28, 2025. Next to him is Mark Ferrandino, director of the Office of State Planning and Budgeting. (Photo by Hyoung Chang/The Denver Post)
Regulating tech
Lawmakers in the legislature’s Democratic majorities broadly agree that companies should be held responsible if they make or use AI that discriminates against people during resume screenings, reviews of loan applications or health care decisions.
And legislators argue that the August negotiations came close to figuring out how to do that without running afoul of Polis’ desire to protect Big Tech from regulations that, he has said, may hurt Colorado’s ambitions of becoming an industry hub.
Polis, whose office declined an interview request, told reporters Thursday that his recent tech-friendly vetoes — and the ongoing debate about AI regulations — showed that such policymaking was “complicated and it’s important to get it right.”
A frequent AI user, Polis made his fortune in the early dot-com rush. Earlier this year, he supported a tech-backed effort in Congress that would have blocked states from regulating AI for the next decade.
“Coloradans are forward-looking (and) want to be on the cutting edge of adapting different technologies in our daily lives and work lives,” Polis said. “And at the same time, we want to be protected from discrimination. We want to be safe when we’re taking a car somewhere.”
But they’re not so complicated, he continued, that they can’t be regulated. He’s called for tighter, consumer-focused guardrails on artificial intelligence, and he co-authored “A Blueprint for an AI Bill of Rights” for former President Joe Biden’s administration.
State legislators don’t have armies of staff members to talk through the particulars of artificial intelligence or algorithmic decision-making, he said. They’re often overwhelmed by better-financed lobbyists and tech companies that, for all their futurist ideals, are ultimately businesses trying to maximize profits.
Late this month, when lawmakers returned to Denver to discuss the regulations, more than 150 lobbyists and firms registered a position on the bill, with companies like Google, Amazon and Workday as well as national tech groups all present.
The companies “have a lot of power to show up with a couple of key lobbyists and to overwhelm the abilities that the legislatures have to think about this,” Venkatasubramanian said, arguing that the companies’ strategy is to delay and prevent. “And they’ve done this successfully a number of times. They come with full force, saying, ‘We’re tech companies, you need to believe us or else we’ll leave your state.’ People don’t know how to respond to that.”
That’s what happened when Rep. Jenny Willford, a Northglenn Democrat, sponsored a bill this year to require more safety measures of ride-hailing companies like Uber and Lyft. Willford sponsored the bill, she has said, because she was allegedly sexually assaulted by a Lyft driver last year.
The companies said the bill’s provisions couldn’t be implemented. Uber threatened to leave the state if the bill passed, and it asked drivers and users who opened its app to write to their lawmakers to oppose it. Uber had also made the same threat in Minneapolis over a proposed minimum wage increase.
“The bottom line is that this bill was ultimately about accountability and liability,” said Willford, who was the lawmaker who was told by a lobbyist that no AI deal had been reached. “Them having liability for the harm that they’ve caused and the actions they did not take in order to keep people safe. We saw that with the AI bill this time around.”
Sen. Lindsey Daugherty, an Arvada Democrat who sponsored the vetoed bill that would have regulated social media companies, said her proposal was more about protecting kids than cracking down on tech companies. Still, she said, its veto underscored that tech bills had an easier path to becoming law if the industry and Polis were on your side.
State Rep. Jenny Willford speaks about her bill, HB25-1291, with the support of Rep. Meg Froelich, left, during the last day of the 2025 regular legislative session in the Colorado State Capitol in Denver on May 7, 2025. (Photo by Helen H. Richardson/The Denver Post)
Weighing compliance, discrimination risks
Furman, the state chamber’s CEO, argued that some tech bills had to be vetoed because companies fundamentally couldn’t comply with them. As for AI companies, she said they weren’t opposed to negotiating liability — including who should bear the legal cost if an AI system used by a company or a hospital is found to discriminate against a job applicant or a patient.
But she said they couldn’t reach an agreement during a short special session.
“It takes a surgical approach to figure out how to get your parties on the same side on this,” she said. “If we were in a regular session, we probably could’ve gotten there.”
The task of rewriting the state’s AI rules in a week was likely doomed from the start, agreed Senate Majority Leader Robert Rodriguez, who sponsored the initial rules and the recent attempt to rewrite them. Other lawmakers and negotiators were frustrated that the AI bill was part of the special session at all.
He said tech companies were reasonable with him during negotiations, but the lobbying fight had been vicious.
“I was ready to do the work, and conversations were happening,” Rodriguez said. “But once you put it on the special session calendar, then you have to come up with policy (and) start trying to negotiate in a short period of time.”
The need for the regulations is real, Venkatasubramanian said, even if tech companies will wield considerable muscle to block them. Artificial intelligence systems and automated screening programs are prominent across various services in the United States, and they’re trained on data from an imperfect society.
Bias may not even be intentional: He pointed to a 2019 paper about a common medical algorithm used to determine access to high-risk treatment. Researchers found that it “routinely lets healthier whites into the programs ahead of blacks who are less healthy,” according to the University of Chicago.
That was because the model was built on data that based the need for future treatment on the cost of prior treatment, Venkatasubramanian explained. Because white people had access to more expensive treatment for a variety of systemic reasons, the model picked up on that pattern and directed them toward better care.
The bias in AI systems and algorithms has been studied and understood for nearly two decades, he said. Mitigating that risk was the goal of labor unions, advocacy groups and progressive lawmakers during the special session.
They said they wanted to give users of those services the ability to see which data informed AI’s decisions about them, to correct it when it was wrong, and to pursue lawsuits if the artificial intelligence system or the company that deployed it used the technology to discriminate.
The bill also would’ve cut some of the regulatory requirements from the state’s current regulations.
“This is going to be one of the most important public policy fights in our generation,” said David Seligman, a Democratic attorney general candidate whose nonprofit law firm was involved in the AI negotiations. “And this is going to be the shape of it: whether we can hold these companies accountable, whether we’re going to share in that wealth.”
Those negotiations will now continue into next year. Despite raw nerves from last week, officials from all sides were optimistic.
Rep. Brianna Titone, an Arvada Democrat who also sponsored the legislation, said she’d already taken the first steps to start drafting a new bill. The Colorado Technology Association said the delay provided “the opportunity to work collaboratively on practical solutions that strengthen consumer trust, safeguard jobs, and preserve Colorado’s competitiveness.”
Furman said the competing interests needed an “adult in the room” — a mediator — to help them work through the liability disagreement. But she said ground had been gained elsewhere, and she argued that Big Tech firms weren’t flatly opposed to regulation.
“If we’d spent five months working through this, we might get there,” she said. “But people want to have a narrative for political reasons, a lot of times, and they don’t want to pay attention to some of the areas where there could be consensus — and (instead) they point the finger and blame.”