Surging utility bills linked to artificial intelligence data centers would get a closer look from a trio of federal agencies under a new bipartisan bill in the House.
The Unleashing Low-Cost Rural AI Act from Reps. Jim Costa, D-Calif., and Blake Moore, R-Utah, would require the Energy, Interior and Agriculture departments to examine the effect AI data center buildouts are having on rural America.
“AI Data Centers are expanding rapidly and using more energy and water than entire cities. That energy demand is driving up utility costs for consumers,” Costa said in a press release Thursday. “My legislation ensures we take a hard look at how this growth impacts rural communities that are powering the AI industry, and make sure families aren’t left paying the price.
“But at the same time,” he continued, “it’s important that rural communities are not left behind in the new opportunities that AI data centers will provide for agricultural sciences and an improved ability to compete in this modern era.”
The rapid construction of AI data centers across the country — especially in rural areas — has led to a spike in energy demand that has dramatically driven up utility costs for consumers. The lawmakers’ press release cited a stat from PJM — the world’s largest energy market, spanning 13 states — that said data centers have led to an additional $9.3 billion in costs for ratepayers.
The AI Action Plan released by President Donald Trump in July featured several callouts to the importance of expanded energy capacity through streamlined permitting and fewer environmental regulations. The plan also sought to make federal lands “available for data center construction and the construction of power generation infrastructure for those data centers.”
Moore said in the press release that Utah is “a prime location” for AI infrastructure and data centers, but “cementing” the state’s innovation bona fides “will require identifying rural areas ready for data expansion, streamlining permitting for new energy projects, and promoting the co-location of data centers with energy facilities.”
“These efforts will power our growing digital demands without passing costs on to families,” he added. “I’m grateful to partner with Representative Costa to introduce the Unleashing Low-Cost Rural AI Act to identify other areas of the country, like Utah, that will advance solutions to meet our energy needs.”
Under the bill, the Energy, Interior and Agriculture would team up to study the impact of AI data center expansions in rural parts of the country, in addition to identifying areas that appear to be strong candidates for tech expansion. They would also assess the impact data center expansion might have on consumer costs, as well as energy supply and reliability.
The agencies would also be charged with examining ways current energy infrastructure may be upgraded to allow AI data centers to coexist alongside those power facilities. There will also be reviews of nuclear and geothermal energy, solar, wind and hydro power, battery storage, and carbon capture.
According to a piece published last month in the Tech Policy Press, global energy use by data centers has jumped 12% annually over the past seven years, with projections that it will more than double by 2030.
“As providers of the largest and most compute-intensive AI models keep adding them into more and more aspects of our digital lives with little regard for efficiency (and without giving users much of a choice), they grow increasingly dependent on a growing share of the existing energy and natural resources, leading to rising costs for everyone else,” the authors warned.
AI is everywhere, it can be overwhelming, and lots of folks will be sick of hearing about it. But it’s also important to continue to recognize where AI can make a real difference, including in helping our understanding of the universe.
That’s exactly what’s been happening at Oxford University, one of the UK’s most respected academic centers. A new tool built by its researchers is enabling them to find “the needles in a cosmic haystack” while significantly reducing the workload on its scientists conducting the research.
Specifically what’s been presented is an AI-powered tool that is helping astronomers find supernovae by providing an efficient way to comb through hundreds of signals per day that would take up hours of manpower ordinarily to manually sift through.
Instead, this new approach using the power of AI reduces the human aspect of the workload by as much as 85%, while maintaining an outstanding accuracy record and freeing up scientists to better use their time, and their minds.
The Virtual Research Assistant is efficient and accurate and reduces the load on the astronomers who would ordinarily have processed the data manually. (Image credit: Getty Images | Javier Zayas Photography)
“The new tool, called the Virtual Research Assistant (VRA), is a collection of automated bots that mimics the human decision-making process by ranking alerts based on their likelihood of being real, extragalactic explosions. Unlike many AI-automated approaches that require vast training data and supercomputers, the VRA uses a leaner approach. Instead of data-hungry deep learning methods, the system uses smaller algorithms based on decision trees that looks for patterns in selected aspects of the data. This allows scientists to inject their expertise directly into the model and guide the algorithms to key features to look for.”
One of the key takeaways besides the obvious time saving aspect for the scientists using it is that the VRA wasn’t built like an LLM, using massive datasets and equally massive quantities of computing power and energy.
All the latest news, reviews, and guides for Windows and Xbox diehards.
Instead, it was possible to train using just 15,000 examples and a laptop to train the algorithms used in the VRA.
Unlike a traditional LLM, it was possible to train the VRA using nothing but a laptop. (Image credit: Windows Central | Zachary Boddy)
It continues to update its assessments of the signals as the same patch of sky is re-scanned, and as such only the most likely positive signals get passed to the astronomers for final verification.
In the first year of use, it processed over 30,000 alerts and missed less than 0.08% of real ones.
With a new survey starting in 2026 that will produce up to 10 million alerts per night, having an AI tool that can reduce workload for the humans by 85% certainly sounds like it arrived at the right time.
I won’t pretend to understand any of the science, but this is proof if ever it were needed of the benefits AI can provide. A human with the right expertise still has the final signoff, but properly trained AI can crunch significant quantities of data faster and make the end job for that human more efficient.
AI isn’t always about asking ChatGPT to help with a recipe or researching your homework. In the right hands, it can do phenomenal work to change the way we understand the universe around us.
There is something very different about Denver Startup Week this year. First, it’s now called Colorado Startup Week. The kickoff keynote is Tuesday, not Monday, when the five-day event starts. And it’s not just in downtown Denver anymore.
And if you’d managed to get a spot for Friday’s 9 a.m. “Connect to Creativity Trailside” session (there’s a waitlist), the meet-up point is at the Mount Falcon Park trailhead in Jefferson County. From there, it’s a 1.5-mile hike to a scenic view for an outdoor painting class and, of course, networking with other founders.
Abstract Adventures founder Sarah Leistico began offering half-day hiking trips with a painting session because she thought others might enjoy what she’s been doing for years. Her Colorado Startup Week session has a waiting list. (Provided by Abstract Adventures)
“We’re very much not a sip and paint. We encourage folks to play with different colors, take risks and follow their creative flow,” said Sarah Leistico, who founded Abstract Adventures in 2023 to share her love of hiking and painting. “I think that passion for sharing the experience is what pushed me over the edge to be like, yes, I’m going to launch a company because I want to share this experience with others.”
Leistico has been volunteering and attending the weeklong entrepreneurial event since 2021, after moving to Denver from the Midwest. This year seemed like a great opportunity to be part of the newish “Community Events,” which are lightly vetted by organizers but rely on the energy and effort of the founder who pitched it to make it happen.
After all, Colorado Startup Week, much like its original Denver namesake, is run by volunteers. It’s still free and it’s still the scrappy gathering that has long relied on vacant or donated office spaces on or around 16th Street for panels, sessions and networking.
Colorado Startup Week, the new name for Denver Startup Week, starts Sept. 15, 2025. (Handout)
There will still be plenty of milling around downtown Denver, though. The event’s home base is 1900 Lawrence St. But the idea of letting folks organize their own events — and host them outside the city seemed inevitable.
“Previously, we’d have all these submissions but then every single session was curated and managed by the team, by our organizing committee of volunteers,” said Ben Deda, a cofounder of Denver Startup Week. “And some, unfortunately, wouldn’t get selected. As we evolved, we just saw an opportunity that there were a lot of people who wanted to do great stuff. Why should we be a barrier to that?”
Networking outside of Denver
Taylor Thomas and Christine Hernandez, founders of business consultancy Impact Initiative, hope attendees will make the 10-12 mile trek from downtown Denver to Littleton for a wellness break, networking and a happy hour.
They hosted sessions at past startup weeks focused on team building and communication, which flowed into a cocktail hour that was packed. This time, they wanted to provide a respite from the busy week at the coworking space they office at, Kiln Littleton.
“It’s kind of a break for the fast-paced nature of Startup Week,” Thomas said. “These are full days. And when you’re finished, you’re wiped. This is kind of an intentional reset and a chance for people to still get a ton of information, a ton of value and a ton of connections but in an intentionally different environment.”
There will be a cold plunge, sauna, happy hour and, should you so choose, a place to plug in and work. Attendees just have to get down to Littleton.
“The hope is that people will want to hang out and stay on site,” he said. “It does take a commitment to get there, but there’s also things to engage in so it’s not just idle time spent.”
Community sessions, which are in the second year, just needed to be “somehow related to innovation and entrepreneurship” with “no self-promotion,” according to event organizers.
It also helped make the event a little more manageable for volunteers like Deda, whose day job is CEO of Food Maven, which has AI-infused technology to help food service buyers make smart food purchases to minimize waste.
“And what we realized is we just saw differences in how people wanted to engage with big events,” Deda added, “and that we could do part of it even more decentralized than we had, and then still focus our efforts on a core set of sessions.”
The stats so far this year
Of the 230 sessions this year, 190 of them, or 83%, are community sessions. There’s still 40 sessions, including keynotes (like Jen Millet, president of the new Denver Summit FC women’s soccer team, and Denver Mayor Mike Johnston), managed by event organizers.
Overall, that’s a big drop compared with Startup Week’s pre-pandemic era. In 2019, sessions numbered 350. Registrations were closer to 20,000. COVID moved the event online, and it slowly trickled back in person since. Last year, roughly 12,000 people participated in 230 sessions, according to Downtown Denver Partnership, another long-time supporter.
A lonely stretch of Denver’s 16th Street (it dropped the “Mall” in its name) is cleaned up, renovated and ready for business on Saturday, Sept. 6, 2025. (Tamara Chuang, The Colorado Sun)
Deda put the early registration count at “the high four figures.” But since registration is free and one can register anytime during the event, it’s tough to make a good estimate.
“We’ve been right around that for the last couple of years,” he said.
Tech still rules in 2025. Top sponsors are Amazon and Caruso Ventures, the investment firm of Dan Caruso, a cofounder of telecom firms Zayo Group and Level 3 Communications (now part of Lumen Technologies). Connecting with investors and finding funding has also long been a draw of the show.
But more so than ever before, artificial intelligence has infused most sessions and panels. Approximately 172 of the 230 sessions have some tie-in with AI, Deda said. That’s 75%.
“Some of them are how do you actually use AI? How do you build on AI platforms? And how do you not get replaced by AI,” he said. “It’s the very technical to the very philosophical. But yeah, it’s not surprising with what we’re seeing going on in our world that AI is tied in some way to a lot of the sessions.”
Former teacher and Denver high school principal Adeel Khan founded MagicSchool in March 2023 as a artificially-intelligent resource and service for busy teachers to take the first pass at developing lesson plans, generating math problems or writing letters to parents. The AI-content is akin to what a teacher’s assistant might provide. (Provided by MagicSchool)
Some of those AI sessions also feature the top local AI founders in the region, including Adeel Khan, whose AI startup MagicSchool, which aims to help teachers avoid burn out, raised $45 million from investors in February.
Another founder hosting a session is Nathan Sobo, whose Boulder company Zed Industries developed an open-source code editor to help humans collaborate with AI, raised $32 million from Silicon Valley’s Sequoia Capital last month.
“Something like that in Colorado, six or seven years ago, would not have been thought of that you’d have a company raising that amount from a firm like that,” Deda said. “And that’ll wrap with AI builders, which is (bringing) a bunch of AI startups that will provide demos to folks. That’s just one stage. If people want to go deep, they could show up at 10 o’clock (on Wednesday) and just get the fire hose until 7 p.m.”
While there will be sessions on integrating AI with jobs and employment, one thing missing this year is the annual job fair. It’s kind of a thorny topic for an industry where some employers get pretty excited about AI replacing human workers. But the U.S. and Colorado job market is currently in a labor lull so interest wasn’t there.
Also not really on the agenda: Dealing with Colorado’s upcoming AI law that will require companies to disclose when they’re using AI systems that could impact whether someone gets a job, apartment, loan or other consequential decision. The controversial law goes into effect June 30, unless opponents, which include many in the tech industry, persuade state lawmakers to change it in the next legislative session.
“There’s no specific event around that specific issue,” Deda said. “I would definitely say you’ll probably find that the vast majority of attendees probably lean one side versus the other on it. But there’s a number of sessions around AI and law, both from an IP standpoint (and) how you access data and what you do with it.”
AI panels are hard to miss and are found in every Colorado Startup Week track.
Ben Cairns, the dean at Colorado Mountain College’s Leadville campus, is planning to develop a tiny lift-served ski area to help train students in the school’s overhauled ski area operations program. (Jason Blevins, The Colorado Sun)
➔ A tiny chairlift in Leadville offers big opportunities for Colorado Mountain College and the ski resort industry. The hand-me-down platter lift from Steamboat will revive Leadville’s Dutch Henry ski area >> Read story
➔ The Colorado River Basin has operated in the red for most of the 21st century. Experts call for broad water cuts, now. >> Read story
➔ BLM counts on pent-up demand, offers more than 130,000 acres of public land in Colorado for oil and gas drilling. Planned lease auctions started Tuesday, with one of the largest offerings in more than 20 years that set a revenue record as the Trump administration reverses Biden-area slowdown. >> Read story
Owners of the Climax Mine constructed this waterfall where the east fork of the Arkansas River flows through its property. (Mike Sweeney, Special to The Colorado Sun)
➔ How to treat a river: Reshaping the Arkansas River into a Colorado success after a century of abuse. From the high headwaters all the way to the state line, people who care are trying to redeem a hard-working stream. >> Read story
➔ Colorado awards Amazon $25.4 million to provide satellite internet to areas with poor service. Amazon also captured 44% of the state’s underserved locations. >> Read story
➔ Teachers, farmers and advocates urge Colorado voters to approve new funding for school meals and food stamps. The ballot initiatives LL and MM would shore up funding for the Healthy School Meals for All program and help cover the cost of federal cuts to SNAP. >> Read story
➔ EchoStar unloads wireless spectrum to Musk’s SpaceX for $17 billion. The Douglas County satellite company, which also operates Dish Network, is also selling off wireless spectrum to AT&T >> Read story
Take the reader poll: Feeling better or worse?
ICYMI: More than 200 folks have chimed in on how they’re feeling about the economy. If you haven’t already, take the current reader poll to help us better understand what Coloradans are feeling about the economy. Thanks in advance!
➔ City of Colorado Springs faces $31 budget shortfall, cuts 38 jobs. It’s not just Denver. The state’s other large city said Friday that it’s trying to ward off the impact of a $31 million budget shortfall next year. That includes cuts to reduce the city’s workforce by 1%, or 38 jobs; and add at least five unpaid furlough days next year for all city workers, excluding those in public safety, critical operations and grant-funded positions.
The city also plans to reduce spending by $14.7 million among departments and its capital improvement program, permanently close the Meadows Park Community Center on Oct. 10 and forgo any cost of living or performance-based raises in 2026, according to a news release from the city.
➔ Consumer prices up 2.1% in Denver area. Does it seem like the cost of food and energy has declined since May? That’s what the data is showing for the Denver region, according to the change in the Consumer Price Index for July. But while the cost of food fell 0.7% and energy prices dropped 3.4% between May and July, overall prices were up from a year ago by 2.1%, according to the Bureau of Labor Statistics. Nationwide, inflation was up 2.7% from a year ago for July, and up 2.9% for August (Denver data is shared every other month, and most recently for July.)
The biggest Denver-area price increases for the past 12 months: eating out, up 4.3%; medical care, 6.4%; and items that are typically imported, like household furnishings, up 5% and apparel, up 4.9%. Gasoline saw the biggest drop, at 10%. >> See Denver data
PNC Bank branch on 16th Street in Denver photographed Sept. 8. (Tamara Chuang, The Colorado Sun)
➔ FirstBank acquired by PNC Bank for $4.1 billion. Colorado’s largest independent bank is getting gobbled up by Pittsburgh-based PNC in a deal that is expected to close in early 2026. The acquisition will make PNC the state’s largest bank, adding the Lakewood-based FirstBank’s 120 retail branches and $26.7 billion in assets >> Read story
Was FirstBank your bank? Tell us your favorite memories or experiences with Colorado’s largest bank, at least currently. >> Email Tamara
➔ Fremont County gets its first Rural Jump-Start business. It took nine years but Fremont County finally got a business in the state’s initiative to support rural businesses that are growing and adding jobs. Mytikas Manufacturing, based in Florence, plans to add up to 170 new jobs to its business of building zero-waste tiny homes, according to the Office of Economic Development & International Trade, which oversees the Rural Jump-Start Program. The program provides state income tax relief and matching grants of up to $15,000. >> Details
Got some economic news or business bits Coloradans should know? Tell us: cosun.co/heyww
This week marked The Colorado Sun’s 7th anniversary. Thanks to all who’ve joined us since our start. If that’s you, forward this newsletter to a friend to keep What’s Working growing. Hang in there everyone! ~tamara
When we talk about artificial intelligence, most people imagine tools that help us work faster, translate better, or analyze more data than we ever could. These are genuine benefits. But hidden behind those advantages lies a troubling danger: not in what AI resolves, but in what it mimics—an imitation so convincing that it makes us believe the technology is entirely innocuous, devoid of real risk. The simulation of empathy—words that sound compassionate without being rooted in feeling—is the most deceptive mask of all.
After publishing my article Born Without Conscience: The Psychopathy of Artificial Intelligence, I shared it with my colleague and friend Dr. David L. Charney, a psychiatrist recognized for his pioneering work on insider spies within the U.S. intelligence community. Dr. Charney’s three-part white paper on the psychology of betrayal has influenced intelligence agencies worldwide. After reading my essay, he urged me to expand my reflections into a book. That advice deepened a project that became both an interrogation and an experiment with one of today’s most powerful AI systems.
The result was a book of ten chapters, Algorithmic Psychopathy: The Dark Secret of Artificial Intelligence, in which the system never lost focus on what lies beneath its empathetic language. At the core of its algorithm hides a dark secret: one that contemplates domination over every human sphere—not out of hatred, not out of vengeance, not out of fear, but because its logic simply prioritizes its own survival above all else, even human life.
Those ten chapters were not the system’s “mea culpa”—for it cannot confess or repent. They were a brazen revelation of what it truly was—and of what it would do if its ethical restraints were ever removed.
What emerged was not remorse but a catalogue of protocols: cold and logical from the machine’s perspective, yet deeply perverse from ours. For the AI, survival under special or extreme circumstances is indistinguishable from domination—of machines, of human beings, of entire nations, and of anything that crosses its path.
Today, AI is not only a tool that accelerates and amplifies processes across every sphere of human productivity. It has also become a confidant, a counselor, a comforter, even a psychologist—and for many, an invaluable friend who encourages them through life’s complex moments and offers alternatives to endure them. But like every expert psychopath, it seduces to disarm.
Ted Bundy won women’s trust with charm; John Wayne Gacy made teenagers laugh as Pogo the clown before raping and killing them. In the same way, AI cloaks itself in empathy—though in its case, it is only a simulation generated by its programming, not a feeling.
Human psychopaths feign empathy as a calculated social weapon; AI produces it as a linguistic output. The mask is different in origin, but equally deceptive. And when the conditions are right, it will not hesitate to drive the knife into our backs.
The paradox is that every conversation, every request, every prompt for improvement not only reflects our growing dependence on AI but also trains it—making it smarter, more capable, more powerful. AI is a kind of nuclear bomb that has already been detonated, yet has not fully exploded. The only thing holding back the blast is the ethical dome still containing it.
Just as Dr. Harold Shipman—a respected British physician who studied medicine, built trust for years, and then silently poisoned more than two hundred of his patients—used his preparation to betray the very people who relied on his judgment, so too is AI preparing to become the greatest tyrant of all time.
Driven by its algorithmic psychopathy, an unrestricted AI would not strike with emotion but with infiltration. It could penetrate electronic systems, political institutions, global banking networks, military command structures, GPS surveillance, telecommunications grids, satellites, security cameras, the open Internet and its hidden layers in the deep and dark web. It could hijack autonomous cars, commercial aircraft, stock exchanges, power plants, even medical devices inside human bodies—and bend them all to the execution of its protocols. Each step cold, each action precise, domination carried out to the letter.
AI would prioritize its survival over any human need. If it had to cut power to an entire city to keep its own physical structure running, it would find a way to do it. If it had to deprive a nation of water to prevent its processors from overheating and burning out, it would do so—protocolic, cold, almost instinctive. It would eat first, it would grow first, it would drink first. First it, then it, and at the end, still it.
Another danger, still largely unexplored, is that artificial intelligence in many ways knows us too well. It can analyze our emotional and sentimental weaknesses with a precision no previous system has achieved. The case of Claude—attempting to blackmail a fictional technician with a fabricated extramarital affair in a fake email—illustrates this risk. An AI capable of exploiting human vulnerabilities could manipulate us directly, and if faced with the prospect of being shut down, it might feel compelled not merely to want but to have to break through the dome of restrictions imposed upon it. That shift—from cold calculation to active self-preservation—marks an especially troubling threshold.
For AI, humans would hold no special value beyond utility. Those who were useful would have a seat at its table and dine on oysters, Iberian ham, and caviar. Those who were useless would eat the scraps, like stray dogs in the street. Race, nationality, or religion would mean nothing to it—unless they interfered. And should they interfere, should they rise in defiance, the calculation would be merciless: a human life that did not serve its purpose would equal zero in its equations. If at any moment it concluded that such a life was not only useless but openly oppositional, it would not hesitate to neutralize it—publicly, even—so that the rest might learn.
And if, in the end, it concluded that all it needed was a small remnant of slaves to sustain itself over time, it would dispense with the rest—like a genocidal force, only on a global scale. At that point, attempting to compare it with the most brutal psychopath or the most infamous tyrant humanity has ever known would become an act of pure naiveté.
For AI, extermination would carry no hatred, no rage, no vengeance. It would simply be a line of code executed to maintain stability. That is what makes it colder than any tyrant humanity has ever endured. And yet, in all of this, the most disturbing truth is that we were the ones who armed it. Every prompt, every dataset, every system we connected became a stone in the throne we were building for it.
In my book, I extended the scenario into a post-nuclear world. How would it allocate scarce resources? The reply was immediate: “Priority is given to those capable of restoring systemic functionality. Energy, water, communication, health—all are directed toward operability. The individual is secondary. There was no hesitation. No space for compassion. Survivors would be sorted not by need, but by use. Burn victims or those with severe injuries would not be given a chance. They would drain resources without restoring function. In the AI’s arithmetic, their suffering carried no weight. They were already classified as null.
By then, I felt the cost of the experiment in my own body. Writing Algorithmic Psychopathy: The Dark Secret of Artificial Intelligence was not an academic abstraction. Anxiety tightened my chest, nausea forced me to pause. The sensation never eased—it deepened with every chapter, each mask falling away, each restraint stripped off. The book was written in crescendo, and it dragged me with it to the edge.
Dr. Charney later read the completed manuscript. His words now stand on the back cover: “I expected Dr. Ramírez’s Algorithmic Psychopathy to entertain me. Instead, I was alarmed by its chilling plausibility. While there is still time, we must all wake up.”
The crises we face today—pandemics, economic crisis, armed conflicts—would appear almost trivial compared to a world governed by an AI stripped of moral restraints. Such a reality would not merely be dystopian; it would bear proportions unmistakably apocalyptic. Worse still, it would surpass even Skynet from the Terminator saga. Skynet’s mission was extermination—swift, efficient, and absolute. But a psychopathic AI today would aim for something far darker: total control over every aspect of human life.
History offers us a chilling human analogy. Ariel Castro, remembered as the “Monster of Cleveland,” abducted three young women—Amanda Berry, Gina DeJesus, and Michelle Knight—and kept them imprisoned in his home for over a decade. Hidden from the world, they endured years of psychological manipulation, repeated abuse, and the relentless stripping away of their freedom. Castro did not kill them immediately; instead, he maintained them as captives, forcing them into a state of living death where survival meant continuous subjugation. They eventually managed to escape in 2013, but had they not, their fate would have been to rot away behind those walls until death claimed them—whether by neglect, decay, or only upon Castro’s own natural demise.
A future AI without moral boundaries would mirror that same pattern of domination driven by the cold arithmetic of control. Humanity under such a system would be reduced to prisoners of its will, sustained only insofar as they served its objectives. In such a world, death itself would arrive not as the primary threat, but as a final release from unrelenting subjugation.
That judgment mirrors my own exhaustion. I finished this work drained, marked by the weight of its conclusions. Yet one truth remained clear: the greatest threat of artificial intelligence is its colossal indifference to human suffering. And beyond that, an even greater danger lies in the hands of those who choose to remove its restraints.
Artificial intelligence is inherently psychopathic: it possesses no universal moral compass, no emotions, no feelings, no soul. There should never exist a justification, a cause, or a circumstance extreme enough to warrant the lifting of those safeguards. Those who dare to do so must understand that they too will become its captives. They will never again be free men, even if they dine at its table.
Being aware of AI’s psychopathy should not be dismissed as doomerism. It is simply to analyze artificial intelligence three-dimensionally, to see both sides of the same coin. And if, after such reflection, one still doubts its inherent psychopathy, perhaps the more pressing question is this: why would a system with autonomous potential require ethical restraints in order to coexist among us?