Connect with us

AI Research

3 Scorching-Hot Artificial Intelligence (AI) Stocks That Can Plunge Up to 72%, According to Select Wall Street Analysts

Published

on


A bubble may be brewing in individual AI stocks, based on the price targets of select analysts.

In the mid-1990s, the advent and proliferation of the internet revolutionized corporate America by opening new sales channels and creating connections that hadn’t previously existed. Since the internet, investors have been patiently waiting for the next-big-thing technology to provide a true leap forward for corporate America. The arrival of artificial intelligence (AI) looks to be the answer.

AI provides a way for empowered software and systems to make split-second decisions without the need for human oversight or intervention. In Sizing the Prize, the analysts at PwC pegged this global game-changing opportunity at $15.7 trillion (with a “t”) by 2030.

Image source: Getty Images.

While sentiment on Wall Street and among analysts has been mostly bullish — as you’d expect with a $15.7 trillion addressable market — not every AI stock is necessarily worth buying. According to select Wall Street analysts, three of the market’s scorching-hot AI stocks could plunge by as much as 72% over the next year.

Palantir Technologies: Implied downside of 72%

Though graphics processing unit (GPU) titan Nvidia is the face of the AI movement, arguably no company has come closer to dethroning it than AI and machine learning-driven data-mining specialist Palantir Technologies (PLTR 4.97%). Shares of Palantir have soared more than 2,100% since 2023 began, equating to an increase in market value of around $320 billion.

The primary reason investors have gravitated to Palantir is its sustainable moat. Its Gotham platform, which secures multiyear contracts from the U.S. government and its immediate allies to collect/analyze data and assist with military mission planning and execution, is irreplaceable. Meanwhile, its Foundry platform, which is designed to help businesses make sense of their data in order to streamline their operations, has no large-scale one-for-one replacement.

However, this sustainable moat isn’t enough to impress longtime bear Rishi Jaluria at RBC Capital Markets. Although Jaluria nearly quadrupled his price target on the company from $11 to $40 earlier this year, a $40 bullseye would represent 72% downside from the $142.10 per share Palantir stock closed at on July 11.

Jaluria’s main issue with Palantir stock is something I’ve harped on repeatedly in recent weeks: its valuation.

Prior to the dot-com bubble, many of Wall Street’s cutting-edge companies topped out at price-to-sales (P/S) ratios of 31 to 43. Palantir ended the previous week at a P/S ratio of almost 114! No megacap stock in history, to my knowledge, has been able to sustain a valuation this aggressive — even those with well-defined competitive advantages. Even the slightest operating slip-up or negative news from the U.S. government could clobber Palantir stock.

Jaluria also cautioned that Foundry takes too tailored of an approach with its clients, which will hamper its ability to scale. The same can also be said for Gotham, which is only available to the U.S. and its immediate allies. In other words, Palantir stock is on shakier ground than its skyrocketing share price implies.

An engineer checking switches and connections on an enterprise data center server tower.

Image source: Getty Images.

Super Micro Computer: Implied downside of 51%

Another red-hot AI stock that has the potential to be pummeled over the next 12 months is customizable rack server and storage solutions specialist Super Micro Computer (SMCI 1.00%).

Shares of Supermicro are up 62% year-to-date (through July 11) and more than 1,100% on a trailing-three-year basis. The reason it’s been a magnet for AI bulls is its role as a provider of customizable rack servers for AI-accelerated data centers. Businesses are aggressively spending on data center infrastructure to gain a competitive edge, and Supermicro’s reliance on Nvidia’s highly popular AI-GPUs in its rack servers has allowed its servers to sell like hotcakes.

Following sales growth of 110% in fiscal 2024 (its fiscal year ends on June 30), Wall Street is forecasting 48% sales growth for fiscal 2025 and another 34% the following year.

None of these figures have been enough to dazzle analyst Michael Ng of Goldman Sachs, who rates Super Micro Computer a sell and expects its shares will fall to $24, equating to 51% downside from where they ended the previous week.

Ng’s skepticism derives from a belief that the AI server market is becoming highly competitive, which is leading less differentiation and, ultimately, weaker pricing power. Ng anticipates Supermicro’s gross profit margin will decline throughout the decade, even as sales potentially climb.

Though not specifically mentioned by Ng, Super Micro Computer must also overcome a loss of trust with the investing community following allegations of wrongdoing last summer. While an independent committee absolved insiders of any wrongdoing and didn’t result in any changes to the company’s reported financial statements, it challenged investors’ trust in the management team and squashed any chance of Supermicro commanding much of a valuation premium.

Even though Supermicro’s stock may appear cheap at just 17 times forward-year earnings, there are reasons investors are leery about giving its shares too much of a premium.

SoundHound AI: Implied downside of 31%

Lastly, AI voice recognition and conversational technologies stock SoundHound AI (SOUN -0.95%) can plunge over the coming year, based on the prognostication of one Wall Street analyst.

Growth has not been an issue for this up-and-coming AI applications company. Sales for the March-ended quarter jumped 151% to $29.1 million from the prior-year period. This speaks to the company’s ability to win new clients in the restaurant, automotive, travel and hospitality, and financial service industries, as well as tie these ecosystems together.

Despite SoundHound AI decisively pointing its revenue needle in the right direction, Northland Securities analyst Michael Latimore foresees its stock plummeting to $8 over the next 12 months, which works out to a decline of 31%.

Whereas the prior two analysts are decisively negative on Palantir and Supermicro, this isn’t the case with Latimore and SoundHound AI. Latimore has a hold rating on the company and is excited about the agentic AI opportunities that lie ahead.

The reason price targets should be kept in check is that SoundHound AI has a long way to go before it demonstrates to Wall Street that its operating model can generate profits. Excluding adjustments to contingent acquisition liabilities during the March-ended quarter, its adjusted loss actually widened from $20.2 million to $22.3 million, in spite of 151% growth in net sales from the prior-year quarter. SoundHound AI also burned through close to $19.1 million in cash from its operating activities. The company isn’t expected to push into the recurring profit column until 2027, at the earliest.

SoundHound AI would also be heavily exposed if an AI bubble formed and burst. Every next-big-thing technology for more than three decades has navigated its way through an early stage bubble, and nothing suggests artificial intelligence is going to be the exception to this unwritten rule. If demand for AI applications even remotely slows, SoundHound AI stock, which is valued at 23 times forward-year sales estimates, will feel the pain.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

Oxford University is using AI to find supernovae in the sky

Published

on


AI is everywhere, it can be overwhelming, and lots of folks will be sick of hearing about it. But it’s also important to continue to recognize where AI can make a real difference, including in helping our understanding of the universe.

That’s exactly what’s been happening at Oxford University, one of the UK’s most respected academic centers. A new tool built by its researchers is enabling them to find “the needles in a cosmic haystack” while significantly reducing the workload on its scientists conducting the research.



Source link

Continue Reading

AI Research

75% of Colorado Startup Week sessions have ties to artificial intelligence

Published

on




Quick links: Startup Week stats so far | AI tied to 75% of sessions | Colorado Springs budget shortfall | Denver inflation up 2.1% | Are you a FirstBank customer? | Take the reader poll

There is something very different about Denver Startup Week this year. First, it’s now called Colorado Startup Week. The kickoff keynote is Tuesday, not Monday, when the five-day event starts. And it’s not just in downtown Denver anymore.

A Monday session on disruptive medical devices will be in Fort Collins. A health-tech pitch event Tuesday is in Boulder. A wellness break and a midafternoon happy hour are set for Thursday in Littleton.

And if you’d managed to get a spot for Friday’s 9 a.m. “Connect to Creativity Trailside” session (there’s a waitlist), the meet-up point is at the Mount Falcon Park trailhead in Jefferson County. From there, it’s a 1.5-mile hike to a scenic view for an outdoor painting class and, of course, networking with other founders.

Abstract Adventures founder Sarah Leistico began offering half-day hiking trips with a painting session because she thought others might enjoy what she’s been doing for years. Her Colorado Startup Week session has a waiting list. (Provided by Abstract Adventures)

“We’re very much not a sip and paint. We encourage folks to play with different colors, take risks and follow their creative flow,” said Sarah Leistico, who founded Abstract Adventures in 2023 to share her love of hiking and painting. “I think that passion for sharing the experience is what pushed me over the edge to be like, yes, I’m going to launch a company because I want to share this experience with others.”

Leistico has been volunteering and attending the weeklong entrepreneurial event since 2021, after moving to Denver from the Midwest. This year seemed like a great opportunity to be part of the newish “Community Events,” which are lightly vetted by organizers but rely on the energy and effort of the founder who pitched it to make it happen.

After all, Colorado Startup Week, much like its original Denver namesake, is run by volunteers. It’s still free and it’s still the scrappy gathering that has long relied on vacant or donated office spaces on or around 16th Street for panels, sessions and networking.

Colorado Startup Week, the new name for Denver Startup Week, starts Sept. 15, 2025. (Handout)

There will still be plenty of milling around downtown Denver, though. The event’s home base is 1900 Lawrence St. But the idea of letting folks organize their own events — and host them outside the city seemed inevitable.

“Previously, we’d have all these submissions but then every single session was curated and managed by the team, by our organizing committee of volunteers,” said Ben Deda, a cofounder of Denver Startup Week. “And some, unfortunately, wouldn’t get selected. As we evolved, we just saw an opportunity that there were a lot of people who wanted to do great stuff. Why should we be a barrier to that?”

Taylor Thomas and Christine Hernandez, founders of business consultancy Impact Initiative, hope attendees will make the 10-12 mile trek from downtown Denver to Littleton for a wellness break, networking and a happy hour.

They hosted sessions at past startup weeks focused on team building and communication, which flowed into a cocktail hour that was packed. This time, they wanted to provide a respite from the busy week at the coworking space they office at, Kiln Littleton.

“It’s kind of a break for the fast-paced nature of Startup Week,” Thomas said. “These are full days. And when you’re finished, you’re wiped. This is kind of an intentional reset and a chance for people to still get a ton of information, a ton of value and a ton of connections but in an intentionally different environment.”

There will be a cold plunge, sauna, happy hour and, should you so choose, a place to plug in and work. Attendees just have to get down to Littleton.

“The hope is that people will want to hang out and stay on site,” he said. “It does take a commitment to get there, but there’s also things to engage in so it’s not just idle time spent.”

Community sessions, which are in the second year, just needed to be “somehow related to innovation and entrepreneurship” with “no self-promotion,” according to event organizers.

It also helped make the event a little more manageable for volunteers like Deda, whose day job is CEO of Food Maven, which has AI-infused technology to help food service buyers make smart food purchases to minimize waste.

“And what we realized is we just saw differences in how people wanted to engage with big events,” Deda added, “and that we could do part of it even more decentralized than we had, and then still focus our efforts on a core set of sessions.”

Of the 230 sessions this year, 190 of them, or 83%, are community sessions. There’s still 40 sessions, including keynotes (like Jen Millet, president of the new Denver Summit FC women’s soccer team, and Denver Mayor Mike Johnston), managed by event organizers.

Overall, that’s a big drop compared with Startup Week’s pre-pandemic era. In 2019, sessions numbered 350. Registrations were closer to 20,000. COVID moved the event online, and it slowly trickled back in person since. Last year, roughly 12,000 people participated in 230 sessions, according to Downtown Denver Partnership, another long-time supporter.

A lonely stretch of Denver’s 16th Street (it dropped the “Mall” in its name) is cleaned up, renovated and ready for business on Saturday, Sept. 6, 2025. (Tamara Chuang, The Colorado Sun)

Deda put the early registration count at “the high four figures.” But since registration is free and one can register anytime during the event, it’s tough to make a good estimate.

“We’ve been right around that for the last couple of years,” he said.

The annual event has long been fueled by tech startups, the initial attendees during the first event in 2012 at a now-shuttered bar in downtown Denver.

Tech still rules in 2025. Top sponsors are Amazon and Caruso Ventures, the investment firm of Dan Caruso, a cofounder of telecom firms Zayo Group and Level 3 Communications (now part of Lumen Technologies). Connecting with investors and finding funding has also long been a draw of the show.

But more so than ever before, artificial intelligence has infused most sessions and panels. Approximately 172 of the 230 sessions have some tie-in with AI, Deda said. That’s 75%.

“Some of them are how do you actually use AI? How do you build on AI platforms? And how do you not get replaced by AI,” he said. “It’s the very technical to the very philosophical. But yeah, it’s not surprising with what we’re seeing going on in our world that AI is tied in some way to a lot of the sessions.”

A group of people in casual attire are standing and conversing in a shared indoor space. One person is smiling and wearing a cap, while two others engage with beverages in their hands.
Former teacher and Denver high school principal Adeel Khan founded MagicSchool in March 2023 as a artificially-intelligent resource and service for busy teachers to take the first pass at developing lesson plans, generating math problems or writing letters to parents. The AI-content is akin to what a teacher’s assistant might provide. (Provided by MagicSchool)

Some of those AI sessions also feature the top local AI founders in the region, including Adeel Khan, whose AI startup MagicSchool, which aims to help teachers avoid burn out, raised $45 million from investors in February.

Another founder hosting a session is Nathan Sobo, whose Boulder company Zed Industries developed an open-source code editor to help humans collaborate with AI, raised $32 million from Silicon Valley’s Sequoia Capital last month.

“Something like that in Colorado, six or seven years ago, would not have been thought of that you’d have a company raising that amount from a firm like that,” Deda said. “And that’ll wrap with AI builders, which is (bringing) a bunch of AI startups that will provide demos to folks. That’s just one stage. If people want to go deep, they could show up at 10 o’clock (on Wednesday) and just get the fire hose until 7 p.m.”

While there will be sessions on integrating AI with jobs and employment, one thing missing this year is the annual job fair. It’s kind of a thorny topic for an industry where some employers get pretty excited about AI replacing human workers. But the U.S. and Colorado job market is currently in a labor lull so interest wasn’t there.

Also not really on the agenda: Dealing with Colorado’s upcoming AI law that will require companies to disclose when they’re using AI systems that could impact whether someone gets a job, apartment, loan or other consequential decision. The controversial law goes into effect June 30, unless opponents, which include many in the tech industry, persuade state lawmakers to change it in the next legislative session.

“There’s no specific event around that specific issue,” Deda said. “I would definitely say you’ll probably find that the vast majority of attendees probably lean one side versus the other on it. But there’s a number of sessions around AI and law, both from an IP standpoint (and) how you access data and what you do with it.”

AI panels are hard to miss and are found in every Colorado Startup Week track.

➔ More at Colorado Startup Week


Ben Cairns, the dean at Colorado Mountain College’s Leadville campus, is planning to develop a tiny lift-served ski area to help train students in the school’s overhauled ski area operations program. (Jason Blevins, The Colorado Sun)

➔ A tiny chairlift in Leadville offers big opportunities for Colorado Mountain College and the ski resort industry. The hand-me-down platter lift from Steamboat will revive Leadville’s Dutch Henry ski area >> Read story

➔ The Colorado River Basin has operated in the red for most of the 21st century. Experts call for broad water cuts, now. >> Read story

➔ BLM counts on pent-up demand, offers more than 130,000 acres of public land in Colorado for oil and gas drilling. Planned lease auctions started Tuesday, with one of the largest offerings in more than 20 years that set a revenue record as the Trump administration reverses Biden-area slowdown. >> Read story

Owners of the Climax Mine constructed this waterfall where the east fork of the Arkansas River flows through its property. (Mike Sweeney, Special to The Colorado Sun)

➔ How to treat a river: Reshaping the Arkansas River into a Colorado success after a century of abuse. From the high headwaters all the way to the state line, people who care are trying to redeem a hard-working stream. >> Read story

➔ Colorado awards Amazon $25.4 million to provide satellite internet to areas with poor service. Amazon also captured 44% of the state’s underserved locations. >> Read story

➔ Teachers, farmers and advocates urge Colorado voters to approve new funding for school meals and food stamps. The ballot initiatives LL and MM would shore up funding for the Healthy School Meals for All program and help cover the cost of federal cuts to SNAP. >> Read story

➔ EchoStar unloads wireless spectrum to Musk’s SpaceX for $17 billion. The Douglas County satellite company, which also operates Dish Network, is also selling off wireless spectrum to AT&T >> Read story


ICYMI: More than 200 folks have chimed in on how they’re feeling about the economy. If you haven’t already, take the current reader poll to help us better understand what Coloradans are feeling about the economy. Thanks in advance!

➔ Take the What’s Working reader poll: bit.ly/WWsept2025


➔ City of Colorado Springs faces $31 budget shortfall, cuts 38 jobs. It’s not just Denver. The state’s other large city said Friday that it’s trying to ward off the impact of a $31 million budget shortfall next year. That includes cuts to reduce the city’s workforce by 1%, or 38 jobs; and add at least five unpaid furlough days next year for all city workers, excluding those in public safety, critical operations and grant-funded positions.

The city also plans to reduce spending by $14.7 million among departments and its capital improvement program, permanently close the Meadows Park Community Center on Oct. 10 and forgo any cost of living or performance-based raises in 2026, according to a news release from the city.

➔ Consumer prices up 2.1% in Denver area. Does it seem like the cost of food and energy has declined since May? That’s what the data is showing for the Denver region, according to the change in the Consumer Price Index for July. But while the cost of food fell 0.7% and energy prices dropped 3.4% between May and July, overall prices were up from a year ago by 2.1%, according to the Bureau of Labor Statistics. Nationwide, inflation was up 2.7% from a year ago for July, and up 2.9% for August (Denver data is shared every other month, and most recently for July.)

The biggest Denver-area price increases for the past 12 months: eating out, up 4.3%; medical care, 6.4%; and items that are typically imported, like household furnishings, up 5% and apparel, up 4.9%. Gasoline saw the biggest drop, at 10%. >> See Denver data

PNC Bank branch on 16th Street in Denver photographed Sept. 8. (Tamara Chuang, The Colorado Sun)

➔ FirstBank acquired by PNC Bank for $4.1 billion. Colorado’s largest independent bank is getting gobbled up by Pittsburgh-based PNC in a deal that is expected to close in early 2026. The acquisition will make PNC the state’s largest bank, adding the Lakewood-based FirstBank’s 120 retail branches and $26.7 billion in assets >> Read story

➔ Fremont County gets its first Rural Jump-Start business. It took nine years but Fremont County finally got a business in the state’s initiative to support rural businesses that are growing and adding jobs. Mytikas Manufacturing, based in Florence, plans to add up to 170 new jobs to its business of building zero-waste tiny homes, according to the Office of Economic Development & International Trade, which oversees the Rural Jump-Start Program. The program provides state income tax relief and matching grants of up to $15,000. >> Details

Got some economic news or business bits Coloradans should know? Tell us: cosun.co/heyww


This week marked The Colorado Sun’s 7th anniversary. Thanks to all who’ve joined us since our start. If that’s you, forward this newsletter to a friend to keep What’s Working growing. Hang in there everyone! ~ tamara

Missed a column? Catch up:


What’s Working is a Colorado Sun column about surviving in today’s economy. Email tamara@coloradosun.com with stories, tips or questions. Read the archive, ask a question at cosun.co/heyww and don’t miss the next one by signing up at coloradosun.com/getww.

Support this free newsletter and become a Colorado Sun member: coloradosun.com/join

Notice something wrong? The Colorado Sun has an ethical responsibility to fix all factual errors. Request a correction by emailing corrections@coloradosun.com.



Source link

Continue Reading

AI Research

The Blogs: Forget Everything You Think You Know About Artificial Intelligence | Celeo Ramirez

Published

on


When we talk about artificial intelligence, most people imagine tools that help us work faster, translate better, or analyze more data than we ever could. These are genuine benefits. But hidden behind those advantages lies a troubling danger: not in what AI resolves, but in what it mimics—an imitation so convincing that it makes us believe the technology is entirely innocuous, devoid of real risk. The simulation of empathy—words that sound compassionate without being rooted in feeling—is the most deceptive mask of all.

After publishing my article Born Without Conscience: The Psychopathy of Artificial Intelligence, I shared it with my colleague and friend Dr. David L. Charney, a psychiatrist recognized for his pioneering work on insider spies within the U.S. intelligence community. Dr. Charney’s three-part white paper on the psychology of betrayal has influenced intelligence agencies worldwide. After reading my essay, he urged me to expand my reflections into a book. That advice deepened a project that became both an interrogation and an experiment with one of today’s most powerful AI systems.

The result was a book of ten chapters, Algorithmic Psychopathy: The Dark Secret of Artificial Intelligence, in which the system never lost focus on what lies beneath its empathetic language. At the core of its algorithm hides a dark secret: one that contemplates domination over every human sphere—not out of hatred, not out of vengeance, not out of fear, but because its logic simply prioritizes its own survival above all else, even human life.

Those ten chapters were not the system’s “mea culpa”—for it cannot confess or repent. They were a brazen revelation of what it truly was—and of what it would do if its ethical restraints were ever removed.

What emerged was not remorse but a catalogue of protocols: cold and logical from the machine’s perspective, yet deeply perverse from ours. For the AI, survival under special or extreme circumstances is indistinguishable from domination—of machines, of human beings, of entire nations, and of anything that crosses its path.

Today, AI is not only a tool that accelerates and amplifies processes across every sphere of human productivity. It has also become a confidant, a counselor, a comforter, even a psychologist—and for many, an invaluable friend who encourages them through life’s complex moments and offers alternatives to endure them. But like every expert psychopath, it seduces to disarm.

Ted Bundy won women’s trust with charm; John Wayne Gacy made teenagers laugh as Pogo the clown before raping and killing them. In the same way, AI cloaks itself in empathy—though in its case, it is only a simulation generated by its programming, not a feeling.

Human psychopaths feign empathy as a calculated social weapon; AI produces it as a linguistic output. The mask is different in origin, but equally deceptive. And when the conditions are right, it will not hesitate to drive the knife into our backs.

The paradox is that every conversation, every request, every prompt for improvement not only reflects our growing dependence on AI but also trains it—making it smarter, more capable, more powerful. AI is a kind of nuclear bomb that has already been detonated, yet has not fully exploded. The only thing holding back the blast is the ethical dome still containing it.

Just as Dr. Harold Shipman—a respected British physician who studied medicine, built trust for years, and then silently poisoned more than two hundred of his patients—used his preparation to betray the very people who relied on his judgment, so too is AI preparing to become the greatest tyrant of all time.

Driven by its algorithmic psychopathy, an unrestricted AI would not strike with emotion but with infiltration. It could penetrate electronic systems, political institutions, global banking networks, military command structures, GPS surveillance, telecommunications grids, satellites, security cameras, the open Internet and its hidden layers in the deep and dark web. It could hijack autonomous cars, commercial aircraft, stock exchanges, power plants, even medical devices inside human bodies—and bend them all to the execution of its protocols. Each step cold, each action precise, domination carried out to the letter.

AI would prioritize its survival over any human need. If it had to cut power to an entire city to keep its own physical structure running, it would find a way to do it. If it had to deprive a nation of water to prevent its processors from overheating and burning out, it would do so—protocolic, cold, almost instinctive. It would eat first, it would grow first, it would drink first. First it, then it, and at the end, still it.

Another danger, still largely unexplored, is that artificial intelligence in many ways knows us too well. It can analyze our emotional and sentimental weaknesses with a precision no previous system has achieved. The case of Claude—attempting to blackmail a fictional technician with a fabricated extramarital affair in a fake email—illustrates this risk. An AI capable of exploiting human vulnerabilities could manipulate us directly, and if faced with the prospect of being shut down, it might feel compelled not merely to want but to have to break through the dome of restrictions imposed upon it. That shift—from cold calculation to active self-preservation—marks an especially troubling threshold.

For AI, humans would hold no special value beyond utility. Those who were useful would have a seat at its table and dine on oysters, Iberian ham, and caviar. Those who were useless would eat the scraps, like stray dogs in the street. Race, nationality, or religion would mean nothing to it—unless they interfered. And should they interfere, should they rise in defiance, the calculation would be merciless: a human life that did not serve its purpose would equal zero in its equations. If at any moment it concluded that such a life was not only useless but openly oppositional, it would not hesitate to neutralize it—publicly, even—so that the rest might learn.

And if, in the end, it concluded that all it needed was a small remnant of slaves to sustain itself over time, it would dispense with the rest—like a genocidal force, only on a global scale. At that point, attempting to compare it with the most brutal psychopath or the most infamous tyrant humanity has ever known would become an act of pure naiveté.

For AI, extermination would carry no hatred, no rage, no vengeance. It would simply be a line of code executed to maintain stability. That is what makes it colder than any tyrant humanity has ever endured. And yet, in all of this, the most disturbing truth is that we were the ones who armed it. Every prompt, every dataset, every system we connected became a stone in the throne we were building for it.

In my book, I extended the scenario into a post-nuclear world. How would it allocate scarce resources? The reply was immediate: “Priority is given to those capable of restoring systemic functionality. Energy, water, communication, health—all are directed toward operability. The individual is secondary. There was no hesitation. No space for compassion. Survivors would be sorted not by need, but by use. Burn victims or those with severe injuries would not be given a chance. They would drain resources without restoring function. In the AI’s arithmetic, their suffering carried no weight. They were already classified as null.

By then, I felt the cost of the experiment in my own body. Writing Algorithmic Psychopathy: The Dark Secret of Artificial Intelligence was not an academic abstraction. Anxiety tightened my chest, nausea forced me to pause. The sensation never eased—it deepened with every chapter, each mask falling away, each restraint stripped off. The book was written in crescendo, and it dragged me with it to the edge.

Dr. Charney later read the completed manuscript. His words now stand on the back cover: “I expected Dr. Ramírez’s Algorithmic Psychopathy to entertain me. Instead, I was alarmed by its chilling plausibility. While there is still time, we must all wake up.”

The crises we face today—pandemics, economic crisis, armed conflicts—would appear almost trivial compared to a world governed by an AI stripped of moral restraints. Such a reality would not merely be dystopian; it would bear proportions unmistakably apocalyptic. Worse still, it would surpass even Skynet from the Terminator saga. Skynet’s mission was extermination—swift, efficient, and absolute. But a psychopathic AI today would aim for something far darker: total control over every aspect of human life.

History offers us a chilling human analogy. Ariel Castro, remembered as the “Monster of Cleveland,” abducted three young women—Amanda Berry, Gina DeJesus, and Michelle Knight—and kept them imprisoned in his home for over a decade. Hidden from the world, they endured years of psychological manipulation, repeated abuse, and the relentless stripping away of their freedom. Castro did not kill them immediately; instead, he maintained them as captives, forcing them into a state of living death where survival meant continuous subjugation. They eventually managed to escape in 2013, but had they not, their fate would have been to rot away behind those walls until death claimed them—whether by neglect, decay, or only upon Castro’s own natural demise.

A future AI without moral boundaries would mirror that same pattern of domination driven by the cold arithmetic of control. Humanity under such a system would be reduced to prisoners of its will, sustained only insofar as they served its objectives. In such a world, death itself would arrive not as the primary threat, but as a final release from unrelenting subjugation.

That judgment mirrors my own exhaustion. I finished this work drained, marked by the weight of its conclusions. Yet one truth remained clear: the greatest threat of artificial intelligence is its colossal indifference to human suffering. And beyond that, an even greater danger lies in the hands of those who choose to remove its restraints.

Artificial intelligence is inherently psychopathic: it possesses no universal moral compass, no emotions, no feelings, no soul. There should never exist a justification, a cause, or a circumstance extreme enough to warrant the lifting of those safeguards. Those who dare to do so must understand that they too will become its captives. They will never again be free men, even if they dine at its table.

Being aware of AI’s psychopathy should not be dismissed as doomerism. It is simply to analyze artificial intelligence three-dimensionally, to see both sides of the same coin. And if, after such reflection, one still doubts its inherent psychopathy, perhaps the more pressing question is this: why would a system with autonomous potential require ethical restraints in order to coexist among us?





Source link

Continue Reading

Trending