north of Fargo, readers have asked several questions about the facility.
The Forum spoke this week with Applied Digital Chairman and CEO Wes Cummins about the 280-megawatt facility planned for east of Interstate 29 between Harwood, North Dakota, and Fargo. The 160-acre center will sit on 925 acres near the Fargo Park District’s North Softball Complex.
The Harwood City Council voted unanimously on Wednesday, Sept. 10, to rezone the land for the center from agricultural to light industrial. With the vote also came final approval of the building permit for the center, meaning Applied Digital can break ground on the facility this month.
“We’re grateful for the City of Harwood’s support and look forward to continuing a strong partnership with the community as this project moves ahead,” Cummins said after the vote.
Applied Digital CEO and Chairman Wes Cummins talks about his company and its plans for Harwood, North Dakota, during a meeting on Tuesday, Sept. 2, 2025, at the Harwood Community Center.
Alyssa Goelzer / The Forum
Applied Digital plans to start construction this month and open partially by the end of 2026. The facility should be fully operational by early 2027, the company said.
The project should create 700 construction jobs while the facility is built, Applied Digital said. The center will need more than 200 full-time employees to operate, the company said. The facility is expected to generate tax revenue and economic growth for the area, but those estimates have not been disclosed.
Here are some questions readers had about the facility.
What will the AI data center be used for?
Applied Digital said it develops facilities that provide “high-performance data centers and colocations solutions for artificial intelligence, cloud, networking, and blockchain industries.” AI is used to run applications that make computers functional, Cummins said.
“ChatGPT runs in a facility like this,” he said. “There’s just enormous amounts of servers that can run GPUs (graphic processing units) inside of the facility and can either be doing training, which is making the product, or inference, which is what happens when people use the product.”
Applied Digital’s $3 billion data center will be constructed just southeast of the town of Harwood, North Dakota.
Map by The Forum
Applied Digital hasn’t announced what tenants would use Polaris Forge 2, the name for the Harwood facility. At a Harwood City Council meeting, Cummins said the company markets to companies in the U.S. like Google, Meta, Amazon and Microsoft.
“The demand for AI capacity continues to accelerate, and North Dakota continues to be one of the most strategic locations in the country to meet that need,” he said. “We have strong interest from multiple parties and are in advanced negotiations with a U.S. based investment-grade hyperscaler for this campus, making it both timely and prudent to proceed with groundbreaking and site development.”
AI data centers need significant amounts of electricity to operate, Cummins said. Other centers have traditionally been built near heavily populated areas, but that isn’t necessary, he said.
North Dakota produces enough energy to export it out of state, Cummins said. The Fargo area also has the electrical grid in place to connect to that energy, he said.
“A lot of North Dakotans, especially the leaders of North Dakota, want to better utilize the energy produced by North Dakota for economic benefit inside of the state versus exporting it to neighboring states or to Canada,” he said.
North Dakota’s cold climate much of the year also will keep the center cooler than in states like Texas, meaning the facility will use significantly less power than in warmer states, Cummins said.
“We get much more efficiency out of the facility,” he said. “Those aspects make North Dakota, in my opinion, an ideal place for this type of AI infrastructure.”
The Harwood, North Dakota, elevator on Thursday, Aug. 28, 2025, looms behind the land designated for the construction of Applied Digital’s 280-megawatt data center.
David Samson / The Forum
How much water will the center use?
Cummins acknowledged other AI data centers around the world use millions of gallons of water a day. Applied Digital designed a closed-loop system so the North Dakota centers use as little water as possible, Cummins said.
He compared the cooling system to a car radiator. The centers will use glycol liquid to run through the facilities and servers, Cummins said. After cooling the equipment, the liquid goes through chillers, much like a heat pump outside of a house. Once cooled, the liquid will recirculate on a continuous loop, he said.
People who operate the facility will use water for bathroom breaks and drinking, much like a person in a house or a car, he said.
“The data center, even with the immense size, we expect it to use the same amount of water as roughly a single household,” he said. “The reason is the people inside.”
Duncan Alexander and dog Valka protest a proposed AI data center before a Planning and Zoning meeting on Tuesday, Sept. 2, 2025, in Harwood, North Dakota.
Alyssa Goelzer / The Forum
Will the AI center increase electricity rates?
Applied Digital claims that electricity rates will not go up for local residents because of the data center.
“Data centers pay a large share of fixed utility costs, which helps spread expenses across more users,” the company said.
Applied Digital’s center in Ellendale, North Dakota, much like the one to be built in Harwood, uses power produced in the state, Cummins said. The Ellendale center, which runs on about 200 megawatts a year, saved ratepayers $5.3 million in 2023 and $5.7 million last year, he said.
“Utilizing the infrastructure more efficiently can actually drive rates down,” Cummins said, adding he expects rate savings for Harwood as well.
How much noise will the center make?
Applied Digital’s concrete walls should content the noise from computers, Cummins said. What residents will hear is fan noise from heat pumps used to cool the facility, he said.
“It will sound like the one that runs outside of your house,” he said in describing that the facility will create minimal noise.
The loudest noise will be construction of the facility, Cummins said.
The facility only will cover 160 acres, but Applied Digital is buying 925 acres of land, with the rest of the space serving as a sound buffer, he said. People who live nearby may hear some sound, he acknowledged.
“If you’re a half mile or more from the facility, you will very unlikely hear anything,” he said.
About 300 people showed up to a town hall meeting on Monday, Aug. 25, 2025, at the Harwood Community Center to listen and to discuss a new AI data center that is planned to be built in Harwood, North Dakota.
Chris Flynn / The Forum
Has Applied Digital conducted an environmental study?
The facility won’t create emissions or other hazards that would require an environmental impact study, Cummins said.
Why move so fast to approve the facility?
Some have criticized Applied Digital and the Harwood City Council for pushing the approval process so quickly. Applied Digital announced the project in mid-August, and the city approved it in less than a month.
Cummins acknowledged that concern but noted the industry is moving fast. The U.S. is competing with China to create artificial intelligence, an industry that is not going away, Cummins said.
“I do believe we are in a race in the world for super intelligence,” he said. “It’s a race amongst companies in the U.S., but it’s also a race against other countries. … I do think it’s very important the U.S. win this AI race to super intelligence and then to artificial general intelligence.”
Applied Digital said it wanted to finish foundation and grading work on the project before winter sets in, meaning it needed an expedited approval timeline.
People in Harwood have shown overwhelming support, Cummins said, adding that protesters mostly came from other cities.
“I can’t think of a project that would spend this amount of money and have this kind of economic benefit for a community and a county and a state and have this low of a negative impact,” he said. “I think these types of projects are fantastic for these types of communities.”
You’ve probably encountered images in your social media feeds that look like a cross between photographs and computer-generated graphics. Some are fantastical — think Shrimp Jesus — and some are believable at a quick glance — remember the little girl clutching a puppy in a boat during a flood?
These are examples of AI slop, low- to mid-quality content — video, images, audio, text or a mix — created with AI tools, often with little regard for accuracy. It’s fast, easy and inexpensive to make this content. AI slop producers typically place it on social media to exploit the economics of attention on the internet, displacing higher-quality material that could be more helpful.
AI slop has beenincreasing over the past few years. As the term “slop” indicates, that’s generally not good for people using the internet.
AI slop’s many forms
The Guardian published an analysis in July 2025 examining how AI slop is taking over YouTube’s fastest-growing channels. The journalists found that nine out of the top 100 fastest-growing channels feature AI-generated content like zombie football and cat soap operas.
Listening to Spotify? Be skeptical of that new band, The Velvet Sundown, that appeared on the streaming service with a creative backstory and derivative tracks. It’s AI-generated.
In many cases, people submit AI slop that’s just good enough to attract and keep users’ attention, allowing the submitter to profit from platforms that monetize streaming and view-based content.
The ease of generating content with AI enables people to submit low-quality articles to publications. Clarkesworld, an online science fiction magazine that accepts user submissions and pays contributors, stopped taking new submissions in 2024 because of the flood of AI-generated writing it was getting.
Get the world’s most fascinating discoveries delivered straight to your inbox.
These aren’t the only places where this happens — even Wikipedia is dealing with AI-generated low-quality content that strains its entire community moderation system. If the organization is not successful in removing it, a key information resource people depend on is at risk.
Harms of AI slop
AI-driven slop is making its way upstream into people’s media diets as well. During Hurricane Helene, opponents of President Joe Biden cited AI-generated images of a displaced child clutching a puppy as evidence of the administration’s purported mishandling of the disaster response. Even when it’s apparent that content is AI-generated, it can still be used to spread misinformation by fooling some people who briefly glance at it.
AI slop also harms artists by causing job and financial losses and crowding out content made by real creators. The placement of this lower-quality AI-generated content is often not distinguished by the algorithms that drive social media consumption, and it displace entire classes of creators who previously made their livelihood from online content.
Wherever it’s enabled, you can flag content that’s harmful or problematic. On some platforms, you can add community notes to the content to provide context. For harmful content, you can try to report it.
Along with forcing us to be on guard for deepfakes and “inauthentic” social media accounts, AI is now leading to piles of dreck degrading our media environment. At least there’s a catchy name for it.
🤖 Researchers explore whether AI needs a physical body to achieve true intelligence.
🧠 The concept of embodied cognition suggests that sensing, acting, and thinking are interconnected.
🐙 Soft robotics, inspired by creatures like the octopus, offer a new path for developing adaptive AI.
🔄 Autonomous physical intelligence (API) allows materials to self-regulate and make decisions independently.
In the realm of artificial intelligence (AI), the concept of whether machines require physical bodies to achieve true intelligence has long been a topic of debate. Popular culture, from Rosie the robot maid in “The Jetsons” to the empathetic C-3PO in “The Empire Strikes Back,” has offered diverse interpretations of robots and AI. However, these fictional portrayals often overlook the complexities and limitations faced by real-world AI systems. With recent advancements in robotics and AI, researchers are revisiting the question of embodiment in AI, exploring whether a physical form could be essential for achieving artificial general intelligence (AGI). This exploration could redefine our understanding of cognition, intelligence, and the future of AI technology.
The Limits of Disembodied AI
Recent studies have highlighted the shortcomings of disembodied AI systems, particularly in their ability to perform complex tasks. A study from Apple on Large Reasoning Models (LRMs) found that while these systems can outperform standard language models in some scenarios, they struggle significantly with more complex problems. Despite having ample computing power, these models often collapse under complexity, revealing a fundamental flaw in their reasoning capabilities.
Unlike humans, who can reason consistently and algorithmically, these AI models lack internal logic in their “reasoning traces.” Nick Frosst, a former Google researcher, emphasized this discrepancy, noting that current AI systems merely predict the next most likely word rather than truly think like humans. This raises concerns about the viability of disembodied AI in replicating human-like intelligence.
“What we are building now are things that take in words and predict the next most likely word … That’s very different from what you and I do,” Frosst told The New York Times.
The limitations of disembodied AI underscore the need for exploring alternative approaches to achieve true cognitive abilities in machines.
Historically, artificial intelligence was developed under the paradigm of Good Old-Fashioned Artificial Intelligence (GOFAI), which treated cognition as symbolic logic. This approach assumed that intelligence could be built by processing symbols, akin to a computer executing code. However, real-world challenges exposed the limitations of this model, leading researchers to question whether intelligence could be achieved without a physical body.
Research from various disciplines, including psychology and neuroscience, suggests that intelligence is inherently linked to physical interactions with the environment. In humans, the enteric nervous system, often referred to as the “second brain,” operates independently, illustrating that intelligence can be distributed throughout an organism rather than centralized in a brain.
This has led to the concept of embodied cognition, where sensing, acting, and thinking are interconnected processes. As Rolf Pfeifer, Director of the University of Zurich’s Artificial Intelligence Laboratory, pointed out, “Brains have always developed in the context of a body that interacts with the world to survive.” This perspective challenges the traditional view of cognition and suggests that a physical body might be crucial for developing adaptable and intelligent systems.
Embodied Intelligence: A Different Kind of Thinking
The exploration of embodied intelligence has prompted researchers to consider new approaches to AI development. Cecilia Laschi, a pioneer in soft robotics, advocates for the use of soft-bodied machines inspired by organisms like the octopus. These creatures demonstrate a form of intelligence that is distributed throughout their bodies, allowing them to adapt and respond to their environments without centralized control.
Laschi argues that smarter AI requires softer, more flexible bodies that can offload perception, control, and decision-making to the physical structure of the robot itself. This approach reduces the computational demands on the main AI system, enabling it to function more effectively in unpredictable environments.
In a May special issue of Science Robotics, Laschi explained that “motor control is not entirely managed by the computing system … motor behavior is partially shaped mechanically by external forces acting on the body.” This suggests that behavior and intelligence are shaped by experience and interaction with the environment, rather than pre-programmed algorithms.
The field of soft robotics, which employs materials like silicone and special fabrics, offers promising possibilities for creating adaptive, real-time learning systems. By integrating flexibility and adaptability into the physical form of AI, researchers are paving the way for machines that can think and learn in ways similar to living organisms.
Flesh and Feedback: How to Make Materials Think for Themselves
The development of soft robotics is also advancing the concept of autonomous physical intelligence (API), where materials themselves exhibit decision-making capabilities. Ximin He, an Associate Professor of Materials Science and Engineering at UCLA, has been at the forefront of this research, designing soft materials that not only react to stimuli but also regulate their movements using built-in feedback.
He’s approach involves embedding logic directly into the materials, allowing them to sense, act, and decide autonomously. This method contrasts with traditional robotics, which relies on external control systems to analyze sensory data and dictate actions. By incorporating nonlinear feedback mechanisms, soft robots can achieve rhythmic, controlled behaviors without external intervention.
He’s work has demonstrated the potential for soft materials to self-regulate their movements, a significant advancement toward creating lifelike autonomy in machines. This approach opens up new possibilities for AI systems that can adapt and respond to their environments in more natural and intuitive ways.
By integrating sensing, control, and actuation at the material level, researchers are moving closer to developing machines that can independently decide, adapt, and act, paving the way for a new era of intelligent robotics.
As researchers continue to explore the potential of embodied intelligence and soft robotics, the future of AI appears increasingly promising. These innovations could lead to breakthroughs in fields ranging from medicine to environmental exploration, offering machines that are not only intelligent but also capable of understanding and interacting with the world in new ways. However, questions remain about how these technologies will be integrated into society and the ethical implications of creating machines with lifelike autonomy. As we move forward, how will the intersection of AI and physical embodiment redefine our relationship with technology and the world around us?
This article is based on verified sources and supported by editorial technologies.