Connect with us

AI Insights

Column: AI is reality — ready or not – Automotive News

Published

on

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Insights

CEO of company behind Harwood AI data center answers commonly asked questions

Published

on


HARWOOD, N.D. — As a Texas company prepares to break ground this month on a

$3 billion artificial intelligence data center

north of Fargo, readers have asked several questions about the facility.

The Forum spoke this week with Applied Digital Chairman and CEO Wes Cummins about the 280-megawatt facility planned for east of Interstate 29 between Harwood, North Dakota, and Fargo. The 160-acre center will sit on 925 acres near the Fargo Park District’s North Softball Complex.

The Harwood City Council voted unanimously on Wednesday, Sept. 10, to rezone the land for the center from agricultural to light industrial. With the vote also came final approval of the building permit for the center, meaning Applied Digital can break ground on the facility this month.

“We’re grateful for the City of Harwood’s support and look forward to continuing a strong partnership with the community as this project moves ahead,” Cummins said after the vote.

Applied Digital CEO and Chairman Wes Cummins talks about his company and its plans for Harwood, North Dakota, during a meeting on Tuesday, Sept. 2, 2025, at the Harwood Community Center.

Alyssa Goelzer / The Forum

Applied Digital plans to start construction this month and open partially by the end of 2026. The facility should be fully operational by early 2027, the company said.

The project should create 700 construction jobs while the facility is built, Applied Digital said. The center will need more than 200 full-time employees to operate, the company said. The facility is expected to generate tax revenue and economic growth for the area, but those estimates have not been disclosed.

The facility has generated

questions and protest.

Here are some questions readers had about the facility.

What will the AI data center be used for?

Applied Digital said it develops facilities that provide “high-performance data centers and colocations solutions for artificial intelligence, cloud, networking, and blockchain industries.” AI is used to run applications that make computers functional, Cummins said.

“ChatGPT runs in a facility like this,” he said. “There’s just enormous amounts of servers that can run GPUs (graphic processing units) inside of the facility and can either be doing training, which is making the product, or inference, which is what happens when people use the product.”

081825.N.FF.HarwoodDataCenter

Applied Digital’s $3 billion data center will be constructed just southeast of the town of Harwood, North Dakota.

Map by The Forum

Applied Digital hasn’t announced what tenants would use Polaris Forge 2, the name for the Harwood facility. At a Harwood City Council meeting, Cummins said the company markets to companies in the U.S. like Google, Meta, Amazon and Microsoft.

“The demand for AI capacity continues to accelerate, and North Dakota continues to be one of the most strategic locations in the country to meet that need,” he said. “We have strong interest from multiple parties and are in advanced negotiations with a U.S. based investment-grade hyperscaler for this campus, making it both timely and prudent to proceed with groundbreaking and site development.”

AI data centers need significant amounts of electricity to operate, Cummins said. Other centers have traditionally been built near heavily populated areas, but that isn’t necessary, he said.

North Dakota produces enough energy to export it out of state, Cummins said. The Fargo area also has the electrical grid in place to connect to that energy, he said.

“A lot of North Dakotans, especially the leaders of North Dakota, want to better utilize the energy produced by North Dakota for economic benefit inside of the state versus exporting it to neighboring states or to Canada,” he said.

North Dakota’s cold climate much of the year also will keep the center cooler than in states like Texas, meaning the facility will use significantly less power than in warmer states, Cummins said.

“We get much more efficiency out of the facility,” he said. “Those aspects make North Dakota, in my opinion, an ideal place for this type of AI infrastructure.”

A foreground of a leafy crop stretches toward the horizon, where metal grain elevators and metal storage buildings stand against a gray sky.

The Harwood, North Dakota, elevator on Thursday, Aug. 28, 2025, looms behind the land designated for the construction of Applied Digital’s 280-megawatt data center.

David Samson / The Forum

How much water will the center use?

Cummins acknowledged other AI data centers around the world use millions of gallons of water a day. Applied Digital designed a closed-loop system so the North Dakota centers use as little water as possible, Cummins said.

He compared the cooling system to a car radiator. The centers will use glycol liquid to run through the facilities and servers, Cummins said. After cooling the equipment, the liquid goes through chillers, much like a heat pump outside of a house. Once cooled, the liquid will recirculate on a continuous loop, he said.

People who operate the facility will use water for bathroom breaks and drinking, much like a person in a house or a car, he said.

“The data center, even with the immense size, we expect it to use the same amount of water as roughly a single household,” he said. “The reason is the people inside.”

090425.N.FF.HarwoodAI

Duncan Alexander and dog Valka protest a proposed AI data center before a Planning and Zoning meeting on Tuesday, Sept. 2, 2025, in Harwood, North Dakota.

Alyssa Goelzer / The Forum

Will the AI center increase electricity rates?

Applied Digital claims that electricity rates will not go up for local residents because of the data center.

“Data centers pay a large share of fixed utility costs, which helps spread expenses across more users,” the company said.

Applied Digital’s center in Ellendale, North Dakota, much like the one to be built in Harwood, uses power produced in the state, Cummins said. The Ellendale center, which runs on about 200 megawatts a year, saved ratepayers $5.3 million in 2023 and $5.7 million last year, he said.

“Utilizing the infrastructure more efficiently can actually drive rates down,” Cummins said, adding he expects rate savings for Harwood as well.

How much noise will the center make?

Applied Digital’s concrete walls should content the noise from computers, Cummins said. What residents will hear is fan noise from heat pumps used to cool the facility, he said.

“It will sound like the one that runs outside of your house,” he said in describing that the facility will create minimal noise.

The loudest noise will be construction of the facility, Cummins said.

The facility only will cover 160 acres, but Applied Digital is buying 925 acres of land, with the rest of the space serving as a sound buffer, he said. People who live nearby may hear some sound, he acknowledged.

“If you’re a half mile or more from the facility, you will very unlikely hear anything,” he said.

About 300 people showed up to a town hall meeting on Monday, Aug. 25, 2025, at the Harwood Community Center to listen and to discuss a new AI data center that is planned to be built in Harwood.

About 300 people showed up to a town hall meeting on Monday, Aug. 25, 2025, at the Harwood Community Center to listen and to discuss a new AI data center that is planned to be built in Harwood, North Dakota.

Chris Flynn / The Forum

Has Applied Digital conducted an environmental study?

The facility won’t create emissions or other hazards that would require an environmental impact study, Cummins said.

Why move so fast to approve the facility?

Some have criticized Applied Digital and the Harwood City Council for pushing the approval process so quickly. Applied Digital announced the project in mid-August, and the city approved it in less than a month.

Cummins acknowledged that concern but noted the industry is moving fast. The U.S. is competing with China to create artificial intelligence, an industry that is not going away, Cummins said.

“I do believe we are in a race in the world for super intelligence,” he said. “It’s a race amongst companies in the U.S., but it’s also a race against other countries. … I do think it’s very important the U.S. win this AI race to super intelligence and then to artificial general intelligence.”

Applied Digital said it wanted to finish foundation and grading work on the project before winter sets in, meaning it needed an expedited approval timeline.

People in Harwood have shown overwhelming support, Cummins said, adding that protesters mostly came from other cities.

“I can’t think of a project that would spend this amount of money and have this kind of economic benefit for a community and a county and a state and have this low of a negative impact,” he said. “I think these types of projects are fantastic for these types of communities.”





Source link

Continue Reading

AI Insights

AI slop is on the rise — what does it mean for how we use the internet?

Published

on


You’ve probably encountered images in your social media feeds that look like a cross between photographs and computer-generated graphics. Some are fantastical — think Shrimp Jesus — and some are believable at a quick glance — remember the little girl clutching a puppy in a boat during a flood?

These are examples of AI slop, low- to mid-quality content — video, images, audio, text or a mix — created with AI tools, often with little regard for accuracy. It’s fast, easy and inexpensive to make this content. AI slop producers typically place it on social media to exploit the economics of attention on the internet, displacing higher-quality material that could be more helpful.



Source link

Continue Reading

AI Insights

How to talk to your teen about AI : NPR

Published

on


Parents should broach the AI conversation with their children when they are elementary school-age, before they encounter AI through their friends at school or in other spaces, says Marc Watkins, a lecturer at the University of Mississippi who researches AI and its impact on education.

Eva Redamonti for NPR


hide caption

toggle caption

Eva Redamonti for NPR

Nicholas Munkbhatter started using ChatGPT shortly after the artificial intelligence chatbot was released in late 2022. He was 14 at the time, and he says, “I would use it for almost everything, like math problems.”

At first, Munkbhatter, who is from Sacramento, Calif., thought it was amazing. But then, he says, he started to see downsides: “I realized it was just giving me an answer without helping me go through the actual process of learning.”

Many kids and teens use ChatGPT and other generative artificial intelligence models like Claude or Google Gemini for everything from dealing with math homework to coping with a mental health crisis, often with little to no guidance from adults. Education and child development experts say parents must take the lead in helping children understand this new technology. 

“Having conversations now about what is ethical, responsible usage of AI is important, and you need to be a part of that if you are a parent,” says Marc Watkins, a lecturer at the University of Mississippi who researches AI and its impact on education.

While early evidence suggests the technology could bolster student learning if deployed correctly, ongoing research and stories about teenagers who died by suicide after talking to AI chatbots indicate significant risks to young users.

Experts share advice on how to talk to kids about AI, including its potential benefits and harms.

Start the conversation early 

Broach the conversation when children are elementary-school age, Watkins says, before they encounter AI through their friends at school or in other spaces.

To guide these discussions, Watkins says to budget time each week to learn about AI and try the tools for yourself. That might mean listening to a podcast, reading a newsletter or experimenting with platforms like ChatGPT.

To explain how AI works to your kids, Watkins recommends playing a Google game called Quick, Draw!. Players receive a drawing prompt, and the game’s neural network tries to guess what you’re drawing by recognizing patterns in doodles from thousands of other players.

Watkins says it’s a way to show kids that AI is only as good as the data it’s trained on. It mimics how humans write and create content, but it doesn’t think or understand things the way people do.

Use AI together 

Since the technology is still evolving, parents are often learning about it alongside their children. Ying Xu, an assistant professor at the Harvard Graduate School of Education, who researches AI, says parents can use this as an opportunity to explore it together.

For example, the next time your child asks you a question, type it into an AI chatbot and discuss the response, Xu says. “Is it helpful? What felt off? How do you think this response was generated?”

Parents should also reinforce that AI can make mistakes. Xu says parents can teach kids to fact-check information that AI chatbots provide by using other sources.

Explore its possibilities 

If your kid is using AI for homework help, keep an open mind.

Research has shown that some AI tools can have a positive impact on learning. Xu worked with PBS Kids to design interactive, AI-powered digital versions of popular kids shows. She found that children who watched the AI versions were more engaged and learned more compared with children who watched the traditional broadcast version of the show.

Meanwhile, Munkbhatter, the teenager from Sacramento, says AI has been a helpful learning aid and brainstorm partner — so long as he doesn’t use it to do all the work for him.

Now, if he gets stuck on a challenging math problem, he says he asks ChatGPT: “What’s the first step I should take when looking at a problem like this? How should I think about it?”

Munkbhatter also says he provides his class notes to ChatGPT and asks it to quiz him on the subject matter. “I make sure that it only gives me the question itself rather than the question and the answer at the same time.”

Understand the risks

We don’t yet know how generative AI will impact child development in the long term, but there are some present dangers.

Dr. Darja Djordjevic, a faculty fellow at Stanford University’s Brainstorm: The Stanford Lab for Mental Health Innovation, is working with the group Common Sense Media to study how popular AI models respond to users who show symptoms of psychiatric disorders that affect teens. The research hasn’t been released yet, but Djordjevic shared some of her findings with NPR.

“ What we found was that the AI chatbots could provide good general mental health information, but they demonstrated concerning gaps in recognizing serious conditions,” Djordjevic says.

At times, she says, AI chatbots provided unsafe responses to questions and statements about self-harm, substance use, body image or eating disorders and risk-taking behaviors. She says they also generated sexually explicit content.

NPR reached out to OpenAI, the company behind ChatGPT, about these concerns. We were directed to a recent post on the company’s website that says OpenAI is “continuing to improve how our models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input.”

The post says ChatGPT is also trained to direct users expressing suicidal intent to professional help.

Warning signs a child is spending too much time with AI include increased time alone with devices or talking about an AI chatbot as if it were a real friend.

“That’s a warning sign that the conversation about these being AI tools and not people needs to be nurtured again,” Djordjevic says.

Set reasonable household rules around AI 

You may be wondering how to enforce these boundaries at home. Experts share their tips.

Co-write the AI rules with your kids, Djordjevic says. Identify safe uses of AI together — like for homework help with a parent’s supervision or as a creative outlet — and limit the amount of time your child uses it. And check in regularly on how the use of AI is making your child feel.

Don’t prohibit your kids from using AI — but do set limits. “Bans don’t generally work, especially with teens,” Watkins says. “What works is having conversations with them, putting clear guidelines and structure around these things and understanding the do’s and don’ts.” Parents should feel empowered to ban clearly dangerous uses, like if a child is harming themselves and an AI chatbot encourages the behavior, Djordjevic says. 

Make time for real life. Prioritize time spent outside with real people, away from devices, Djordjevic says. That could include joining a sports team and scheduling regular family activities.

Trust that your conversations will make a difference. As overwhelmed as parents might feel navigating AI, Watkins emphasizes that taking time to talk with kids can have real impact: “They’re not going to remember an ad from an AI chatbot. They’re going to remember a conversation you had with them. And that gives you a lot of agency, a lot of power in this.”

This episode of Life Kit was produced by Clare Marie Schneider. It was edited by Malaka Gharib. The visual editor is Beck Harlan.

Want more Life Kit? Subscribe to our weekly newsletter and get expert advice on topics like money, relationships, health and more. Click here to subscribe now.



Source link

Continue Reading

Trending