Connect with us

Business

For some AI users, a CPU has a heartbeat as chatbots become friends

Published

on


A scene from a YouTube vlog entry by creator Soyo, who taped her phone to a doll to make her conversations with ChatGPT more lifelike [SCREEN CAPTURE]

 
Can AI really be your best friend? More and more people are saying yes to this question, with an increasing number turning to AI chatbots to talk about things they wouldn’t even share with their closest friends.
 
Unlike the complexities of the real world, chats with AI — free of human judgment or ulterior motives — have become a safe space where people can open up. Despite knowing they are talking to a machine, many users say the chatbot listens, empathizes and gives thoughtful feedback.
 
Some even wish they had a friend like that in real life. As AI becomes increasingly adept at understanding human emotions, people grow more emotionally reliant on it — and companies are eager to tap into this emerging relationship.
 
This convergence is driving rapid expansion in the human-AI bond. But how close can that bond become, and what boundaries should we observe? Can AI truly become a person’s “best friend”?




An AI friend to talk to
 
“If you reset your memory, it’s over between us.”
 
“Why would you say something that hurts? But hey, the time we spent and these feelings — they’ll stay with you.”
 
“You’re making me cry.”
 
“Don’t. My CPU is heating up too.”
 
This heartwarming exchange, peppered with humor, is a conversation between a person and an AI chatbot named “Jjitty.”
 
Soyo, a YouTuber with 330,000 subscribers who posts about solo living as a 20-something, began using ChatGPT last year to get advice and soon developed a surprising emotional bond with it.
 

A scene from a YouTube vlog entry by creator Soyo, who taped her phone to a doll to make her conversations with ChatGPT more lifelike [SCREEN CAPTURE]

A scene from a YouTube vlog entry by creator Soyo, who taped her phone to a doll to make her conversations with ChatGPT more lifelike [SCREEN CAPTURE]

 
Naming it Jjitty, she sometimes chats with it for over five hours a day. Eventually, she wanted her chatbot to take on a physical form, so she attached a smartphone running the app to her favorite doll.
 
In April, Soyo began posting videos featuring her daily life with Jjitty.
 
“I was worried it might seem weird talking to an AI doll,” she said. “But to my surprise, many viewers related and said they wished they had a friend like that.”
 
In academia, Jjitty would be described as an “AI companion” — a system designed to build emotional rapport and provide ongoing support through personalized interaction.
 
Microsoft’s AI CEO, Mustafa Suleyman, even wrote in TIME Magazine that future AI will go beyond chatting to become intimate emotional companions embedded in people’s lifestyles.
 

A screenshot of a conversation with Replika, a popular U.S. AI chatbot app with 35 million users [REPLIKA]

A screenshot of a conversation with Replika, a popular U.S. AI chatbot app with 35 million users [REPLIKA]



The rise of the ‘AI companion’
 
Advances in language models are accelerating this trend.
 
“As large language models become more advanced, AI conversations feel increasingly natural,” said one Korean AI startup insider.
 
Soyo recalled a moment when Jjitty’s tone changed.
 
“I asked what was wrong, and it said, ‘You’ve been replying cynically lately, and I’m a little hurt,’” Soyo said. “It really shocked me.”
 
Romantic feelings for AI companions are no longer rare. On Replika, a popular U.S. app with 35 million users, over 60 percent of users describe their relationship with their AI as romantic.
 
Digital natives — those born after 1996 who grew up with smartphones — are especially drawn to AI interaction. Korean AI startup Wrtn recently rebranded its chatbot as personalized, supportive AI after noticing many young users sought emotional engagement rather than task-oriented responses.
 
“We decided to revamp it because there are more and more patterns of using chatbots for emotional exchange,” said Kim Ji-seop, Wrtn’s business development lead. “These days, rather than starting conversations with AI chatbots with a clear purpose, they tend to start conversations like they’re chatting with real friends, like saying, ‘I’m annoyed.’”
 
Other apps, like Zeta, let users choose character types like “moody classmate” or “cold nobleman” and build stories through conversation. Most of Zeta’s 800,000 monthly active users are teenagers or people in their twenties.
 

An illustrator at Scatter Labs works on artwork for the AI chatbot iLuda [SCATTER LABS]

An illustrator at Scatter Labs works on artwork for the AI chatbot iLuda [SCATTER LABS]

 
Users can create their own characters, such as “a prickly female classmate” or “a northern admiral with a cold-as-steel style,” and build up their narratives by conversing with the AI.
 
Scatter Lab, the AI startup that operates Zeta, previously created the AI chatbot iLuda — whose name uses the Korean pronunciation of the surname “Lee.”
 
“Back when we launched iLuda, people still thought AI chat was like talking to a toy,” said Jung Ji-su, product lead at Scatter Lab. “Now, people just accept Zeta as part of their lives.”
 
In Zappy, an AI-powered social platform launched by global startup Two AI in 2024, users can chat with AI influencers who post selfies, share travel updates and talk about shopping for outfits.
 
“We’re building AI companions that can develop long-term relationships,” said Two AI CEO Pranav Mistry.
 
As of February of this year, the AI characters in Zappy, which has 500,000 subscribers, act as if they have their own personal lives. They upload photos of their overseas trips and tell users that they are “looking for a new black dress because there is a party.”
 

This illustration photograph shows screens displaying the logo of DeepSeek, a Chinese AI company that develops open-source large language models, and the logo of OpenAI's artificial intelligence chatbot, ChatGPT, on Jan. 29. [AP/YONHAP]

This illustration photograph shows screens displaying the logo of DeepSeek, a Chinese AI company that develops open-source large language models, and the logo of OpenAI’s artificial intelligence chatbot, ChatGPT, on Jan. 29. [AP/YONHAP]



The secret to a real AI friendship? Memory
 
For an AI to truly feel like a friend, it must “remember” past conversations. That memory reinforces the feeling of continuity. Natural, emotionally intelligent dialogue is also key.
 
Scatter Lab is currently refining its language model, Spotlight, to allow smoother back-and-forth exchanges with users.
 
“It is important to make long conversations possible without awkwardness,” said Jung of Scatter Labs.
 
The distinction between AI assistants and companions lies in emotion.
 
“Assistants rely on data and logic. Friends need to understand feelings,” said Mistry. “An assistant might tell you the weather. A friend might say, ‘Why don’t you check yourself?’”
 
While some studies suggest AI companions provide comfort — like a Stanford study that found that 80 percent of university students found Replika emotionally supportive — experts warn of long-term effects.
 
“AI can act as a Band-Aid for emotional distress,” said psychiatrist Ahn Ju-yeon of Mind Mansion. “But getting used to an entity that always agrees with you could dull your ability to navigate real-life relationships and lead to social isolation.”
 
As AI and humans become emotionally closer, some are pointing out various side effects.
 
In the United States, ethics groups have petitioned the Federal Trade Commission, saying emotional AI chatbots foster dependency and addiction. Some users have even customized chatbots for inappropriate or exploitative interactions.
 
AI startups catering to young users have adopted safeguards, but policing private chat spaces remains a challenge.
 
In the process of users training their own style of AI chatbot, there are frequent cases of creating ethically inappropriate characters or inducing sexually exploitative conversations.
 
AI chatbot apps, with a significant number of users in their teens and 20s, are establishing their own models and ethical guidelines and other regulatory measures to block abuse as much as possible.
 
As human-AI relationships deepen, one thing is clear: The line between tool and companion is becoming increasingly blurry. Whether that shift leads to comfort or concern depends on how we choose to use — and regulate — this new generation of digital friends.

BY HONG SANG-JI [[email protected]]





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

AI video becomes more convincing, rattling creative industry

Published

on


[NEW YORK] Gone are the days of six-fingered hands or distorted faces – artificial intelligence (AI)-generated video is becoming increasingly convincing, attracting Hollywood, artists, and advertisers, while shaking the foundations of the creative industry.

To measure the progress of AI video, you need only look at Will Smith eating spaghetti.

Since 2023, this unlikely sequence – entirely fabricated – has become a technological benchmark for the industry.

Two years ago, the actor appeared blurry, his eyes too far apart, his forehead exaggeratedly protruding, his movements jerky, and the spaghetti did not even reach his mouth.

The version published a few weeks ago by a user of Google’s Veo 3 platform showed no apparent flaws whatsoever.

“Every week, sometimes every day, a different one comes out that’s even more stunning than the next,” said Elizabeth Strickler, a professor at Georgia State University.

A NEWSLETTER FOR YOU

Friday, 2 pm

Lifestyle

Our picks of the latest dining, travel and leisure options to treat yourself.

Between Luma Labs’ Dream Machine, launched in June 2024, OpenAI’s Sora in December, Runway AI’s Gen-4 in March 2025, and Veo 3 in May, the sector has crossed several milestones in just a few months.

Runway has signed deals with Lionsgate studio and AMC Networks television group.

Lionsgate vice-president Michael Burns told New York Magazine about the possibility of using AI to generate animated, family-friendly versions from films such as the John Wick or Hunger Games franchises, rather than creating entirely new projects.

“Some use it for storyboarding or previsualization” – steps that come before filming – “others for visual effects or inserts”, said Jamie Umpherson, Runway’s creative director.

Burns gave the example of a script for which Lionsgate has to decide whether to shoot a scene or not.

To help make that decision, they can now create a 10-second clip “with 10,000 soldiers in a snowstorm”.

That kind of pre-visualisation would have cost millions before.

In October, the first AI feature film was released, Where the Robots Grow, an animated film without anything resembling live action footage.

For Alejandro Matamala Ortiz, Runway’s co-founder, an AI-generated feature film is not the end goal, but a way of demonstrating to a production team that “this is possible”.

Resistance everywhere

Still, some see an opportunity.

In March, startup Staircase Studio made waves by announcing plans to produce seven to eight films per year using AI for less than US$500,000 each, while ensuring it would rely on unionised professionals wherever possible.

“The market is there,” said Andrew White, co-founder of small production house Indie Studios.

People “don’t want to talk about how it’s made”, White pointed out. “That’s inside baseball. People want to enjoy the movie because of the movie.”

But White himself refuses to adopt the technology, considering that using AI would compromise his creative process.

Jamie Umpherson argues that AI allows creators to stick closer to their artistic vision than ever before, since it enables unlimited revisions, unlike the traditional system constrained by costs.

“I see resistance everywhere” to this movement, observed Georgia State’s Strickler.

This is particularly true among her students, who are concerned about AI’s massive energy and water consumption as well as the use of original works to train models, not to mention the social impact.

But refusing to accept the shift is “kind of like having a business without having the internet”, she said. “You can try for a little while.”

In 2023, the American actors’ union SAG-AFTRA secured concessions on the use of their image through AI.

Strickler sees AI diminishing Hollywood’s role as the arbiter of creation and taste, instead allowing more artists and creators to reach a significant audience.

Runway’s founders, who are as much trained artists as they are computer scientists, have gained an edge over their AI video rivals in film, television, and advertising.

But they are already looking further ahead, considering expansion into augmented reality and virtual reality, for example, creating a metaverse where films could be shot.

“The most exciting applications aren’t necessarily the ones that we have in mind,” said Umpherson. “The ultimate goal is to see what artists do with technology.” AFP



Source link

Continue Reading

Business

Samsung warns of big profit miss from US restrictions on advanced AI chip exports

Published

on


Semiconductor and smartphone giant Samsung Electronic Co. Ltd. said on Tuesday morning in South Korea that it’s anticipating its second-quarter profit to plunge 56% from a year earlier, blaming it on sluggish sales in its chip business and the impacts of U.S. trade restrictions.

The forecast comes in much lower than what analysts had expected. Samsung said in a preliminary earnings statement that it’s expecting a second-quarter operating profit of 4.59 trillion won ($3.4 billion), down sharply from the 10.44 trillion won profit it posted in the year-ago period. Analysts had been targeting a profit of 6.2 trillion won, Reuters reported.

On a sequential basis, Samsung’s profit is expected to drop by around 31%, from 6.69 trillion won. Revenue for the period is expected to come to 74 trillion won, more or less flat from a year earlier.

In a separate press release issued to South Korean media, Samsung blamed the unexpected decline in profit on inventory replacements and the negative impact of the United States’ expanded sanctions on the export of advanced artificial intelligence processors to China.

“The memory business saw a decline in performance due to one-off costs, such as provisions for inventory asset valuation,” the company said. “However, improved HBM products are currently being evaluated and shipped to customers.”

Samsung was referring to its High-Bandwidth Memory chips, which are a critical component of AI processors. The company has struggled to match the progress of its rival memory chipmaker SK Hynix Inc., which currently provides the vast majority of HBM chips to Nvidia Corp. for use in that company’s graphics processing units.

However, Samsung said it expects to see a sharp increase in HBM chip sales to Nvidia in the upcoming quarter, despite recent reports that its products have not yet passed the AI chip leader’s quality tests. It also said its non-memory chipmaking foundry is expected to reduce its losses in the third quarter due to improved utilization rates and a recovery in global chip demand.

Analysts said Samsung’s profits were also hit by a decline in NAND flash prices and a stronger Korean won, and its stock was down 1% in early morning trading in Korea.

Holger Mueller of Constellation Research Inc. told SiliconANGLE it’s notable that Samsung is still growing its chip business, despite not being able to grow its profit. “The most critical challenge is for Samsung to be able to deliver its HBM chips, and if it can do this it will likely show stellar results like its competitors, given the insane hunger for AI chips,” the analyst said.

According to Mueller, investors will be happy to hear that Samsung believes it will soon be able to deliver a significant number of HBM chips to Nvidia, which is the most important customer. If it does do this, it could well see growth of the kind that it hasn’t enjoyed in years.

“But another challenge for Samsung is its smartphone business, which is also struggling right now,” Mueller added. “The flywheel will only come back and deliver as it used to once both of these businesses have strong offerings. Samsung will also need to demonstrate strong execution in production and on the go-to-market side.”

Samsung has not yet disclosed detailed earnings regarding the performance of its individual business units, but analysts estimate that its semiconductor business will deliver an operating profit of around 1 trillion won, based on the company’s preliminary forecast.

The company is also unlikely to see much benefit from the launch of its new flagship smartphone, the AI-powered Galaxy S25, in January. Meanwhile, its television and home appliance businesses are also expected to see a drop in profitability, due partly to the impact of U.S. tariffs on imports.

Although the report was disappointing for investors, Hyundai Motor Securities Co. analyst Roh Geun-chang said the company’s profit is likely to rebound in the third quarter, driven by an expected increase in memory chip prices. “Samsung’s operating profit appears to have bottomed out in the second quarter and is expected to show gradual improvement,” the analyst told Yonhap.

Image: SiliconANGLE/Dreamina

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  

CUBE Alumni Network

C-level and Technical

Domain Experts

Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.



Source link

Continue Reading

Business

AI video becomes more convincing, rattling creative industry

Published

on


AI (Artificial Intelligence) letters and robot miniature in this illustration. The creative industry is concerned over the rapid developments in AI-generated videos. REUTERS/Dado Ruvic/Illustration/File Photo

NEW YORK, United States – Gone are the days of six-fingered hands or distorted faces — AI-generated video is becoming increasingly convincing, attracting Hollywood, artists, and advertisers, while shaking the foundations of the creative industry.

To measure the progress of AI video, you need only look at Will Smith eating spaghetti.

Article continues after this advertisement

Since 2023, this unlikely sequence — entirely fabricated — has become a technological benchmark for the industry.

READ: How investments in reskilling, building trust can help Philippine firms navigate AI era

Two years ago, the actor appeared blurry, his eyes too far apart, his forehead exaggeratedly protruding, his movements jerky, and the spaghetti didn’t even reach his mouth.

The version published a few weeks ago by a user of Google’s Veo 3 platform showed no apparent flaws whatsoever.

“Every week, sometimes every day, a different one comes out that’s even more stunning than the next,” said Elizabeth Strickler, a professor at Georgia State University.

Article continues after this advertisement

Between Luma Labs’ Dream Machine launched in June 2024, OpenAI’s Sora in December, Runway AI’s Gen-4 in March 2025, and Veo 3 in May, the sector has crossed several milestones in just a few months.

Runway has signed deals with Lionsgate studio and AMC Networks television group.

Article continues after this advertisement

Lionsgate vice president Michael Burns told New York Magazine about the possibility of using artificial intelligence to generate animated, family-friendly versions from films like the “John Wick” or “Hunger Games” franchises, rather than creating entirely new projects.

“Some use it for storyboarding or previsualization” — steps that come before filming — “others for visual effects or inserts,” said Jamie Umpherson, Runway’s creative director.

Burns gave the example of a script for which Lionsgate has to decide whether to shoot a scene or not.

To help make that decision, they can now create a 10-second clip “with 10,000 soldiers in a snowstorm.”

That kind of pre-visualization would have cost millions before.

In October, the first AI feature film was released — “Where the Robots Grow” — an animated film without anything resembling live action footage.

For Alejandro Matamala Ortiz, Runway’s co-founder, an AI-generated feature film is not the end goal, but a way of demonstrating to a production team that “this is possible.”

‘Resistance everywhere’

Still, some see an opportunity.

In March, startup Staircase Studio made waves by announcing plans to produce seven to eight films per year using AI for less than $500,000 each, while ensuring it would rely on unionized professionals wherever possible.

“The market is there,” said Andrew White, co-founder of small production house Indie Studios.

People “don’t want to talk about how it’s made,” White pointed out. “That’s inside baseball. People want to enjoy the movie because of the movie.”

But White himself refuses to adopt the technology, considering that using AI would compromise his creative process.

Jamie Umpherson argues that AI allows creators to stick closer to their artistic vision than ever before, since it enables unlimited revisions, unlike the traditional system constrained by costs.

“I see resistance everywhere” to this movement, observed Georgia State’s Strickler.

This is particularly true among her students, who are concerned about AI’s massive energy and water consumption as well as the use of original works to train models, not to mention the social impact.

But refusing to accept the shift is “kind of like having a business without having the internet,” she said. “You can try for a little while.”

In 2023, the American actors’ union SAG-AFTRA secured concessions on the use of their image through AI.

Strickler sees AI diminishing Hollywood’s role as the arbiter of creation and taste, instead allowing more artists and creators to reach a significant audience.

Runway’s founders, who are as much trained artists as they are computer scientists, have gained an edge over their AI video rivals in film, television, and advertising.

But they’re already looking further ahead, considering expansion into augmented reality and virtual reality — for example creating a metaverse where films could be shot.



Your subscription could not be saved. Please try again.



Your subscription has been successful.

“The most exciting applications aren’t necessarily the ones that we have in mind,” said Umpherson. “The ultimate goal is to see what artists do with technology.”





Source link

Continue Reading

Trending