Connect with us

AI Insights

A new couple’s experiment with ChatGPT : NPR

Published

on


One recent evening, my new boyfriend and I found ourselves in a spat.

I accused him of giving in to his anxious thoughts.

“It’s hard to get out of my head,” David said. “Mental spiraling is part of the nature of sensitivity sometimes — there’s emotional overflow from that.”

“Well, spiraling is bad,” said I, a woman who spirals.

Our different communication styles fueled the tense exchange. While I lean practical and direct, he’s contemplative and conceptual.

I felt we could benefit from a mediator. So, I turned to my new relationship consultant, ChatGPT.

AI enters the chat

Almost half of Generation Z uses artificial intelligence for dating advice, more than any other generation, according to a recent nationwide survey by Match Group, which owns the dating apps Tinder and Hinge. Anecdotally, I know women who’ve been consulting AI chatbots about casual and serious relationships alike. They gush over crushes, upload screenshots of long text threads for dissection, gauge long-term compatibility, resolve disagreements and even soundboard their sexts.

Kat, a friend of mine who uses ChatGPT to weed out dating prospects, told me she found it pretty objective. Where emotions might otherwise get in the way, the chatbot helped her uphold her standards.

“I feel like it gives better advice than my friends a lot of the time. And better advice than my therapist did,” said Kat, who asked to go by her first name due to concerns that her use of AI could jeopardize future romantic connections. “With friends, we’re all just walking around with our heads chopped off when it comes to emotional situations.”

When apps are challenging our old ways of finding connection and intimacy, it seems ironic to add another layer of technology to dating. But could Kat be on to something? Maybe a seemingly neutral AI is a smart tool for working out relationship issues, sans human baggage.

For journalistic purposes, I decided to immerse myself in the trend.

Let’s see what ChatGPT has to say about this …

Drawing on the theory that couples should seek therapy before major problems arise, I proposed to my boyfriend of less than six months that we turn to an AI chatbot for advice, assess the bot’s feedback and share the results. David, an artist who’s always up for a good experimental project (no last name for him, either!), agreed to the pitch.

Our first foray into ChatGPT-mediated couples counseling began with a question suggested by the bot to spark discussion about the health of our relationship. Did David have resources to help him manage his stress and anxiety? He did — he was in therapy, exercised and had supportive friends and family. That reference to his anxiety then sent him on a tangent.

He reflected on being a “sensitive artist type.” He felt that women, who might like that in theory, don’t actually want to deal with emotionally sensitive male partners.

“I’m supposed to be unflappable but also emotionally vulnerable,” David said.

He was opening up. But I accused him of spiraling, projecting assumptions and monologuing.

While he was chewing over big ideas, I tried to steer the conversation back to our interpersonal friction. That’s where ChatGPT came in: I recorded our conversation and uploaded the transcript to the bot. And then I posed a question. (Our chats have been heavily edited for brevity — it talks a lot.)

David was incredulous. “It feels like a cliché,” he said.

Deflection, I thought. I turned back to ChatGPT and read on:

It was a damning summary. Was I, as ChatGPT suggested, carrying a burnout level of emotional labor at this early stage in the relationship?

Pushing for objectivity

A human brought me back to reality.

“It might be true that you were doing more emotional labor [in that moment] or at the individual level. But there’s a huge bias,” said Myra Cheng, an AI researcher and computer science Ph.D. student at Stanford University.

The material that large language models (LLMs), such as ChatGPT, Claude and Gemini, are trained on — the internet, mostly — has a “huge American and white and male bias,” she said.

And that means all the cultural tropes and patterns of bias are present, including the stereotype that women disproportionately do the emotional labor in work and relationships.

Cheng was part of a research team that compared two datasets, each comprising personal advice: one dataset written by humans responding to real-world situations and the second dataset consisting of judgments made by LLMs in response to posts on Reddit’s AITA (“Am I the A**hole?”) advice forum.

The study found that LLMs consistently exhibit higher rates of sycophancy — excessive agreement with or flattery of the user — than humans do.

For soft-skill matters such as advice, sycophancy in AI chatbots can be especially dangerous, Cheng said, because there’s no certainty about whether its guidance is sensible. In one recent case revealing the perils of a sycophantic bot, a man who was having manic episodes said ChatGPT’s affirmations had prevented him from seeking help.

So, striving for something closer to objectivity in the biased bot, I changed my tack.

There it was again: I was stuck doing the emotional labor. I accused ChatGPT of continuing to lack balance.

“Why do you get ‘clear communication’?” David asked me, as if I chose those words.

At this point, I asked Faith Drew, a licensed marriage and family therapist based in Arizona who has written about the topic, for pointers on how to bring ChatGPT into my relationship.

It’s a classic case of triangulation, according to Drew. Triangulation is a coping strategy in relationships when a third person — a friend, parent or AI, for example — is brought in to ease tension between two people.

There’s value in triangulation, whether the source is a bot or a friend. “AI can be helpful because it does synthesize information really quickly,” Drew said.

But triangulation can go awry when you don’t keep sight of your partner in the equation.

“One person goes out and tries to get answers on their own — ‘I’m going to just talk to AI,'” she said. “But it never forces me back to deal with the issue with the person.”

The bot might not even have the capacity to hold me accountable if I’m not feeding it all the necessary details, she said. Triangulation in this case is valuable, she said, “if we’re asking the right questions to the bot, like: ‘What is my role in the conflict?'”

The breakthrough

In search of neutrality and accountability, I calibrated my chatbot once more. “Use language that doesn’t cast blame,” I commanded. Then I sent it the following text from David:

I feel like you accuse me of not listening before I even have a chance to listen. I’m making myself available and open and vulnerable to you.

“What’s missing on my end?” I asked ChatGPT.

After much flattery, it finally answered:

I found its response simple and revelatory. Plus, it was accurate.

He was picking up a lot of slack in the relationship lately. He made me dinners when work kept me late and set aside his own work to indulge me in long-winded, AI-riddled conversations.

I reflected on a point Drew made — about the importance of putting work into our relationships, especially in the uncomfortable moments, instead of relying on AI.

“Being able to sit in the distress with your partner — that’s real,” she said. “It’s OK to not have the answers. It’s OK to be empathic and not know how to fix things. And I think that’s where relationships are very special — where AI could not ever be a replacement.”

Here’s my takeaway. ChatGPT had a small glimpse into our relationship and its dynamics. Relationships are fluid, and the chatbot can only ever capture a snapshot. I called on AI in moments of tension. I could see how that reflex could fuel our discord, not help mend it. ChatGPT could be hasty to choose sides and often decided too quickly that something was a pattern.

Humans don’t always think and behave in predictable patterns. And chemistry is a big factor in compatibility. If an AI chatbot can’t feel the chemistry between people — sense it, recognize that magical thing that happens in three-dimensional space between two imperfect people — it’s hard to put trust in the machine when it comes to something as important as relationships.

A few times, we both felt that ChatGPT gave objective and creative feedback, offered a valid analysis of our communication styles and defused some disagreements.

But it took a lot of work to get somewhere interesting. In the end, I’d rather invest that time and energy — what ChatGPT might call my emotional labor — into my human relationships.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Insights

UAPB librarian leads session on artificial intelligence in STEM fields

Published

on


University of Arkansas at Pine Bluff librarian Shenise McGhee presented on AI-powered smart tools at the 2025 STEM Librarians South Conference hosted by the University of Texas at Arlington.

This annual conference, held virtually and in person, brings together librarians of science, technology, engineering and math from across the United States and beyond to exchange ideas, strategies and innovations in areas such as library instruction, reference services, collection development and outreach, according to a news release.

As a featured panelist during the virtual portion of the July conference, McGhee presented a session titled “Smart Tools: AI-Powered Pathways to STEM Student Success.”

She explored how advancements in artificial intelligence and machine learning are reshaping education, especially in STEM fields, where data-driven decision-making and adaptive learning are increasingly vital. She emphasized how STEM librarians can harness AI tools to enhance student learning, improve academic performance and promote equity in STEM education.

McGhee examined emerging technologies, including AI tutoring systems, intelligent learning platforms and personalized machine learning applications. She demonstrated how these tools can create inclusive learning environments by adapting instruction to meet individual student needs, delivering real-time feedback, automating instructional tasks and predicting student challenges before they arise.

Her presentation also emphasized the critical role of STEM librarians in supporting the ethical use of AI tools, teaching students how to engage with AI tools critically and effectively in their coursework by providing access to the digital resources that empower student success. Attendees were offered practical strategies, case studies and best practices to integrate AI into library services and student support initiatives.

In addition, McGhee spotlighted the UAPB STEM Academy, a five-to-six-week summer residential program designed to prepare incoming STEM majors for the academic rigor of college and life on campus. She discussed how the library collaborates with other campus departments to support students through targeted library instruction and services that contribute to academic success.

“STEM librarians are uniquely positioned to guide students through the evolving AI-driven educational landscape,” McGhee said. “By integrating smart tools and inclusive practices, we not only improve outcomes, but we also empower students to thrive.”

For more information, visit:

John Brown Watson Memorial Library

STEM Academy

Home



Source link

Continue Reading

AI Insights

Oakland Ballers to use artificial intelligence to manage Saturday home game against Great Falls

Published

on


OAKLAND, Calif. (AP) — Oakland Ballers manager Aaron Miles will leave it to artificial intelligence to decide when to pinch hit or replace his pitcher.

The playoff-bound Ballers of the independent Pioneer League are turning to AI to manage most aspects of Saturday’s home game against the Great Falls Voyagers at Raimondi Park. So it might feel almost like a day off for the skipper, whose lineup and in-game decisions will even be made for him — from a tablet he will have in the dugout providing instructions.

Advertisement

The starting pitcher is already set.

“Luckily it’s only game. Maybe we’ve done so well that the AI will just keep doing what we’re doing,” Miles joked Wednesday. “Being a 70-win team we’ve got a very good bench. It’s hard to write a lineup without leaving somebody out that’s really good. This game I’ll be like, ‘Hey, it’s not on me for not writing you in there, it’s on the computer.’ It won’t be my fault if somebody’s not in the lineup, I guess I’ll enjoy that.”

Yet Miles knows he still might have to step in with some lineup adjustments, because the human element still matters when it comes to someone who could need rest or take a break because of injury or other circumstances.

Co—founder Paul Freedman said the second-year club will produce the first AI-powered professional sporting event. It happens to be Fan Appreciation day, too.

Advertisement

Last year, during the Ballers’ inaugural season, they had a game in which fans wrote the lineup and chose the uniforms — but Oakland lost. So the Ballers are doing it differently this time by partnering with AI company Distillery to control almost everything.

“The AI won’t be able to do third-base coaching, we don’t have the technology for that yet,” Freedman said. “The human will be responsible for waving somebody home or throwing up the hand. But those kind of situational decisions, we will look to the machine to make the call.”

Freedman figures with the Ballers having locked up the top seed for playoffs, this is a perfect opportunity to give AI a try.

And no need for Miles to be concerned with job security, even with the greater potential for Monday-morning quarterbacking when it comes to his moves.

Advertisement

“The good news is Aaron has won 100 games for us and right now our winning percentage is well over 75%, I think his job is pretty safe,” Freedman said. “And we’re happy with the decisions he’s made, but we do think it’s cool. One of the fun things about being a sports fan is being able to engage in conversations after the game about the key decisions. So this is a breadcrumb for us for what we think could be something if it works well could be part of a fan experience application or something that we do where after a game we kind of highlight what the key decisions were that our manager made and which ones kind of went against the grain — either for right or wrong.”

Miles has already experimented with AI a couple of times but earlier this season one roster showed up as the 2024 group. He expects AI might end up making a smarter decision just based on real-time data.

“I fooled around with this before just for fun, now it’s for real,” he said, “for one game.”

Ballers catcher Tyler Lozano is open-minded to incorporating new elements into the game to complement the analytics — as long as the treasured traditions aren’t lost.

Advertisement

“It’s immersive, it’s definitely involving new technology, new everything. It’s interesting to see what an AI platform or AI software can do for a baseball team,” Lozano said. “There’s always going to be a human element in the game of baseball. I think in sports period there’s going to be some type of human element because you’re live, you’re there. These AI platforms aren’t watching the game or don’t see all of the intricate moments that happen throughout the game and the human element of the player. I don’t think you’re going to lose that.”

___

AP MLB: https://apnews.com/hub/mlb



Source link

Continue Reading

AI Insights

Oakland Ballers to use artificial intelligence to manage Saturday home game against Great Falls – Bluefield Daily Telegraph

Published

on



Oakland Ballers to use artificial intelligence to manage Saturday home game against Great Falls  Bluefield Daily Telegraph



Source link

Continue Reading

Trending