Connect with us

Tools & Platforms

The Three Skills You Need To Master In The Age Of AI, According To Tech Execs

Published

on


AI agent adoption is projected to grow by 327% by 2027, a new report from Salesforce (NYSE:CRM) finds. This is expected to increase a company’s productivity by 30%, but will drastically impact the way those companies are organized and run.

With the implementation of these AI agents, an estimated 61% of a company’s workforce will remain in their current roles. However, their day-to-day responsibilities will almost certainly be impacted, human resource executives said.

Around 75% of the HR executives surveyed said that they expect soft skills to become much more important and that collaboration and adaptability skills will become increasingly valued in the new AI economy.

Don’t Miss:

Tech executives who spoke to Business Insider about the findings agreed, highlighting three skills in particular.

First, you’ll need to become a generalist.

“Deep expertise will be less valued” in this new AI-driven world, Cognizant Technology Solutions  (NASDAQ:CTSH) CEO Ravi Kumar told the outlet. Instead, he suggests learning how to merge that expertise with technological capabilities to make yourself a more appealing candidate.

Cloudflare Inc. (NYSE:NET) CEO Matthew Prince agreed, saying, “We’re trying to find people who have a broad set of skills and can be general.”

Second, tech executives say creativity will be the key to making yourself stand out from the crowd.

Trending: Named a TIME Best Invention and Backed by 5,000+ Users, Kara’s Air-to-Water Pod Cuts Plastic and Costs — And You Can Invest At Just $6.37/Share

Dropbox (NASDAQ:DBX) Vice President of Product and Growth Morgan Brown told Business Insider that with AI taking on many of the more mundane, time-consuming tasks for many roles, employees will have more time for “expansive thinking” about new ideas.

The third skill you’ll need to master in the age of AI is proficiency with AI tools, execs say.

Google Cloud Product Executive Yasmeen Ahmad told the outlet that successful employees will be able to “interact with these new-age tools and be able to prompt to engineer and ask the right questions and interact in this flow that hasn’t been there before.”

Will Grannis, Google Cloud’s chief technology officer, took things a step further, telling Business Insider a strong candidate will not only know how to craft their queries effectively but will go “beyond the formal curriculum” to learn how to do things like use AI to code.



Source link

Tools & Platforms

Page not found – Lancaster City Council

Published

on




We show this page when a page cannot be found in our website. It might be that a link has changed or is no longer available.

The page or document you tried to view could not be located. Our new website has changed some of the links so if you’ve bookmarked something you might need to find it again.

We apologise for the inconvenience.



Source link

Continue Reading

Tools & Platforms

Tech Providers Guide Clients on AI

Published

on

By


YOUNGSTOWN, Ohio – While artificial intelligence continues to attract interest from businesses, widespread implementation remains inconsistent, area technology providers say. Companies are exploring AI primarily for automating tasks such as document drafting, sales support and data analysis, though industry-specific uses are emerging.

Plus, the Youngstown State University women’s soccer team enters the 2025 season under new leadership with first-year coach Rich Wall.

And the Kikel Charitable Trust has awarded $1 million to Akron Children’s Hospital Mahoning Valley to support ongoing projects, including pediatric services in Warren, the Food Farmacy program in Boardman and the outpatient pharmacy.

Service Providers, Clients Navigate AI
3 Minutes With: YSU Soccer Team Embraces Fresh Leadership
Women’s Soccer Team Kicks into Gear
$1M from Trust to Benefit Akron Children’s Mahoning Valley



Source link

Continue Reading

Tools & Platforms

Psychologist urges caution when turning to AI for emotional support

Published

on


Artificial intelligence is becoming an increasingly popular tool that can help with many areas of life — from solving work-related issues to assisting with research and everyday decision-making.

Its conversational style of communicating has also led some to turn to it for emotional support, with potentially dangerous consequences, as recent news stories have reported.

To understand the complexities around this topic, ASU News spoke with a psychologist at Arizona State University about the use of AI tools, such as ChatGPT, to navigate personal challenges.

While they are accessible and affordable, Clinical Associate Professor Matthew Meier says they shouldn’t completely replace professional psychotherapy.

Matthew Meier

Question: Are there any scientific studies or data tracking how many people are turning to ChatGPT or AI tools for emotional support or therapeutic conversations?

Answer: A number of surveys have examined the use of Large Language Models (LLMs), also referred to as chatbots or just AI, for psychological support. The largest study was conducted in 2025 by Kantar, a marketing data business, which surveyed 10,000 global AI users and found that just over half of the sample have used AI at least once for emotional or mental well-being. A recent study in the U.S. also found that nearly 50% have tried AI for psychological support.

AI use is more common among younger people. More than 70% of teens have used AI chatbots and more than 50% use AI regularly for emotional support.

Q: In what kinds of personal or emotional situations can AI reasonably offer support or guidance? What are some of the potential benefits of using AI to help navigate personal problems?

A: I think the biggest benefit is increased access to potential help. A chatbot is available 24 hours a day, at a moment’s notice and in any location that has internet access. And, at least for now, it’s free. Even the most expensive plans for paid access to AIs are cheaper than weekly psychotherapy sessions.

When used in conjunction with psychotherapy, chatbots can substantially increase the amount of work that occurs outside of therapy sessions. Chatbots can provide support for homework completion, encourage the use of new coping skills, assess symptoms and severity on an ongoing basis and be a source of support.

Whether someone is using a chatbot for emotional support and therapy or as an adjunct to formal psychotherapy, I would encourage people to use chatbots designed for mental health purposes, rather than generic AI platforms such as ChatGPT. Examples of mental health-related AIs include Earkick, Koko, Therabot, Youper and Wysa. However, the available research indicates that these chatbots are not yet sufficient to replace therapy from licensed professionals.

Q: What kinds of personal or psychological issues should only be handled by trained professionals? What are the risks in people turning to AI for help with deeper or more serious problems like trauma, for example?

A: Psychotherapy is a legally-protected term and a therapist must be licensed to provide psychotherapy. The purpose of licensure is to protect the public from harm by requiring a minimum standard of care for mental health services, by assuring that therapists have the level of training and experience necessary to safely provide psychological help to others. There is no such licensing process for AI chatbots, nor any standards to assure that psychological interventions are provided ethically or competently.

There are multiple pending lawsuits alleging harm from chatbots, including exposure to hypersexualized interactions, encouragement of violence toward parents and wrongful death by suicide. Almost 10% of chatbot users report receiving harmful or inappropriate responses. In February 2025, the American Psychological Association urged federal legislators and the Federal Trade Commission to regulate the use of AI for mental health support. At this time, no AI chatbot has been FDA-approved to diagnose or treat mental disorders.

Until there are safeguards in place, I do not recommend the use of AI chatbots as an alternative to human-provided therapy. However, as a component of mental health treatment from a licensed professional, chatbots can be helpful. I encourage my clients to use chatbots to help with problem-solving skills, work through negative thoughts, practice social skills, provide reminders and monitor symptoms. However, I also work with clients to not become overly reliant on a chatbot at the expense of human relationships and to have a safety plan to address crises.

Q: Given the barriers that often prevent people from seeking traditional therapy — like cost, access or stigma — could turning to ChatGPT still be better than nothing at all in helping people with painful struggles?

A: Much of the time, some help is better than no help, but this isn’t always the case. In some situations, especially involving trauma, the wrong intervention can make things worse. Or, bad or even unhelpful advice can be detrimental when in a crisis.

On the other hand, using AI isn’t really any different than using self-help books or talking to nonprofessionals. The important thing to keep in mind is to not solely rely on a chatbot for help. If someone is feeling overwhelmed or suicidal, professional help is available. Anyone can call 988, the suicide and crisis lifeline. Locally, the ASU Clinical Psychology Center offers in-person and telehealth psychology services on a sliding fee scale.

Q: There was a time when forming a relationship with a machine was considered the stuff of science fiction, yet today it is becoming surprisingly normalized. What are the concerns or consequences of people believing that robots are a normal and sufficient substitute for human connection?

A: I think people will increasingly turn to chatbots for emotional help, especially as AI becomes more sophisticated and is better able to mimic empathy and emotional connectedness.

What I am concerned about is when people choose to cut off human relationships and become overly reliant on a chatbot that is designed to collect personal information for profit. Right now, 30% of teens find AI conversations as satisfying or more satisfying than human conversations, and 6% spend more time with chatbots than with friends. These numbers are going to increase as chatbots become more sophisticated, and we need to protect children, teens and vulnerable adults from losing connections with people in the real world.

Q: Looking ahead, do you think AI will reshape the field of mental health and counseling, the same way it’s transforming so many other industries? Could it become part of mainstream therapy practice — or even possibly challenge or replace it?

A: AI is going to reshape most fields, including mental health. AI is relatively new, and with any new technology, we have to learn how to use it effectively and how to minimize negative consequences. Unfortunately, AI is advancing at such a fast pace that the field of psychology needs to dedicate significant effort to catch up. We need to determine how to protect people from the harmful aspects and establish standards of care, which will likely require mental health chatbots to become registered as medical devices. Once the guardrails are in place, I definitely think AI will become very integrated in clinical practice. I don’t think AI will replace therapists, but chatbots will increase access to mental health and could serve as an initial, frontline option to improve mental health.

Ways ChatGPT can help

ASU psychologist Matthew Meier says chatbots can provide support for homework completion and encourage the use of new coping skills, among other helpful tasks. 

Here’s an example using a real-life conversation with ChatGPT on July 29 about final exams.

Student: I have finals coming up and am really stressed out about them. What can I do?

ChatGPT: Totally understandable — finals can feel like a mental marathon with a fog machine. Here are practical, low-stress strategies to help you get through them without burning out:

1. Use a 3-part plan: Focus, rest, reset

FOCUS:
Make a short, realistic plan for each day (one to two subjects max). Use Pomodoro (25 min study/five min break) to avoid fatigue.Start with a topic you sort of understand — it’s motivating. 

REST:
Sleep is not optional — even six and a half to eight hours helps with memory. Take guilt-free 20–30 min walks or naps. Don’t cram before bed — you’re just stressing your brain out.

RESET:
Stuck? Switch subjects or teach it to your pet (or a wall). Do a one-minute reset: Close your eyes, slow your breath, unclench jaw. Even five deep breaths can help you feel less panicked.

2. Study smarter, not harder
Practice tests — rereading. Actively retrieving helps memory stick. Use flashcards, voice memos or even doodles if you’re visual. Teach it to someone else — you’ll quickly see what you know verses what you don’t.

3. Talk to yourself (nicely)
Stress thrives on inner trash talk. Replace “I’m going to fail” with “I’ve done hard things before. This is just another one.”

Bonus tip: Take a break before you break
Fifteen minutes of doing literally nothing productive (watching a silly video, lying on the floor) is not slacking — it’s stress relief. If you want, I can give you a daily study planner or a calming playlist. You’ve got this. You don’t need to be perfect — you just need to make steady progress.



Source link

Continue Reading

Trending