Connect with us

AI Insights

University Hospitals studying use of Artificial Intelligence to improve lung cancer outcomes

Published

on


CLEVELAND, Ohio (WOIO) – There are more lung cancer deaths in the United States than breast cancer, colon cancer, and prostate cancer combined.

The problem is that most lung cancers are caught so late, in stage three or four, because it can be so difficult for radiologists to detect.

But University Hospitals is conducting a study to see if artificial intelligence can help patients catch it much earlier.

“We needed a way to find lung cancer early. It’s like finding a needle in a haystack. And that’s where A.I. comes in,” said Dr. Samir Shah, Chief Medical Officer at Qure.ai.

Lung cancer nodules in their early stages can be nearly impossible for radiologists to see. Usually, lung cancer goes undiagnosed until patients start feeling symptoms.

“When they are having cough, chest pain, or blood coming out of the sputum, unfortunately, that stage is stage three or four, and at that time the survival rate is, whatever you do, in single digits,” said Radiologist, Dr. Amit Gupta.

He and his team at University Hospitals are now studying use of A.I. to see if they can get to a diagnosis sooner.

“In my initial experience, it can find some of the nodules which are super hidden. And on a busy day, even a trained cardiothoracic radiologist may overlook them,” he said.

Qure.ai acts as another set of eyes in the room for radiologists.

Dr. Shah, says their algorithm was trained on large amounts of data gathered over nine years studying tuberculosis x-rays.

“Those x-rays could be used not only to look at TB, but we started looking at lung nodules as well. And we looked at 5 million of those.

That’s more than I would ever read, and maybe 10-fold more than I would ever read in my entire lifetime,” said Dr. Shah.

Through this vast amount of data, it learned what was and wasn’t a nodule.

Dr. Shah says they’re hopeful this application could result in a stage shift, catching lung cancer in stage one or two, and boosting survival rates to 60-70 percent.

“When it starts small and has not spread, and it can be removed, you’re talking about curative situation, which makes patients happy. But you know, that’s what the goal is for every oncologist. That’s what the goal is for every thoracic surgeon who’s involved, every pulmonologist. That’s what the aim is,” said Dr. Shah.

Qure.AI tackled lung cancer first because of the difficulty of detection. They say this technology is also incredibly positive for attacking other equally hard illnesses like breast cancer.

The University Hospitals radiology team on this study expects the data collection to go on another 9-10 months. If U.H. does move forward, they estimate implementing it a little more than a year from now.



Source link

AI Insights

Intro robotics students build AI-powered robot dogs from scratch

Published

on


Equipped with a starter robot hardware kit and cutting-edge lessons in artificial intelligence, students in CS 123: A Hands-On Introduction to Building AI-Enabled Robots are mastering the full spectrum of robotics – from motor control to machine learning. Now in its third year, the course has students build and enhance an adorable quadruped robot, Pupper, programming it to walk, navigate, respond to human commands, and perform a specialized task that they showcase in their final presentations.

The course, which evolved from an independent study project led by Stanford’s robotics club, is now taught by Karen Liu, professor of computer science in the School of Engineering, in addition to Jie Tan from Google DeepMind and Stuart Bowers from Apple and Hands-On Robotics. Throughout the 10-week course, students delve into core robotics concepts, such as movement and motor control, while connecting them to advanced AI topics.

“We believe that the best way to help and inspire students to become robotics experts is to have them build a robot from scratch,” Liu said. “That’s why we use this specific quadruped design. It’s the perfect introductory platform for beginners to dive into robotics, yet powerful enough to support the development of cutting-edge AI algorithms.”

What makes the course especially approachable is its low barrier to entry – students need only basic programming skills to get started. From there, the students build up the knowledge and confidence to tackle complex robotics and AI challenges.

Robot creation goes mainstream

Pupper evolved from Doggo, built by the Stanford Student Robotics club to offer people a way to create and design a four-legged robot on a budget. When the team saw the cute quadruped’s potential to make robotics both approachable and fun, they pitched the idea to Bowers, hoping to turn their passion project into a hands-on course for future roboticists.

“We wanted students who were still early enough in their education to explore and experience what we felt like the future of AI robotics was going to be,” Bowers said.

This current version of Pupper is more powerful and refined than its predecessors. It’s also irresistibly adorable and easier than ever for students to build and interact with.

“We’ve come a long way in making the hardware better and more capable,” said Ankush Kundan Dhawan, one of the first students to take the Pupper course in the fall of 2021 before becoming its head teaching assistant. “What really stuck with me was the passion that instructors had to help students get hands-on with real robots. That kind of dedication is very powerful.”

Code come to life

Building a Pupper from a starter hardware kit blends different types of engineering, including electrical work, hardware construction, coding, and machine learning. Some students even produced custom parts for their final Pupper projects. The course pairs weekly lectures with hands-on labs. Lab titles like Wiggle Your Big Toe and Do What I Say keep things playful while building real skills.

CS 123 students ready to show off their Pupper’s tricks. | Harry Gregory

Over the initial five weeks, students are taught the basics of robotics, including how motors work and how robots can move. In the next phase of the course, students add a layer of sophistication with AI. Using neural networks to improve how the robot walks, sees, and responds to the environment, they get a glimpse of state-of-the-art robotics in action. Many students also use AI in other ways for their final projects.

“We want them to actually train a neural network and control it,” Bowers said. “We want to see this code come to life.”

By the end of the quarter this spring, students were ready for their capstone project, called the “Dog and Pony Show,” where guests from NVIDIA and Google were present. Six teams had Pupper perform creative tasks – including navigating a maze and fighting a (pretend) fire with a water pick – surrounded by the best minds in the industry.

“At this point, students know all the essential foundations – locomotion, computer vision, language – and they can start combining them and developing state-of-the-art physical intelligence on Pupper,” Liu said.

“This course gives them an overview of all the key pieces,” said Tan. “By the end of the quarter, the Pupper that each student team builds and programs from scratch mirrors the technology used by cutting-edge research labs and industry teams today.”

All ready for the robotics boom

The instructors believe the field of AI robotics is still gaining momentum, and they’ve made sure the course stays current by integrating new lessons and technology advances nearly every quarter.

A water jet is mounted on this "firefighter" Pupper

This Pupper was mounted with a small water jet to put out a pretend fire. | Harry Gregory

Students have responded to the course with resounding enthusiasm and the instructors expect interest in robotics – at Stanford and in general – will continue to grow. They hope to be able to expand the course, and that the community they’ve fostered through CS 123 can contribute to this engaging and important discipline.

“The hope is that many CS 123 students will be inspired to become future innovators and leaders in this exciting, ever-changing field,” said Tan.

“We strongly believe that now is the time to make the integration of AI and robotics accessible to more students,” Bowers said. “And that effort starts here at Stanford and we hope to see it grow beyond campus, too.”



Source link

Continue Reading

AI Insights

5 Ways CFOs Can Upskill Their Staff in AI to Stay Competitive

Published

on


Chief financial officers are recognizing the need to upskill their workforce to ensure their teams can effectively harness artificial intelligence (AI).

According to a June 2025 PYMNTS Intelligence report, “The Agentic Trust Gap: Enterprise CFOs Push Pause on Agentic AI,” all the CFOs surveyed said generative AI has increased the need for more analytically skilled workers. That’s up from 60% in March 2024.

“The shift in the past year reflects growing hands-on use and a rising urgency to close capability gaps,” according to the report.

The CFOs also said the overall mix of skills required across the business has changed. They need people who have AI-ready skills: “CFOs increasingly need talent that can evaluate, interpret and act on machine-generated output,” the report said.

The CFO role itself is changing. According to The CFO, 27% of job listings for chief financial officers now call for AI expertise.

Notably, the upskill challenge is not limited to IT. The need for upskilling in AI affects all departments, including finance, operations and compliance. By taking a proactive approach to skill development, CFOs can position their teams to work alongside AI rather than compete with it.

The goal is to cultivate professionals who can critically assess AI output, manage risks, and use the tools to generate business value.

Among CEOs, the impact is just as pronounced. According to a Cisco study, 74% fear that gaps in knowledge will hinder decisions in the boardroom and 58% fear it will stifle growth.

Moreover, 73% of CEOs fear losing ground to rivals because of IT knowledge or infrastructure gaps. One of the barriers holding back CEOs are skills shortages.

Their game plan: investing in knowledge and skills, upgrading infrastructure and enhancing security.

Here are some ways companies can upskill their workforce for AI:

Ensure Buy-in by the C-Suite

  • With leadership from the top, AI learning initiatives will be prioritized instead of falling by the wayside.
  • Allay any employee concerns about artificial intelligence replacing them so they will embrace the use and management of AI.

Build AI Literacy Across the Company

  • Invest in AI training programs: Offer structured training tailored to finance to help staff understand both the capabilities and limitations of AI models, according to CFO.university.
  • Promote AI fluency: Focus on both technical skills, such as how to use AI tools, and conceptual fluency of AI, such as understanding where AI can add value and its ethical implications, according to the CFO’s AI Survival Guide.
  • Create AI champions: Identify and develop ‘AI champions’ within the team who can bridge the gap between finance and technology, driving adoption and supporting peers, according to Upflow.

Integrate AI Into Everyday Workflows

  • Start with small, focused projects such as expense management to demonstrate value and build confidence.
  • Foster a culture where staff can explore AI tools, automate repetitive tasks, and share learnings openly.

Encourage Continuous Learning

Make learning about AI a continuous process, not a one-time event. Encourage staff to stay updated on AI trends and tools relevant to finance.

  • Promote collaboration between finance, IT, and other departments to maximize AI’s impact and share best practices.

Tap External Resources

  • Partner with universities and providers: Tap into external courses, certifications, and workshops to supplement internal training.
  • Consider tapping free or low-cost resources, such as online courses and AI literacy programs offered by tech companies (such as Grow with Google). These tools can provide foundational understanding and help employees build confidence in using AI responsibly.

Read more:

CFOs Move AI From Science Experiment to Strategic Line Item

3 Ways AI Shifts Accounts Receivable From Lagging to Leading Indicator

From Nice-to-Have to Nonnegotiable: How AI Is Redefining the Office of the CFO



Source link

Continue Reading

AI Insights

Real or AI: Band confirms use of artificial intelligence for its music on Spotify

Published

on


The Velvet Sundown, a four-person band, or so it seems, has garnered a lot of attention on Spotify. It started posting music on the platform in early June and has since released two full albums with a few more singles and another album coming soon. Naturally, listeners started to accuse the band of being an AI-generated project, which as it now turns out, is true.

The band or music project called The Velvet Sundown has over a million monthly listeners on Spotify. That’s an impressive debut considering their first album called “Floating on Echoes” hit the music streaming platform on June 4. Then, on June 19, their second album called “Dust and Silence” was added to the library. Next week, July 14, will mark the release of the third album called “Paper Sun Rebellion.” Since their debut, listeners have accused the band of being an AI-generated project and now, the owners of the project have updated the Spotify bio and called it a “synthetic music project guided by human creative direction, and composed, voiced, and visualized with the support of artificial intelligence.”

It goes on to state that this project challenges the boundaries of “authorship, identity, and the future of music itself in the age of AI.” The owners claim that the characters, stories, music, voices, and lyrics are “original creations generated with the assistance of artificial intelligence tools,” but it is unclear to what extent AI was involved in the development process.

The band art shows four individuals suggesting they are owners of the project, but the images are likely AI-generated as well. Interestingly, Andrew Frelon (pseudonym) claimed to be the owner of the AI band initially, but then confirmed that was untrue and that he pretended to run their Twitter because he wanted to insert an “extra layer of weird into this story,” of this AI band.

As it stands now, The Velvet Sundown’s music is available on Spotify with the new album releasing next week. Now, whether this unveiling causes a spike or a decline in monthly listeners, remains to be seen. 



Source link

Continue Reading

Trending