AI Insights
University Hospitals studying use of Artificial Intelligence to improve lung cancer outcomes
CLEVELAND, Ohio (WOIO) – There are more lung cancer deaths in the United States than breast cancer, colon cancer, and prostate cancer combined.
The problem is that most lung cancers are caught so late, in stage three or four, because it can be so difficult for radiologists to detect.
But University Hospitals is conducting a study to see if artificial intelligence can help patients catch it much earlier.
“We needed a way to find lung cancer early. It’s like finding a needle in a haystack. And that’s where A.I. comes in,” said Dr. Samir Shah, Chief Medical Officer at Qure.ai.
Lung cancer nodules in their early stages can be nearly impossible for radiologists to see. Usually, lung cancer goes undiagnosed until patients start feeling symptoms.
“When they are having cough, chest pain, or blood coming out of the sputum, unfortunately, that stage is stage three or four, and at that time the survival rate is, whatever you do, in single digits,” said Radiologist, Dr. Amit Gupta.
He and his team at University Hospitals are now studying use of A.I. to see if they can get to a diagnosis sooner.
“In my initial experience, it can find some of the nodules which are super hidden. And on a busy day, even a trained cardiothoracic radiologist may overlook them,” he said.
Qure.ai acts as another set of eyes in the room for radiologists.
Dr. Shah, says their algorithm was trained on large amounts of data gathered over nine years studying tuberculosis x-rays.
“Those x-rays could be used not only to look at TB, but we started looking at lung nodules as well. And we looked at 5 million of those.
That’s more than I would ever read, and maybe 10-fold more than I would ever read in my entire lifetime,” said Dr. Shah.
Through this vast amount of data, it learned what was and wasn’t a nodule.
Dr. Shah says they’re hopeful this application could result in a stage shift, catching lung cancer in stage one or two, and boosting survival rates to 60-70 percent.
“When it starts small and has not spread, and it can be removed, you’re talking about curative situation, which makes patients happy. But you know, that’s what the goal is for every oncologist. That’s what the goal is for every thoracic surgeon who’s involved, every pulmonologist. That’s what the aim is,” said Dr. Shah.
Qure.AI tackled lung cancer first because of the difficulty of detection. They say this technology is also incredibly positive for attacking other equally hard illnesses like breast cancer.
The University Hospitals radiology team on this study expects the data collection to go on another 9-10 months. If U.H. does move forward, they estimate implementing it a little more than a year from now.
Copyright 2025 WOIO. All rights reserved.
AI Insights
Intro robotics students build AI-powered robot dogs from scratch
Equipped with a starter robot hardware kit and cutting-edge lessons in artificial intelligence, students in CS 123: A Hands-On Introduction to Building AI-Enabled Robots are mastering the full spectrum of robotics – from motor control to machine learning. Now in its third year, the course has students build and enhance an adorable quadruped robot, Pupper, programming it to walk, navigate, respond to human commands, and perform a specialized task that they showcase in their final presentations.
The course, which evolved from an independent study project led by Stanford’s robotics club, is now taught by Karen Liu, professor of computer science in the School of Engineering, in addition to Jie Tan from Google DeepMind and Stuart Bowers from Apple and Hands-On Robotics. Throughout the 10-week course, students delve into core robotics concepts, such as movement and motor control, while connecting them to advanced AI topics.
“We believe that the best way to help and inspire students to become robotics experts is to have them build a robot from scratch,” Liu said. “That’s why we use this specific quadruped design. It’s the perfect introductory platform for beginners to dive into robotics, yet powerful enough to support the development of cutting-edge AI algorithms.”
What makes the course especially approachable is its low barrier to entry – students need only basic programming skills to get started. From there, the students build up the knowledge and confidence to tackle complex robotics and AI challenges.
Robot creation goes mainstream
Pupper evolved from Doggo, built by the Stanford Student Robotics club to offer people a way to create and design a four-legged robot on a budget. When the team saw the cute quadruped’s potential to make robotics both approachable and fun, they pitched the idea to Bowers, hoping to turn their passion project into a hands-on course for future roboticists.
“We wanted students who were still early enough in their education to explore and experience what we felt like the future of AI robotics was going to be,” Bowers said.
This current version of Pupper is more powerful and refined than its predecessors. It’s also irresistibly adorable and easier than ever for students to build and interact with.
“We’ve come a long way in making the hardware better and more capable,” said Ankush Kundan Dhawan, one of the first students to take the Pupper course in the fall of 2021 before becoming its head teaching assistant. “What really stuck with me was the passion that instructors had to help students get hands-on with real robots. That kind of dedication is very powerful.”
Code come to life
Building a Pupper from a starter hardware kit blends different types of engineering, including electrical work, hardware construction, coding, and machine learning. Some students even produced custom parts for their final Pupper projects. The course pairs weekly lectures with hands-on labs. Lab titles like Wiggle Your Big Toe and Do What I Say keep things playful while building real skills.
CS 123 students ready to show off their Pupper’s tricks. | Harry Gregory
Over the initial five weeks, students are taught the basics of robotics, including how motors work and how robots can move. In the next phase of the course, students add a layer of sophistication with AI. Using neural networks to improve how the robot walks, sees, and responds to the environment, they get a glimpse of state-of-the-art robotics in action. Many students also use AI in other ways for their final projects.
“We want them to actually train a neural network and control it,” Bowers said. “We want to see this code come to life.”
By the end of the quarter this spring, students were ready for their capstone project, called the “Dog and Pony Show,” where guests from NVIDIA and Google were present. Six teams had Pupper perform creative tasks – including navigating a maze and fighting a (pretend) fire with a water pick – surrounded by the best minds in the industry.
“At this point, students know all the essential foundations – locomotion, computer vision, language – and they can start combining them and developing state-of-the-art physical intelligence on Pupper,” Liu said.
“This course gives them an overview of all the key pieces,” said Tan. “By the end of the quarter, the Pupper that each student team builds and programs from scratch mirrors the technology used by cutting-edge research labs and industry teams today.”
All ready for the robotics boom
The instructors believe the field of AI robotics is still gaining momentum, and they’ve made sure the course stays current by integrating new lessons and technology advances nearly every quarter.
This Pupper was mounted with a small water jet to put out a pretend fire. | Harry Gregory
Students have responded to the course with resounding enthusiasm and the instructors expect interest in robotics – at Stanford and in general – will continue to grow. They hope to be able to expand the course, and that the community they’ve fostered through CS 123 can contribute to this engaging and important discipline.
“The hope is that many CS 123 students will be inspired to become future innovators and leaders in this exciting, ever-changing field,” said Tan.
“We strongly believe that now is the time to make the integration of AI and robotics accessible to more students,” Bowers said. “And that effort starts here at Stanford and we hope to see it grow beyond campus, too.”
AI Insights
5 Ways CFOs Can Upskill Their Staff in AI to Stay Competitive
Chief financial officers are recognizing the need to upskill their workforce to ensure their teams can effectively harness artificial intelligence (AI).
AI Insights
Real or AI: Band confirms use of artificial intelligence for its music on Spotify
The Velvet Sundown, a four-person band, or so it seems, has garnered a lot of attention on Spotify. It started posting music on the platform in early June and has since released two full albums with a few more singles and another album coming soon. Naturally, listeners started to accuse the band of being an AI-generated project, which as it now turns out, is true.
The band or music project called The Velvet Sundown has over a million monthly listeners on Spotify. That’s an impressive debut considering their first album called “Floating on Echoes” hit the music streaming platform on June 4. Then, on June 19, their second album called “Dust and Silence” was added to the library. Next week, July 14, will mark the release of the third album called “Paper Sun Rebellion.” Since their debut, listeners have accused the band of being an AI-generated project and now, the owners of the project have updated the Spotify bio and called it a “synthetic music project guided by human creative direction, and composed, voiced, and visualized with the support of artificial intelligence.”
It goes on to state that this project challenges the boundaries of “authorship, identity, and the future of music itself in the age of AI.” The owners claim that the characters, stories, music, voices, and lyrics are “original creations generated with the assistance of artificial intelligence tools,” but it is unclear to what extent AI was involved in the development process.
The band art shows four individuals suggesting they are owners of the project, but the images are likely AI-generated as well. Interestingly, Andrew Frelon (pseudonym) claimed to be the owner of the AI band initially, but then confirmed that was untrue and that he pretended to run their Twitter because he wanted to insert an “extra layer of weird into this story,” of this AI band.
As it stands now, The Velvet Sundown’s music is available on Spotify with the new album releasing next week. Now, whether this unveiling causes a spike or a decline in monthly listeners, remains to be seen.
I have always been passionate about gaming and technology, which drove me towards pursuing a career in the tech writing industry. I have spent over 7 years in the tech space and about a decade in content writing. I hope to continue to use this passion and generate informative, entertaining, and accurate content for readers.
-
Funding & Business7 days ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers7 days ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions7 days ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business6 days ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers6 days ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Funding & Business4 days ago
Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%
-
Jobs & Careers6 days ago
Astrophel Aerospace Raises ₹6.84 Crore to Build Reusable Launch Vehicle
-
Funding & Business7 days ago
From chatbots to collaborators: How AI agents are reshaping enterprise work
-
Funding & Business4 days ago
HOLY SMOKES! A new, 200% faster DeepSeek R1-0528 variant appears from German lab TNG Technology Consulting GmbH
-
Tools & Platforms6 days ago
Winning with AI – A Playbook for Pest Control Business Leaders to Drive Growth