Connect with us

AI Insights

How a new type of AI is helping police skirt facial recognition bans

Published

on


“The whole vision behind Track in the first place,” says Veritone CEO Ryan Steelberg, was “if we’re not allowed to track people’s faces, how do we assist in trying to potentially identify criminals or malicious behavior or activity?” In addition to tracking individuals where facial recognition isn’t legally allowed, Steelberg says, it allows for tracking when faces are obscured or not visible. 

The product has drawn criticism from the American Civil Liberties Union, which—after learning of the tool through MIT Technology Review—said it was the first instance they’d seen of a nonbiometric tracking system used at scale in the US. They warned that it raises many of the same privacy concerns as facial recognition but also introduces new ones at a time when the Trump administration is pushing federal agencies to ramp up monitoring of protesters, immigrants, and students.

Veritone gave us a demonstration of Track in which it analyzed people in footage from different environments, ranging from the January 6 riots to subway stations. You can use it to find people by specifying body size, gender, hair color and style, shoes, clothing, and various accessories. The tool can then assemble timelines, tracking a person across different locations and video feeds. It can be accessed through Amazon and Microsoft cloud platforms.

In an interview, Steelberg said that the number of attributes Track uses to identify people will continue to grow. When asked if Track differentiates on the basis of skin tone, a company spokesperson said it’s one of the attributes the algorithm uses to tell people apart but that the software does not currently allow users to search for people by skin color. Track currently operates only on recorded video, but Steelberg claims the company is less than a year from being able to run it on live video feeds.

Agencies using Track can add footage from police body cameras, drones, public videos on YouTube, or so-called citizen upload footage (from Ring cameras or cell phones, for example) in response to police requests.

“We like to call this our Jason Bourne app,” Steelberg says. He expects the technology to come under scrutiny in court cases but says, “I hope we’re exonerating people as much as we’re helping police find the bad guys.” The public sector currently accounts for only 6% of Veritone’s business (most of its clients are media and entertainment companies), but the company says that’s its fastest-growing market, with clients in places including California, Washington, Colorado, New Jersey, and Illinois. 

That rapid expansion has started to cause alarm in certain quarters. Jay Stanley, a senior policy analyst at the ACLU, wrote in 2019 that artificial intelligence would someday expedite the tedious task of combing through surveillance footage, enabling automated analysis regardless of whether a crime has occurred. Since then, lots of police-tech companies have been building video analytics systems that can, for example, detect when a person enters a certain area. However, Stanley says, Track is the first product he’s seen make broad tracking of particular people technologically feasible at scale.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Insights

Intro robotics students build AI-powered robot dogs from scratch

Published

on


Equipped with a starter robot hardware kit and cutting-edge lessons in artificial intelligence, students in CS 123: A Hands-On Introduction to Building AI-Enabled Robots are mastering the full spectrum of robotics – from motor control to machine learning. Now in its third year, the course has students build and enhance an adorable quadruped robot, Pupper, programming it to walk, navigate, respond to human commands, and perform a specialized task that they showcase in their final presentations.

The course, which evolved from an independent study project led by Stanford’s robotics club, is now taught by Karen Liu, professor of computer science in the School of Engineering, in addition to Jie Tan from Google DeepMind and Stuart Bowers from Apple and Hands-On Robotics. Throughout the 10-week course, students delve into core robotics concepts, such as movement and motor control, while connecting them to advanced AI topics.

“We believe that the best way to help and inspire students to become robotics experts is to have them build a robot from scratch,” Liu said. “That’s why we use this specific quadruped design. It’s the perfect introductory platform for beginners to dive into robotics, yet powerful enough to support the development of cutting-edge AI algorithms.”

What makes the course especially approachable is its low barrier to entry – students need only basic programming skills to get started. From there, the students build up the knowledge and confidence to tackle complex robotics and AI challenges.

Robot creation goes mainstream

Pupper evolved from Doggo, built by the Stanford Student Robotics club to offer people a way to create and design a four-legged robot on a budget. When the team saw the cute quadruped’s potential to make robotics both approachable and fun, they pitched the idea to Bowers, hoping to turn their passion project into a hands-on course for future roboticists.

“We wanted students who were still early enough in their education to explore and experience what we felt like the future of AI robotics was going to be,” Bowers said.

This current version of Pupper is more powerful and refined than its predecessors. It’s also irresistibly adorable and easier than ever for students to build and interact with.

“We’ve come a long way in making the hardware better and more capable,” said Ankush Kundan Dhawan, one of the first students to take the Pupper course in the fall of 2021 before becoming its head teaching assistant. “What really stuck with me was the passion that instructors had to help students get hands-on with real robots. That kind of dedication is very powerful.”

Code come to life

Building a Pupper from a starter hardware kit blends different types of engineering, including electrical work, hardware construction, coding, and machine learning. Some students even produced custom parts for their final Pupper projects. The course pairs weekly lectures with hands-on labs. Lab titles like Wiggle Your Big Toe and Do What I Say keep things playful while building real skills.

CS 123 students ready to show off their Pupper’s tricks. | Harry Gregory

Over the initial five weeks, students are taught the basics of robotics, including how motors work and how robots can move. In the next phase of the course, students add a layer of sophistication with AI. Using neural networks to improve how the robot walks, sees, and responds to the environment, they get a glimpse of state-of-the-art robotics in action. Many students also use AI in other ways for their final projects.

“We want them to actually train a neural network and control it,” Bowers said. “We want to see this code come to life.”

By the end of the quarter this spring, students were ready for their capstone project, called the “Dog and Pony Show,” where guests from NVIDIA and Google were present. Six teams had Pupper perform creative tasks – including navigating a maze and fighting a (pretend) fire with a water pick – surrounded by the best minds in the industry.

“At this point, students know all the essential foundations – locomotion, computer vision, language – and they can start combining them and developing state-of-the-art physical intelligence on Pupper,” Liu said.

“This course gives them an overview of all the key pieces,” said Tan. “By the end of the quarter, the Pupper that each student team builds and programs from scratch mirrors the technology used by cutting-edge research labs and industry teams today.”

All ready for the robotics boom

The instructors believe the field of AI robotics is still gaining momentum, and they’ve made sure the course stays current by integrating new lessons and technology advances nearly every quarter.

A water jet is mounted on this "firefighter" Pupper

This Pupper was mounted with a small water jet to put out a pretend fire. | Harry Gregory

Students have responded to the course with resounding enthusiasm and the instructors expect interest in robotics – at Stanford and in general – will continue to grow. They hope to be able to expand the course, and that the community they’ve fostered through CS 123 can contribute to this engaging and important discipline.

“The hope is that many CS 123 students will be inspired to become future innovators and leaders in this exciting, ever-changing field,” said Tan.

“We strongly believe that now is the time to make the integration of AI and robotics accessible to more students,” Bowers said. “And that effort starts here at Stanford and we hope to see it grow beyond campus, too.”



Source link

Continue Reading

AI Insights

5 Ways CFOs Can Upskill Their Staff in AI to Stay Competitive

Published

on


Chief financial officers are recognizing the need to upskill their workforce to ensure their teams can effectively harness artificial intelligence (AI).

According to a June 2025 PYMNTS Intelligence report, “The Agentic Trust Gap: Enterprise CFOs Push Pause on Agentic AI,” all the CFOs surveyed said generative AI has increased the need for more analytically skilled workers. That’s up from 60% in March 2024.

“The shift in the past year reflects growing hands-on use and a rising urgency to close capability gaps,” according to the report.

The CFOs also said the overall mix of skills required across the business has changed. They need people who have AI-ready skills: “CFOs increasingly need talent that can evaluate, interpret and act on machine-generated output,” the report said.

The CFO role itself is changing. According to The CFO, 27% of job listings for chief financial officers now call for AI expertise.

Notably, the upskill challenge is not limited to IT. The need for upskilling in AI affects all departments, including finance, operations and compliance. By taking a proactive approach to skill development, CFOs can position their teams to work alongside AI rather than compete with it.

The goal is to cultivate professionals who can critically assess AI output, manage risks, and use the tools to generate business value.

Among CEOs, the impact is just as pronounced. According to a Cisco study, 74% fear that gaps in knowledge will hinder decisions in the boardroom and 58% fear it will stifle growth.

Moreover, 73% of CEOs fear losing ground to rivals because of IT knowledge or infrastructure gaps. One of the barriers holding back CEOs are skills shortages.

Their game plan: investing in knowledge and skills, upgrading infrastructure and enhancing security.

Here are some ways companies can upskill their workforce for AI:

Ensure Buy-in by the C-Suite

  • With leadership from the top, AI learning initiatives will be prioritized instead of falling by the wayside.
  • Allay any employee concerns about artificial intelligence replacing them so they will embrace the use and management of AI.

Build AI Literacy Across the Company

  • Invest in AI training programs: Offer structured training tailored to finance to help staff understand both the capabilities and limitations of AI models, according to CFO.university.
  • Promote AI fluency: Focus on both technical skills, such as how to use AI tools, and conceptual fluency of AI, such as understanding where AI can add value and its ethical implications, according to the CFO’s AI Survival Guide.
  • Create AI champions: Identify and develop ‘AI champions’ within the team who can bridge the gap between finance and technology, driving adoption and supporting peers, according to Upflow.

Integrate AI Into Everyday Workflows

  • Start with small, focused projects such as expense management to demonstrate value and build confidence.
  • Foster a culture where staff can explore AI tools, automate repetitive tasks, and share learnings openly.

Encourage Continuous Learning

Make learning about AI a continuous process, not a one-time event. Encourage staff to stay updated on AI trends and tools relevant to finance.

  • Promote collaboration between finance, IT, and other departments to maximize AI’s impact and share best practices.

Tap External Resources

  • Partner with universities and providers: Tap into external courses, certifications, and workshops to supplement internal training.
  • Consider tapping free or low-cost resources, such as online courses and AI literacy programs offered by tech companies (such as Grow with Google). These tools can provide foundational understanding and help employees build confidence in using AI responsibly.

Read more:

CFOs Move AI From Science Experiment to Strategic Line Item

3 Ways AI Shifts Accounts Receivable From Lagging to Leading Indicator

From Nice-to-Have to Nonnegotiable: How AI Is Redefining the Office of the CFO



Source link

Continue Reading

AI Insights

Real or AI: Band confirms use of artificial intelligence for its music on Spotify

Published

on


The Velvet Sundown, a four-person band, or so it seems, has garnered a lot of attention on Spotify. It started posting music on the platform in early June and has since released two full albums with a few more singles and another album coming soon. Naturally, listeners started to accuse the band of being an AI-generated project, which as it now turns out, is true.

The band or music project called The Velvet Sundown has over a million monthly listeners on Spotify. That’s an impressive debut considering their first album called “Floating on Echoes” hit the music streaming platform on June 4. Then, on June 19, their second album called “Dust and Silence” was added to the library. Next week, July 14, will mark the release of the third album called “Paper Sun Rebellion.” Since their debut, listeners have accused the band of being an AI-generated project and now, the owners of the project have updated the Spotify bio and called it a “synthetic music project guided by human creative direction, and composed, voiced, and visualized with the support of artificial intelligence.”

It goes on to state that this project challenges the boundaries of “authorship, identity, and the future of music itself in the age of AI.” The owners claim that the characters, stories, music, voices, and lyrics are “original creations generated with the assistance of artificial intelligence tools,” but it is unclear to what extent AI was involved in the development process.

The band art shows four individuals suggesting they are owners of the project, but the images are likely AI-generated as well. Interestingly, Andrew Frelon (pseudonym) claimed to be the owner of the AI band initially, but then confirmed that was untrue and that he pretended to run their Twitter because he wanted to insert an “extra layer of weird into this story,” of this AI band.

As it stands now, The Velvet Sundown’s music is available on Spotify with the new album releasing next week. Now, whether this unveiling causes a spike or a decline in monthly listeners, remains to be seen. 



Source link

Continue Reading

Trending