AI Insights
Singapore police can now seize bank accounts to stop scams
Police in Singapore can now seize control of a person’s bank account and block money transfers if they suspect the person is being scammed, under a new law that kicked in on Tuesday.
The move is aimed at addressing a common issue faced by the police where victims often refuse to believe they are being scammed despite warnings, authorities have said.
The law was passed earlier this year by lawmakers, though some members of parliament have described the measure as intrusive.
Singapore has seen a worsening problem with scams, which surged to a record S$1.1 billion ($860m; £630m) in 2024 in the island-state.
Under the new Protection from Scams Act, the police can order banks to block a potential victim from making transactions if they suspect the person is being scammed.
Police can also block a potential victim’s use of ATMs and credit services.
The decision can be taken by a police officer even if the potential victim does not believe warnings that they are being scammed.
The bank account owner will still have access to his funds for legitimate reasons, such as to pay for their daily expenses and bills, but can only use their money at the discretion of the police, according to Singapore’s Ministry of Home Affairs (MHA).
The MHA has said that a potential victim’s bank account can be controlled by the police for up to 30 days at a time, with the option for a maximum of five extensions if more time is needed.
Critics of the law have raised concerns over accountability and the possibility of abuse of power. In Parliament in January, some MPs suggested allowing citizens to opt out of the law, or giving people the option to nominate someone else to freeze their transactions instead of the authorities.
But proponents have said that the law is needed to stem the huge losses incurred by victims and to protect them.
The MHA said the decision would be based on the facts offered by the individual and family members. “The restriction order will only be issued as a last resort, after other options to convince the individual have been exhausted,” it said in a statement.
The number of reported scams in Singapore has grown from around 15,600 cases in 2020 to more than 50,000 cases in 2024.
Common scams in Singapore include job and investment scams, and e-commerce fraud where users are duped into paying for items they never receive. Many are also increasingly falling prey to internet love scams, where fraudsters spend months building online relationships before tricking victims into sending money.
The new law is the latest anti-scam measure authorities have rolled out in Singapore. Since 2023, bank users can lock up a portion of money in their account so that they cannot be transferred digitally.
Most banks also have an emergency “kill switch” that lets customers freeze their bank accounts immediately if they suspect it has been compromised.
AI Insights
Intro robotics students build AI-powered robot dogs from scratch
Equipped with a starter robot hardware kit and cutting-edge lessons in artificial intelligence, students in CS 123: A Hands-On Introduction to Building AI-Enabled Robots are mastering the full spectrum of robotics – from motor control to machine learning. Now in its third year, the course has students build and enhance an adorable quadruped robot, Pupper, programming it to walk, navigate, respond to human commands, and perform a specialized task that they showcase in their final presentations.
The course, which evolved from an independent study project led by Stanford’s robotics club, is now taught by Karen Liu, professor of computer science in the School of Engineering, in addition to Jie Tan from Google DeepMind and Stuart Bowers from Apple and Hands-On Robotics. Throughout the 10-week course, students delve into core robotics concepts, such as movement and motor control, while connecting them to advanced AI topics.
“We believe that the best way to help and inspire students to become robotics experts is to have them build a robot from scratch,” Liu said. “That’s why we use this specific quadruped design. It’s the perfect introductory platform for beginners to dive into robotics, yet powerful enough to support the development of cutting-edge AI algorithms.”
What makes the course especially approachable is its low barrier to entry – students need only basic programming skills to get started. From there, the students build up the knowledge and confidence to tackle complex robotics and AI challenges.
Robot creation goes mainstream
Pupper evolved from Doggo, built by the Stanford Student Robotics club to offer people a way to create and design a four-legged robot on a budget. When the team saw the cute quadruped’s potential to make robotics both approachable and fun, they pitched the idea to Bowers, hoping to turn their passion project into a hands-on course for future roboticists.
“We wanted students who were still early enough in their education to explore and experience what we felt like the future of AI robotics was going to be,” Bowers said.
This current version of Pupper is more powerful and refined than its predecessors. It’s also irresistibly adorable and easier than ever for students to build and interact with.
“We’ve come a long way in making the hardware better and more capable,” said Ankush Kundan Dhawan, one of the first students to take the Pupper course in the fall of 2021 before becoming its head teaching assistant. “What really stuck with me was the passion that instructors had to help students get hands-on with real robots. That kind of dedication is very powerful.”
Code come to life
Building a Pupper from a starter hardware kit blends different types of engineering, including electrical work, hardware construction, coding, and machine learning. Some students even produced custom parts for their final Pupper projects. The course pairs weekly lectures with hands-on labs. Lab titles like Wiggle Your Big Toe and Do What I Say keep things playful while building real skills.
CS 123 students ready to show off their Pupper’s tricks. | Harry Gregory
Over the initial five weeks, students are taught the basics of robotics, including how motors work and how robots can move. In the next phase of the course, students add a layer of sophistication with AI. Using neural networks to improve how the robot walks, sees, and responds to the environment, they get a glimpse of state-of-the-art robotics in action. Many students also use AI in other ways for their final projects.
“We want them to actually train a neural network and control it,” Bowers said. “We want to see this code come to life.”
By the end of the quarter this spring, students were ready for their capstone project, called the “Dog and Pony Show,” where guests from NVIDIA and Google were present. Six teams had Pupper perform creative tasks – including navigating a maze and fighting a (pretend) fire with a water pick – surrounded by the best minds in the industry.
“At this point, students know all the essential foundations – locomotion, computer vision, language – and they can start combining them and developing state-of-the-art physical intelligence on Pupper,” Liu said.
“This course gives them an overview of all the key pieces,” said Tan. “By the end of the quarter, the Pupper that each student team builds and programs from scratch mirrors the technology used by cutting-edge research labs and industry teams today.”
All ready for the robotics boom
The instructors believe the field of AI robotics is still gaining momentum, and they’ve made sure the course stays current by integrating new lessons and technology advances nearly every quarter.
This Pupper was mounted with a small water jet to put out a pretend fire. | Harry Gregory
Students have responded to the course with resounding enthusiasm and the instructors expect interest in robotics – at Stanford and in general – will continue to grow. They hope to be able to expand the course, and that the community they’ve fostered through CS 123 can contribute to this engaging and important discipline.
“The hope is that many CS 123 students will be inspired to become future innovators and leaders in this exciting, ever-changing field,” said Tan.
“We strongly believe that now is the time to make the integration of AI and robotics accessible to more students,” Bowers said. “And that effort starts here at Stanford and we hope to see it grow beyond campus, too.”
AI Insights
Why Infuse Asset Management’s Q2 2025 Letter Signals a Shift to Artificial Intelligence and Cybersecurity Plays
The rapid evolution of artificial intelligence (AI) and the escalating complexity of cybersecurity threats have positioned these sectors as the next frontier of investment opportunity. Infuse Asset Management’s Q2 2025 letter underscores this shift, emphasizing AI’s transformative potential and the urgent need for robust cybersecurity infrastructure to mitigate risks. Below, we dissect the macroeconomic forces, sector-specific tailwinds, and portfolio reallocation strategies investors should consider in this new paradigm.
The AI Uprising: Macro Drivers of a Paradigm Shift
The AI revolution is accelerating at a pace that dwarfs historical technological booms. Take ChatGPT, which reached 800 million weekly active users by April 2025—a milestone achieved in just two years. This breakneck adoption is straining existing cybersecurity frameworks, creating a critical gap between innovation and defense.
Meanwhile, the U.S.-China AI rivalry is fueling a global arms race. China’s industrial robot installations surged from 50,000 in 2014 to 290,000 in 2023, outpacing U.S. adoption. This competition isn’t just about economic dominance—it’s a geopolitical chess match where data sovereignty, espionage, and AI-driven cyberattacks now loom large. The concept of “Mutually Assured AI Malfunction (MAIM)” highlights how even a single vulnerability could destabilize critical systems, much like nuclear deterrence but with far less predictability.
Cybersecurity: The New Infrastructure for an AI World
As AI systems expand into physical domains—think autonomous taxis or industrial robots—so do their vulnerabilities. In San Francisco, autonomous taxi providers now command 27% market share, yet their software is a prime target for cyberattacks. The decline in AI inference costs (outpacing historical declines in electricity and memory) has made it cheaper to deploy AI, but it also lowers the barrier for malicious actors to weaponize it.
Tech giants are pouring capital into AI infrastructure—NVIDIA and Microsoft alone increased CapEx from $33 billion to $212 billion between 2014 and 2024. This influx creates a vast, interconnected attack surface. Investors should prioritize cybersecurity firms that specialize in quantum-resistant encryption, AI-driven threat detection, and real-time infrastructure protection.
The Human Element: Skills Gaps and Strategic Shifts
The demand for AI expertise is soaring, but the workforce is struggling to keep pace. U.S. AI-related IT job postings have surged 448% since 2018, while non-AI IT roles have declined by 9%. This bifurcation signals two realities:
1. Cybersecurity skills are now mission-critical for safeguarding AI systems.
2. Ethical AI development and governance are emerging as compliance priorities, particularly in regulated industries.
The data will likely show a stark divergence, reinforcing the need for investors to back training platforms and cybersecurity firms bridging this skills gap.
Portfolio Reallocation: Where to Deploy Capital
Infuse’s insights suggest three actionable strategies:
-
Core Holdings in Cybersecurity Leaders:
Target firms like CrowdStrike (CRWD) and Palo Alto Networks (PANW), which excel in AI-powered threat detection and endpoint security. -
Geopolitical Plays:
Invest in companies addressing data sovereignty and cross-border compliance, such as Palantir (PLTR) or Cloudflare (NET), which offer hybrid cloud solutions. -
Emerging Sectors:
Look to quantum computing security (e.g., Rigetti Computing (RGTI)) and AI governance platforms like DataRobot (NASDAQ: MGNI), which help enterprises audit and validate AI models.
The Bottom Line: AI’s Growth Requires a Security Foundation
The “productivity paradox” of AI—where speculative valuations outstrip tangible ROI—is real. Yet, cybersecurity is one area where returns are measurable: breaches cost companies millions, and defenses reduce risk. Investors should treat cybersecurity as the bedrock of their AI investments.
As Infuse’s letter implies, the next decade will belong to those who balance AI’s promise with ironclad security. Position portfolios accordingly.
JR Research
AI Insights
5 Ways CFOs Can Upskill Their Staff in AI to Stay Competitive
Chief financial officers are recognizing the need to upskill their workforce to ensure their teams can effectively harness artificial intelligence (AI).
-
Funding & Business7 days ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers7 days ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions7 days ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business6 days ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers6 days ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Funding & Business4 days ago
Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%
-
Funding & Business1 week ago
From chatbots to collaborators: How AI agents are reshaping enterprise work
-
Jobs & Careers6 days ago
Astrophel Aerospace Raises ₹6.84 Crore to Build Reusable Launch Vehicle
-
Jobs & Careers4 days ago
Ilya Sutskever Takes Over as CEO of Safe Superintelligence After Daniel Gross’s Exit
-
Funding & Business4 days ago
Dust hits $6M ARR helping enterprises build AI agents that actually do stuff instead of just talking