Connect with us

Tools & Platforms

AWS is launching an AI agent marketplace next week with Anthropic as a partner

Published

on


Amazon Web Services (AWS) is launching an AI agent marketplace next week and Anthropic is one of its partners, TechCrunch has exclusively learned.

The AWS agent marketplace launch will take place at the AWS Summit in New York City on July 15, two people familiar with the development told TechCrunch. AWS and Anthropic did not respond to requests for comments.

AI agents are ubiquitous nowadays. And every single investor in Silicon Valley is bullish on startups building them — even if there is some disagreement on exactly what defines an AI agent. The term is somewhat ambiguous and is loosely used to describe computer programs that can make decisions and perform tasks independently, such as interacting with software, by using an AI model at the backend.

AI behemoths such as OpenAI and Anthropic are promoting it as the next big thing in tech. However, the distribution of AI agents poses a challenge, as most companies offer them in silos. AWS appears to be taking a step to address this with its new move.

The company’s dedicated agent marketplace will allow startups to directly offer their AI agents to AWS customers. The marketplace will also allow enterprise customers to browse, install, and look for AI agents based on their requirements from a single location, a source said.

That could give Anthropic — and other AWS agent marketplace partners — a considerable boost.

Anthropic, which already has Amazon’s backing and is reportedly in line for another multibillion-dollar investment from the e-commerce company, views AI’s future primarily in terms of agents — at least for the coming years. Anthropic builds AI agents in-house and enables developers to create them using its API.

AWS’ marketplace would help Anthropic reach more customers, including those who may already use AI agents from its rivals, such as OpenAI. Anthropic’s involvement in the marketplace could also attract more developers to use its API to create more agents, and eventually increase its revenues. The company already hit $3 billion in annualized revenue in late May.

Like any other online marketplace, AWS will take a cut of the revenue that startups earn from agent installations. However, this share will be minimal compared to the marketplace’s potential to unlock new revenue streams and attract customers.

The marketplace model will allow startups to charge customers for agents. The structure is similar to how a marketplace might price SaaS offerings rather than bundling them into broader services, one of the sources said.

Amazon is not the first tech giant to offer a marketplace for agents. In April, Google Cloud introduced an AI Agent Marketplace to help developers and businesses list, buy, and sell AI agents. Microsoft also introduced a similar offering, called Agent Store, within Microsoft 365 Copilot a month later. Similarly, enterprise software providers, including Salesforce and ServiceNow, have their own agent marketplaces.

That said, we have yet to see how successful these marketplaces are for smaller AI startups and enterprises seeking specific AI agents.



Source link

Tools & Platforms

Tech Philosophy and AI Strategy – Stratechery by Ben Thompson

Published

on


A drawing of Apple, Microsoft, OpenAI, Anthropic, Meta, and Google on the AI Tech Philosophy Opportunity Graph
(Stratechery)

Welcome back to This Week in Stratechery!

As a reminder, each week, every Friday, we’re sending out this overview of content in the Stratechery bundle; highlighted links are free for everyone. Additionally, you have complete control over what we send to you. If you don’t want to receive This Week in Stratechery emails (there is no podcast), please uncheck the box in your delivery settings.

On that note, here were a few of our favorites this week.

  1. Who Invests and Why? As Mark Zuckerberg and Meta inflame the already raging talent wars, I wanted to explore if there was a way to understand who was willing to invest to win, and who was not. I came up with two scales: how big is the business opportunity for a given company, and whether or not that company’s philosophy is about helping users, or doing things for them. Not only does this intersection of Tech Philosophy and AI Opportunity explain the actions of Meta and Apple, it also helped me fully rectify some of my long-standing confusion about Google. Ben Thompson
  2. Apple Searches for an AI Partner. If Apple isn’t going to pay for AI talent, then they need a partner, which is why Apple is considering a partnership with either Anthropic or OpenAI to power a new version of Siri. For one, thinking about what OpenAI and Anthropic would want from a deal with Apple provides a window into the goals distinguishing two of the leading AI labs in the world. As for Apple, the news highlights the corner that they’ve backed themselves into after several years of failed AI efforts internally and one prolonged and very public failure with last year’s Apple Intelligence rollout. The choices now? Either surrender control and branding to OpenAI, or pay big money to Anthropic (a far cry from collecting $20 billion a year from Google for default search placement). In either case, Apple management will have to leave its comfort zone, and looking at the past few years, perhaps that comfort zone was the problem.  Andrew Sharp
  3. Is Xi Jinping on His Way Out? Every week I survey the news to prep for Sharp China, and for about two months now, there’s been a steady thrum of rumors concerning the political fate of Xi Jinping. Connecting the dots between Xi’s unexplained absences from public view, a spate of dismissals of powerful generals from the People’s Liberation Army, and a surprise absence at the BRICS summit in Brazil a few weeks ago, various internet sleuths and commentators are wondering whether Xi’s long-unshakeable hold on power may be waning. For the second half of this week’s episode, Sinocism’s Bill Bishop, who’s been studying the CCP for 30 years, explained why he finds the public evidence unconvincing and the rumor ecosystem increasingly frustrating. It was a rollicking conversation, and one that I caveated with my own note: what’s most remarkable to me about this rumor cycle is that because of the CCP’s unbelievable opacity, there is a hard limit on what any expert can conclusively say about the future of anyone in powereven the big man, himself.  AS

Stratechery Articles and Updates

Dithering with Ben Thompson and Daring Fireball’s John Gruber

Asianometry with Jon Yu

Sharp China with Andrew Sharp and Sinocism’s Bill Bishop

Greatest of All Talk with Andrew Sharp and WaPo’s Ben Golliver

Sharp Tech with Andrew Sharp and Ben Thompson

This week’s Stratechery video is on Checking In on AI and the Big Five.


Get notified about new Articles




Source link

Continue Reading

Tools & Platforms

The AI trends driving business success

Published

on




Source link

Continue Reading

Tools & Platforms

Noninvasive brain technology allows control of robotic hands with thought

Published

on


NEWYou can now listen to Fox News articles!

Noninvasive brain tech is transforming how people interact with robotic devices. Instead of relying on muscle movement, this technology allows a person to control a robotic hand by simply thinking about moving his fingers. 

No surgery is required. 

Instead, a set of sensors is placed on the scalp to detect brain signals. These signals are then sent to a computer. As a result, this approach is safe and accessible. It opens new possibilities for people with motor impairments or those recovering from injuries.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM/NEWSLETTER

PARALYZED MAN SPEAKS AND SINGS WITH AI BRAIN-COMPUTER INTERFACE

A woman wearing non-invasive brain technology  (Carnegie Mellon University)

How noninvasive brain tech turns thought into action

Researchers at Carnegie Mellon University have made significant progress with noninvasive brain technology. They use electroencephalography (EEG) to detect the brain’s electrical activity when someone thinks about moving a finger. Artificial intelligence, specifically deep learning algorithms, then decodes these signals and translates them into commands for a robotic hand. In their study, participants managed to move two or even three robotic fingers at once, just by imagining the motion. The system achieved over 80% accuracy for two-finger tasks. For three-finger tasks, accuracy was over 60%. All of this happened in real time. 

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Meeting the challenge of finger-level control

Achieving separate movement for each robotic finger is a real challenge. The brain areas responsible for finger movement are small. Their signals often overlap, which makes it hard to distinguish between them. However, advances in noninvasive brain technology and deep learning have made it possible to pick up on these subtle differences. 

The research team used a neural network called EEGNet. They fine-tuned it for each participant. Because of this, the system allowed for smooth, natural control of the robotic fingers. The movements closely matched how a real hand works.

brain tech 2

A robotic finger being controlled by non-invasive brain technology  (Kurt “CyberGuy” Knutsson)

Why noninvasive brain tech matters for everyday life

For people with limited hand function, even small improvements can make a huge difference. Noninvasive brain technology eliminates the need for surgery because the system is external and easy to use. In addition, this technology provides natural and intuitive control. It enables a person to move a robotic hand by simply thinking about the corresponding finger movements. 

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The accessibility of noninvasive brain technology means it can be used in clinics and homes and by a wide range of people. For example, it enables participation in everyday tasks, such as typing or picking up small objects that might otherwise be difficult or impossible to perform. This approach can benefit stroke survivors and people with spinal cord injuries. It can also help anyone interested in enhancing their abilities. 

What’s next for noninvasive brain tech?

While the progress is exciting, there are still challenges ahead. Noninvasive brain technology needs to improve even further at filtering out noise and adapting to individual differences. However, with ongoing advances in deep learning and sensor technology, these systems are becoming more reliable and easier to use. Researchers are already working to expand the technology for more complex tasks. 

As a result, assistive robotics could soon become a part of more homes and workplaces.

brain tech 3

Illustration of how the noninvasive brain technology works  (Carnegie Mellon University)

Kurt’s key takeaways

Noninvasive brain technology is opening up possibilities that once seemed out of reach. The idea of moving a robotic hand just by thinking about it could make daily life easier and more independent for many people. As researchers continue to improve these systems, it will be interesting to see how this technology shapes the way we interact with the world around us.

CLICK HERE TO GET THE FOX NEWS APP

If you had the chance to control a robotic hand with your thoughts, what would you want to try first? Let us know by writing us at Cyberguy.com/Contact

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM/NEWSLETTER 

Copyright 2025 CyberGuy.com.  All rights reserved.  



Source link

Continue Reading

Trending