AI Research
Myths of AI networking — debunked
As AI infrastructure scales at an unprecedented rate, a number of outdated assumptions keep resurfacing – especially when it comes to the role of networking in large-scale training and inference systems. Many of these myths are rooted in technologies that worked well for small clusters. But today’s systems are scaling to hundreds of thousands – and soon, millions – of GPUs. Those older models no longer apply. Let’s walk through some of the most common myths – and why Ethernet has clearly emerged as the foundation for modern AI networking.
Myth 1: You cannot use Ethernet for high performance AI networks
This myth has already been busted. Ethernet is now the de facto networking technology for AI at scale. Most, if not all, of the largest GPU clusters deployed in the past year have used Ethernet for scale-out networking.
Ethernet delivers performance that matches or exceeds what alternatives like InfiniBand offer – while providing a stronger ecosystem, broader vendor support, and faster innovation cycles. InfiniBand, for example, wasn’t designed for today’s scale. It’s a legacy fabric being pushed beyond its original purpose.
Meanwhile, Ethernet is thriving: multiple vendors are shipping 51.2T switches, and Broadcom recently introduced Tomahawk 6, the industry’s first 102.4T switch. Ecosystems for optical and electrical interconnect are also mature, and clusters of 100K GPUs and beyond are now routinely built on Ethernet.
Myth 2: You need separate networks for scale-up and scale-out
This was acceptable when GPU nodes were small. Legacy scale-up links originated in an era when connecting two or four GPUs was enough. Today, scale-up domains are expanding rapidly. You’re no longer connecting four GPUs – you’re designing systems with 64, 128, or more in a single scale-up cluster. And that’s where Ethernet, with its proven scalability, becomes the obvious choice.
Using separate technologies for local and cluster-wide interconnect only adds cost, complexity, and risk. What you want is the opposite: a single, unified network that supports both. That’s exactly what Ethernet delivers – along with interface fungibility, simplified operations, and an open ecosystem.
To accelerate this interface convergence, we’ve contributed the Scale-Up Ethernet (SUE) framework to the Open Compute Project, helping the industry standardize around a single AI networking fabric.
Myth 3: You need proprietary interconnects and exotic optics
This is another holdover from a different era. Proprietary interconnects and tightly coupled optics may have worked for small, fixed systems – but today’s AI networks demand flexibility and openness.
Ethernet gives you options: third-generation co-packaged optics (CPO), module-based retimed optics, linear drive optics, and the longest-reach passive copper. You’re not locked into one solution. You can tailor your interconnect to your power, performance, and economic goals – with full ecosystem support.
Myth 4: You need proprietary NIC features for AI workloads
Some AI networks rely on programmable, high-power NICs to support features like congestion control or traffic spraying. But in many cases, that’s just masking limitations in the switching fabric.
Modern Ethernet switches – like Tomahawk 5 & 6 – integrate load balancing, rich telemetry, and failure resiliency directly into the switch. That reduces cost, lowers power, and frees up power for what matters most: your GPUs/ XPUs.
Looking ahead, the trend is clear: NIC functions will increasingly be embedded into XPUs. The smarter strategy is to simplify, not over-engineer.
Myth 5: You have to match your network to your GPU vendor
There’s no good reason for this. The most advanced GPU clusters in the world – deployed at the largest hyperscalers – run on Ethernet.
Why? Because it enables flatter, more efficient network topologies. It’s vendor-neutral. And it supports innovation – from AI-optimized collective libraries to workload-specific tuning at both the scale-up and scale-out levels.
Ethernet is a standards-based, well understood technology with a very vibrant ecosystem of partners. This allows AI clusters to scale more easily, and completely decoupled from the choice of GPU/XPU, delivering an open, scalable and power efficient system
The bottom line
Networking used to be an afterthought. Now it’s a strategic enabler of AI performance, efficiency, and scalability.
If your architecture is still built around assumptions from five years ago, it’s time to rethink them. The future of AI is being built on Ethernet – and that future is already here.
Click here to explore more about Ethernet technology and here to learn more about Merchant Silicon.
About Ram Velaga
Broadcom
Ram Velaga is Senior Vice President and General Manager of the Core Switching Group at Broadcom, responsible for the company’s extensive Ethernet switch portfolio serving broad markets including the service provider, data center and enterprise segments. Prior to joining Broadcom in 2012, he served in a variety of product management roles at Cisco Systems, including Vice President of Product Management for the Data Center Technology Group. Mr. Velaga earned an M.S. in Industrial Engineering from Penn State University and an M.B.A. from Cornell University. Mr. Velaga holds patents in communications and virtual infrastructure.
AI Research
Artificially intelligent: Does it matter if ChatGPT can’t think? – AFR
AI Research
Ciena Powers SingAREN to Enhance AI-Driven Research with High-Speed Network
For over a decade, Singapore has consistently ranked highly on the Global Innovation Index, an annual ranking of 130 economies. In 2024 it achieved its highest position yet – 4th globally.
This strong performance is largely due to steady, long-term investment in research & development (R&D) as a key pillar of Singapore’s economic development strategy.
Supporting Singapore’s Research, Innovation and Enterprise (RIE) ecosystem is the Singapore Advanced Research and Education Network (SingAREN), established in 1997. SingAREN is the sole provider of dedicated local and international network services for the local Research and Education community.
SingAREN’s network supports the SingAREN Open Exchange (SOE) for high-speed research and education connectivity, eduroam, an international Wi-Fi internet access roaming service for the international research and education community, and FileSender SG as a platform for large file transfers, among other services running on its network.
RIE is vital to Singapore’s progress, fostering economic growth and competitiveness. It also drives scientific advancements that can potentially address societal challenges and enhance our well-being.
SingAREN has supported robotic telesurgery trials across international boundaries, which require precise, instantaneous control, and a low-latency network for real-time collaboration.
SingAREN also enables high-speed, resilient connectivity to the National Supercomputing Center (NSCC), which manages Singapore’s national high-performance computing (HPC) resources, supporting research and innovation across various fields. In particular, the NSCC’s expertise and specialized infrastructure are often leveraged to manage and analyze genomic data. Transferring genomic data is typically difficult due to its massive data size.
SingAREN provided a high-speed link to the Cancer Science Institute of Singapore for a research project, transmitting more than 2 petabytes of cancer genomics data downloaded from repositories in the United States into NSCC. The research involved harmonizing petabytes of whole genome sequencing data, and downloads were expected to be extremely fast, stable, and efficient, after which, the downloaded data would be analyzed and reprocessed with high computing power.
This is but one of the examples of collaboration with NSCC to transfer, download, analyze and process genomic data.
Academic research is experiencing explosive growth and requires more data than ever before, fuelled by AI and Machine Learning (ML), and cloud computing. The increasing use of generative and agentic AI will also impact SingAREN and its research partners significantly, leading to increased data volume. This type of advanced research activity will not be possible without a robust, scalable, low-latency network.
In the coming months, SingAREN will enhance its network to further support its research institution partners. These plans include the SingAREN Lightwave Internet Exchange (SLIX) 2.5 project, to provide high-speed, secure connectivity by 2027, and the SLIX 3.0 vision to build a future-ready network that incorporates quantum-safe networking, AI research, and haptic surgery. SingAREN also aims to expand cybersecurity threat intelligence sharing and continue infrastructure upgrades, such as implementing 400G switches and enhancing Points of Presence (PoP) resilience.
SingAREN uses Ciena’s 6500 powered by Ciena’s WaveLogic programmable coherent optic technology. Deployed by Ciena partner, Terrabit Networks, Ciena’s 6500 supports SingAREN to respond to changing requirements on-demand, allowing the REN to continually maximize network efficiencies and offer customizable service delivery over any distance.
Associate Professor Francis Lee, Vice President of SingAREN
Our backbone network, powered by Ciena’s 6500 optical solution, is built to handle the growing demands of AI, genomics, and big data applications—transmitting petabytes of data. To support the advancement of Singapore’s Research, Innovation and Enterprise agenda, our flexible, low-latency network can now seamlessly deliver 10G to 100G connections to member institutions. We continue to push the boundaries of research and innovation, ensuring connectivity is never a limiting factor.
AI Research
SoundHound AI Stock Sank Today — Is the Artificial Intelligence Company a Buy?
SoundHound AI (SOUN -4.73%) stock saw a pullback in Thursday’s trading. The company’s share price fell 4.7% in the session and had been down as much as 8.1% earlier in trading.
While there doesn’t appear to have been any major business-specific news behind the pullback, investors may have moved to take profits after a pop for the company’s share price earlier in the week. Despite today’s pullback, the stock is still up roughly 9% over the last week of trading. Even more striking, the company’s share price is up roughly 39% over the last three months.
Image source: Getty Images.
Is SoundHound AI stock a good buy right now?
SoundHound AI has been highly volatile over the last year of trading. While the company’s share price is still up roughly 197% across the stretch, it’s also still down approximatley 49% from its peak in the period.
Even as the company’s sales base has ramped up rapidly, sales growth has continued to accelerate. Revenue increased 151% year over year in the first quarter of the company’s current fiscal year, which ended March 31. The company still only posted $29.1 million in sales in the period, but sales growth in the quarter marked a dramatic improvement over the 73% annual growth it posted in the prior-year period.
SoundHound is an early mover in the voice-based agentic artificial intelligence (AI) space, and it has huge expansion potential over the long term — but its valuation profile still comes with a risk. The company now has a market capitalization of roughly $4.9 billion and is valued at approximately 31 times this year’s expected sales.
For investors with a very high risk tolerance, SoundHound AI could still be a worthwhile investment. The company has been posting very impressive sales momentum, but its valuation already prices in a lot of strong growth in the future. If you’re looking to build a position in SoundHound AI stock, using a dollar-cost-averaging strategy for your purchases may be better than buying in all at once at today’s prices.
Keith Noonan has no position in any of the stocks mentioned. The Motley Fool has no position in any of the stocks mentioned. The Motley Fool has a disclosure policy.
-
Funding & Business1 week ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers1 week ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions1 week ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business1 week ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers1 week ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Education3 days ago
9 AI Ethics Scenarios (and What School Librarians Would Do)
-
Education4 days ago
Teachers see online learning as critical for workforce readiness in 2025
-
Education1 week ago
AERDF highlights the latest PreK-12 discoveries and inventions
-
Education6 days ago
How ChatGPT is breaking higher education, explained
-
Education4 days ago
Nursery teachers to get £4,500 to work in disadvantaged areas