Artificial Intelligence (AI) in the classroom has been a major topic for the past few years.
Source: Youtube
Artificial Intelligence (AI) in the classroom has been a major topic for the past few years.
Source: Youtube
The bill, which aimed to regulate shipments of AI GPUs to adversaries and prioritize U.S. buyers, as proposed by U.S. senators earlier this week, made quite a splash in America. To a degree, Nvidia issued a statement claiming that the U.S. was, is, and will remain its primary market, implying that no regulations are needed for the company to serve America.
“The U.S. has always been and will continue to be our largest market,” a statement sent to Tom’s Hardware reads. “We never deprive American customers in order to serve the rest of the world. In trying to solve a problem that does not exist, the proposed bill would restrict competition worldwide in any industry that uses mainstream computing chips. While it may have good intentions, this bill is just another variation of the AI Diffusion Rule and would have similar effects on American leadership and the U.S. economy.”
The new export rules would obviously apply even to older AI GPUs — assuming they are still in production, of course — like Nvidia’s HGX H20 or L2 PCIe, which still meet the defined performance thresholds set by the Biden administration. Although Nvidia has claimed that H20 shipments to China do not interfere with the domestic supply of H100, H200, or Blackwell chips, the new legislation could significantly formalize such limitations on transactions in the future.
OpenAI has sharply raised its projected cash burn through 2029 to $115 billion, according to The Information. This marks an $80 billion increase from previous estimates, as the company ramps up spending to fuel the AI behind its ChatGPT chatbot.
The company, which has become one of the world’s biggest renters of cloud servers, projects it will burn more than $8 billion this year, about $1.5 billion higher than its earlier forecast. The surge in spending comes as OpenAI seeks to maintain its lead in the rapidly growing artificial intelligence market.
To control these soaring costs, OpenAI plans to develop its own data center server chips and facilities to power its technology.
The company is partnering with U.S. semiconductor giant Broadcom to produce its first AI chip, which will be used internally rather than made available to customers, as reported by The Information.
In addition to this initiative, OpenAI has expanded its partnership with Oracle, committing to a 4.5-gigawatt data center capacity to support its growing operations.
This is part of OpenAI’s larger plan, the Stargate initiative, which includes a $500 billion investment and is also supported by Japan’s SoftBank Group. Google Cloud has also joined the group of suppliers supporting OpenAI’s infrastructure.
OpenAI’s projected cash burn will more than double in 2024, reaching over $17 billion. It will continue to rise, with estimates of $35 billion in 2027 and $45 billion in 2028, according to The Information.
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
Journey to 1000 models: Scaling Instagram’s recommendation system
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
VEX Robotics launches AI-powered classroom robotics system
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
OpenAI 🤝 @teamganassi