Connect with us

AI Research

Framework Laptop 12 review: fun, flexible and repairable | Laptops

Published

on


The modular and repairable PC maker Framework’s latest machine moves into the notoriously difficult to fix 2-in-1 category with a fun 12in laptop with a touchscreen and a 360-degree hinge.

The new machine still supports the company’s innovative expansion cards for swapping the different ports in the side, which are cross-compatible with the Framework 13 and 16 among others. And you can still open it up to replace the memory, storage and internal components with a few simple screws.

The Framework 12 is available in either DIY form, starting at £499 (€569/$549/A$909), or more conventional prebuilt models starting at £749. It sits under the £799-and-up Laptop 13 and £1,399 Laptop 16 as the company’s most compact and affordable model.

The compact notebook is available in a range of two-tone colours, not just grey and black. Photograph: Samuel Gibbs/The Guardian

Where the Laptop 13 is a premium-looking machine, the Laptop 12 is unmistakably chunky and rugged with over-moulded plastic parts for shock protection. It is designed to meet the MIL-STD-810 standard common to rugged electronics. It looks and feels as if it could take a beating, not like a flimsy DIY kit you put together yourself.

The glossy 12.2in screen is bright and relatively sharp. But it is highly reflective, has large black bezels around it and has a relatively narrow colour gamut, which means colours look a little muted. It’s decent enough for productivity but not great for photo editing. The touchscreen rotates all the way back on to the bottom of the machine to turn it into a tablet or it can be folded like a tent or parallel to the keyboard. The screen supports the use of a wide range of first and third-party styluses for drawing or notes, which could make it handy in the classroom.

A selection of fun colours are available for the DIY version, further enhancing its college appeal. The 1080p webcam at the top is decent, although it won’t rival a Surface, and it has a physical privacy switch alongside the mics. The stereo speakers are loud and distortion-free but lack bass and a little clarity, sounding a little hollow compared with the best on the market.

The keyboard is nicely spaced, fairly quiet and pretty good to type on but lacks a backlight. Photograph: Samuel Gibbs/The Guardian

At 1.3kg the Laptop 12 isn’t featherweight but it is nice and compact, easy to fit in bags or on small desks. The generous mechanical trackpad is precise and works well. But the laptop lacks any form of biometrics, with no fingerprint or face recognition, forcing you to enter a pin or password every time you open the laptop or to use secure apps such as password managers, which gets old fast.

Specifications

  • Screen: 12.2in LCD 1920×1200 (60Hz; 186PPI)

  • Processor: Intel Core i3 or i5 (U-series, 13th gen)

  • RAM: 8 or 16GB (up to 48GB)

  • Storage: 512GB (up to 2TB)

  • Operating system: Windows 11 or Linux

  • Camera: 1080p front-facing

  • Connectivity: wifi 6E, Bluetooth 5.3, headphones + choice of 4 ports: USB-C, USB-A, HDMI, DisplayPort, ethernet, microSD, SD

  • Dimensions: 287 x 213.9 x 18.5mm

  • Weight: 1.3kg

Modular ports and performance

The expansion modules slide into sockets in the underside of the laptop to change the ports, which you can change at any time. Photograph: Samuel Gibbs/The Guardian

The Laptop 12 comes with a choice of two Intel 13-generation U-series processors, which are lower-power chips from a few years ago. As tested with the mid-range i5-1334U it won’t win any raw performance awards but was generally up to the task of more than basic computing. It feels responsive in day-to-day tasks but struggles a bit in longer, processor-heavy jobs such as converting video.

The older chip means the battery life is a little on the short side for 2025, lasting about seven to eight hours of light office-based work using browsers, word processors, note-taking apps and email. Use more demanding apps and the battery life shrinks by a few hours. The battery takes about 100 minutes to fully charge using a 60W or greater USB-C power adaptor.

Four expansion cards can be fitted at any one time, but they can be swapped in and out without having to turn off the laptop. Photograph: Samuel Gibbs/The Guardian

The port selection is entirely customisable with a fixed headphone jack and four slots for expansion cards, which are available in a choice of USB-A and USB-C, DisplayPort and HDMI, microSD and SD card readers, or ethernet. Other cards can add up to 1TB of storage and the USB-C cards are available in a range of solid or translucent colours to make things even brighter. It is an excellent system but note the Laptop 12 supports only USB 3.2 Gen 2, not the faster USB4/Thunderbolt common on new machines.

Sustainability

The high-quality plastic body with over-moulded sides feels well built and durable. Photograph: Samuel Gibbs/The Guardian

Framework rates the battery to maintain at least 80% of its original capacity for at least 1,000 full charge cycles. It can easily be replaced along with all the rest of the components, including the RAM and SSD.

Framework sells replacement parts and upgrades through its marketplace but also supports third-party parts. The laptop contains recycled plastic in many components.

Price

The DIY edition of the Framework 12 starts at £499 (€569/$549/A$909) with pre-built systems starting at £749 (€849/$799/A$1,369) with Windows 11.

For comparison, the DIY Framework 13 costs from £799 and the DIY Framework 16 costs from £1,399 . Similarly specced 2-in-1 Windows machines start at about £500.

Verdict

Like previous Framework machines, the Laptop 12 demonstrates that repairable, upgradable and adaptable computers are possible, work well and can be used by more than just the tech savvy. It manages to be fun in a way most mid-range PCs just aren’t.

The keyboard is solid, the trackpad good and the speakers loud. The modular ports are a killer feature that every PC should embrace, while being able to repair or upgrade it easily is still so unusual. The touchscreen is bright but unremarkable, the lack of any biometrics is irritating, and the older processor, while still decently fast for everyday tasks, means the battery life isn’t long by modern standards.

Its biggest problem is cost, as it is about £150-£200 more expensive than similarly specced but closed and locked-down machines. Unless you already have spare storage and RAM lying around, that’s the price you have to pay for the open and modular machine.

Pros: swappable ports, repairable and upgradeable, fun and durable design, compact, lots of colour choices, solid keyboard and trackpad, solid performance for everyday tasks.

Cons: battery life short of best, screen is bright but a little lacklustre, no biometrics, expensive, older processor, wait time for purchases.

The ports can be colour matched to the body or mixed and matched for fun combinations. Photograph: Samuel Gibbs/The Guardian



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

I asked ChatGPT to help me pack for my vacation – try this awesome AI prompt that makes planning your travel checklist stress-free

Published

on


It’s that time of year again, when those of us in the northern hemisphere pack our sunscreen and get ready to venture to hotter climates in search of some much-needed Vitamin D.

Every year, I book a vacation, and every year I get stressed as the big day gets closer, usually forgetting to pack something essential, like a charger for my Nintendo Switch 2, or dare I say it, my passport.



Source link

Continue Reading

AI Research

Denodo Announces Plans to Further Support AI Innovation by Releasing Denodo DeepQuery, a Deep Research Capability — TradingView News

Published

on


PALO ALTO, Calif., July 07, 2025 (GLOBE NEWSWIRE) — Denodo, a leader in data management, announced the availability of the Denodo DeepQuery capability, now as a private preview, and generally available soon, enabling generative AI (GenAI) to go beyond retrieving facts to investigating, synthesizing, and explaining its reasoning. Denodo also announced the availability of Model Context Protocol (MCP) support as part of the Denodo AI SDK.

Built to address complex, open-ended business questions, DeepQuery will leverage live access to a wide spectrum of governed enterprise data across systems, departments, and formats. Unlike traditional GenAI solutions, which rephrase existing content, DeepQuery, a deep research capability, will analyze complex, open questions and search across multiple systems and sources to deliver well-structured, explainable answers rooted in real-time information. To help users operate this new capability to better understand complex current events and situations, DeepQuery will also leverage external data sources to extend and enrich enterprise data with publicly available data, external applications, and data from trading partners.

DeepQuery, beyond what’s possible using traditional generative AI (GenAI) chat or retrieval augmented generation (RAG), will enable users to ask complex, cross-functional questions that would typically take analysts days to answer—questions like, “Why did fund outflows spike last quarter?” or “What’s driving changes in customer retention across regions?” Rather than piecing together reports and data exports, DeepQuery will connect to live, governed data across different systems, apply expert-level reasoning, and deliver answers in minutes.

Slated to be packaged with the Denodo AI SDK, which streamlines AI application development with pre-built APIs, DeepQuery is being developed as a fully extensible component of the Denodo Platform, enabling developers and AI teams to build, experiment with, and integrate deep research capabilities into their own agents, copilots, or domain-specific applications.

“With DeepQuery, Denodo is demonstrating forward-thinking in advancing the capabilities of AI,” said Stewart Bond, Research VP, Data Intelligence and Integration Software at IDC. “DeepQuery, driven by deep research advances, will deliver more accurate AI responses that will also be fully explainable.”

Large language models (LLMs), business intelligence tools, and other applications are beginning to offer deep research capabilities based on public Web data; pre-indexed, data-lakehouse-specific data; or document-based retrieval, but only Denodo is developing deep research capabilities, in the form of DeepQuery, that are grounded in enterprise data across all systems, data that is delivered in real-time, structured, and governed. These capabilities are enabled by the Denodo Platform’s logical approach to data management, supported by a strong data virtualization foundation.

Denodo DeepQuery is currently available in a private preview mode. Denodo is inviting select organizations to join its AI Accelerator Program, which offers early access to DeepQuery capabilities, as well as the opportunity to collaborate with our product team to shape the future of enterprise GenAI.

“As a Denodo partner, we’re always looking for ways to provide our clients with a competitive edge,” said Nagaraj Sastry, Senior Vice President, Data and Analytics at Encora. “Denodo DeepQuery gives us exactly that. Its ability to leverage real-time, governed enterprise data for deep, contextualized insights sets it apart. This means we can help our customers move beyond general AI queries to truly intelligent analysis, empowering them to make faster, more informed decisions and accelerating their AI journey.”

Denodo also announced support of Model Context Protocol (MCP), and an MCP Server implementation is now included in the latest version of the Denodo AI SDK. As a result, all AI agents and apps based on the Denodo AI SDK can be integrated with any MCP-compliant client, providing customers with a trusted data foundation for their agentic AI ecosystems based on open standards.

“AI’s true potential in the enterprise lies not just in generating responses, but in understanding the full context behind them,” said Angel Viña, CEO and Founder of Denodo. “With DeepQuery, we’re unlocking that potential by combining generative AI with real-time, governed access to the entire corporate data ecosystem, no matter where that data resides. Unlike siloed solutions tied to a single store, DeepQuery leverages enriched, unified semantics across distributed sources, allowing AI to reason, explain, and act on data with unprecedented depth and accuracy.”

Additional Information

  • Denodo Platform: What’s New
  • Blog Post: Smarter AI Starts Here: Why DeepQuery Is the Next Step in GenAI Maturity
  • Demo: Watch a short video of this capability in action.

About Denodo

Denodo is a leader in data management. The award-winning Denodo Platform is the leading logical data management platform for transforming data into trustworthy insights and outcomes for all data-related initiatives across the enterprise, including AI and self-service. Denodo’s customers in all industries all over the world have delivered trusted AI-ready and business-ready data in a third of the time and with 10x better performance than with lakehouses and other mainstream data platforms alone. For more information, visit denodo.com.

Media Contacts

pr@denodo.com



Source link

Continue Reading

AI Research

Sakana AI: Think LLM dream teams, not single models

Published

on


Enterprises may want to start thinking of large language models (LLMs) as ensemble casts that can combine knowledge and reasoning to complete tasks, according to Japanese AI lab Sakana AI.

Sakana AI in a research paper outlined a method called Multi-LLM AB-MCTS (Adaptive Branching Monte Carlo Tree Search) that uses a collection of LLMs to cooperate, perform trial-and-error and leverage strengths to solve complex problems.

In a post, Sakana AI said:

“Frontier AI models like ChatGPT, Gemini, Grok, and DeepSeek are evolving at a breathtaking pace amidst fierce competition. However, no matter how advanced they become, each model retains its own individuality stemming from its unique training data and methods. We see these biases and varied aptitudes not as limitations, but as precious resources for creating collective intelligence. Just as a dream team of diverse human experts tackles complex problems, AIs should also collaborate by bringing their unique strengths to the table.”

Sakana AI said AB-MCTS is a method for inference-time scaling to enable frontier AIs to cooperate and revisit problems and solutions. Sakana AI released the algorithm as an open source framework called TreeQuest, which has a flexible API that allows users to use AB-MCTS for tasks with multiple LLMs and custom scoring.

What’s interesting is that Sakana AI gets out of that zero-sum LLM argument. The companies behind LLM training would like you to think there’s one model to rule them all. And you’d do the same if you were spending so much on training models and wanted to lock in customers for scale and returns.

Sakana AI’s deceptively simple solution can only come from a company that’s not trying to play LLM leapfrog every few minutes. The power of AI is in the ability to maximize the potential of each LLM. Sakana AI said:

“We saw examples where problems that were unsolvable by any single LLM were solved by combining multiple LLMs. This went beyond simply assigning the best LLM to each problem. In (an) example, even though the solution initially generated by o4-mini was incorrect, DeepSeek-R1-0528 and Gemini-2.5-Pro were able to use it as a hint to arrive at the correct solution in the next step. This demonstrates that Multi-LLM AB-MCTS can flexibly combine frontier models to solve previously unsolvable problems, pushing the limits of what is achievable by using LLMs as a collective intelligence.”

A few thoughts:

  • Sakana AI’s research and move to emphasize collective intelligence over on LLM and stack is critical to enterprises that need to create architectures that don’t lock them into one provider.
  • AB-MCTS could play into what agentic AI needs to become to be effective and complement emerging standards such as Model Context Protocol (MCP) and Agent2Agent.
  • If combining multiple models to solve problems becomes frictionless, the costs will plunge. Will you need to pay up for OpenAI when you can leverage LLMs like DeepSeek combined with Gemini and a few others? 
  • Enterprises may want to start thinking about how to build decision engines instead of an overall AI stack. 
  • We could see a scenario where a collective of LLMs achieves superintelligence before any one model or provider. If that scenario plays out, can LLM giants maintain valuations?
  • The value in AI may not be in the infrastructure or foundational models in the long run, but the architecture and approaches.

More:



Source link

Continue Reading

Trending