Close Menu
TechBrunchTechBrunch
  • Home
  • AI
  • Apps
  • Crypto
  • Security
  • Startups
  • TechCrunch
  • Venture

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

After data is wiped out, Kiranapro co-founders cannot rule out external hacks

June 7, 2025

Why investing in a growing AI startup is risky and more complicated

June 6, 2025

Humanity appoints national security experts to governing trusts

June 6, 2025
Facebook X (Twitter) Instagram
TechBrunchTechBrunch
  • Home
  • AI

    OpenAI seeks to extend human lifespans with the help of longevity startups

    January 17, 2025

    Farewell to the $200 million woolly mammoth and TikTok

    January 17, 2025

    Nord Security founder launches Nexos.ai to help enterprises move AI projects from pilot to production

    January 17, 2025

    Data proves it remains difficult for startups to raise capital, even though VCs invested $75 billion in the fourth quarter

    January 16, 2025

    Apple suspends AI notification summaries for news after generating false alerts

    January 16, 2025
  • Apps

    Trump Mask feud was perfect for X and jumped on the app store chart

    June 6, 2025

    iOS 19: All the rumor changes that Apple could bring to the new operating system

    June 6, 2025

    WWDC 2025: What to expect from this year's meeting

    June 6, 2025

    The court denied requests to suspend awards regarding Apple's App Store payment fees

    June 6, 2025

    Perplexity received 780 million questions last month, the CEO says

    June 5, 2025
  • Crypto

    xNotify Polymarket as partner in the official forecast market

    June 6, 2025

    Circle IPOs are giving hope to more startups waiting to be published to more startups

    June 5, 2025

    GameStop bought $500 million in Bitcoin

    May 28, 2025

    Vote for the session you want to watch in 2025

    May 26, 2025

    Save $900 + 90% from 2 tickets to destroy 2025 in the last 24 hours

    May 25, 2025
  • Security

    After data is wiped out, Kiranapro co-founders cannot rule out external hacks

    June 7, 2025

    Humanity appoints national security experts to governing trusts

    June 6, 2025

    Italian lawmakers say Italy used spyware to target immigrant activists' mobile phones, but not for journalists

    June 6, 2025

    Humanity unveils custom AI models for US national security customers

    June 5, 2025

    Unlock phone company Cellebrite to acquire mobile testing startup Corellium for $170 million

    June 5, 2025
  • Startups

    7 days left: Founders and VCs save over $300 on all stage passes

    March 24, 2025

    AI chip startup Furiosaai reportedly rejecting $800 million acquisition offer from Meta

    March 24, 2025

    20 Hottest Open Source Startups of 2024

    March 22, 2025

    Andrill may build a weapons factory in the UK

    March 21, 2025

    Startup Weekly: Wiz bets paid off at M&A Rich Week

    March 21, 2025
  • TechCrunch

    OpenSea takes a long-term view with a focus on UX despite NFT sales remaining low

    February 8, 2024

    AI will save software companies' growth dreams

    February 8, 2024

    B2B and B2C are not about who buys, but how you sell

    February 5, 2024

    It's time for venture capital to break away from fast fashion

    February 3, 2024

    a16z's Chris Dixon believes it's time to focus on blockchain use cases rather than speculation

    February 2, 2024
  • Venture

    Why investing in a growing AI startup is risky and more complicated

    June 6, 2025

    Startup Battlefield 200: Only 3 days left

    June 6, 2025

    Book all TC Stage Exhibitor Tables before ending today

    June 6, 2025

    Less than 48 hours left until display at TC at all stages

    June 5, 2025

    TC Session: AI will be on sale today at Berkeley

    June 5, 2025
TechBrunchTechBrunch

TTT models could be the next frontier for generative AI

TechBrunchBy TechBrunchJuly 17, 20244 Mins Read
Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Telegram Email


After years of dominance of a form of AI called Transformers, the search for new architectures has begun.

Transformers are the foundation of OpenAI's video generation model Sora, and are at the core of text generation models such as Anthropic's Claude, Google's Gemini, and GPT-4o. But Transformers are starting to run into technical hurdles, particularly those related to computation.

Transformers, at least when running on commodity hardware, are not particularly efficient at processing and analyzing vast amounts of data, which is why as companies build and expand infrastructure to accommodate the requirements of transformers, electricity demand is growing exponentially, perhaps unsustainably.

One promising architecture proposed this month is Test-Time Training (TTT), which was developed over the course of a year and a half by researchers at Stanford University, UC San Diego, UC Berkeley, and Meta. The team argues that not only can the TTT model process much more data than the Transformer, it can do so without consuming as much computing power.

Transformer Hidden Status

The basic component of a Transformer is the “hidden state”, which is essentially a long list of data. When a Transformer processes something, it “remembers” what it processed by adding an entry to the hidden state. For example, if the model is processing books, the values ​​in the hidden state might be representations of words (or parts of words), etc.

“If you think of the Transformer as an intelligent entity, the lookup table, that hidden state, is the Transformer's brain,” Yu Sun, a postdoctoral researcher at Stanford University and collaborator on the TTT study, told TechCrunch. “This specialized brain is what enables the Transformer's well-known features, such as in-context learning.”

Hidden state is one of the things that makes Transformers so powerful, but it also holds them back: for the Transformer to “say” even one word about a book it has read, the model must scan the entire lookup table, a task as computationally intensive as re-reading the entire book.

So Sun and his team came up with the idea of ​​replacing the hidden state with a machine learning model—an AI nesting doll, if you like, a model within a model.

This gets a bit technical, but the gist is that the internal machine learning model of the TTT model, unlike a Transformer lookup table, does not get larger and larger as it processes additional data. Instead, it encodes the data it processes into representative variables called weights. This is what makes the TTT model so performant: the size of the internal model remains the same regardless of how much data the TTT model processes.

Sun believes future TTT models will be able to efficiently process billions of pieces of data, from words to images, voice recordings and videos, far beyond the capabilities of current models.

“Our system can say X words about a book without the computational complexity of re-reading the book X times,” Sun said. “Large-scale video models based on Transformers such as Sora can only process 10-second videos because they only have a lookup table 'brain'. Our ultimate goal is to develop a system that can process longer videos that resemble the visual experience in human life.”

Skepticism about the TTT model

So, will the TTT models eventually replace Transformers? Possibly, but it's too early to say for sure.

The TTT model is not a replacement for the Transformer, and because the researchers only developed two small models for their study, it is methodologically difficult to compare the TTT to some of the larger Transformer implementations currently on the market.

“I think it's a really interesting innovation, and if it supports the claim that data leads to efficiency gains then that's great news, but I don't know if it's better than existing architectures,” says Mike Cook, a senior lecturer in the School of Information at King's College London, who was not involved in the TTT research. “When I was an undergraduate, an old professor of mine used to make this joke: 'How do you solve a computer science problem? You add another layer of abstraction. Adding a neural network within a neural network definitely reminds me of that.'”

Either way, the accelerated research into transformer alternatives signals a growing awareness of the need for innovative solutions.

This week, AI startup Mistral released Codestral Mamba, a model based on another alternative to Transformers called a state-space model (SSM). Like the TTT model, SSMs are more computationally efficient than Transformers and can scale to larger amounts of data.

AI21 Labs is also researching SSMs, as is Cartesia, which developed some of the first SSMs, as well as the Codestral Mamba and its namesake Mamba and Mamba-2.

If these efforts are successful, for better or worse, generative AI could become even more accessible and pervasive than it is today.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

OpenAI seeks to extend human lifespans with the help of longevity startups

January 17, 2025

Farewell to the $200 million woolly mammoth and TikTok

January 17, 2025

Nord Security founder launches Nexos.ai to help enterprises move AI projects from pilot to production

January 17, 2025

Data proves it remains difficult for startups to raise capital, even though VCs invested $75 billion in the fourth quarter

January 16, 2025

Apple suspends AI notification summaries for news after generating false alerts

January 16, 2025

Nvidia releases more tools and guardrails to help enterprises adopt AI agents

January 16, 2025

Leave A Reply Cancel Reply

Top Reviews
Editors Picks

7 days left: Founders and VCs save over $300 on all stage passes

March 24, 2025

AI chip startup Furiosaai reportedly rejecting $800 million acquisition offer from Meta

March 24, 2025

20 Hottest Open Source Startups of 2024

March 22, 2025

Andrill may build a weapons factory in the UK

March 21, 2025
About Us
About Us

Welcome to Tech Brunch, your go-to destination for cutting-edge insights, news, and analysis in the fields of Artificial Intelligence (AI), Cryptocurrency, Technology, and Startups. At Tech Brunch, we are passionate about exploring the latest trends, innovations, and developments shaping the future of these dynamic industries.

Our Picks

After data is wiped out, Kiranapro co-founders cannot rule out external hacks

June 7, 2025

Why investing in a growing AI startup is risky and more complicated

June 6, 2025

Humanity appoints national security experts to governing trusts

June 6, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

© 2025 TechBrunch. Designed by TechBrunch.
  • Home
  • About Tech Brunch
  • Advertise with Tech Brunch
  • Contact us
  • DMCA Notice
  • Privacy Policy
  • Terms of Use

Type above and press Enter to search. Press Esc to cancel.