T3 Chat is an aggregator of several Large Language Models (LLMs)—such as ChatGPT, Claude, DeepSeek, and Llama—all in one chat interface. Its standout feature? Speed. I first discovered it a couple of weeks ago, right after its release, and it’s been my sole AI subscription ever since. It saves me money, provides more answers for the price, and offers an overall better experience for just $8/month (though it’s a little less stable—more on that later).
Unlike many AI tools backed by big corporations, T3 Chat is the brainchild of Theo, a tech influencer, and his small team. Despite their size, they’re making rapid progress. In fact, my previous article about T3 Chat from just two weeks ago is already outdated. A lot of new features have been added since then:
Screenshots: The Feature I Was Waiting For
The feature I was most excited about - screenshots - has finally arrived. While ChatGPT and Claude already had this functionality, T3 Chat didn’t include it at launch for some reason. Now, you can send a photo to the LLM of your choice directly in T3 Chat, and it works seamlessly.
This is especially useful when writing or debugging code, as you can screenshot both the code and the error message to save time.
New AI Models Added
T3 Chat has introduced three new models: Llama 3.3 70B, DeepSeek v3, and DeepSeek R1. This makes the subscription even more valuable, especially if you want to try out a variety of free and premium models side-by-side without paying for multiple subscriptions. However, the new models are a bit unstable due to DeepSeek’s lagging API, likely caused by the recent surge in demand. The most reliable models remain GPT 4o-mini, 4o, 3o-mini, and Claude 3.5 Sonnet. I personally prefer Claude as my coding assistant, and GPT-4o as my writing assistant, and rotate between them all the time. With the free tier, you get only 4o-mini.
The limits for these models are 500 messages per week, which is significantly more than what Claude offers, for just a third of the price. For me, this is more than enough. That said, I do wonder how Theo is managing to keep this service afloat with all the tokens we’re burning on his behalf.
Limits Indicator: A Useful Addition
One of the new features is a limits indicator, which shows how many messages you have left and what your weekly cap is. My first reaction was, “Wait, there are limits?”—because I hadn’t even come close to hitting them yet. But now, at least, you’ll know exactly where you stand.
T3 Chat: A Work in Progress
My main issue with T3 Chat is its stability. Even some of the “default” models occasionally throw errors, requiring you to retry your query. To be fair, this happens in Claude’s default app as well, but it can still get frustrating. Additionally, the screenshot feature could use some visual improvements—you don’t see what you’ve sent right away, and the screenshot only becomes visible after the LLM has already responded. As you can see on the screenshot, I caught two "Pokemons" at once - both the visual bug with the screenshot and the "stream failed" error. But these bugs are rare.
Despite its minor flaws, I still really enjoy using T3 Chat. It’s exciting to see it improve almost daily, thanks to Theo and his team’s relentless updates. For just $8/month, you get access to several cutting-edge AI models, a better user experience, faster responses (due to an improved frontend), and more generous limits compared to the original apps. As I mentioned earlier, this is my sole AI subscription at the moment, and I’d like to keep it that way.
You can try T3 Chat here: