Clarifai’s new reasoning engine makes AI models faster and less expensive

On Thursday, the AI platform Clarifai announced a new reasoning engine that it claims will make running AI models twice as fast and 40% less expensive. Designed to be adaptable to a variety of models and cloud hosts, the system employs a range of optimizations to get more inference power out of the same hardware.

“It’s a variety of different types of optimizations, all the way down to CUDA kernels to advanced speculative decoding techniques,” said CEO Matthew Zeiler. “You can get more out of the same cards, basically.”

The results were verified by a string of benchmark tests by the third-party firm Artificial Analysis, which recorded industry-best records for both throughput and latency.

The process focuses specifically on inference, the computing demands of operating an AI model that has already been trained. That computing load has grown particularly intense with the rise of agentic and reasoning models, which require multiple steps in response to a single command.

First launched as a computer vision service, Clarifai has grown increasingly focused on compute orchestration as the AI boom has drastically increased demand for both GPUs and the data centers that house them. The company first announced its compute platform at AWS re:Invent in December, but the new reasoning engine is the first product specifically tailored for multi-step agentic models.

The product comes amid intense pressure on AI infrastructure, which has spurred a string of billion-dollar deals. OpenAI has laid out plans for as much as $1 trillion in new data center spending, projecting nearly limitless future demand for compute. But while the hardware buildout has been intense, Clarifai’s CEO believes there is more to be done in optimizing the infrastructure we already have.

“There’s software tricks that take a good model like this further, like the Clarifai reasoning engine,” Zeiler says, “but there’s also algorithm improvements that can help combat the need for gigawatt data centers. And I don’t think we’re at the end of the algorithm innovations.”

Continue Reading