Redis, the world's fastest data platform, recently unveiled an impactful expansion to its AI strategy during the Redis Released 2025 event. The keynote address by CEO Rowan Trollope highlighted several key initiatives, including the acquisition of Decodable, the introduction of LangCache, and numerous advancements to bolster Redis' position as a critical infrastructure layer for AI applications.
"As AI enters its next phase, the challenge isn't proving what language models can do; it's giving them the context and memory to act with relevance and reliability," Trollope noted. He emphasised how Redis' strategic acquisition of Decodable will streamline data pipeline developments, enabling data conversion into actionable context swiftly and efficiently within Redis.
Decodable, established by Eric Sammer, offers a serverless platform that simplifies the ingestion, transformation, and delivery of real-time data. By joining forces with Redis, Decodable aims to enhance AI capabilities and seamlessly connect developers with real-time data sources.
Redis also premiered LangCache, a fully-managed semantic caching service that cuts latency and token usage by up to 70% in LLM-reliant applications. The caching solution optimises performance and reduces costs significantly, supporting Redis’ mission to bolster AI agent efficiency.
The key advantages of LangCache include:
Redis continuously adapts to the swift advancements in AI. Recent integrations make it easier for developers to leverage existing AI frameworks and tools. New tools, such as AutoGen and Cognee, along with LangGraph enhancements, provide scalable memory solutions for agents and chatbots.
Developers can now:
Additional Redis for AI Enhancements
Redis' evolution continues with key improvements in hybrid search and data compression technologies within AI applications. Introduced upgrades include:
These latest updates ensure that Redis remains a pivotal platform for developing high-quality, reliable AI agents and frameworks.