• Decrease Text SizeIncrease Text Size

Redis Vector Search

Redis Vector Search refers to the vector similarity search capabilities added to Redis Stack and Redis Enterprise starting in 2022, leveraging the FT.SEARCH command of the RediSearch module to perform HNSW or FLAT index queries over vector fields. Redis brings sub-millisecond latency and in-memory throughput to vector search, making it especially attractive for low-latency recommendation, fraud detection, and real-time RAG applications where every millisecond counts. The platform supports cosine, Euclidean, and inner product distance metrics, hybrid queries combining vectors with structured filters, and tag fields for multi-tenant isolation. Redis Cloud offers managed deployments across AWS, GCP, and Azure with vector capabilities turned on by default. Because Redis is already deployed at most enterprises for caching and session management, adding vector search reuses existing operational expertise and AI governance controls. The trade-off is RAM cost — large vector indexes can be expensive to keep entirely in memory — which Redis mitigates with Redis Flex (RAM + flash tiering).

Redis Vector Search + Centralpoint: Centralpoint integrates Redis Vector Search as a low-latency option in the model-agnostic stack, pairing it with any generative LLM for real-time retrieval-augmented chatbots. The platform meters tokens centrally, keeps prompts and skills local, and embeds Redis-backed chatbots across portals through one line of JavaScript.


Related Keywords:
Redis Vector Search,,