Qdrant founding team

Open source vector database startup Qdrant raises $28M

Qdrant founding team

Image Credits: Qdrant — Qdrant founders

Qdrant, the company behind the eponymous open source vector database, has raised $28 million in a Series A round of funding led by Spark Capital.

Founded in 2021, Berlin-based Qdrant is seeking to capitalize on the burgeoning AI revolution, targeting developers with an open source vector search engine and database — an integral part of generative AI, which requires relationships be drawn between unstructured data (e.g. text, images or audio that isn’t labelled or otherwise organized), even when that data is “dynamic” within real-time applications. As per Gartner data, unstructured data makes up around 90% of all new enterprise data, and is growing three times faster than its structured counterpart.

The vector database realm is hot. In recent months we’ve seen the likes of Weaviate raise $50 million for its open source vector database, while Zilliz secured secured $60 million to commercialize the Milvus open source vector database. Elsewhere, Chroma secured $18 million in seed funding for a similar proposition, while Pinecone nabbed $100 million for a proprietary alternative.

Qdrant, for its part, raised $7.5 million last April, further highlighting the seemingly insatiable appetite investors have for vector databases — while also pointing to a planned growth spurt on Qdrant’s part.

“The plan was to go into the next fundraising in the second quarter this year, but we received an offer a few months earlier and decided to save some time and start scaling the company now,” Qdrant CEO and co-founder Andre Zayarni explained to TechCrunch. “Fundraising and hiring of right people always takes time.”

Of note, Zayarni says that the company actually rebuffed a potential acquisition offer from a “major database market player” at the same time of receiving a follow-on investment offer. “We went with the investment,” he said, adding that they’ll use the fresh cash injection to build out its business team, given that the company substantively consists of engineers at the moment.

Binary logic

In the intervening nine months since its last raise, Qdrant has launched a new super-efficient compression technology called binary quantization (BQ), focused on low-latency, high-throughput indexing which it says can reduce memory consumption by as much as 32 times and enhance retrieval speeds by around 40 times.

“Binary quantization is a way to ‘compress’ the vectors to simplest possible representation with just zeros and ones,” Zayarni said. “Comparing the vectors becomes the simplest CPU instruction — this makes it possible to significantly speed up the queries and save dramatically on memory usage. The theoretical concept is not new, but we implemented it the way that there is very little loss of accuracy.”

BQ might not work for all all AI models though, and it’s entirely up to the user to decide with compression option will work best for their use-cases — but Zayarni says that the best results they found were with OpenAI’s models, while Cohere also worked well as did Google’s Gemini. The company is currently benchmarking against models from the likes of Mistral and Stability AI.

It’s such endeavors that have helped attract high-profile adopters, including Deloitte, Accenture, and — arguably the highest profile of them all — X (née Twitter). Or perhaps more accurately, Elon Musk’s xAI, a company developing the ChatGPT competitor Grok and which debuted on the X platform last month.

While Zayarni didn’t disclose any details of how X or xAI was using Qdrant due to a non-disclosure agreement (NDA), it’s reasonable to assume that it’s using Qdrant to process real-time data. Indeed, Grok uses a generative AI model dubbed Grok-1 trained on data from the web and feedback from humans, and given its (now) tight alignment with X, it can incorporate real-time data from social media posts into its responses — this is what is known today as retrieval augmented generation (RAG), and Elon Musk has teased such use-cases publicly over the past few months.

Qdrant doesn’t reveal which of its customers are using the open source Qdrant incarnation and which are using its managed services, but it did point to a number of startups, such as GitBook, VoiceFlow, and Dust, which are “mostly” using its managed cloud service — this, effectively, saves resource-restricted companies from having to manage and deploy everything themselves as they would have to with the core open source incarnation.

However, Zayarni is adamant that the company’s open source credentials are one of the major selling points, even if a company elects to pay for add-on services.

“When using a proprietary or cloud-only solution, there is always a risk of vendor lock-in,” Zayarni said. “If the vendor decides to adjust the pricing, or change other terms, customers need to agree or consider a migration to an alternative, which isn’t easy if it’s a heavy-production use-case. With open source, there is always more control over your data, and it is possible to switch between different deployment options.”

Alongside the funding today, Qdrant is also officially releasing its managed “on-premise” edition, giving enterprises the option to host everything internally but tap the premium features and support provided by Qdrant. This follows last week’s news that Qdrand’s cloud edition was landing on Microsoft Azure, adding to the existing AWS and Google Cloud Platform support.

Aside from lead backer Spark Capitali, Qdrant’s Series A round included participation from Unusual Ventures and 42cap.

Aerospike raises $109M for its real-time database platform to capitalize on the AI boom

Digital holographic blue transparent drawer with data inside.

Image Credits: Yuichiro Chino / Getty Images

NoSQL database Aerospike today announced that it has raised a $109 million Series E round led by Sumeru Equity Partners. Existing investor Alsop Louie Partners also participated in this round.

In 2009, the company started as a key-value store with a focus on the adtech industry; Aerospike has since diversified its offerings quite a bit. Today, its core offering is a NoSQL database that’s optimized for real-time use cases at scale.

In 2022, Aerospike added document support and then followed that up with graph and vector capabilities — two database features that are crucial for building real-time AI and ML applications.

“We were founded primarily as a real-time data platform that can work with data at really high scale, or, as we call it, unlimited scale,” Aerospike CEO Subbu Iyer said. “We’ve been fortunate enough that a lot of our customers have either started their journey at scale with us, or started the journey earlier and grown into the platform. So our premise has held good that real-time data and real-time access to data is going to be important pretty much across every industry. Our founding principles were really to deliver real-time performance with data at any scale, and the lowest [total cost of ownership] on the market.”

In part, Aerospike, which offers its service as a hosted platform and on-premises, is able to deliver on this promise through its hybrid memory architecture that allows it to augment the use of RAM to speed up data access with fast flash storage — or any combination of the two. Aerospike competitor Redis recently acquired Speedb to offer similar capabilities — also with an eye on helping its customers reduce costs.

Image Credits: Aerospike

Today, the company’s customers include the likes of Airtel, TransUnion, Snap and TechCrunch parent company Yahoo.

Right now, though, it’s definitely the AI boom that is driving a lot of interest in Aerospike and the company wants to be in a position to capitalize on that through this new funding round.

Unsurprisingly, that means the company plans to use the new funding to accelerate its innovations around AI, which are mostly focused on its graph and vector capabilities. Iyer told me that Aerospike is specifically looking at combining those two capabilities.

“Going forward, there are some synergistic ways in which graph and vectors can come together,” he said. “A simple use case I use for this, for example, is if you’re looking for a specific document and you have embeddings and stored them in a vector database, you want to use a vector search to get to that specific document. But if you’re looking for a set of similar documents, a vector search can get you to the neighborhood and then a graph can get you a similar corpus of documents because of relationships and stuff.”

That, of course, is also what got investors interested in the company. Aerospike raised its last round in 2019 and, according to the company’s CEO, it didn’t need to raise now, but there is a large opportunity for Aerospike to capitalize on, something Sumeru co-founder and managing director George Kadifa also stressed.

“AI is transforming the economy and presents new opportunities for growth and innovation,” Kadifa said. “Aerospike, with its impressive customer base and performance advantage at scale, is uniquely positioned to become a foundational element for the next generation of real-time AI applications.”