Understand why Superlinked is the solution for better search and recommendation systems
“We saw 2x more keywords search 6 months after the ChatGPT launch.” Algolia CTO, 2023They have 17,000 customers accounting for 120B searches/month. This trend isn’t isolated. Across industries, we’re seeing a shift towards more sophisticated search queries that blend multiple concepts, contexts, and data types. Vector Search with text-only embeddings (& also multi-modal) fails on complex queries, because complex queries are never just about text. They involve other data too! Consider these examples:
"recent news about crop yield"
. After collecting your data, you define your schema, ingest data and build index like this:
Define your schemas
Schema definition code
Create embedding spaces
Encoder definition code
Build your index
Index definition code
Define parameterized queries
Query definition code
Set up the execution environment
Executor setup and API usage
Handle natural language queries
Natural language query examples
OpenAI embeddings result in noisy, non-monotonic cosine similarity scores. For example, CosSim(25, 50) equals to 0.69 when CosSim(32, 50) equals 0.42 meaning 25 is more similar to 50 than 32 which doesn't make sense. Superlinked number embeddings avoid such inconsistencies by design.