Skip to content

Things Not Strings: Understanding Search Intent with Better Recall

For every growing company using an out-of-the-box search solution there comes a point when the corpus and query volume get so big that developing a system to understand user search intent is needed to consistently show relevant results. 

We ran into a similar problem at DoorDash where, after we set up a basic “out-of-the-box” search engine, the team focused largely on reliability.

Building a Gigascale ML Feature Store with Redis, Binary Serialization, String Hashing, and Compression

When a company with millions of consumers such as DoorDash builds machine learning (ML) models, the amount of feature data can grow to billions of records with millions actively retrieved during model inference under low latency constraints.

Improving Online Experiment Capacity by 4X with Parallelization and Increased Sensitivity

Data-driven companies measure real customer reactions to determine the efficacy of new product features, but the inability to run these experiments simultaneously and on mutually exclusive groups significantly slows down development.