When A/B testing is not enough, use Contextual Bandits.

Online advertising is changing and Machine learning is now everywhere. Learn how to adapt your creative according to context and outcomes automatically without A/B-sitting your campaigns.

pic
Our clients:
image image image

Why Contextual Bandits?

Brands like Netflix, Booking, Hubspot and Expedia have all discovered the potential in Contextual Bandits for growing their business.
Bandit algorithms have shown 60% reduction in cost when compared to traditional A/B testing.

Adaptable Splits

Automatically learn the optimal traffic split to maximize revenue while keeping an eye for new opportunities.

Personalized Content

Contextual bandits take into account user features (e.g. geography, time of day, history) and learn the best match between user and content.

No AB-sitting

The process is completely automated, no need to babysit (AB-sit) until convergence.

How we can help you grow your business

We specialize in implementing bandit algorithms in production using state-of-the-art Machine learning and AI.

Optimize for your objective

We have extensive experience customizing bandit algorithms to your specific objective.

Control your costs

We help you converge better on the optimal variant and reduce advertising budget significantly.

Improve customer experience

We help you dynamically target your audience using advanced machine learning and feature engineering.

pic

Register for the Webinar and learn

We've gathered use-cases from our clients and we'll demonstrate the effect bandits had on their bottom-line.

  • We'll Cover A/B Testing and discuss when it's most suited.
  • We'll talk about Recommendation Systems and use-cases with a large number of variants.
  • We'll learn about Multi-armed bandits and speed of convergence.
  • We'll see how to use the User funnel as Context for bandits.
Register for the Webinar

Frequently Asked Questions

We've received a lot of interest in recent years, here are some of the most common questions.

I never AB-tested my site, is using contextual bandits premature for me?

Contextual Bandits require a more advanced tech stack than A/B testing, we recommend clients to start their journey by exploring the potential on traditional A/B testing first.

I do Multivariate testing, is it the same as Bandits?

MV-testing, is essentially a split of more than 2 variants (A-B-C-D testing). They do not account for the similarity between variants (e.g. both A and C have the same background) and therefore the convergence is slower.

Usage as a recommendation engine?

Recommendation is tricky. On the one hand, you'd like to suggest the items that sell well. On the other hand, you don't want to miss new opportunities and trends. Bandits are a great trade-off between exploration and exploitation.

How to do dynamic Personalization?

Assume you have a product page, and you need to control the position of elements, the key figure and the backgound. Contextual Bandits test and find the best combination for your audience dynamically.

Our Podcast

Latest news, tips and best practices.

A/B Testing and AB-sitting

July 1st, 2021

In this episode we introduce the basic concept of A/B testing and the assumption it makes. We discuss the monitoring phase (aka "AB-sitting") and the role of the account manager.

How long should you run an experiment

July 14th, 2021

We all want quick results, but stopping an experiment too quickly can cause suboptimal results. In this episode we'll discuss statistical significance and stopping criteria.

Mutlivariate testing and Factorial design

July 28th, 2021

Have more than 2 alternatives to test? Welcome to the realm of multivariate testing. Are some of them related to one another ? maybe you should consider a factorial design.

Got Questions?

Drop us a line