Marketer's Dilemma: A/B Testing vs. Gut Feel – Can We Just Flip a Coin?
When it just doesn't make sense to experiment
Jeff Bezos once said, "All of my best decisions in business and in life have been made with heart, intuition, guts... not analysis." Coming from the leader of Amazon—a company renowned for its data-driven approach and relentless experimentation—this might sound a bit counterintuitive. It’s well established that Amazon's success is linked to the sheer number of experiments they run daily.
So, what's the right path for the rest of us?
Conventional marketing wisdom tells us that testing - and the continuous, incremental improvement that results from testing - is the holy grail - the difference between achieving 10X outcomes and settling for mediocrity in the long run. While generally speaking it might hold true, it is not a dictum; your situation and unique set of challenges defy the generalities. Sometimes, testing doesn't accelerate outcomes; in fact, it might even get in the way.
Harmony Between Analysis and Intuition
Analysis and intuition aren't enemies; they can actually be the best of friends if we let them. Testing helps us optimize existing paths or explore new ones, but occasionally, the best way forward isn't through experimentation alone. Allowing room for heart and intuition can lead to breakthroughs that data might not immediately reveal.
When Testing May Not Be the Best Approach
Time-Sensitive Decisions
In fast-moving markets or during critical periods (think holiday seasons or crisis events), you will miss the opportunity while waiting around for A/B test results. Relying on experience and best practices becomes essential to make timely decisions.
Opportunity Cost Outweighs Benefits - A.K.A the juice isn't worth the squeeze
Running A/B tests requires time, effort, and often specialized tools. For smaller teams or companies with limited resources, constant experimentation on low-impact or cosmetic elements (like 14 shades of blue button) probably isn't worth it since it rarely moved the needle in the long term. Time spent on trivial tests is time not spent on potentially high-impact activities.
Limited User Base or Low Traffic
For early-stage startups or niche products with a small audience, achieving statistical significance with A/B tests could take forever and is simply not practical. In these cases, qualitative feedback from user interviews or industry benchmarks can be more valuable. The road to product-market fit often runs through super users who use, love, and refer your product. Surveys and interviews should be the preferred techniques to identify what to prioritize.
Unimplementable Changes
Testing ideas that can't be realistically implemented due to technical constraints, budget limitations, or resource shortages is not practical. Even if the test shows positive results, inability to execute the changes renders the effort futile. Prioritize tests that lead to feasible and sustainable solutions.
Major Overhauls
When planning significant redesigns or launching major features, A/B testing might not cut it. Usability testing, focus groups, or beta launches can provide deeper insights
Minimum Viable Product (MVP) Launches
When introducing a new product or feature to gather initial user feedback, you only know so much. Start with the best possible starting point based on existing body of knowledge. Subsequently, based on outcomes, develop hypotheses, and run tests. You can’t steer a parked car—get it moving and adjust the course as you go
Areas Already Performing Well - Avoid Diminishing Returns
Investing time in testing aspects of your marketing that are already yielding strong results will lead to diminishing returns. Unless there's evidence suggesting significant room for improvement, resources are generally better spent on underperforming areas
Elements Without a Clear Hypothesis
Testing without a strategic rationale or expected outcome is like shooting arrows in the dark. It rarely hits the target and isn't an effective use of resources.
Regulatory Constraints
In industries like financial services, fair lending laws that prohibits different set of treatments based on certain attributes limits the scope of experimentation around personalization and offers
High Impact Testing Elements
While this article is primarily focused on what “not” to test, below are a few top elements where testing leads to highest impact
Landing Page Optimization
My personal favorite and a real game-changer. Companies see significant conversion rate improvements after optimizing landing pages. According to HubSpot, businesses have achieved up to a 55% increase in conversions through effective landing page optimization. This is the antidote to ever-increasing costs on ad platforms. You might not control competitor spend or eyeballs on these platforms, but you own your landing pages !
In one of my previous roles, focusing on landing page optimization led to a 60% improvement in customer acquisition cost (CAC) over 12 months. We transitioned from an unsustainable ROI to a highly desirable LTV/CAC ratio.
Call-to-Action (CTA) Optimization:
Optimizing CTAs can increase conversion rates by up to 200% (Source: Word Stream).
Value proposition Clarity:
Clarifying your value proposition can boost conversion rates dramatically. Marketing Experiments found that clear value propositions can increase conversions by 30% to 90%. The goal is to enhance both differentiation and resonance—one without the other won't suffice. Craft concise, customer-centric messaging that highlights who it's for and addresses key pain points. Test different headlines and subheadings to see what strikes a chord.
Example: Grammarly: "Great writing, simplified." It speaks directly to writers, highlighting simplicity and enhancing reputation. It differentiates itself by emphasizing responsible AI capabilities.
User Onboarding Experience:
Experimenting with email subject lines, content personalization, interactive tutorials, and creating feedback loops for learning can significantly impact user retention. User onboarding is strongly correlated with churn and retention metrics. Incremental improvements in the onboarding experience can improve user retention rates by up to 30% at the end of the first year. That's the difference between a product thriving with a loyal user base and one struggling to keep the lights on.
Social Proof and Trust Signals:
A whopping 92% of consumers trust recommendations from friends and family over other forms of advertising. Incorporating social proof and trust signals can increase conversion rates by up to 34%, according to VWO case studies. It doesn’t get any better than customers doing the marketing for you.
Simplified Checkout Process:
The average cart abandonment rate hovers around 70% (Baymard Institute), with 28% of shoppers abandoning carts due to a lengthy or complicated checkout process. By focusing on form optimization and introducing a guest checkout option, you can recover a substantial portion of these lost sales. Only ask for essential information
From Data-Driven to Data-Guided: A Personal Tale
When I joined First Republic Bank (now part of JPMorgan Chase) heading Growth Marketing, I was astonished to learn that the bank had sustained a 20% CAGR for over 35 years without a marketing experimentation roadmap or extensive data-driven strategy. Their secret sauce? Following the north star of providing extraordinary client and prospect experiences. Everything else took care of itself.
There was no growth or acquisition marketing engine until some of us were brought on board. The bank grew because customers loved the bank and referred it to their family and friends—a level of organic virality unheard of in the banking industry. Sometimes - in order to learn - you need to unlearn - I had to unlearn a lot ! My mental model shifted from being data-driven to data-guided. A minor tweak but a major philosophical change. Being data-guided makes room for decisions driven by intuition and heart, leading to discoveries and serendipities you might never have imagined.
Conclusion
While A/B testing remains a powerful tool in the growth marketer's arsenal, it's not always the best solution for every situation. By understanding when to experiment and when to rely on other methodologies—or even just your gut—marketers can make more informed decisions and drive meaningful results. The key is to approach growth marketing with a flexible mindset, ready to employ the most appropriate tools and strategies for each unique challenge.
In the end, the question isn't simply "to experiment or not to experiment," but rather, "How can we best drive growth given our specific context and constraints?"