If you want to make an impact with CRO… you need to stop optimizing and start experimenting.
In this interview, Ben Labay, CEO of Speero — an agency that helps marketing and product teams build and scale CRO — shares everything he knows about CRO. He knows a lot.
He and Alina, Co-CEO and Co-Founder of Chili Piper, discuss:
Alina
Alina Vandenberghe, Co-CEO and co-Founder of Chili Piper. I have Ben here with me from Speero, and we're gonna talk about CRO and the importance of devising experiments that are impactful.
What mistakes do you think companies do when they think about their optimization efforts?
Ben
I think that resourcing is the big area of mistakes. Right now it's quite easy to pull back and retreat, especially on the marketing side to pull back and retreat and not be aggressive because of that uncertainty.
It's that is something that's hard to give a solution for, because you have to know your own burn rate and your own numbers and your own risk tolerance. So it is an opportunity to recalibrate and be a little bit more efficient and stuff like that. But I think the biggest mistakes are around resourcing marketing, in particular on non CRO programs.
If you can pull back on optimization programs and go more towards experimentation programs, that's what you typically want to. So you stop tweaking and start radically introducing new concepts, like radical new landing pages or offers or small types of packages that kind of bridge the gap of your burn rate or something like that.
New webinar, workshop, relation, co-marketing kind of things that you might experiment with, right? So instead of optimization, I think right now is the time to do more experimentation, right? The difference between is like tweaking. stuff Versus trying a bunch of stuff and seeing what works.
There's a kind of a dividing line. So the mistake is to keep on resourcing the tweaks. You gotta go bigger and bolder right now in our CRO activities, we always put the things that might be highest risk, but high reward experiments at the top.
Alina
Actually for me, CRO is something I didn't know existed until few years ago.
And it's so exciting for me to start seeing results right away cuz I come from product. And in product it takes months before you can see results because you have to build something, you have to get people to use it. Whereas in zero experiments you get to see results right away. What are some of the things that people can start doing immediately with low budgets and even if they have a low risk tolerance that they may not be thinking about right now?
Ben
It's really easy to radically change a campaign landing page. You have a channel that's coming into a landing page. So, for example, we're at this conference, CXL Live, and there's a lot of talks on messaging and positioning. Try some radically new different approaches. And I don't mean just a different H1.
Try incredibly simplified, so the page only exists above the fold. That's it. And then you want to, you're trying to make that user go in versus down, right? This is super simple and very radical. And even for B2B audiences you'll learn something, you'll measure a change in behavior.
And so it's low risk because you'll get information to pivot on, it's part of research in a way. It's high reward because you're gonna learn something. And so those are the first steps to like shake up the boat, so to speak.
I'm a big fan of intentional intent capture, right? Getting people to describe themselves. So in a way, a navigation on a website is a very passive way to collect intent information and get the user to self-select. So they're like, "okay, let me drop down this thing and navigate there." Very passive. You're relying on the action of the user.
But if you put it right in front of them. Who are you? What are you interested in? Are you X, Y, or Z. This is a way to be more active in, in, in characterizing and grabbing that user intention, right? So progressive profiling or intent profiling and sync capture, Ttere's a bunch of names for it. But getting them instead of those passive approaches, getting them to go in right there.
Almost like an app. Like turning your website into an app. But you can do it low budget, really simple landing page. You just cut it and focus on a couple like really clear, crisp action.
Alina
What about tools? What tools do you typically have with your clients and yourself and your experimentations?
Ben
We kind of work with the tools that the clients have, so we're tool agnostic.
We don't have like explicit partnerships or our strategic partnerships with any particular tools. Optimizly is a very common one. But Chameleon, but Jamele on safety site spec or some. Those are the traditional ones. And we work with enterprises, they'll have tools like this.
But also working with ones like Interaction Studio, which is the Salesforce version of a CDP type of tool.
There's the hybrid new breed of tools like Intelimize and Evolve and Mutiny, which do a lot of personalization and adaptive learning kinds of testing. So we're using those.
Those are a family of testing tools on their own.
And there's tools like Conductrics out there that, that kind of do a blend of everything. Chameleon as well. They blend that kind of stuff.
But we're using other tools like Jebbit for example, building out interactive quizzes.
Chili Piper, you're allowing off ramps of sales-led versus product led. So you're introducing like scheduling call features. We play around with that quite a bit.
Alina
In our client base as well, there are million tools that have disparate data points for companies to work well at scale.
There's Salesforce, there's your CSM tool, and then you have your marketing tool, and then you might have something separate for your account managers, something separate for your SDRs. That's where Rev Ops comes into play to make sure that these tools work well together and that the data flows nicely from one tool to another.
It doesn't always happen. Do you get involved a lot with RevOps teams in your activities?
Ben
I haven't called them RevOps teams but we do get involved with like the analytics teams and the Operations teams and the MarTech teams, and I guess you could call them RevOps teams. Yes, we do a lot of that type of work.
And probably not enough. I told you about the action sourcing and the data pipelines and things like that. I think that we're constantly hitting our head against the table with regards to this problem and connecting to the right team to make change. Normalize the language. Like these are the audiences, these are our ICPs versus these are the buyer personas within those ICPs versus these are the kind of action sources or event this is the event library related to the tags and actions that we care about related to the ICPs, et cetera.
Having a data dictionary on that and having everyone be on the same page is incredibly difficult. So we do a lot of work in creating that standardization. And then we will help teams with data warehousing. We do a lot of work in Snowflake, BigQuery, etc.
Alina
It's an interesting problem to solve because it's cross-department.
And within the RevOps agreement is to consolidate those definitions, to consolidate the data flows. But it's not always a department that exists. And it's always hard to, as a data team, to put things in place without having those pipelines being directed with you, with the exact same definition of that metric across those systems.
And it's interesting to see the intersection of CRO and RevOps and Analytics all trying to solve for the same challenges to get insights to what can be done.
How do you prioritize all this possible things. What framework do you typically use to prioritize experiments?
Ben
So we've got a different couple different prioritization tools. The main one in terms of what to do is what we call the PXL. It's a kind of a meta prioritization process. It's an ICE model on steroids, so an ICE model being impact, confidence, ease. Three pillars. We'll separate those out to other binomial metrics that are relevant for a testing.
Is it above the fold? Is it noticeable in five seconds? Do we have past experience and data and research that supports this so we have more confidence in it? Is it easy to implement.
So everything that we put forward and propose to test there's mechanisms to automatically score it, whoever is proposing to do it.
So it then ranks out and you have a kind of a rank priority of what to play with and how to execute within your punch list
Alina
I'm happy that we got a little bit to discuss about CRO. I think that we could be here until the morning because I'm very passionate about this subject and I take it you are as well.
Any parting words for companies who might not start, have not started a CRO journey yet and are considering doing it? What do you think some of the basic things that they can put together right away to just get started.
Ben
So I think CRO is the blend of behavioral psychology and economics.
And what we're doing ultimately is creating data and measurement systems to make better decisions. It's important to understand your own biases and stuff like that. But the baby step for starting to understand it is standing up a tool, standing up an experiment and collecting the data and seeing what happens.
And then I think that it's getting you and your team to take bets on what works and then watching yourself be wrong. That's like the first baby step, getting you and your team to be okay with being wrong and watching yourself be wrong. Oh man, I thought it was this and it did not work out.
Or this was a waste of time. Testing can get expensive. And so it's hard and it's not natural, especially for, series A, series B startups to do a lot of testing, they just need to ship, right? There's power, the law of large numbers, like there's safety in that, just shipping.
But eventually you need your operating system to start to measure, right? If you're not measuring, you can't improve it. So CRO testing is a way to measure your work in order to improve your work. So that's where the transition happens.
That's why we like to work with series C, young growth teams. That's when they start to need to be a little bit more rigorous about measurement to making sure that this big team and multiple teams, siloed teams are all on the same page, pointed in the right direction. I don't know, the baby step is just to stand up some small stuff and watch yourself be wrong.
I don't know if that was a little bit of a rambly answer to your question.
Alina
No, it was very. It's a very important insight. You have to be okay to just be curious and not point fingers at the initial failures because then you blame yourself or others and then you can't move forward.
So curiosity mindset is is key. Thank you for being here today and thank you for coming to share some of the wisdom with us.
Ben
Yeah, my pleasure. And happy to talk anytime.