Learning from the Winners with DoWhatWorks' Andres Glusman
Imagine standing in a gym, facing an overwhelming array of free weights, machines, and fitness programs. Choosing where to start is challenging, and without proper guidance, you may squander your resources on ineffective methods, yielding little progress. Now picture an experienced personal trainer, like Andres Glusman, who directs you towards exercises that are proven to work, align with your personal goals, and minimize risks. This scenario finds its echo in the B2B SaaS landscape, a complex gym of infinite growth strategies where identifying productive strategies from unproductive ones demands the expertise of a growth experimentation maestro.
Andres Glusman, the Co-Founder and CEO of DoWhatWorks, is this expert trainer in our metaphorical gym. With real-world experience, including being part of the team that launched MeetUp.com, Andres possesses a unique perspective on steering SaaS businesses towards effective strategies and away from potential pitfalls. In today's episode, we're joining him on the gym floor to delve into the world of growth experimentation and uncover how smart growth experimentation can revolutionize your SaaS business. So, gear up and prepare to unravel the secrets of effective growth experimentation.
High Level Overview:
- Embrace Experimentation: Andres Glusman champions the idea of running growth experiments within your organization. The principle is simple, yet profound - test a variety of strategies, learn from the outcomes, and optimize based on what works.
- Learn From Successes: Drawing on his experience with MeetUp.com, Andres teaches us to focus on techniques that have already proven successful. It's the equivalent of not wasting time on ineffective workout techniques in a gym.
- Avoid Past Failures: Andres emphasizes the importance of avoiding strategies that have already proven to be ineffective. His philosophy cuts down on wasted time and resources, akin to not spending effort on futile exercise routines.
- Leverage Real-World Experience: With his involvement in the launch of MeetUp.com, Andres stands out as a seasoned 'trainer'. He offers invaluable, practical lessons, far from theoretical knowledge, which he gleaned from his hands-on experience.
- Guided Learning is Key: Andres underscores the importance of guided learning in growth experimentation. Much like having a personal trainer in a gym, having someone who knows the ropes can help avoid missteps and accelerate progress in the complex landscape of SaaS growth strategies.
Learning from Successful Experiments
In the competitive world of SaaS, the difference between success and stagnation often lies in one's ability to conduct and learn from successful growth experiments. As any seasoned gym-goer can tell you, finding the most effective exercise regimen takes time, patience, and a lot of trial and error. This approach is just as applicable when it comes to orchestrating growth in SaaS businesses. The most successful companies aren't those who shy away from experimentation, but those who learn from their winners, optimize, and then iterate.
- Trial and Embrace Change: Be prepared to experiment with different techniques, products, or strategies. Embrace change and understand that some experiments will work better than others.
- Data-Driven Decisions: Lean heavily on data to determine your 'winning' experiments. This ensures that your strategy is rooted in fact, not guesswork or bias.
- Continuous Learning: Even a successful experiment can be improved. Never stop learning, iterating, and experimenting. There's always room for improvement.
- Focus on the Customer: All experiments should ultimately improve the customer experience. If they're not, reassess your strategy and pivot accordingly.
- Organizational Buy-In: Everyone within the organization should be aligned on the value of experimentation and be prepared to learn from the results, whether they're positive or negative.
In conclusion, learning from successful growth experiments isn't just about finding what works and sticking to it. It's about understanding why it works, and how you can apply those learnings to other areas of your business. It's about taking calculated risks, analyzing your results, and using this data to guide your future strategies. It's a continuous journey, not a destination. With this mindset, your SaaS company will be well-positioned to stay ahead of the competition and sustain long-term growth.
Further Learnings
Follow Andres on LinkedIn.
Do us a favor?
Part of the way we measure success is by seeing if our content is shareable. If you got value from this episode and write up, we'd appreciate a share on Twitter or LinkedIn.
00:00:01:14 - 00:00:26:11
Ben Hillman
Picture yourself in a gym eyeing an array of free weights, machines and numerous fitness programs. There's a myriad of pathways to strengthen fitness, but knowing where to start can be overwhelming. Without appropriate guidance, you may end up spending precious time and energy on ineffective techniques yielding little to no progress. Now envision having a seasoned personal trainer guiding you through the process.
00:00:26:14 - 00:00:52:08
Ben Hillman
A trainer who can discern which exercises have proven effective align with your goals and minimize the risk of injury. The magic here lies in leveraging hard earned wisdom to bypass pitfalls and accelerate progress. In the B2B Sass universe, this gym metaphor finds perfect resonance. The landscape resembles our hypothetical gym with an infinite selection of growth strategies or workout regimens.
00:00:52:12 - 00:01:19:00
Ben Hillman
Just as some workouts lead to a dead end of fatigue without gains, some strategies are unproductive and draining. The trick is to discern the effective strategies from the futile a task that requires the guidance of a seasoned trainer or in our case, a growth experimentation guru. Enter Andre Glusman, the co-founder and CEO of Do It Works. Andre's is the veteran personal trainer of our metaphorical gym.
00:01:19:06 - 00:01:44:04
Ben Hillman
Deftly guiding companies through the maze of potential growth strategies. His background isn't theoretical. It's grounded in real world experience. As part of the team that launched Meetup.com. Andre's had an inside view into what works and what doesn't when it comes to growing a business. His firsthand experience gives him a unique ability to steer sass businesses away from ineffective strategies and toward the techniques that truly work.
00:01:45:19 - 00:02:11:06
Ben Hillman
In today's episode, we're lacing up our sneakers and hitting the gym floor with Andre's. We're going to delve into his world of growth experimentation, unpacking how he applies the principles he learned at meetup.com and unveiling the critical role that smart growth experimentation can play in the transformation of your SAS business. So grab your water bottle, crank up your workout playlist and get ready to explore the secrets of effective growth experimentation.
00:02:12:07 - 00:02:35:13
Ben Hillman
From Paddle, it's Protect the Hustle, where we explore the truth behind the strategy and tactics of B2B SAS growth to make you an outstanding operator. I'm Ben Hillman, and on today's episode, Andre Grossman speaks with Andrew DAVIES about growth experimentation. They talk about embracing experimentation, learning from successes, avoiding past failures, leveraging real world experience, and the key to guided learning.
00:02:35:17 - 00:02:45:10
Ben Hillman
After you finish the episode, check out the show notes for a field guide from today's episode. Then, while you're leaving your five star review of this podcast, tell us what resonated most about our guest advice.
00:02:52:21 - 00:03:00:19
Andrew Davies
Andre Why don't you just give me a little bit of an overview of of your storied history before we dive into some of your your experience around experimentation.
00:03:00:20 - 00:03:23:16
Andres Glusman
Yeah, a storied history. I am a behavioral economist by training. I have been involved in the Internet since the early days of the linear dating back to 1998. I ran some of the first experiments there online, but really spent the last. Prior to launching my current company, I spent 15 years helping get meetup.com off the ground. So I help launch Meet Up made their first $14 of revenue led product and growth there.
00:03:23:16 - 00:03:38:10
Andres Glusman
And I just had a wild really interesting experience there over the course of a decade and a half, became early pioneers in the startup movement and then really was there. It was my experiences there that motivated me to start my brand new thing, which at this venue, this is why three years old, but it still feels like a new thing.
00:03:38:10 - 00:03:39:13
Andres Glusman
It's an overnight success story.
00:03:39:14 - 00:03:55:11
Andrew Davies
Maybe let's go back back to some of that meetup journey since you mentioned as a 15 years of Meetup dot com. That's an iconic brand in the in the space. And as you said, you took the first $14 of revenue. So maybe just give us a bit of a heads up as to what that company looked like as you were joining it.
00:03:55:11 - 00:03:58:09
Andrew Davies
And give us a few of the highlights of that journey before we dive in.
00:03:58:10 - 00:04:14:18
Andres Glusman
So it was one of those things where the founder, CEO and co-founder of Meetup Guy named Scott Hoffman, he led the company that I had worked at very early in my career. And so when he was starting Meet up, he said, Hey, I'm starting this brand new thing. Can you actually want to join it with us? I had just gotten into Wharton Business School and I said, You know what?
00:04:14:19 - 00:04:28:14
Andres Glusman
I need to go back to school. I need to get my head straight. I just was part of a startup that really did terribly. I need to go learn some stuff. And he said, Well, you can sell to us already and off the ground. I said, Sure. So I jumped in and helped prove out their business model in the very early days.
00:04:28:14 - 00:04:45:15
Andres Glusman
And at that point, meetup look nothing like Meetup looks now or what it became, but at that point the idea was that we could make money by charging businesses for the right to host meetups in their locations. And so this is 2002. There was no such thing as like local Google ads or any of that stuff. None of it existed.
00:04:45:15 - 00:05:03:21
Andres Glusman
And so it was kind of online, local advertising. And we were we said, I said, great, I'll try and prove it out. So what I did is I got a list of companies that are businesses, restaurants that we were in that were going to have meetups, because at the time the system worked in a way that a computer was automatically generating where meetups would happen and it would send people randomly, by and large, to certain locations.
00:05:03:21 - 00:05:19:18
Andres Glusman
So I got a list of the places where meetups were going to happen, and this is like from the first few hundred meetups that were occurring in the world at that point. And I basically called them up and I said, Hey, we're sending 14 people your way next Tuesday. If you want us to keep sending people your way, you can pay us for that and they'll cost you a dollar a person.
00:05:19:18 - 00:05:30:21
Andres Glusman
And the person said, Yeah, that sounds great. I mean, it was Ben's Chili Bowl in Washington, D.C. if you've ever been there, actually turns out to be a very iconic place. He said, Yeah, man, that's great. And I said, Oh my goodness, this is it. We've locked it and this is going to be the most amazing business you've ever seen.
00:05:30:22 - 00:05:44:19
Andres Glusman
I go to the next to the name on the list and give them the exact same pitch, and it's just like a train wreck. Terrible call. And I said halfway through the call, he's like, No, this is not for me at all. And I said, okay, okay, I'm not selling you anymore. I'm not going to take your money even if you try and give it to me.
00:05:44:19 - 00:06:00:02
Andres Glusman
But can I ask you, is there anything that I said that was remotely interesting to you? And he said, Yeah, getting people in the door is actually quite cool. But I got a line going out the door on a Friday night and I don't need that. And so I really want is people coming in on a Tuesday afternoon or a Wednesday morning or whatever the case may be, or my slow times.
00:06:00:02 - 00:06:14:07
Andres Glusman
And so over the course of 20 or 30 calls like this, where I just basically kept repeating over and over again, like running into the wall and then stopping the pitch and then asking for feedback and refining, we were able to hone in on a model that actually worked and we were able to get that, you know, into a sales team.
00:06:14:07 - 00:06:19:04
Andres Glusman
And that actually helped us get our first round of financing in 2002 when it was impossible to get funding.
00:06:19:05 - 00:06:22:10
Andrew Davies
So 2002 was the first money in. Was it.
00:06:22:10 - 00:06:25:08
Andres Glusman
Yes. Yeah. From DFA out on the West Coast. Yeah.
00:06:25:09 - 00:06:32:16
Andrew Davies
Now let's fast forward all the way through to the end of that 15 years. How different did the business model the scale of our business look?
00:06:32:16 - 00:06:50:04
Andres Glusman
It became a really, really cool business. So there was a lot of changes we made along the way around network to really capitalize on the network effects of the business. And the biggest thing that was change was in fact moving to a model where there's organizers and numbers, which becomes a two sided marketplace and more members attract organizers are more organizers attract more members.
00:06:50:04 - 00:07:09:02
Andres Glusman
And it became it's really wonderful dynamic flywheel. And so the biggest challenge that we had at Meetup Point was basically around how do you take this organic traffic that's coming your way and make the remove the friction and figure out how to make the user experience as seamless and as easy for people as possible to help them get into meetups and to change the behavior around the world.
00:07:09:02 - 00:07:28:10
Andres Glusman
And so what we really sort of honed in on over the course of that of that period, once we had this flywheel traffic was this experimentation engine and getting people to like understanding how to use experiments at Meetup in order to help people get through an experience or trying to accomplish the goal that we were helping them, we wanted to help them accomplish and learn just a whole lot of lessons along the way.
00:07:28:10 - 00:07:44:22
Andres Glusman
But Meetup grew to 40 million people. It had a successful exit to do. We work at the time, which fortunately for us was was an all cash deal, which is a very good thing for us. All of the things that happened after the fact did not did not affect us personally, too terribly. And that allowed me the opportunity to go take a year off and to go play.
00:07:45:00 - 00:07:49:08
Andres Glusman
And so at that point, that's where I started sort of pioneering or playing with new ideas.
00:07:49:08 - 00:08:04:19
Andrew Davies
You used a phrase there that I'm sure has a lot more meaning and definition behind it. You said experimentation engine to describe what you built. So what is an experimentation engine? What did you build there that that could be scalable, that could be repeatable? Talk to me a bit more about the definition that sits behind them.
00:08:04:20 - 00:08:19:07
Andres Glusman
It started like most things it did. It wasn't as glamorous as that. So I'm probably, you know what, I really started as very early on running and making changes on our website. And this is we were kind of learning how the Internet worked and how to affect behavior. So there really wasn't a playbook for how do you run an experiment online?
00:08:19:07 - 00:08:39:22
Andres Glusman
And what we noticed is that we would change things and nothing would happen. And we changed over the course of six months. You just look back and like, what did we get from all of this motion? We didn't get anything out of it. No results from these things, and we didn't even learn anything. And at some point in along the journey, we sort of honed in on this idea of like, Well, what if we ran something as a split test or as an experiment where we systematically vary the experience?
00:08:39:22 - 00:08:58:07
Andres Glusman
For some people and then not for others and understood what would happen or not, and what as a result of doing that, the thing couple things happened. One is we got lucky on the very first one. Very first experiments we ran because we got a lift and that was great. So it's extremely addictive to say, Wow, I just not only I just changed something and then finally walked right and made a big difference.
00:08:58:07 - 00:09:19:13
Andres Glusman
We had like a 16% lift, I think a conversion rate at the time, right on this thing that was really important to us and we were able to quantify it and that was extremely important. But more importantly, we were able to start building momentum and learning from the thing that was changed in the outcome we saw. And invariably what ends up happening is that the next several experiments that we ran did not move the needle and no effect, but we were able to learn from every single one of those.
00:09:19:13 - 00:09:39:22
Andres Glusman
And so the experimentation engine at MIT, ultimately when when I was leading project growth, there was around a mindset and a culture of running experiments in order to figure out how to drive growth and being able to accomplish our goals by either running an experiment that created a lift or running one that created learning. And so sort of that classic saying that, you know, experience is what you get when you don't get what you want.
00:09:39:22 - 00:09:52:07
Andres Glusman
I think learning is what you get when you don't get the result you want in some regards when it comes to running experiments. But in an ideal world you're getting both the learning and the lift every single time. But the world doesn't work that way. It doesn't cooperate.
00:09:52:22 - 00:10:02:03
Andrew Davies
But I guess if you're building the right culture of experimentation, then both lift and learning can be addictive, to use your word, which is, I guess, how you build repeatability into the culture. So that's that's fantastic.
00:10:02:08 - 00:10:19:00
Andres Glusman
That's exactly right. It's like a straight dopamine hit to everyone in the culture, which is very, very interesting because there is such a thing as having to too much experimentation, it being too addictive. And so addiction is actually probably the right word in this regard because there's a time to pull back and a time to not run tests at that time to actually run those tests.
00:10:19:00 - 00:10:31:16
Andrew Davies
And let's go down a level deeper on that word addiction then. So when you are working, you know, when you're working at Meetup and when you're working now with other companies, when you're advising founders on thinking through that process, how do you get people hooked? What is that journey to get someone hooked on that process?
00:10:31:16 - 00:10:58:13
Andres Glusman
I Momentum begets momentum, and so the reality is as early as possible that you can get somebody to have the positive when in whatever journey they're on, the more likely they are to stick to it. This can be true when you're lifting weights, right? This is true when you're running experiments as well. The kiss of death for any any culture, any company that's trying to run experiments is actually for the first few experiments to not work, because then everyone who is a naysayer is going to basically throw water on and be like, this is a waste of time.
00:10:58:13 - 00:11:11:17
Andres Glusman
And what people don't really realize is that that's sort of the natural order of things. Running experiments is a lot like being a VC, you know, and great VCs are lucky if they can get one out of ten wins or two out of ten wins with one being a home run. Same goes for experiments. This is an interesting fact.
00:11:11:17 - 00:11:31:01
Andres Glusman
Andrew I'm sure you probably know this from your experience as well, but according to to optimize early, in fact, 80% of experiments that are run do not positively move the needle. Eight out of ten produce learning and not lift. And so the question becomes, will, why? Right. And how do you actually overcome those odds to be able to get a positive, to be more likely to get that win?
00:11:31:01 - 00:11:54:03
Andres Glusman
And yeah, no, no, no, no and so so you can to to to your to your point and turn to like what crazy addiction or what it doesn't create the addiction if you're lucky and one of those you get that the first or few one of those first two gets the win great. But if you're just like the average player out of hand, if you were sitting at a table in a casino and you had one in five odds and you're going to only you might go forehands without a win and you might get lucky and get to get your hand on get the win on the 45th one.
00:11:54:03 - 00:12:03:14
Andres Glusman
It might take six or seven. You know, that's how statistics works. And so the the thing that creates the addiction is if you can sort of stack up those wins early on because then people want it and they want more.
00:12:03:14 - 00:12:22:04
Andrew Davies
Part of this is about setting the expectation that it might take five or six or seven goes. But is there also an approach where you want to load the dice as they're sitting down there in the casino to make sure that there's more likelihood and easier hypothesis or a frame of reference that helps the first few win. You know, how do you how do you think about that advising companies to start this process?
00:12:22:04 - 00:12:35:16
Andres Glusman
That's exactly right. So that's what motivated me to start do what works because that was the pain that I felt. So I was running those experiments at Mirror. I saw that it takes a while to win the way you load the dice or I'll frame it the other way, which is like, why? Why is the odds one in five?
00:12:35:17 - 00:12:54:16
Andres Glusman
Well, one is it's hard to change behavior. It was just accept that. Yeah, we're trying to change in any of that things and it's hard to create things that work. But then the second thing is that no one learns from anyone else every lesson that any individual company is learning on how they're laying out their their pricing page, on how they're conveying value and how they're conveying discounts on offering free shipping or not.
00:12:54:16 - 00:13:14:04
Andres Glusman
Every single one of those is done in isolation and no one learned from anyone else. So they're all making the same experiments, they're all learning the same lesson, but no one's able to benefit from any other experiment, which is can you imagine what science would be like right now if no scientists ever learn from any other scientist? And so our point of view, and this is my own personal experience, is that I overcame that when I was a meet up by having meetings.
00:13:14:04 - 00:13:29:02
Andres Glusman
I just to just sit down with fellow product leaders in the New York area and talk to the folks over at Etsy and the folks over at Shutterstock and really great, amazing, wonderful people. And we'd swap notes about what was working. And that stack loaded the dice, As you say, I love that term. Loaded the dice in in our favor.
00:13:29:02 - 00:13:45:18
Andres Glusman
What motivated me would do what works is basically around like, well, what if we could create a way at scale to help people learn from every other experiment that's being run by people who are running experiment on. I think I'm thinking about running a test arm. So that's what it's all about. But the reality is it's just like getting signal by any means possible.
00:13:45:21 - 00:13:50:22
Andres Glusman
It's early in the process and Fossil would let you know if you're hallucinating or if you're actually onto something good.
00:13:50:23 - 00:14:08:00
Andrew Davies
Let's go to the second part of that. And over the other, ditch the second part about Genesis, who said, you know, the process to addiction is important to loading the dice there. So how do you know when a culture is going beyond being a high functioning addict to something where the experimental option is being used so frequently? Actually, it's a crutch or friction in the organization itself, The.
00:14:08:00 - 00:14:26:05
Andres Glusman
Answer to every single question is let's test it. You know, it's being used as a crutch. Or when the experiments themselves get smaller and smaller and smaller without the traffic getting bigger and bigger and bigger. So if you have Google size, it's totally cool to test 50 Shades of Blue and super cool because that represents gazillions of dollars, right?
00:14:26:05 - 00:14:42:19
Andres Glusman
Billions and billions of dollars are on the line with different shades of blue. If you have if you're a startup and you're testing 50 Shades of blue, good luck. It's going to take you a million years to get that result. And so it ultimately comes down to and when you're running too many experiments is when the outcome of the test is not proportionately valuable.
00:14:43:09 - 00:15:01:08
Andres Glusman
It doesn't cover your downside risk in a way that makes you makes it worth spending the effort to do so. You want to run tests on the things you're unsure about and that are very consequential, positively or negatively. And when an organization is addicted to running tests is when everyone is just running tests. In order to say, I checked the box, I did it.
00:15:01:08 - 00:15:09:21
Andres Glusman
The downside? Yeah, there was no downside. It's not my fault when it's sort of around avoiding risk as opposed to seeking gain. That's when you're running to an experiment.
00:15:10:00 - 00:15:38:13
Andrew Davies
Maybe this is where we can start hearing a bit more about do what works. But in in that process and, you know, with founders that I've worked with who are looking at adopting a more scientific process here, often there's lots of head nodding when they read Lean Startup or in the A read some of the methodology. But the common frictions that I hear will be things like the amount of time that it takes and they don't have the resource in the team or, you know, not understanding the process in their practical environment or, you know, the stats like a question like, I just don't have enough data points for this to be statistically significant.
00:15:38:14 - 00:15:45:16
Andrew Davies
What else do you see in that list of reasons why not? And yeah, and then talk to me a bit about how you encourage people to overcome that.
00:15:46:00 - 00:16:01:10
Andres Glusman
The unfortunate truth is that they're right on all those fronts, so they're not wrong. It does take forever to run an experiment and as you get smaller and smaller, it takes you longer, longer. It is often the case that you are going to be wrong and that's okay as well, Right? So you need to have those expectations upfront.
00:16:01:10 - 00:16:27:13
Andres Glusman
It does take work. It does slow you down. You're creating two variations of something those and so for the so they're all right. The reality for us though is like we work with six of the top remember we work with eight need to be SAS, unicorns and major banks. They're constrained. They're not they they feel like they take some too long to get an experiment and get the results and so the reality is, is why would you be crazy enough to invest a month if against all these odds?
00:16:27:13 - 00:16:46:00
Andres Glusman
And the reason is because the impact of any given experiment when it breaks in your favor can be pretty substantial. 20% win in a smaller company or even a 5% win in a very large corporation like some of the companies we work with at the Times is a lot of money and it makes a huge difference. And so it's worth taking a few shots at and trying a few different variations because it works out.
00:16:46:00 - 00:17:17:09
Andres Glusman
And in that regard, sort of the C model is exactly the right model for it. It's it's a good analogy because why do these CS invest in all those companies? Because they only one or two are really going to generate the results and the return on the fund. And so you sort of want to approach experimentation in the same way, as long as the outcome can give you a really big result, which is what most conversion optimization does, you have a very positive effect by improving conversion optimization because all the money you spent getting people on your website, all the money you're spending on your ads, you try and get people through, suddenly becomes worth
00:17:17:09 - 00:17:32:03
Andres Glusman
10% more. And that's a pretty massive left. That's a huge influx of resources into your organization. And you can either bank it, take it as profit, or you can use that to outspend your competitors in advertising or in acquiring more companies or etc..
00:17:32:10 - 00:17:44:17
Andrew Davies
And so when I when I, you know, go to do what works, I see you can't test everything. You might as well do what works. So talk a little bit to us about that, the proposition and how people can learn from other people's winners and losers too. To load the dice.
00:17:44:17 - 00:18:11:17
Andres Glusman
We've built is an engine that detects the experiments that are being run focused on growth from any company. And so for any company we care to look at, we can understand what experiments are running, what's winning and losing. We then allow our clients to be able to use that data upfront before running experiments, before making changes to more effectively write ad copy to be able to optimize their experience based on what is in fact winning for other people that are in their space.
00:18:11:17 - 00:18:29:21
Andres Glusman
And so their direct competitors to some degree. But more much more interesting is if you're a product led growth company, you can learn as much from the pricing page on a completely, you know, a company that you don't you don't compete with as one that you do. You can learn as much from like Kalgoorlie, for example, or you can learn as much from from our table as you can learn from looking at your direct competitor.
00:18:29:21 - 00:18:52:16
Andrew Davies
We're going to have to make sure we continue this, continue this specific discussion off my going off camera, because I know, you know, with with the proper wall metrics, data sets, you know, we've got 34, 35,000 companies there where we look at not just, you know, the financial metrics, but we need to start looking at pricing pages and other things that there might be some very interesting ways of of of playing around with exposing more of out for the industry's benefit.
00:18:52:16 - 00:18:54:21
Andrew Davies
So let let let's take that one off line.
00:18:55:23 - 00:19:14:13
Andres Glusman
And just see, you know, I didn't even bring this up but profit well and protect the hustle has been such a fundamental part of my journey over the last few years, especially during the pandemic. Listening to that and going on walks and getting out of the house and listening to this specific show. And we actually drew a lot of lessons from the show and from your experiences there.
00:19:14:13 - 00:19:20:22
Andres Glusman
And our have modeled some of our specific strategies actually, as a result, some of the lessons you all have shared. I really, really want to thank you for that.
00:19:20:23 - 00:19:41:16
Andrew Davies
Hey, that's great. Hey, it's it's lovely to see the rising tide of information that's based on data. And, you know, we definitely want to be continuing contributors to that. But I know a lot of other people, too, and I'm thinking about the contributors and educators in this space. You know, you mentioned Lean Startup movement and, you know, I was quickly Googling now to to to remind myself of when the four steps to the Epiphany was published.
00:19:41:19 - 00:19:59:09
Andrew Davies
So, you know, stuff like that. That's 2005, Right. And then obviously, Eric agrees with with Lean Startup, I think was probably 2011. It looks like you all the other seminal tomes that you point to that are more recent than those. And if not or if so, what are the learnings that are updated for this decade? If those are the two prior decades.
00:19:59:13 - 00:20:17:22
Andres Glusman
Those two books forced us to. The epiphany was just an epiphany moment for me in reading it in the classic journey that a lot of people went through in the early days at that time, and it opened my eyes to so many things and ways, thinking Eric Reece became a friend of mine. As a result, we there was part of the Lean Startup movement was actually powered by meetups there.
00:20:17:22 - 00:20:42:12
Andres Glusman
There are meetups everywhere around lean startups. I ran one in New York with a couple of guys that I'm really great friends with, and we had 5000 members. The he's actually an investor in do what works now to to go full circle which is really, really great. To your question though, what a wonderful question. I think the people who are producing the best insights right now on the sharing front are probably a lot of the folks over at at Open View and Kyle Point.
00:20:42:12 - 00:21:03:00
Andres Glusman
Yes. And and the product led growth movement and a lot of the folks that I really enjoy watching on LinkedIn and in Casey Hill, for example, are on how to you know, how to how to work through in lots of Australia is a marketing. It's all happening on LinkedIn, it's all happening on carousels at the moment. I'm personally very addicted to LinkedIn Carousel, but it's amazing what you can learn on LinkedIn Carousel.
00:21:03:07 - 00:21:18:09
Andres Glusman
We're contributing our own or putting our own out there right now to just to give give back. But I think it's all happening there and it's really to your point, I think it's very use case specific and it's super tactical. It's less strategic at this point. And much more just like, here's the email I wrote. Here's the thing I did.
00:21:18:10 - 00:21:35:11
Andres Glusman
Here's the very specific things I think people at this point are like almost thinking more, almost like object oriented programing where they're looking for building blocks and trying to stitch together the building blocks to be able to make something really cool and build upwards and have it be an emergent strategy or an emergent thing as opposed to a philosophical framework.
00:21:35:11 - 00:21:50:14
Andres Glusman
Go use this framework in order to make this, that or the other happen at some point. Wait, I should give props to profit. Well, to come up with. Think I should give some props already or tip over there to then paddle because you are definitely are giving back to the industry too. How did I even not think about that right away?
00:21:50:14 - 00:21:59:04
Andrew Davies
I think on the book for other definitely it means there's a space for you for the 2020s updates of some of this. So. So maybe we should be marking that on our bookshelf as a space ready for it.
00:21:59:04 - 00:22:24:03
Andres Glusman
You know what's funny is do it or it's actually originated in part because I was taking time off between meet up and thinking about my next thing, and I was starting to play with ideas, and one of the ideas I was playing with was, was a book called Lessons from the Fringe, where I wanted to go analyze what you can learn from drug dealers and what you can learn from different industries that are sort of on the fringe, like street meat, like the vendors in New York City and the porn industry.
00:22:24:03 - 00:22:36:21
Andres Glusman
And there's other like different sectors and how you can learn from them. And that was the book that I was actually thinking about, and it was on trying to think about that and try to think about how would I understand their business, that I actually hit upon the idea for Purdue. It works in terms of one of the methodologies.
00:22:36:21 - 00:22:52:12
Andrew Davies
In the spirit of learning where there isn't a left, you know, can you talk us through some of the the abject failures on that journey of Meetup? Because I mean, that was a fantastic success story from the outside in. But talk to us from the inside and it doesn't have to be experiments. It could be, you know, strategies or cultures.
00:22:52:12 - 00:22:58:09
Andrew Davies
Talk to us a bit about what you would have done differently if you'd had that journey again, And then what people can learn as they're listening to this.
00:22:58:09 - 00:23:16:08
Andres Glusman
It's interesting because it is one of those things where the lessons definitely reinforce the bigger point of view that that I now feel very strongly about. And I know that I put out there the biggest failures came from being big bet driven and the biggest successes also came from being big bat driven. It's hard to note the difference.
00:23:16:08 - 00:23:32:00
Andres Glusman
So in the early days, for example, the I described a little bit about what meet up 1.0 looked like, and then there was a massive redesign towards this organizer member model. You can't get bigger as a bet than that. That was one of the biggest bets you could possibly make, throwing out the entire way the system worked. But why did it work?
00:23:32:00 - 00:23:56:03
Andres Glusman
It worked because we saw people using meet up in a very specific way and creating groups around it and almost hacking our system to do things in a very specific way. They started having organizers and members, and so we changed the platform to map to how people were using it. There were times and that was great, and that was a very big the huge there were times, though, later and meet up where we said we want to get stuck functioning.
00:23:56:08 - 00:24:11:07
Andres Glusman
We don't want to keep growing at this nice linear way. We want to get like a really explosive growth. And so we need to be prepared to make a bigger bet. The approach that we took was around saying, Let's completely rethink how this should happen. And it wasn't pulled from examples in the wild of how people were using it.
00:24:11:07 - 00:24:35:10
Andres Glusman
It wasn't pulled from looking at data that gave us a signal about this thing. It was very vision driven, and every time we tried to create behavior in a vision driven way, as opposed to letting a behavior be emergent and making it really, really simple, we would fail. And so there was a time and it was a very painful time at the company where we had the entire team working for months and months and months on this one, you know, big, massive redesign.
00:24:35:10 - 00:24:54:03
Andres Glusman
And we relaunched the platform and it went so poorly could not have done worse. I mean, it was a terrible launch. And the reason for the biggest reason was because we didn't take the steps in between to like say, well, can we find a signal that proves that valid? This is a good idea as opposed to purely being vision oriented.
00:24:54:03 - 00:25:16:00
Andres Glusman
And those were the failures. And so those like when I look back at my career, the same is true at Meetup or other times or other things I've done. It's when there's an emerging behavior or when there are signals that you're sort of capitalizing on and trying to make better, that's when things really magically work out. And that's where innovation can really happen when it's completely like a point of view that says we're going to try to make this brand new thing happen in a different way.
00:25:16:03 - 00:25:18:08
Andres Glusman
No, I've yet to see that work.
00:25:18:11 - 00:25:33:10
Andrew Davies
And so then if we take it back into the realm of hypotheses and experiments, I think one of the phrases that we loved in your pre-interview chat with Ben was the phrase Don't recreate the losers. So talk to us about what that means and how you can avoid up front the losers.
00:25:33:15 - 00:25:49:23
Andres Glusman
The reason I believe, like I saying, that 80% of experiments fail to move the needle, they only produce learning the reason they fail because you recreating a test that somebody else has run that could have given you a signal. You're conveying your your pricing in a certain way. And you want to you're sort of have your best idea.
00:25:49:23 - 00:26:06:14
Andres Glusman
You're using own assumptions. You started with an assumption this is fundamentally flawed for yourself and you're running the test and you're ultimately recreating something that other people have done that is law. And the best way to not recreate the losers is to learn from other people and to see what worked and didn't work. Of course, I'd love everybody to use my service to be able to do that.
00:26:06:14 - 00:26:31:08
Andres Glusman
That would be great. I would love that. You can also talk to your friends over IP, you know, at a different industry who are solving the same problem, you know, like I did. You can look for other examples. What you just don't want to do is copy blindly. I seen people do this, which is our system detects these experiments that are being run and you can see somebody I've seen an example where there's one company that copied the experiment that was being run by another company in real time, but they copied the losing barrier and so they just went to the website.
00:26:31:13 - 00:26:47:07
Andres Glusman
They saw, Oh, there's a brand new way of presenting the offering. We're going to copy that, we're going to test it. So they ran the exact same test spending a month or two recreating everything about that in their own style, because that's sort of their their kind of fast follower model. And they then relearned this other company could have revealed to them.
00:26:47:07 - 00:27:10:15
Andres Glusman
And so by a sort of fundamental understanding like what signal you pay attention to are making sure that you can understand what worked or didn't work for somebody else, Buy any signal possible by a system like ours, by conversations, by virtue of reading somebody is case studies, whatever you can find online. That's the way to avoid spending time on stuff that doesn't work is by getting the right signal upfront and saving yourself from having wasted your most precious resource, which is which is ultimately time.
00:27:10:16 - 00:27:30:10
Andrew Davies
I'm intrigued whether you think that or whether you advise people to go and learn from very close competitors in the same segment or geo or price point or whatever that segmentation might be, or whether you actively encourage people to build on the assumptions or the tests of companies and teams that are outside your segment or attributes.
00:27:30:15 - 00:27:51:00
Andres Glusman
People come to us a lot of times when things are excited about a thing with their competitor, they're testing and I think that's pretty cool. I think it's way cooler though, to see what other people are testing that are in adjacent space. So there's a famous fabulous case study from Southwest Airlines. And Southwest Airlines in the United States is famous for being able to be very efficient and get their planes off the ground fast, because when planes are on the ground, they're not making money.
00:27:51:00 - 00:28:05:18
Andres Glusman
When planes are in the air, they're there making money. So you need a picture. You need a you need a ground crew that's able to change up the get a plane service and out the door. Who do they study? They didn't study American didn't say American. They didn't say United Airlines. They didn't study Virgin. They studied Formula One pictures and they learned all.
00:28:05:18 - 00:28:32:15
Andres Glusman
They were just trying to understand what worked. It didn't work for Formula One crew in order to figure out how to make their ground crews and get more efficient, get the planes back up in the air. And so for me, I'd rather see somebody learning not from their direct competitor, but from learning from other people that are in a similar space that are targeting small businesses or in a product led growth oriented nature that are maybe a consumer subscriptions that are so that are they're not in the meal kit space, but maybe they're selling boxes of dog toys or whatever the case may be.
00:28:32:19 - 00:28:56:17
Andres Glusman
I think you can learn as much, if not more, because you tend to get into very tunnel vision when you're focusing in on your competitors and more not, you're often learning the wrong things. You're probably pulling out or teasing up around things. The exception to that is, of course, with messaging. I think messaging What works for your competitors in messaging, If you can understand the messaging that is most likely to resonate because they're talking directly to your target audience, I think you can learn a lot about them.
00:28:56:17 - 00:29:14:20
Andres Glusman
And so what I would like to say is don't learn from your competitors. Learn through your competitors, like see what they do and try and understand the parts that are applicable to you and bar that it's worth borrowing. You don't have to be too proud. There's things that don't matter in the big scheme of things that are the differentiator, but they can reduce friction for your customers and make their life better.
00:29:14:20 - 00:29:30:15
Andres Glusman
Allow you to focus all of your chips and energies on the things that actually do matter, that will differentiate you. And that's the thing I think you learn by looking in adjacent spaces a little. It's a little bit of a yes and but you got to know what to look for from your competitors and what to look for for other companies that are more inspirations.
00:29:30:15 - 00:29:59:09
Andrew Davies
That let's just jump into practical coaching mode. We'll have a bunch of founders listening to this at various stages, but let's let's set an imaginary context. You know, there's a business that's, you know, perhaps a proto led business that is, you know, in their first few months, they're getting a bit of traffic through their user and user acquisition funnel and are thinking now about experimentation and what are the what's the checklist of tasks that you would want to walk them through in order for them to set up to measure and to learn from their first few experiments as a business?
00:29:59:09 - 00:30:21:13
Andres Glusman
The very first thing I would do is encourage them to understand what are their golden pages, what are the areas, what are the key experiences that are the most fundamental experiences in the entire experience? And most early stage companies are fortunate because they don't have that many experiences, and so they're very fearful of our team. But there's really kind of just a handful of places where all roads lead through and you want to devote as much of your firepower to making that experience as great as possible.
00:30:21:13 - 00:30:40:13
Andres Glusman
The first question is, do you have enough traffic to actually run an experiment? And if you do, if you don't go try and grab from the best practices of everyone else, put your best guess forward and move on. Don't of your early early stage, don't bother running a ton of experiments if it's going to take you three months, five months to get results on any one given page, do your best guess, put your best foot forward and move on.
00:30:40:14 - 00:30:57:03
Andres Glusman
Don't run the tests as you start to reach a certain level of scale and you've got money you're sending, you're trying to get people through and small improvements in that conversion rate. Meaningful improvements make a big difference, isolate in on the most important variables and the most important mechanics that a user is going to go through that has to break in your favor.
00:30:57:03 - 00:31:19:00
Andres Glusman
And so the thing I love to see at this early stage of the question I always ask is what has to break in your favor in order for this thing to work And it might be achieving a certain cost of acquisition for an early stage company. It might be getting a single customer, having somebody pay you for the second or third month in a row and devote all your resources on that thing first and foremost, and then work backwards from there.
00:31:19:02 - 00:31:37:09
Andres Glusman
The other two other components, as you start scaling up and start running more experiments, are you trying to create a more robust program so maybe a little less founder and more like head of marketing programs, right where you now have scale like, like some of the challenges you're facing here? You didn't ask me for coaching, but I'll give it to you if you'd like, but is around prioritization.
00:31:37:09 - 00:31:55:07
Andres Glusman
So the problem starts to be you start having organizations where you have more ideas than you have time to address them and you have more data or you have more opinions than you have people in is generally the way it usually works. And so I like to joke that like product management and marketing in some regards are team sports, but they're full contact team sports.
00:31:55:07 - 00:32:16:23
Andres Glusman
And so you have a lot of people with a lot of strong opinions but not a lot of data. And so to the degree to which you can use data to help you prioritize hires, the results are to prioritize the sequence and get alignment. That's fundamental to actually like getting a team aligned and getting it going. And it's probably one of the most important things because an organization of that size, it's actually making sure that things don't bump into each other.
00:32:16:23 - 00:32:34:00
Andres Glusman
On the route to finding that, honing in on the experiments that are most likely to make the biggest results. That is the biggest challenge. And you don't. The cruel joke around experimentation is that you only get the data after you run the test, but when you need it as before your other test, the general feedback and the coaching I gave is to try and get your hands on data early in the process as possible.
00:32:34:00 - 00:32:59:08
Andres Glusman
And then as you start moving into the execution phase. The thing I always really love to see is having the team that's working on it be able to look at other things together. To use a non example, like you could do that by looking at experiments other people around. You could also let's see your thing about launching a brand new feature and you're launching I'm trying to imagine one that you might want to launch, but you might be launching something and you see that Stripe has a similar functionality on this.
00:32:59:08 - 00:33:19:10
Andres Glusman
One thing. I know you don't compete with Stripe, but I imagine that there's sort of a similar enough functionality there, maybe Venmo or whatever the case may be, and yet you're trying to launch a consumer oriented feature and no reason why you can't usability test people using Venmo, for example. Are you using that thing and getting signal before you lined up a single line of code by looking at actual users using the thing?
00:33:19:10 - 00:33:39:01
Andres Glusman
And so it's really around getting data by any means necessary, but qualitatively through conversations and quantitative through through better harvest, other people's experiments, whatever the case may be, getting the data upfront to inform the team that's running the executions, to be able to make better assumptions. Because ultimately when you make better assumptions, that's the difference between something heading in something miss.
00:33:39:07 - 00:33:58:20
Andrew Davies
One of the things we do as part of these is we publish a field guide that comes out of the discussion and some of the clarity in our framework I know will come through as part of that. That's our field guide. So thank you. It's really interesting you use the word opinion in the edit. It reminded me of when I was working at Optimizer, so I was running Brand demanded Digital, so the whole web property rolled up to me.
00:33:58:20 - 00:34:19:07
Andrew Davies
The Web two rolled up to me and it was the first example I'd come across of experimentation. There's a response to Hippo to the highest paid person in the room is opinion. And I can remember because we were having to redesign the new top level now for the for the website for optimizing dot com and every department wanted their thing although top level now and i remember when she's a fantastic h.r.
00:34:19:08 - 00:34:43:22
Andrew Davies
Leader, but she came and said, well, we need careers on the top level now because what loads of hiring to do and therefore we need the careers up there. And fortunately the web team was strong enough and robust enough to say, we'll take your opinion and it will become a hypothesis and a test. We run at a run through a process, and it was extremely obvious to everyone after a month that, you know, having careers up on the offer the top level now as one of our options that we were testing, firstly, almost no one clicked on it.
00:34:44:01 - 00:34:57:15
Andrew Davies
And those that did, they didn't apply for a job. And actually, you know, even if we would look to the next level of the people who applied, the people who needed it on the top level, now to find their way to apply, we're probably not the applicants. We wanted that. So I thought that was just a really nice response to executive opinion.
00:34:57:15 - 00:35:10:09
Andrew Davies
I don't know if you've got the funny examples of of experiments setting culture, but I know as we as we draw to the end of our time here, maybe there's this an example that comes to mind that you can use as helping founders set culture using experimentation.
00:35:10:09 - 00:35:44:16
Andres Glusman
It is such a great example that a large organization, I was sort of trying to put a debate to bed and there are so many debates that linger and linger and linger. And it's so funny. There was an organization we were working with and they were telling us that they had been there's been this lingering debate that's been going on forever where the marketing team wanted to use these big, beautiful lifestyle photos and the product team wanted to use product screenshots and product imagery and the design team wanted to use this kind of cute notion illustrations, the carrot, you know, these little drawings that are now very popular on the net.
00:35:44:16 - 00:36:01:09
Andres Glusman
And all three teams had basically just been like locking horns for forever about it. And what they said to us is they said, Well, you've got a huge data trail. You've got a, you know, a vast well of data that you can tap. What's the pattern across all experiments you're seeing related to imagery and use of imagery in our space?
00:36:01:09 - 00:36:24:04
Andres Glusman
Is it drawings? Is it product imagery or is it cartoon or is it cartoons? And we were able to distill it down and sort of analyze likelihood of any one of those things winning in their space. Based on an analysis, we did a kind of a meta analysis of the experiments, far from scientific terms, and give them a recommendation that ultimately let them put that to bed and sort of said, okay, here's what here's what it is for your industry.
00:36:24:04 - 00:36:42:16
Andres Glusman
It's this, this and this. They ran one more test and then they put that to bed. They say, okay, yes, okay, we now agree. I think it was that product imagery was superior to these drawings. And so we're getting rid of the drawings and we go with the product imagery and we're moving forward in their space. So it was one of those things where it was able to put the debate to bed.
00:36:42:16 - 00:36:57:12
Andres Glusman
Well, I think is really funny about your story is that you just sort of well, one is that it was optimized so like there couldn't be more a better fit of a culture that they were willing to accept the results of an experiment. But two is that you had to spend a month like a whole month consumed putting that debate to bed.
00:36:57:12 - 00:37:10:00
Andres Glusman
And that's the part that it's like when I think about our industry and I think about where we're going and the role that I want to play it, because I want to save people from spending that month, I just feel so bad that like you didn't spend a month doing something else that was so much better, right? Or so much cooler.
00:37:10:01 - 00:37:24:12
Andres Glusman
And so it's one of those things where it is important from a culture point of view that I have a data back to Penn, like have it be backed by data because, you know, if you don't, then it is just and it's great to actually if you don't do that, it will linger. And these things create scars on organization.
00:37:24:12 - 00:37:42:15
Andres Glusman
To your point about founders, though, it's one of those questions from a founder point of view, the hardest thing and that I have to keep an eye on a lot is around you're a founder because you have more ideas. You have a lot of ideas. Like you feel really inspired by inventing and creating new things and you need to be careful to couch the things you're saying is this is taking a leave it.
00:37:42:16 - 00:38:06:09
Andres Glusman
This is not set in stone. I don't really. I don't I think it's cool idea, but we can go in any number of directions too. So often it's like somebody will hear something and say, Oh, I need to immediately lock in. And so the ability to sort of create a culture where the anyone feels comfortable questioning the CEO or crashing the founder and sort of the trust and no sacred cows is one that I really, really appreciate and say that anyone can bring this up.
00:38:06:09 - 00:38:27:00
Andres Glusman
And if you can test and improve your way out of it and great in fact, proving me wrong is the best thing you could do because I just learned something. And that's the hardest thing to instill in a culture, especially as you get larger. I'll tell you one quick story, Andrew, that related to this. If you add me up, those are what I used to do when I really wanted to build a culture of experimentation is we used to have lunch and learns for every new employee and they would have lunch with the head of h.r.
00:38:27:00 - 00:38:45:13
Andres Glusman
That have lunch with the head of finance. Have lunch with different team leads over the course of their first month and meet up. And what i would always do on my lunch and learn is i would show them seven experiments that i had run at mit up over the years and show them very a invariant day and I would make everyone in the room guess which one won.
00:38:45:13 - 00:38:59:11
Andres Glusman
And then I reveal the answer. And of course, I designed it in a way that it was always the most counterintuitive tests. I think there's no other way you could get seven in a row, right? In fact, over the course of doing it, I did it over the course of three years. Only one person ever got all seven.
00:38:59:11 - 00:39:13:18
Andres Glusman
Right. And you should hire her. By the way, she's she's a genius. But only one person ever got all seven. Right. And the reason I did that was because I wanted everyone to know that you're going to be wrong a lot when you're trying to change behavior, when you're running an experiment. And that's why we run these experiments.
00:39:13:18 - 00:39:33:18
Andres Glusman
And those experiments reveal to you the actual lesson to learn or make you better at figuring out where to go. And I wanted everyone to fail publicly in front of their peers, in front of a senior leader at the organization to know that it's not a big deal. And it was kind of painful. People at first, you could see them being very reluctant, and I had to be very kind and gentle and kind of jocular as I was trying to get people to do it.
00:39:33:18 - 00:39:52:12
Andres Glusman
But I do think that those are the kinds of things you have to do to get people to accept it. Another trick you can do is have public betting on what will. And so every experiment that your team launches have, you know, have a poll and have the winners be people who guessed it correctly, be entered to win a prize, for example, around those things we used to call it bet on red because our colors were red.
00:39:52:12 - 00:39:56:21
Andres Glusman
So there's a lot you can do to build the buy in, but the more you can get people comfortable being wrong, the better.
00:39:56:21 - 00:40:11:13
Andrew Davies
That's Really cool and and really sound advice. I was just thinking I scribbled down as you were writing that proved me wrong is the best thing you can do. I'm going to take that away from myself as a challenge for me. Back to my my next meeting. That's a great thing for every leader to make sure that there are no sacred cows in their organization.
00:40:11:13 - 00:40:23:19
Andrew Davies
So thank you. I can't believe we've run through all of our time so quickly. I could talk to you for hours, but I really appreciate the wisdom you brought and the guidance you brought. And yeah, I know this will be so helpful to so many of our listeners. So thank you.
00:40:23:21 - 00:40:30:00
Andres Glusman
I'm so happy to hear that. And again, Protect the Hustle is such a special place in my heart that I'm thrilled to be a part of it. I'm thrilled to give back.
00:40:30:00 - 00:40:36:17
Andrew Davies
Andre, thank you so much for your time and I'd love to catch up soon when I'm over in New York next. I'd love to. I'd love to get a coffee table deal.
00:40:36:17 - 00:40:37:03
Andres Glusman
Deal.
00:40:37:05 - 00:41:00:10
Ben Hillman
You're on Shout out to Andre's for being on the show. Now you have a better understanding of growth experimentation. Today, we talked about embracing experimentation, learning from successes, avoiding past failures, leveraging real world experience, and the key to guided learning. Make sure to give Protect the Hustle a five star review and tell us what lesson from today's episode was your favorite.
00:41:00:15 - 00:41:08:13
Ben Hillman
Thanks for listening. Subscribe to and tell your friends about Protect the Hustle, a podcast from Battle Studios dedicated to helping you build better SAS.