Growth Leap

Create a Killer Product: Experiments, User Research, and Product-Led Growth with Hootsuite Henning Heinrich

April 14, 2023 Stun and Awe Episode 23
Growth Leap
Create a Killer Product: Experiments, User Research, and Product-Led Growth with Hootsuite Henning Heinrich
Growth Leap +
Exclusive access to premium content!
Starting at $3/month Subscribe
Show Notes Transcript Chapter Markers

In our latest podcast episode, we had a blast chatting with Henning Heinrich, a seasoned product growth expert who's driven millions in incremental revenue for companies across the globe. As the Group Product Manager at Hootsuite, Henning knows a thing or two about creating disruptive and meaningful products.

Product-led growth is all about putting the product front and center of your growth strategy. This means creating a product so damn good that customers will want to shout it from the rooftops and bring their friends along for the ride. It's all about using data, feedback, and experimentation to inform product development and drive growth.

Support the show



Follow us:

Create a Killer Product: Experiments, User Research, and Product-Led Growth with Hootsuite Henning Heinrich

Michel: Henning Heinrich. Welcome to the show. Super hap happy that we finally managed to talk. Um, you are a group product manager at Hootsuite. Uh, can you tell us a bit about your background and how you ended up there? Um, not many people, not until recently studied growth marketing in, in the University of Product Management University. So I'm really interested in understanding a bit better the, the, the path that, you know, led you to where you are today.

Henning: Yeah, sure.

I started my career in, um, you know, in growth before growth was a, was a thing really , uh, back in the day it was more around like digital marketing, those kind of things.

And I naturally kind of gravitated into conversion rate optimization then.

And, um, testing, ab testing experimentation. So spent a lot of time, a lot of my career in like, experimentation and, uh, and then kind of growth marketing became a thing, right? Like, um, again, it wasn't, it wasn't really like, Hey, I wanna be a growth marketer. It's just like naturally, um, growth marketing became, um, yeah, became a role and, um, moved into that really, um, focused on acquiring users, acquiring customers, acquiring signups, trials, uh, depending on which company I worked for.

Um, and then from there, Uh, moved into, moved into the product because I, I got more interested in retention. Um, I, you know, I felt the, like we used to say, we, I felt the bucket at the top, but I seen it leaking and, uh, got really interested in that and wanted to kind of avoid that, uh, that leaky bucket. And went into pro, what we call product growth today.

Henning's team at Hootsuite

Michel: As you said, the growth marketing or or, or product growth became a role. A lot of people use these terms as if everybody's clear on what, you know, we all mean. But, um, I'm really interested in understanding how your team at Hootsuite is set up. Um, can you share a bit about like how, like, what your team looks like, but also how it interacts with the, with the others?

Henning: Yeah, for sure.

My team is, uh, made up of a couple of, uh, product marketers that oversee, you know, certain areas of, of the, the customer journey. Um, we are, we have teams very focused on activation. So, you know, customers coming in, unlocking first, you know, doing a first visit, unlocking some value for them, uh, doing the onboarding, those kind of things.

Um, we have a team that looks after adoption. So once they are, you know, move into kind of being a beginner with the, with our tool, uh, what can we do to make them regulars, uh, deeper functionality. And then we have a team very focused on expansion. So once they become regulars, we always see, you know, do you wanna adopt more functionality, buy more functionality.

And then we close the loop with advocacy. Um, so we have one team looking after advocacy, which is, um, kind of, you know, if, if you have customers that are your champions, Use them or get their support to kind of advocate for your company. That's kind of the, the loop, um, that we look after. And that's how our teams are also set up.

Um, and then there's obviously the people that don't wanna stay with us anymore that are at risk and, uh, you know, want to cancel. And so we have a team looking, looking after that as well.

The difference between growth marketing and product growth?

Michel: Since I think you've been on both sides of, of the fence, how do you see growth marketing versus product growth? What, what are the key differences? And also is that, uh, and I'm assuming it ref, it's reflected into how the other teams are set up.

Henning: My short answer is growth marketing really looks after acquisition and product growth looks after retention, right? Um, there's, there's three levers that you can pull when you wanna drive growth. That's acquisition, retention, and monetization. Monetization can sit either on the growth marketing side right now for us, it sits very much on the growth marketing side, uh, but could also sit in product.

Right? And there's some mobile overlap. As I said, we are looking after expansion, for example, which has a lot to do with pricing and packaging as well. But yeah, in a nutshell, growth marketing acquisition, product growth looks after retention.

How to develop good growth habits

Michel: You said, uh, a while back that, uh, growth marketing is not a magic potion, it's a habit. And I know that you're, um, you do CrossFit, you're a, a CrossFit athlete, which I believe, uh, requires a lot of, uh, developing good habits. Can you tell us a bit what a, a good growth habit would be?

Henning: That's a great question. Yeah. I mean, it's about stacking good habits. You're right. Right. I think, um, back then when I, when I said that, um, what I, what I really meant by that is at the time there was a lot of hype around growth hacking. There was like, Hey, there's like this one magic thing because you had companies like Airbnb, um, that just done this, done this thing with Craigslist, right?

They, they posted on Craigslist and they basically had a 10 x growth, uh, leveraging another platform. And that was the growth hack. And at the time, everyone was like looking for those growth hacks. But the reality is it wasn't that one. It wasn't one thing that they came up with and magically. , they grew 10 x right?

They probably tried a thousand things before that and then they found this one thing that worked for them. But it doesn't mean it works for, for any other company. It doesn't even mean it works for any other marketplace, right? So, um, my point back then when I, when I said that was around, well, it approach it more like an iterative approach, right?

You, you have hypotheses, you test those hypotheses and either you test them, right? And, and you find something that works or you don't, right? Um, regardless, it's not that one magic thing that you can just copy paste from another company.

How Henning manages the growth process with his team

Michel: I'm interested in, um, in the process, right? And, um, how you actually run it with your team. I mean, there's been books, uh, you know, ha like Hacking Growth that have been written about the, uh, you know, the, the step-by-step process. I think conceptually speaking, it's relatively easy to understand, um, the, the challenge I feel with any, you know, frameworks, concepts, or processes to actually implement it in your teams.

So I'm really interested, like, if you can walk me through like, I don't know, a day or week, just to see how you manage the process with your team.

Henning: On a very high level, we have a dual track approach. Um, so we do discovery and also delivery. Discovery for us is like really diving into understanding the problem, delivery is delivering the solution. We're spending a good amount of time on discovery, really understanding the problem areas first.

The way we do that is talking to customers, right? Like there's a lot of qualitative data points, uh, that we get really understanding the customer pain points. Uh, not just looking at data and coming up with a hypothesis, but actually talking to customers, understanding why, and getting to the root cause of their, their pain, uh, and their pain points, right?

From here we kind of find areas where we wanna dive in, right? We wanna under that, we wanna understand better. And, uh, what my team is doing is they, as a next step, they look at quantity and the quantify the problem because what I see a lot is as product marketers, we are, we are so trained to kind of look at what are the customer pain points, and when we find a customer pain point, we are so excited that we wanna build a solution for them.

right? But we jump sometimes a little bit too fast into solutioning. So what we need to do, and what I see a lot is like the problem that we identified is a problem, but it's a problem for maybe one or two customers out of a hundred thousand customers, right? If we would build a solution, it's not really solving anything for our entire customer base.

So as a next step to avoid that, what we do is we, um, we look at opportunity size or the problem size basically, right? We look at what's, it is a problem, but how big is the problem? Which goes into kind of prioritization, understanding the impact and those kind of things.

We work very closely with our counterparts in engineering and design to understand what effort would look like. Um, so that's a, that's the next thing if you would tackle . This, right? Like what is that like in terms of scope? Will this take us half a year? Will it take us one sprint of like two weeks . And then we prioritize those things. Um, and then we go into delivery.

We can build, like what we call minimal viable test, which is we are not gonna build the entire big thing. Um, but we just build, you know, maybe a slice of it, test it, and see if we can, if, if there are some signals, right?

See if we can validate some of the, the thinking that we had. And, uh, and if we see it, then we double down on it and we, we, um, we build it out properly, right. And spend more effort on it.

How to quantify an opportunity and the three activation moments

Michel: You mentioned quite a few interesting things in there. Uh, I want to talk about user research, but just before quantifying the problem, Which also conceptually sounds easy to do, but, uh, you know, in my experience, uh, you know, people tend to overemphasize like overweight a specific criteria versus the others.

I'm really interested in how you measure the, uh, I don't know if it's the impact of, uh, your ideas, if it's just, uh, in terms of potential revenue or retained customers. What are the usual, you know, key metrics that you would, um, you know, put into that evaluation?

Henning: I've done this in the past, totally overestimated, what impact could look like. Uh, and you, and you learn from that, right? I think the key is to, to go a) to be very conservative. If you think it'll impact 10% of the customer base, you know, bring this down maybe to 1%, or, you know, bring it down to half of it to 5% and have more conservative number.

Um, the other thing is you need to figure out what your success metrics are first.

Let's talk about, um, activation. There's different moments in the customer lifecycle that we are trying to hit.

First is a setup moment, which is basically we are looking at , what's the minimum requirements that we need to get someone on our platform? Um, for Hootsuite, that would be, they need to authenticate with their social profiles, right? Like we need to build a connection between their Facebook profile and Hootsuite, for example.

That's kind of the minimum. It's a setup moment. And uh, this, just as an example, we have three of those in activation. One is aha. When they get to like unlocking value and they're like, aha, this is, this is the value here. And habit moment is like starting to build that habit around using the platform, right?

Why am I saying all of this? Those moments are really important for us to measure how well we do in activation. And we also, uh, build like correlation between those and early monetization, first time payment, et cetera, right? So we see someone that reaches that moment is X more likely to pay us.

And, uh, and we use. , we use all of those correlations, et cetera, to really understand what is the top line impact. And once you do the math and like really go into that and you know it's not, it's not a hundred percent, it's not that everyone that gets activated Will is gonna pay us. Right. Those numbers are actually quite small.

And once you put all those filters, all those lenses on top of your assumptions, you get to like a, a lot more conservative number than you may have had in, in, in mind when you started this.

Using the ICE framework to prioritize opportunities

Michel: If you, if you use the ICE scoring method, you know, you end up with an ICE score, which is a combination of three, um, you know, three numbers. Um, are you using something like this? Is it the PIE framework that you're using or something else? And do you still end up with that, you know, consolidated number?

Henning: Yeah, I think in terms of opportunity sizing, I would say that's the, that's the I and ICE, so I. For everyone that's not familiar with it stands for impact, confidence, and effort. Um, so it looks through those three lenses to understand what the, what the big, what the overall opportunity looks like. Um, in terms of like understanding the size, really, I would say I would put that in the impact bucket.

Short answer is yes, we use ice.

The confidence is like, I'm asking questions around like, what are the data points really that, that support our idea here? Or a hypothesis, right? Like, does this, is this just like a, just an idea that, that someone had, and we don't actually have any data points backing this up, which is fine sometimes, right?

That's what I meant earlier is like if we have that, we'll just keep the effort low, test something and see if you can get some signals, get some data points that would. , you know, just justify spending more time on this.

How to use the PIE framework to prioritize growth experiments

Henning: The pie framework. I love because pie for everyone that's listening stands for, uh, potential impact and effort. So very similar. It has impact and effort in it as well, but it also has potential in it. Potential is for me, like asking the right questions around understanding how much we can move the needle. .

And I think I, I use PIE a lot when it comes to little UI changes, um, especially on like website and, . And like ads and those kind of, uh, tests that we run. I love to use PIE because sometimes we are like, yeah, let's do this test. It probably has, like, it's on the homepage, so it has a big impact.

A lot of people see it. The effort might be low, but the change is tiny, right? Just it's a headline or you change one word could be impactful, but maybe not.

So when we prioritize it, we gotta look through the lens of like, okay, so what does the change really look like?

Like what's the, what's the potential that this could drive? Really? Is there, is there really the, the impact is high, the effort is low, but is there really a lot of potential by changing the exclamation mark to a dot in a headline. So, um, that's why I love the pie framework for like, when it's like tiny things to really understand, oh, maybe this is too small and we let's prioritize something that has more potential to move the needle.

How to do user research

Michel: Frameworks are great. Devil is in the details. Um, and, um, you know, the assessment of each of those can be very subjective as well. Right. So I think you, you, you require, it needs a bit of practice also with the team to kind of agree on, on some sort of a, of a benchmark for what is impactful.

One thing that I, I want to go back to is, uh, user research. I've been in a lot of meetings, uh, with, you know, product, engineers and, and, and salespeople where people were thinking, oh, what should we build next? You know, and, um, as if, you know, it was, it just cut could come out like this, you know, out of, of this brainstorming session.

What I've experienced myself, you know, whenever I talk to clients or customers is that there's a lot of ideas that just come from, you know, talking to them. Yes, you can look at a competition, you can copy, you can try to catch up, but I think personally, I, I found the real. , uh, bag of gold is when you, you know, you talk to customers. And what I've heard, uh, frequently was

people quoted Henry Ford's, uh, famous quote, if I had asked my customers what they wanted, they would've, uh, said, faster horses, right?

So I'm interested in how you do the user research, uh, how you approach it, uh, what kind of user risk research method that you're using.

And, um, yeah, tell me all your secrets.

Henning: Yeah, no, I love that. Uh, I'm very passionate, actually. I, I feel the same way. Right? Like, and, uh, what I love to say is, if you wanna innovate, don't ask your customers. Like, they don't, they're not gonna tell you your innovation, right?

Because innovation, we sometimes also think it's just a, sorry, just getting sidetracked for 30 seconds. But for innovation, we, we sometimes, uh, think of this really crazy thing, right? You think of, um, iMac for example. The iMac computer was this crazy innovation, but it's actually not true.

The iMac was a personal computer, but there, the computer already existed. It's not that Steve Jobs created the computer. He just took something and made it better. And there's a, there's an innovative aspect to it. And even for us as humans, it's like those are the most successful innovations.

Cuz if you go too far, you have a problem, people have a problem adopting to it. Right. Google glasses is a good example. People are not ready for it. The Apple airpods, I would say

, people were like, ah, this is kind of weird. You have this toothbrush in your ear, but somehow they, they got over that and now everyone's wearing it.

It's normal, right? But in the beginning I remember it was like I wasn't adopting, I wasn't an early adopter because I was like, nah, no. But they got enough early adopters that kind of pushed it into the, to, to main market. Right? Sorry. Got sidetracked. There're very passionate about innovation.

The key question to ask customers when doing user research

Henning: I think that the thing with innovation and like figuring all of it out is not asking the customer, Hey, what solution do you want? You're asking the customer what pain points do you have, like what's the, what are your problems? How do you work? Like, especially in B2B SaaS, their workflows, right? Like our customers are, they need our software to do their work. Uh, we love the framework jobs to be done.

This framework is, is basically asking, so what jobs do you have to do and like, how can we support you in that? And that those are the questions that you have to ask. You don't ask, what solutions do you need? What are the problems? We are here to come up with the solutions. That's our job.

Which tools or channels to use to talk to customers

Michel: One of the things you did not necessarily answer in that question is how do you actually talk to the customers? So do you, uh, basically set up a, a call with some of them? Is this more survey based? Like what, what kind of tools or, or, or methods are you using?

Henning: Yeah. Um, so we have what we call, uh, customer advisory group, um, which is like, uh, basically it's a group of like a lot of different stakeholders, um, that have a connection to the customer and can give us data points back to the product team.

This is just a process that we set up or like a program that we set up so that we can continuously get that input.

The very classic thing is we're talking to our customer success managers. In B2B it's, it's, it's an easy path basically to just, they are talking to our customers every day . So for us as product managers, we need to be in those conversations.

I've never talked to a CSM and didn't come out on the other side with like so many ideas.

So that's, that's number one, right? Uh, the other thing is we work at Hootsuite. We, we have an amazing customer research, um, team, user research team, and they, you know, they recruit our customers. They recruit our, the customers of our competitors.

They recruit people that just try our solution. They recruit, uh, people that have never heard of Hoosuite, depending on what we wanna, research. Um, they would recruit different, different groups. They would do the research basically for us, um, and come up with those really succinct, um, recommendations, right?

One more thing that I think is super underrated in terms of user research just came to mind is talk to your inbound team. So if you have like a blog team, uh, SEO team, talk to them about what are people looking for? Because the first thing that people look for is, or where people look for something is on Google.

What we had at Hootsuite, for example, they're, they're people looking for LinkedIn Bio .

If a billion people are looking for LinkedIn Bio, well let's build our own LinkedIn Bio solution, right? Like, there, there's something there. Um, the same people that look for that click on our blog posts and are very interested in our content. So they're close to our customer.

Yeah. Or our ICP potentially. And, um, so let's explore that doesn't mean we've gotta build it right away, but let, let's, those are the, those are inputs, right? Use this , as a jumping board for, for, for the product that you build.

Michel: I would like to, uh, talk about experiments that you've, uh, tested and run, and obviously, you know, our audience is interested in, in building their own business or the career and, to contribute to growth and building something that, that matters. So I'd like if you could just, you know, share with us like two or three experiments that you've tested, uh, in the past and where you were either, uh, you knew that it would work and then you, you got great results or, um, it's, you know, it was an idea that came, you know, from the left field and everybody was surprised and said, oh my God, this, this is working.

Why?

Henning: Yeah. Uh, well first of all, I would say, uh, in terms of experimentation, only 25% rule of thumb of your experiments will actually be successful. And that's okay. That's why we experiment. That is the tool out of our toolbox, the experimentation tool. That's why we have it to de-risk really early on to understand really early on, is this something or isn't this something?

In terms of like, uh, you know, giving you a specific, let's, let's start with one, one win. Um, that's a very interesting one. I'll tell you in a second. Why. So what we've done is in terms of activation. , we know someone unlocks value at Hootsuite when they publish or post to through Hootsuite to their social profiles, to multiple social profiles, ideally at once. That's kind of our value prop.

What we've done is, or our hypothesis at the time was like, well, if that's what they need to do, let's cut everything when they come in, like all the bla, let's cut all of that and let's just drop them right into that.

You know, Hey, post something, so that you, see the value. So you've done that actually had a really, you know, bad impact on our metrics, . It was actually, it was actually worse. We activated. , right? And, uh, so what, what we could have done or what the team could have done at that point is like, well, it didn't work, let's move on to the next thing. But what we actually did, and I think that's kind of the, the biggest advice I would give anyone that's doing experiments is you had an hypothesis. And what the first question that I ask when I see the results, when I see how an, an experiment performed, is like, why do I think it performed this way?

What are my hypotheses that it performed this way? And what it does is it kicks off like a, a cycle of like, yeah, I had a hypothesis. I build, measure, learn from it, and now I have new hypotheses is coming out of the results and keep going.

So for the example that I just shared, we've done that, we're like, we asked ourselves, okay, so what's, why didn't it work ?

What are our hypotheses? And we're like, well, maybe, you know, one smart person on my team said, uh, maybe, um, they're just overwhelmed, right? Like we dropped them into. Like very close to kind of like the end of their workflow, but they have to make a lot of decisions, upload content, do this, et cetera, blah, blah, blah.

And one of the things that we were thinking about was like, well, they have to kind of, they have to select which social profiles they want to post to Twitter and LinkedIn, for example.

What if we pre-select them for them ? So what if we, um, what if we look during onboarding, we ask them to con connect their LinkedIn with, with the Hootsuite solution.

What if we just look at the first two that they connected, because probably the, those are the ones that are most important to them. And just pre-select that for them, um, in that process. So basically anticipating your user's needs, which was learning coming out of it. Uh, not to spoil the results, but we've done exactly the same, pre-selected some social profiles for them that they may wanna post to and have seen a 10% increase in our activation metrics.

So exactly the same experiment. Great success, but only after we've went through another iteration of it.

When you ask for like, what are your most successful experiments, I gotta tell you, like I, the most successful experiments to me are the ones which took like five iterations to get there and the four iterations that then that were not successful, quote unquote their successes because they got us to that fifth iteration which worked.

How to know if you should iterate a failed experiment

Michel: You're basically saying that, you know, you tested something, it didn't work, and then you, you, you, you kept pushing in that same direction, right?

So what, what's your guiding principle? Why, uh, you know, did you keep pushing, with a, a third, fourth iteration?

Henning: Yeah. Um, I think it comes down to a couple of things. Do we think that if we find the winning formula, it could move the needle?

You gotta see, because it's effort, right? Like basically you increasing your effort every time you, you do that. Then it's about do we actually can, can we actually come up with some really good hypotheses based on the results that we see?

The point is like, don't just stop because it failed. Ask the question of like, why, and then see if something comes out of it. If nothing comes out of it, fine. Stop there. Yeah. Know when to pull the plug.

The last thing I will say to this is be, it's, it's somewhat related, right? Like be data informed instead of data driven. You hear this a lot these days. Like, it, it, it started with data driven because no one was using data. That's why it was like, let's use data and it should, you should use data.

It's a really great tool, but don't be blind, right? Like, don't follow it blindly.

Concrete example to test pricing

Henning: Lemme give you an, a really interesting example. So we've done, um, a couple years back, we've done a, um, a price test.

And the way we've done this is, uh, we just changed the. , we just changed the, and a lot of companies do that, change the prices on the website, but don't actually do all the things to change the prices in the backend yet.

And what do you expect is when you increase prices is that your signups go. You expect less people signing up for your product because you know, they see a higher price and some, and some are not willing to pay that. So that's natural. What we do is like, we do the math, right? It's like you have 10% less signups, but at a hundred percent higher price, it's a no-brainer.

It's still gonna be profitable for you. Um, so we tested different price points and most of them came back negative. Like there's some kind of like negative impact on signups, which is fine. We did the math, et cetera. And one price point really high increase, let's say it was a hundred percent increase, came back with, we got more signups,. If you're data driven, if you're a data driven company, you would be like, great, more signups, higher price. Cool.

Don't follow blindly the data. And we didn't at the time we're like, this is counterintuitive, this doesn't make any sense. And then, and then, and our case either it's like a, it was a tracking error or it was not statistically significant. I can't remember. Right. But the point is, like you asked like, why does this make sense?

Michel: When you tested these, uh, different price points, like how did you do this? Because you're kind of telling me, you know, you changed the front end, but not necessarily the back end. So let's say instead of paying, you know, 40 bucks, it was, you know, you put 35 or or 45.

Um, I would still click on the 40 bucks. Or, or, or the, your new price point. And I would still end up in the checkout with the old pricing.

Henning: In the checkout, we would still show the, the new pricing, but then we would actually just charge you the lower price. And the, the, it only works this way. We call the surprise and delight, um, because nobody's gonna complain that they get charged less than what they thought they're gonna get charged.

It's the only disadvantage doing it this way. It's expensive because you have to get less customers into the door. and everyone is still paying the lower price point. So it's really a learning exercise, but you can de-risk your pricing changes a lot.

You don't do this for a long time, obviously, because it gets more expensive every day. You run this experiment, but it it, you run it for a week or two weeks or maybe even three weeks and, um, uh, and you get a, a good feeling and you really de de-risk the decision there.

Michel: You mentioned on LinkedIn that although marketing has a huge variety of sub-specialties, like, you know, cro, ceo, email, PR, et cetera, we don't find, uh, the same range of expertise when we talk about product management.

And, um, you were kind of saying that, it's kind of time to, uh, to work on that. When you think about expertise or or subspecialties for product management, what comes to mind, uh, for you?

Henning: From my experience, there's like four big camps in product management, right? You have the, obviously the core functionality, and this is kind of how we categorize everyone, everyone. We, we talk about product management and we think like building the products, like the core functionalities, right?

But the reality is we have, we have some PMs that are a little bit more specialized on the backend, , platform also called.

The third one is growth. This is where I'm very active in, right? Um, the difference to core, as I said, is core is, I like to say creating the value for customers.

Growth is connecting the customer with that value. And then you have innovation, right? We talked about this earlier. It's like not everything you build is an innovation, but you should have a track for innovation and you need to have some PMs that also think about innovation. They also call it sometimes zero to one because they just basically validating that there's something there.

What is product-led pipeline?

Michel: I have a few, , follow up questions on, uh, product-led pipeline, which is another thing that you've talked about. Uh, and my understanding is that, you know, historically companies have focused or have, uh, relied on a, on filling their pipeline with sales.

But you talk about, uh, product led pipeline. Can you just define a bit what you mean by this and how companies can, you know, what companies can do to, to, um, to go in that direction?

Henning: Yeah. I think this is . It's a somewhat sensitive topic. A little bit. I think, um, because

when we talk about product led growth, um, what often happens, it has a bit, a little bit of a bad connotation because it seems like that anyone that is in large product led growth, they're basically suggesting that a customer doesn't need any human assistance. There doesn't need to be any sales. They need, doesn't need to be any customer success. No human assistance, basically. Right? But nothing could be further from the truth.

Product led growth is basically leveraging the product whenever the customer wants to self-serve.

Traditionally on marketing what, what happens is very traditional B2B funnel would look like marketing is, um, driving awareness. They would build like mechanisms, um, to get leads into your system. Once the leads are in the system, there might be some nurturing, et cetera, but once they become more opportunities, sales really kicks in.

Opportunities is what makes up this pipeline basically, right? Opportunities are are the ones that really show interest in like buying your software, uh, or your tool, and then that's when sales kicks in and sales really takes it then from the, I'm considering your solution to the conversion stage, which is like actually converting them into customers.

My point was right now, , we have marketing traditionally giving it to sales. What if we put the product in between to just enhance the whole thing, make it more powerful, more profitable even, right? Because what happens is if you can do marketing. You drive the awareness, someone signs up for the product, show some product usage even, you still qualify them, but you qualify them and you have product usage. You can build this pi like you can basically create opportunities. They're a lot stronger, even like a lot harder leads because now you don't, you not just only have someone that's in the consideration of your tool.

And when sales comes in at that point and picks them, it's an easy sell. They can sell so much faster. Sales cycles go down.

There's so many advantages of doing that. My point here was build a pipeline, leverage the product to build a pipeline because that pipeline will be 10 x, perform 10 x better when it comes to selling.

3 ways to build a product-led pipeline

Michel: I agree with you. A lot of people tend to look at, what we could lose, uh, as opposed to, you know, what you can gain from, , AI or, or an idea like product led, uh, pipeline. But what, what, what does that mean, uh, to be able to do this? If, if you have a SaaS business, does that mean you offer a, you know, a free trial?

Is this like the main way to actually do this?

Henning: There's different mechanisms, I would say, right? Like, uh, you gotta figure out, um, what, what this looks like for you. I think, uh, in general what I see is the market, like most customers, at least in b2b, want to try out the product in some ways.

So here are the three things that I see. One is just like a very demo with like dummy data. So you don't really have to sign up. You just click and you see this is our product, this is how it looks like, here's some dummy data in it.

The second thing that I see is, uh, a free product, , what we call a freemium motion. You can sign up for a free product, you'd never have to pay, but you have very limited functionality. That is very good in the beginning to drive acquisition. Free product is really just an acquisition mechanism, but it's very tricky. I was gonna tell you this, like, because I've went through this quite a lot with companies I worked for, it's very tricky to figure out what are the. , we call it entitlements. Like how much do you give the customer on a free plan, basically, right?

Like, how much do you open this up? Uh, where, where do you set the limits, um, of using the free plan? If you open up too much, there's no incentive for them to pay you. If you, if you make it too little, they don't, they, there's no initial value that they get out of it. So they're like, ah, this is not valuable and they, they drop, right?

Canva is doing that. You know, not, not, um, I'm not affiliated with them in any ways, but Canva is doing that because I'm just impressed, uh, how they've done it, uh, very, very smoothly.

They found this one feature, um, that people are willing to pay for. It's interesting. It's, um, the background remover. It's just so, so simple and elegant that I had to mention this.

And then there's the trial and trial is very powerful. Um, because you can open it, you can open up all the functionality and get like people to experience, oh my God, this is, so, there's so much value here, but you timebox it, you say 14 days, 30 days, maybe seven days. And then you get, you lose access to it.

Those are the three things that I see in terms of how companies can, um, can leverage the product.

Henning's take on ChatGPT

Michel: We've talked about ChatGP T, uh, and obviously it's, it's huge at the moment. I think there's a lot of hype, uh, like every time there's something new coming up. But this is not just anything, uh, I think it's huge and it'll become, um, very important in how we we do business and how we work.

I'm interested, in hearing about your, your take on this. How do you see this, uh, changing how people build like SaaS products, uh, how they include that into their product or how they use it maybe you and your team in your daily work?

Henning: Great question and, and, uh, very timely one. I think, uh, to, to your point, I think it's a, it's a big hype right now. Um, and, and rightfully so.

I think we have, what we have is the foundation. We have the technology, but it's on us now to figure out how can we use that to build something that's really useful for our customers. And every company is, is different in that regard.

If you just copy paste what we have, like a conversational ai, basically, this might not be the thing that your customer needs, right? Like, just that, like you, I think you need to, this is my take on this. You have to look deeper and leverage the technology , but then build it in the way that you're, that it's useful for your customers.

It's very exciting. It's very exciting. And I, I gotta tell you, like we are working on some stuff at Hootsuite that's gonna gonna come out in, in the next couple of months. Um, so, you know, anyone that's interested in, in our kind of solution, brace yourself. There's, there's a lot of exciting stuff coming that's gonna help social media marketers.

But I think we're just scratching the surface .

Michel: I agree. And I think a lot of people will find, you know, twisted weird evil ways of, of using it, unfortunately, like, like for any new, uh, technological development. But I think it can boost your productivity, uh, and any and your creativity, uh, for that matter in a way that we could have never envisioned before. So I, I, I'm really excited about what's gonna happen.

Henning. If, uh, we want our audience to find you, I think, uh, LinkedIn might be the, the best channel.

Henning: LinkedIn is best. Reach out. Happy to chat.

Michel: Thank you so much, uh, again for your time. I really appreciate that. And I'm also looking forward to the hot stuff that you're cooking at, at Hootsuite, so we'll, we'll keep an eye on what's coming up in the next few months.

Thank you so much.

Henning: Thank you, Michel. Appreciate it.

Henning's background
Differences between growth marketing and product growth
Developing good habits in growth marketing
Process of implementing growth frameworks
Prioritization and Delivery
Quantifying the Problem
Activation and Monetization
Frameworks for Opportunity Sizing
User Research
Google Glasses, Toothbrush in your Ear, and Innovation
Talking to Customers
Experiments and Hypotheses
Finding the Winning Formula
Being Data Informed
Specialties in Product Management
Product Led Pipeline
Different mechanisms for product acquisition
Different mechanisms for product acquisition
Exciting developments at Hootsuite
Conclusion