Jacob: what's the single best activation
win you've seen or, or had yourself.
Daphne Tideman: This is
definitely recency bias.
I always get excited about the most
recent ones, but this was an advisory
client that they, switched their static
onboarding, like having kind of a
classic questions, walking you through,
showing you what you can do in the app.
To an AI onboarding where it was
asking you like about eight questions.
It was quite skeptical around it, but
they, basically saw that it took people
three times as long to complete it.
So you'd think it completely
would drop off, but more
people actually completed it.
And the arpu, it was more
than 10 x after day 14.
Jacob: Super excited to have you joining
on, on the Price Power Podcast today.
, Today we are talking about activation.
Daphne Tideman: Thanks for having me.
Excited to dive in.
Jacob: The first thing I, I wanted
to ask you about is that, know,
you, you've transitioned to a lot
of different things in your career.
Started more and different agencies
moved to the, in e-commerce.
Um, how did you, how do you
think about activation earlier
in your career compared to now?
Daphne Tideman: Yeah, so I think
I, I could almost think of it as it
being more 2D versus 3D as it is now.
Like when I started out, I was 21.
I was just getting into growth and
learning as I was going, working
with different startups at the
agency, and it was almost like
activation was just super simple.
Like, oh, it's kind of that step that,
you know, they've signed up, taken
some kind of, uh, you know, action.
It was almost just a step in the funnel.
Versus like over time seeing it as
more like behavioral and action based
and really thinking of it as being
a bit more nuanced than just a step,
but more that there's different forms
of activations and ways to activate
and also quality of activation.
And then when I shift to heights, it's
obviously very different than when you're
in e-commerce because you don't actually
know if people are using your product.
So your best measure of activation
is like, okay, are they actually
renewing past that first period?
But if that period's quite long, it's kind
of like, well, what is activation then?
Is it that they've just, you know, are
we almost equating it with acquiring
them and ignoring it as a step?
Because if they've got a year subscription
is then that one year renewal point.
Um, but what we did
see there was that, um.
The monthly subscribers did teach us
a lot about like who was and wasn't
activating and what their reasons were.
And also learned to see like, hey, a
lot of people think of activation as
like working on that set, but most of
what we did that impacted activation
was more upfront around how we were
communicating, creating benefit.
It was a supplement, so it takes
a while to feel the benefit.
So having like benefit timelines,
talking about habits, stacking
it with your morning coffee and
that really like started making.
Activation a bit more 3D for me.
That it's not just this one simple
step, it's not just one action.
It's much more impacted by
everything that happens upfront.
Um.
And I think that's also like now when I'm,
uh, advising and consulting with apps,
it's like so much more layered than in
e-comm because we can measure this data.
And that's also, to be honest, I think
why I love it so much more these days
is because I'm like, gimme the data.
Let me see the insights.
And there's, you know, there's
different layers and moments
that make up activation.
So I think it's just gone from
2D to almost three, four D in
terms of my thinking of it.
Um, and become a lot more nuanced
than just, hey, they actually, you
know, used a product or signed up.
Jacob: And, and you know, the huge
differences from e-commerce subscription
apps where a lot of subscription apps, you
really have user behavior and, and kind
of what are people doing in the product?
How do they interact with things?
How do they perceive things?
Um, do you think there were,
you know, mistakes early on
that you think others can avoid?
Daphne Tideman: I think the main
mistake is oversimplifying it or just
assuming you know what action is best.
I think sometimes we have, this is
what they should do versus this is
what actually helps them activate.
And I think the other thing is, is
like, because we want it as early
on, we sometimes choose something
that has a lower correlation
or a lower predictive power.
So I think that's also
something where it's like.
The easiest metric isn't always the
best metric, and obviously you want
it to be easy enough to measure.
You don't wanna overcomplicate it and make
it a combination of 10 different actions
that aren't realistic or a subsection.
But I think like I, what I
would do now is much more
understanding the nuances of like.
What is kind of volume or like the right
kind of action and when versus just
treating it as like only just the action.
I think I didn't also consider enough
like the importance of timing of these
things and Sure isn't always better
for sure, but like it does have an
impact into terms of like how much
does that actually predict retention.
Jacob: Yeah.
Yeah.
And, and so I guess it's a good
segue in terms of like, what,
what made you realize that.
the standard activation metrics, uh,
everyone was tracking weren't actually
particularly the things that mattered
where you talk about predictive power.
Um, how, how did you kind of figure that
Daphne Tideman: Yeah, so I do a lot
of growth audits, and I absolutely
love them because it just, I get to
go in and be nosy about order data.
It's, it's great.
I, I love diving into all the
different data and seeing everything,
and there'll be these cases where
I was like, Hmm, onboarding.
Looks good.
Trial started, looks good.
There's a decent feature adoption,
but why are people not retaining?
The price isn't crazy?
And it made me start to realize
like, hey, the, you know, like just
like this idea of like, Hey, if
users do this versus they didn't do
this, that seems to retain better.
It was just oversimplifying and that.
Those metrics could still look good,
but you could still have a potential
activation issue because it could be that
they are taking action but not enough.
It could be that they are taking the
wrong actions or they're not getting into
the habit of actually using the product.
And I think that's what made me
realize like the, we sometimes kind
of correlate behaviors and we're like,
oh, there's a correlation with this.
If they do this, they're retaining better.
So that must be the activation, but
that doesn't mean it's causation.
And so any action will
always be no action.
If they're taking some kind of
action, that retention curve will
always look better because at least,
you know, unless it's a terrible
action, but like most of the time.
But that it's finding that sweet spot in
terms of like between volume and impact
of that action and which actions matter
more, and also understanding their users.
So I, I realized it wasn't predicting
what mattered when the metrics that
said activation should be okay.
Were.
Good, but we were still not retaining
users and they were dropping off early on.
They were dropping off, you know, in
the first like 30 days and not later on.
Jacob: Yeah, I think.
From my experience, a lot of teams
maybe go, oh, onboarding completion.
They, they
Daphne Tideman: Yeah.
Jacob: they're activated.
Right.
Um, and, and that's not always the case.
I know.
Um, you were talking about a case
where, you know, onboarding completion
for, you know, both iOS and Android
was over 90%, but then when you looked
at the data, you, you were saying
that most units were gone by day two.
Daphne Tideman: Yeah, so sometimes
you have like an onboarding that's
like very short and quick to click
through, and that was the case there.
It was like a very easy
onboarding, low cognitive load.
You know, answer a few simple
questions and like, I think
people were almost completing it.
'cause they're like, what is this?
How does it work?
I'm curious.
Like, so the desire was there at
them, but then they were getting there
and they were like, uh, now what?
And it was actually
quite a complex product.
So having this short onboarding was
actually doing them a disservice
because yeah, you were getting, they
were getting the insights as to who
these people were, but they weren't
actually helping them to that point.
And then it was really overwhelming
for those people to try and.
Do their whole setup.
Like they'd go through the
onboarding and be like, okay, do
you wanna, you know, start a trial?
Some people would start a trial
just outta curiosity, but then they
still had to do so much that it was
almost like, okay, just leave it.
And I think like.
We need to like, keep in mind
that like completing questions
doesn't mean experiencing value.
Too many onboardings I see are
like just a few questions and
I'm just like, okay, great.
You know who they are.
But, or like there's one question.
They're like, we've personalized
our experience based on you.
I, I literally saw an app do this
today when I was testing an app.
I was like, you haven't personalized it.
Personalized is not one
question with five options.
Um, but you know, and we try
to rush and get that metric up.
But I think at the result
of it is we aren't actually.
Helping them feel it.
So what we ended up doing is making
that onboarding low longer, building in
already kind of some of those steps into
that point already before getting them to
the paywall so they could start to feel
the value and then completion dropped,
but the retention improved as a result.
Jacob: Yeah, it's kind of
like, um, you know, when you're
looking and thinking about.
Email marketing, you
Daphne Tideman: Yeah.
Jacob: you know, click
rates or open rates.
Daphne Tideman: Yeah.
Jacob: these are all vanity metrics.
Right?
And I
Daphne Tideman: Yeah.
Jacob: know, a lot of people understand
that concept of vanity metric and
sometimes onboarding completion is a
similar vanity metric where it's you're
completing onboarding isn't the goal.
Daphne Tideman: No, exactly.
It's, it's a means to end.
And even trial starts, in my opinion,
is also a bit of a vanity metric.
'cause sometimes you try to make
it so easy to start a trial.
Was so attractive, like by you are
like, oh, if we make it longer, they
might, the trial starts went up.
Okay, let's do that.
But I've had that also as a case where,
you know, we made it longer, trial starts
went up, but the actual amount of people
who converted in the end was lower because
people just procrastinated and we're
like, oh, I've got longer to try this out.
And they don't try it out as a.
Jacob: How do you think about like
what that right activation metric is?
Of course, it's different for every
app and, and you're trying to balance
this activation versus monetization
where so many subscription apps, uh,
you are starting a trial before they're
actually experiencing the product.
How do you think about like that,
that activation versus monetization
Is, is activation secondary?
Is uh, yeah.
How, how do you think about that?
Daphne Tideman: So I would argue that
in most cases if I see a monetization
and an activation problem, I would go
for the activation problem because the
monetization will more naturally follow.
Now, there's exceptions to that where
maybe the monetization problem is a
pricing or package mismatch, but most
of the time if I then go into the data
of like, Hey, why are people churning
or like, you know, complaining.
It's often not enough usage
and that's another reason.
Then I'm like, Hey, let's start
with the activation side of it.
What I would say is like, there's
a difference, and this is also more
relevant, I would say also for freemium
apps, is like there's a difference between
an activated user that's paying versus
an activated user that's not paying.
So if you wanna make sure that improving
activation is driving, um, monetization,
you wanna really focus in on those.
Paying users.
If you wanna use activation to
drive more, to re referral and drive
a bigger user base, then you're
focusing more on those three users.
And this is oversimplified,
but that, that would be the
general kind of feeling of that.
And then I would also look at speaking
to those users who have paid for a
while, who have retained for a while and
try not understand what their journey
was and what they were doing in the
beginning, um, to really make sure that
we're driving the right kind of users.
To, uh, make sure that we are
seeing activation as a thing
that's going to drive monetization.
Jacob: Yeah, where, where they, they're
connected in the sense that, an activation
input if you, if users are successfully
understanding the value of the product,
Daphne Tideman: Yeah,
Jacob: usually correlates with better
monetization, better conversion rates.
Yeah, I think that
Daphne Tideman: yeah.
I've almost always seen a positive
impact on monetization as a
result of the activation, because.
It's, it, I, it's having a ripple effect
on retention, on the, your ability to
win people back on willingness to pay.
Like often also, 'cause I also
treat it as part of the onboarding
of like, my job is to activate them
even before they've signed up for a
trial, signed up for a paid account.
Is that you then end up seeing
like, hey, there are more people
who are actually starting than.
Um, a paid account.
And so it, it usually does have a really
positive impact if you work on that.
And then if you've done that and
monetization is still on the lower end,
or there's some fundamental flaw on the
setup, that's when I would start to focus
a bit more on the monetization side.
Jacob: Yeah.
Do you think about, um, you
know, it sounds like you're
talking about activation.
Pretty much the
Daphne Tideman: Yeah.
Jacob: user experience.
Daphne Tideman: Yeah.
Jacob: and so do you think about like
incorporating some aha moments in that
onboarding flow before you know, a
paywall and then can try to continue that
activation journey after the paywall?
Daphne Tideman: Yeah, so I think that's
one of the biggest pushbacks is like, oh,
it's really hard to have an aha moment
in the onboarding because it's like.
From a technical perspective,
it can be difficult to do that.
So like from a workout app for example,
trying to build in that into the
onboarding, hoping that they're at
the moment that they can do a workout,
that can be quite, quite tricky.
So what I try to do is I try to not
treat aha moments as like one big thing,
but try to think of mini moments, um,
of understanding whether or not they
see the value in a smaller way through
like a smaller action or if they, um.
Have a feeling that like, okay,
I can see myself in this and this
is a good, like a good app for me.
So like for example, ladder I think
does this really well because in
their ads they show you kind of like
that outcome of like what you wanna
see as a result of following them.
So you already have like a lower
hammer moment of like, Hey, I see
myself in this person and this is
my problem and what I wanna achieve.
And then in the onboarding as
you're filling in like, Hey,
this is how often I work out.
This is what I do.
That's like super simple.
It's.
The same screen for everyone.
'cause I've extensively
tested that onboarding.
Um, but it's just like a oh ladder
was made for people like you.
And so I see those as like mini little
ham moments that you wanna build into
your ad and also your onboarding.
I think Al a focus app does also a really
cool one of those where it's like, I think
it's a bit too gimmicky in my opinion.
It's like, but like you block your
first thing, or you do this first
action and you get this giant gem
exploding and it's meant to make you
feel good in that onboarding and.
And that's not the actual usage
of them as a focus blocking app.
It's just already getting you a
feeling like, hey, I'm starting
to take control of my focus.
Um, so I think yes, you have to have
those moments, but you have to have
like smaller, more frequent moments
rather than getting caught up on we
have to get them to do the big thing.
Jacob: I like that, that concept of
multiple moments and it's, you don't
have to accomplish it all at once.
It builds up.
Um,
Daphne Tideman: Yeah.
Jacob: and you talked about, uh, I
wanna go back to you, you talked about
bringing the aha moment into ads.
I think that's a.
Super cool concept that we
don't usually think about.
We think about activation and aha
moments as product specifically.
Um, how do you, like, how do you do this?
Well, if
Daphne Tideman: I think.
Jacob: that into the ad.
Daphne Tideman: So I think it's like
showing the job to be done in the
creative and showing the transformation.
I think that's one of the most
powerful ways of doing that.
So an example I use a lot is um,
me motor coding app, like, um.
When she was working there said that
one of their best forming ads was,
um, when someone was just showing
that they were coding in a lift.
And so you can see then, hey,
they're coding in a lift.
They're actually finding time to do this.
This is what I've been
struggling to find time to do.
So I think that's, that's a really
good example of like showing
them what that outcome could look
like and if that outcome is less.
Tangible to see or is harder to see?
Can you have someone talk about it or
can you maybe have like, for example,
kind of like a, a testimonial which talks
about like, Hey, this is what I had,
or can you show like a before or after?
Can you show some kind of
form of the difference?
That makes them trust in the
process that you are gonna
give them that transformation.
Because every single app we use,
at the end of the day, we have
something we wanna achieve with them.
And our biggest reason of like should
I pay for this is 'cause we aren't
sure if it's going to do it for us.
We aren't sure if we can trust
them to deliver on the value.
And when you show upfront in the ad
already we can deliver on this value
and we managed to do that for others,
that's when you start to be willing to
like take more time to get to know them.
Jacob: Yeah, I think that's, uh.
content to think about in terms of boiled
boils down to simple human emotions.
Like in terms of trust.
Can you, I guess, is trust and emotion?
I don't know.
Uh uh, but it kind of is, uh, um,
Daphne Tideman: Anxiety without trust.
Anxiety without it,
Jacob: The opposite.
The
Daphne Tideman: yeah.
Jacob: Um, uh, but, but can you.
Can you make people trust your product?
Can you make people trust you
can actually deliver on that
Daphne Tideman: Yeah,
Jacob: can you, you make
them feel something?
Can you
Daphne Tideman: yeah,
Jacob: uh, think that's, can, can
you, can they picture themselves using
the product before actually using it?
Daphne Tideman: yeah.
And I think which angle you take
and how you do that effectively
really depends on what emotions.
Are your audience going
through in that moment?
And I hop on about doing just user
interviews and it's because there's no way
you're gonna find this from just a survey.
Like I've had such like emotional user
interviews where someone's opened up and
talked about what were they struggling.
'cause I work with wellness.
So a lot of the Kate times it's,
you know, they might actually be
struggling with like chronic, um.
Anxiety or insomnia and what
caused that is quite emotional.
And then asking them about like,
Hey, what are, you know, when
you're looking for these solutions?
What, what were you feeling?
What you know, you can understand
from the emotions they're feeling.
How to then give them that reassurance.
'cause I think sometimes we're
like, oh, we just need to give them
social proof, or Oh, we just need to
show them, you know, it works, but.
Actually like understanding which
emotions holding them back helps you
create more that aha moment because it
helps them speak to like that you're
speaking to what they're struggling with.
Jacob: Yeah, I think I, I say
this a lot, but you know, humans
are emotional creatures kind
of pretending to be rational.
Daphne Tideman: Yeah.
Jacob: Uh, you know, understand
the psychology really,
you know, understand your.
Users, what are you solving for them?
What are they feeling?
And we, included, get, I love the
numbers, I love the quantitative side,
but the end of the day, you know, it's
people on the other side of the screen.
They don't care about the conversion
rates and metrics they have something
they're trying to, to figure
Daphne Tideman: Yeah.
Jacob: that ties to a real need.
And, and I think that's such an
important thing that, that is often
forgotten when we're going, oh, did you
improve your trial to start rate by 5%?
And, and yeah, so I I, I like that a lot.
Daphne Tideman: Exactly.
It's easy to get caught up on
the quick things, but it doesn't
have to be time consuming to go
out there and talk to people.
I've often done it in a week and
it, yeah, it's always been impactful
and made major changes as a result.
Jacob: I, I think to, to, to double down
on it doesn't have to be time consuming.
People have said that after about five
user interviews, you probably have
a good spectrum of what pretty much
everybody is saying, that you don't
Daphne Tideman: Yeah,
Jacob: and tons and you
can get that information,
Daphne Tideman: exactly.
Jacob: pretty quickly.
Yeah.
So, you know, ultimately, you know,
activation isn't the, the what we're
trying to solve, we're trying to solve.
Um.
Retention and usage and long-term habits.
I know that you mentioned that a lot
of times you started telling teams
that your, your retention problem
is actually an activation problem.
Uh, and you know, we think about
uh, we need better retention for
our app, but, you know, retention
starts at the very beginning.
What,
Daphne Tideman: Yeah.
Jacob: reactions when, when
you started telling them that?
Daphne Tideman: Yeah.
I think sometimes it's, um,
they're like open to it, of course.
And there's other times where they.
Kind of feel like, oh, we still
need to get this feature in place
first, and it's almost like, no,
this feature will solve it, but.
Feature rarely solves activation, but it
feels more tangible than saying, Hey, we
have an activation, or we need to optimize
what we have or optimize the journey.
Because that's why a feature
rarely solves it, because it's
often more about the journey.
Then the other thing is, is like
what we've talked about earlier of
like, oh, but the onboarding's good,
or We're getting a lot of people
starting to trial, like, you know,
starting or going, I've even seen,
um, an app I'm looking at right now.
They have a really.
Uh, strong, like trial started and
trial to paid completion rate, but
they're still losing people in that
first month, so we're still thinking
like, Hey, there might be something
wrong here around activation.
And I think that's the thing.
It's like they think, oh,
if these metrics are good.
It's not that.
And then the other thing is
it's like, it's not even that.
They don't wanna focus on
activation instead of retention.
My most common one is the
shiny thing of acquisition.
That's who I'm fighting because
everyone wants to focus on acquisition.
It's like, no, we just need more users.
And then I like do the calculations of
like, you're acquiring thousands of users.
If we could just get, you know.
20% of those that actually do
action, you don't need more users.
And even better, your cost
of acquisition goes down.
How much you're earning on them goes up.
It'll make acquisition easier.
I think that's the biggest thing
that I have to like fight with, is
they just wanna focus on like the
ads and the meta side of it or like
whatever channel they're focusing on.
It's often meta, but um, and I have
to fight with that to say like, Hey.
We do need to focus on activation.
Um, and then there's one final thing
that I think no one talks about enough
in my opinion, but why do we call it day
two retention and day seven retention?
It's a really weird phrase because.
You are not actually retaining anyone at
that point, you're still activating them.
Like I was thinking about it, like for
most apps I work with, like activation
is probably like a seven to 30 day
thing of getting them into the habit
of using it and like really truly
getting them committed to the app.
But we're caught talking about
a retention, so of course people
think it's a retention problem if
you're losing people at day two.
Jacob: Yeah, I, it goes back
to kind of the quantitative.
Something you could understand,
something you could put on paper
versus like understanding what truly
people are experiencing, how people
are, you know, using your product.
And, and I think that
there, there is that gap.
And, and so, you know, this, this
connects to how you're talking about,
um, or differential between time to first
Daphne Tideman: Yeah.
Jacob: versus time to core value as
Daphne Tideman: Yeah.
Jacob: of activation.
Can you tell me more about
that, that framework?
Daphne Tideman: Yeah, so that's really
inspired by the kind of like SaaS
product led, uh, uh, growth space.
Uh, I think we Bush talks about a
similar concept also in his book.
Um, but I'm not sure if he's the one
who originally came up with it 'cause
it's been cited in different places.
But I think it's an interesting
framework, uh, just 'cause I don't
want to claim for something that
I haven't embedded in anyway.
Um, the time to first value.
That's kind of that initial, like what
we were talking about earlier, that mini
like little aha moments where Houston
might not have gotten the full value,
but they see that app could help them.
It's like, this is for me, this
is something that could help me.
It's like, you know what Noom
does with a personalized plan and
reassuring you throughout the,
uh, weight loss app just for.
Not familiar.
Um, they give you a personalized plan.
They reassure you, uh, throughout it.
They ask you questions
and teach you things.
And those are little
moments of perceived value.
And they could be action based, but they
could also just be learning something.
And that's, we want that time to first
value really to happen in a first session,
and ideally happened quite quickly that
we built enough trust to keep them going.
Um.
And then the time to core
value, that does take longer.
I think for most apps, like it probably
does take a few days to get there.
Um, and remember I'm working with
wellness apps, so the time to first
value is, uh, to core value is probably
slightly longer than some other apps.
Um, for some apps it could be on day
one already, but it's like, usually
it's that user is doing something
repetitively and feeling that benefit.
So actually, you know.
Have committed to doing, you know,
a meditation or two already and
are feeling, starting to notice a
difference and they're really building,
starting to build a habit around it.
Um, so like, for example, for blinkers.
The first value could be, oh, you see that
you could actually read a book and feel
convinced, like, Hey, I can actually read
a summary and I can make time for this.
It's really easy to use.
Fits into my lifestyle.
There's a book that I wanna read, and then
you're starting to get excited about it.
But like the time to call a value might
be either finishing that first book
or having a free day streak and what
that core value is, or how many times
that is, is a little bit dependent on
what actually predicts that retention.
Jacob: Do you think, um, breaking
those different moments up
allows teams to be more focused?
Daphne Tideman: I think it makes it
easier to understand that you don't
have to have a big thing in the
first moment, and it also makes it
easier to actually measure the timing
of these things and see like, hey.
If the time to core value is too long,
we probably need to get them first to the
first value a bit shorter and quicker.
Because usually that's then what's taking
too long that they're not seeing the first
value and we need to shorten that down.
And I think it also helps 'em like
kind of break it up into stages versus
seeing it as like one flat metric.
'cause I think that's the
other challenge is like.
It isn't just one metric.
Often in activation there's often
like a first step or second step.
And that's what I mean with meant early
on with that 2D versus like three day
thinking of like realizing like it
isn't just this, like hey, we need to
do one little thing in their retain.
No, it's usually a few different things
and getting them into, you know, a setup
where they are actually gonna use it.
So like.
For example, for a, uh, time
blocking app, the, uh, first value
might be actually seeing like, Hey,
I've managed to block everything.
Get myself set up.
This is exciting.
Like, I can choose exactly
what I've been struggling with.
I feel like it will actually help me.
But then the core value might be
having like a first full session,
or it might be having two sessions.
So I think having those different
moments helps you understand where you're
losing people and what the issue is.
Jacob: I've often heard, uh, and
kind of used this myself in terms
of like a setup moment, uh, aha
moment or activation and then
Daphne Tideman: Yeah,
Jacob: moment,
Daphne Tideman: yeah, yeah.
That's another phrase used often or.
Jacob: And so we talked a lot about
like what, um, to think about activation
metrics, how to influence this, but,
but how do you actually test whether,
um, an activation metric that's real or
vanity or actually influencing anything
Daphne Tideman: Yeah, so this is
where having good data tracking
is, is kind of the prerequisite.
I think that's one of the biggest
challenges I see with this is that
if we are not tracking the right
things it, and we can't backtrack
it, then it's gonna take a while
before we can get this confidence.
But what I'm looking for is
like, okay, activated users are
retaining significantly better.
When we test that across different like
cohorts and channels, we understand that
like, hey, this is consistent across most
of the cohorts, most of the channels.
It isn't just like one, like high
because sometimes you have it, for
example, that, um, users who come
through like a strong organic source,
like for an influencer that they trust,
they might naturally kind of retain,
uh, better or do a certain action
because they've been told to do that.
But it's not predictive of.
All users.
Um, and then I'll look at like,
okay, does this metric actually
also improve the downstream metrics?
So what we also talked about, um, where
create, if they're activated and using,
but if they're not paying, um, that's
probably not going to be like helpful.
So I think those are kind
of the things that I.
We'll look for and then, um, without
going on a user interview around change
again, um, I would say they're also very
powerful and kind of like sense checking.
Is that also in the user's mind?
What matters to them?
Is that actually what
they perceive as value?
Because it might be.
I think there's a difference
between what you need someone to
do to get value out of your app
versus what someone sees as value.
And both are important, but.
I would consider like a good
activation metric to be also
reflective of them getting value,
not just you getting them set up.
Jacob: How do you, how do you start,
like where, where do you start here?
Like in terms of, is it
just kind of throwing some
different metrics at the wall?
Do you have a good formula for kind
of, uh, starting with these hypotheses?
Daphne Tideman: I think it depends
on how much data you have already,
how much you're tracking, and um,
how much confidence you already
have in understanding this.
If you don't have much data yet, I
wouldn't go and just track everything
and then hope that one of them predicts.
I would start with the user
interviews versus if you're
already tracking quite a bit.
What I would just basically do
is a lot of comparisons of like
if they do this versus if they do
behavior one versus behavior two.
What retains better if
they do behavior one.
Okay.
Behavior one seems to be more predictive
against all the other behaviors we check.
Okay.
If they do behavior one in seven days
versus 14 days, does that matter?
And I trying to work out
like, okay, what is actually.
Showing a difference in those downstream
metrics, um, will usually help you
narrow down that, and then you could,
if you're not sure enough or you haven't
got that, you kind of use it inside.
Then you can validate that
with user interviews versus
when you don't have the data.
It's almost like you'd have to
track everything then wait and see.
And that's really going
to take a lot of time.
And that's not realistic for like,
'cause I work with a lot of startups
and scale-ups who don't have that
luxury of data or time, um, or like
the, like length of data that they need
to see if it's predicting retention.
In those cases, what I would actually
do then is really double down on
those user interviews and in, based on
more qualitative data, define what we
believe the activation metric will be.
Make sure we're tracking that.
And some of our other high.
C so that we can start gathering
data, and then I would intentionally
try to see, can we learn faster
who is actually activating or not?
And in this case.
Also kind of co controversial take, but
I actually think monthly subscriptions
are way better than annual subscriptions
because then you're forcing them to,
um, for startups, not for all, but
then you're forcing them to think
every month, whether or not they wanna
renew, which will then help you learn
the difference of like those who do
renew and those who don't renew because
it's when they put their money where
their mouth is that you're learning is
this actually, you know, someone who's
activating and willing to pay versus
saying that this is what matters to
them, but doing something different.
Jacob: 'cause they have to make
that decision every single month
Daphne Tideman: Yeah.
Jacob: product delivering
enough value for me to
Daphne Tideman: Yeah,
Jacob: it?
Yeah.
That's, that's super interesting.
And so, um, so we start with maybe
user research or some, some, uh.
Uh, kind of throw in some different events
Daphne Tideman: yeah,
Jacob: and maybe mix panel amplitude.
Daphne Tideman: yeah.
Jacob: does this have any
correlation with, uh, a retention?
So we talked about, or you
mentioned this before, in terms
of correlation versus causation.
How do we get from, correlation?
Daphne Tideman: Yeah.
Jacob: look like they're correlated
with retention to actually causation.
Daphne Tideman: Yeah, it's a, it's a
tricky one because like there'll always be
some kind of correlation with it, I think.
When we see like on a big enough
data group that this does seem to
be predictive and it also matches.
I think there's also, to be honest, a
bit of common sense of this is like,
hey, okay, obviously if they do a
trial, like completely onboarding,
they're more likely to retain.
But like what is beyond that common
sense of like, okay, we may have
not expected that, or that is some,
someone taking a active movement
rather than a passive kind of metric.
Um, I think that's a key one.
And then I would also say like trying to.
Understand at different scales, how that
impacts also helps reduce that risk of it.
So for example, for one app where
forming friendships as in like
in um, connecting with people
as friends is really important.
What we saw was like, okay, one friend.
Increases that retention.
Three friends in increases that
further, five friends, 10 friends.
But looking at actually then
the difference of that we don't
need them to have 10 friends.
And the cohort that
does that is very small.
So we're not trying to get
like just, you know, the.
2%.
That does happen to do temp.
We, we know that, hey, actually the sweet
spot is around three to five friends.
If they get that, they're
more likely to retain.
It's also realistic for a big enough
group, but there's room for improvement.
So I think that's also another sense
check that I would do is like, okay.
Kind of balancing those two things and
then also sense checking like is it even
realistic to expect this of someone?
Jacob: Yeah, so a, a bit of
the quantitative analysis, also
common sense, um, what, what is
really realistic for users to do.
And then you kind of talking about can
we get a high enough percentage of,
new users to do this metric where it
Daphne Tideman: Yeah.
Jacob: behavior.
And I would imagine also there's
um, simply like can, if we.
If we move this metric
Daphne Tideman: Yeah,
Jacob: or if we launch
experiments that try to move this
Daphne Tideman: yeah,
Jacob: retention improve?
Daphne Tideman: yeah.
Exactly.
And I think that's when like you also
learn a lot from it in terms of like
once you actually launch those tests.
Is this actually making a difference?
Like I was talking about this
earlier, but with someone where it
was like, Hmm, is quarterly plans
actually cannibalizing annuals?
Well, let's first find out if.
People actually know which duration
to go with, and if we do tests
around explaining like, Hey, this
is when you need an annual plan.
This is when you need a quarterly.
'cause my hypothesis was people don't
know which package to go with because
we aren't guiding them in terms of
how long to expect to use the app.
I was like, well, if we do one or two
experiments around that and nothing
around that is actually having a
positive or negative impact on which
packages they're going for, then we could
potentially test and see like, well,
what happens when we remove quarterly?
But just removing quarterly upfront
isn't actually gonna solve it because
we don't actually know if the reason
they aren't going for annuals is
because the quarterly or because they
aren't understanding the value of it.
And we need to get that
value across first.
And I think you learn very quickly
with, you know, almost one or two,
like initial experiments, whether or
not you seem to be on the right track.
Because even if it isn't the right
experiment, if it's having some kind of
drop, that also tells you a lot that that.
Maybe you haven't got the right thing,
but it is impacting them versus a few
non-significant, then you might be, if you
have enough data, assuming you've done all
the precalculating, then it might indicate
this isn't what's wrong or what matters.
Jacob: think I often, and I think other
people often get caught up in the, you
know, fancy quantitative analyses of,
Daphne Tideman: Yeah.
Jacob: oh, well it looks like the
data says this, that this is gonna
happen if this blah, blah, blah,
Daphne Tideman: Yeah.
Jacob: blah.
At the end of the day, well, can, you
don't really know if your hypothesis is
true or not until you actually test it.
Test it, and learn and figure out,
and, and that's the point of like.
We keep in mind that experiments are, are
mainly for learning that you wanna improve
Daphne Tideman: Yeah.
Jacob: Sure.
But the most important thing is
are you learning, uh, you know,
about your users and, and your
Daphne Tideman: Yeah.
Yeah.
The only time it's like, oh, let's do
more of that upfront, is when you're
really small and you don't have the
data to run it as an experiment,
then sure, spend a bit more time
being more confident upfront.
But when you have the luxury of
data, there has to be that switch
point where you have a strong enough
hypothesis and you just go out and
test it because you, most of those
experiments will fail and that's fine.
But like I said, like I, I learned just
as much as a, from a loss versus a win.
And if anything that even a
non-significant, like I know everyone
hates on them, but, um, that teaches
me a lot too, because assuming
I've set it all up correctly, that
means that that apparently is not
something that people are bothered
with or like having an impact.
Jacob: Yep.
Yep.
I think usually you learn
more from a failed experiment
than, than a successful one.
Yeah.
Daphne Tideman: From the learnings.
Jacob: So cool.
So if someone thinks they, they might
have a retention problem, but suspects
it might actually be an activation
issue or someone sees they have a
retention problem, but suspects it might
actually be, uh, some activation issue.
How do they, how do they test this?
How do they, how do they figure this out?
Daphne Tideman: Yeah.
So I would then start to look at like the
retention curse and seeing like, okay,
where are we losing people and at which
point are they actually, uh, dropping off?
Um, and also look at this like per
subscription type, if there are different
ones, and also if there is Android versus
I Wests to try and understand compared
to like benchmarks and experience like.
We're always gonna lose more
in the beginning for sure.
Um, but is it more than what we'd expect
if someone then had, like, let's say
they've only got like annual plans,
um, and or lifetime subscriptions or
something like that, where you can't
really like say much about that because
it's like by the point, you know what we
talked about earlier, like after a year,
is that still an activation problem?
Um, what I would then do is
I would try to understand.
Are people actually active
within that subscription?
And to what point do
they seem to stay active?
So I would almost look at it as
like, okay, are they active in
the beginning but then losing it?
Well, that's probably then
more retention problem.
Or do they never take action to
begin with and haven't been active
and our end essentially just always
been dormant users, then that's
an indication like, okay, this
isn't actually a retention problem.
So, and then one other
thing I would do, which, um.
I think also doesn't get looked at enough
is if it does look to be a potential
activation problem, I would still also
check the acquisition setup to see
like, hey, what are they optimizing on?
If they are running ads, like in
terms of the top of the funnel goal.
'cause sometimes I think that
can mislead you to thinking you
have an acquisition problem.
And also like if they are focusing on like
a lot of lower quality users and bringing
a lot of those volumes and then muddying
the numbers, um, I would try to like.
And that's where that
segmentation comes out.
I would try to, like, I always try to
like prove myself wrong rather than Right.
If that makes sense.
And like try to rule out any other reason
why this could not be an activation
problem, but be one of the other.
And there are apps where it is a
retention problem where users were.
Active in the first month or
so, but they just trickled off.
They stopped using it.
Um, they lost like kind of the
interest and the motivation.
It wasn't new enough or it
wasn't helping them enough.
They weren't seeing difference.
And that's when I would work on
what a habit, uh, like keeping that
habit going and also growing with
the user according to how they're
changing in that time period.
To stay relevant.
Jacob: Do you think it's more
frequently an activation problem
versus like long-term retention?
Yeah.
Daphne Tideman: This is why I harp
on about activation because I'm
like, it's just so much more often.
Like, I think like sometimes, for
example, win backs get a lot of attention
'cause they feel really excited.
Like we would've completely lost them.
And look, we got them back and you
know, these were users who wanted to
stop paying or stopped paying and they
came back and it, it is like, woo-hoo.
But then if you look at the amount of
people at that point, you've lost so many.
And I think like it's, it's
often like the, the, yeah, the
percentage you want back sounds.
Maybe interesting or fun, but
like if you compare that relative
to the amount of people at the
beginning, I think especially with
the competition these days, there's
so many apps, so many options.
Like I know myself when I was looking
for like a new recipe app, I just
started free trials at the same time
and it was like, let made a best app
win and just tested three different ones
and I was like, well this is the one
I actually stuck to and used the most.
That's the one I'm gonna keep.
Jacob: Yeah.
Yeah.
And, and
Daphne Tideman: Maybe I just do that.
Jacob: just.
Yeah, I think probably, uh, uh, you know,
you and I are are much more familiar with
all the different app options than we
think, you know, uh, in that kind of way.
Daphne Tideman: Yeah.
Jacob: not everybody does that,
but, but, but yeah, when you're
talking about the numbers, yeah.
It, it makes sense, right?
You
Daphne Tideman: Yeah.
Jacob: If I can have a 5% win on every
single new person in my app, that's
clearly more impactful than a 5% win on
the 15% of users that are there at, you
know, day 180 or whatever, uh, uh it is.
And, and so it's, it's, you know,
very clear kind of what the,
usually the larger, the larger
impact, uh, uh, effort will be.
Daphne Tideman: Yeah, and like I said,
there's definitely cases where it is a
retention problem, and I think where I
see that being a bigger issue is when.
Users almost outgrow the app or
there's a shorter term use case.
In which case it's always like, okay,
if they are going for a subscription,
is subscription the right model
or it was it always that they were
shorter term and we are just not
fitting the model to the use case?
Or can we somehow show that we're
actually growing with them and
adjusting what we're offering to them
based on how their needs are evolving?
Like what is almost like
their next level of growth.
And I think a good example of this
is like with meditation apps like
Headspace and Calm, they're constantly
creating new content, um, and also
speaking to new, more niche use cases.
But what they could do better is
doing more regular check-ins of like,
Hey, is the problem you came to us
with, still actually your problem?
And is this still the
right solution for you?
Or do we need to push you to different
content based on what your needs are?
Jacob: Yeah.
Yeah.
On the flip side, I.
See, in terms of the shorter term
value, a lot of the AI apps, you
know, shifting towards weekly
plans where, okay, if, if we know
users aren't gonna stay around very
Daphne Tideman: Yeah.
Jacob: we're gonna capture as much
value as we can upfront in the
first few weeks, and then we'll go,
okay, I guess that's our business.
Daphne Tideman: Yeah.
Jacob: so it's, it's not
really long term product.
Daphne Tideman: But I think that's
also like, it's also okay if it is
a short term use case, and then your
retention is probably also shorter.
Like for example, a dating app.
You don't want a annual dating
app because that's basically like
you're saying upfront, I don't think
I'm gonna find anyone in the next
year and I'm gonna need this app.
Or if I find someone
I'm gonna keep dating.
That's, that's a different use case.
That's also fair.
But if that's, that's
the way you do things.
But it's like you wouldn't go for an
annual with that because it's almost
like setting yourself up for the
idea that I'm not going to succeed.
And so like, I think like.
Retention also like looks different.
And that's why I was also thinking
about like, oh yeah, these, what is
the day frame for what is activation
versus, you know, retention.
I do think it really differs per
app because for some, like an AI
tool like let's say take lovable,
um, you know, you might on the first
day already build a really cool
project and be really proud of that.
You might be activated within day one
and be like, Hey look, I built this
website in like half an hour or like.
Even 10 minutes, and I'm really
proud of this or this prototype.
And then you, you, but your lifetime
might be a bit shorter because you
might not need to build a whole
new website every single day.
And so their retention
framework is very different.
And, and I think Elena Vener, um, who's
there a, I don't know her official title,
so I'm really sorry if I get it wrong.
I think Head of Grove,
um, high up in Grove.
Jacob: of marketing.
Something.
Daphne Tideman: So smash
smashing it on the growth side.
Um, even talked about how they removed or
like changed the focus of subscription.
'cause they were like, that's
just not matching our use case.
And it was actually annoying people
to try and force them to have a
certain amount of credits per month
because our use case goes in ways.
And in those cases I think we have to
think of activation as building enough
trust that when they need us, they'll come
back and building in little moments or
reminders or like extra use cases to keep
them coming back and wanting to use it.
Jacob: Yep.
Yep.
That makes sense.
Okay.
Last question before we move
into our, our lightning round.
Uh, you, this is a, a newly,
uh, newly formed section of the
podcast, our lightning round.
So you'll, you'll be the first, uh,
uh, first guest to go through it.
You know, uh, you should be honored.
Um,
Daphne Tideman: Oh yeah.
Jacob: that, what's the, what's
the biggest activation mistake
you see subscription apps making
right now that they could, uh,
maybe fix in a single sprint?
Daphne Tideman: Uh, so I'm assuming
like, I don't, like, we're just
talking generally about like
any app and I don't really know
exactly where they're dropping off.
I think.
The biggest observation mistake I
see the most often is not having that
value moment before paywall and having
that little aha moment in there.
And so that would be what I would try
to fix because like I said, it doesn't
have to be something super technical.
Um, it could just be feeding back in,
Hey, this is exactly who the app is for.
Showing more visually even through a
video or like even just screenshots
what the app looks like and how
it can help them achieve the goal.
That would be where I would
probably look to first of like, Hey.
Can we in the onboarding already
doing that versus getting caught up
on like push or email, all of that.
'cause I just see those as
PLAs on the bigger problem
Jacob: Yeah.
I love that.
I, I think that's, that's great advice.
Great advice.
Okay.
So
Daphne Tideman: now.
Lightning
Jacob: lightning round, um, get ready.
Uh,
Daphne Tideman: ready?
Yeah.
Jacob: it's the Price Power Podcast.
Uh, I have to ask, what's the
biggest or most interesting pricing
and packaging when you've seen.
Daphne Tideman: Um, I had a really
cool case where an audit client came
back and they said that their yearly
subscriptions had gone from 16% to 39%.
After they'd originally said,
no, we aren't a yearly product.
Um, and that was just a result of
that better onboarding and positioning
of annual in terms of price.
Um, I think that's also such a
quick win in terms of like relative,
like monthly to annual price.
And the best part with that increase
was that overall paid conversion had
also actually doubled from like 6.6%
to 13.7%,
despite also being a freemium map.
Jacob: Wow.
So get your monthly to
annual price ratios, right?
And, and that could be a, a pretty big
Daphne Tideman: And yeah.
And managing the expectations upfront
of how long you'll need it for.
Jacob: Right.
Right.
Yeah, that, that makes sense.
Um, alright.
Is there, do you have a hot
take that would get pushback,
uh, from other growth advisors
Daphne Tideman: Uh, oh.
Other growth advisors.
Uh, I was thinking client wise, um.
Jacob: Uh, any
Daphne Tideman: Yeah, I mean, I,
I've had, I've had a few hot takes.
I feel like today, I said a few
controversial things, but I would
say revenue is a terrible, like North
star metric or metric to focus on.
Overall, I still see actually also some
advisors focusing too much on that.
It just makes you optimize for
extracting value rather than creating it.
So that would definitely be be a hot take
that I, I even, I think I shared it today
because I'm just like seen as still so
often that when I ask them for a North
Star metric, it's like, oh, revenue.
I'm like, no.
Jacob: Yep.
That, I think that's a great one.
Um, all right.
And then last question, what's
the single best activation win
you've seen or, or had yourself.
Daphne Tideman: So this is
definitely recency bias.
Um, I always get excited about the most
recent ones, but this was an advisory
client that they, uh, switched their
static onboarding, uh, like having kind of
a classic questions, walking you through,
showing you what you can do in the app.
To an AI onboarding where it was
asking you like about eight questions.
Um, and it was quite skeptical around
it, but they did some really cool user
testing throughout it or to really
validate that onboarding in the setup.
And they, uh, basically saw that it took
people three times as long to complete it.
So you'd think it completely
would drop off, but more
people actually completed it.
And the arpu, which was
very low to begin with.
It was more than 10 x after day 14.
Um, which for an audience that's hard to
monetize, gen Z was really good to see.
Jacob: Revenues pretty crazy.
That's
Daphne Tideman: this is fun.
Startups, startups scale up, so
we get these kind of big numbers.
I always love working with smaller
companies 'cause we, it's, it's
so much more interesting than
like, oh, it improved 2%, um,
with the, with the larger ones.
Jacob: But still that, that's, uh,
Daphne Tideman: Yeah.
Jacob: super interesting and, and
for, you know, probably a strong
signal for others to try out, uh,
kind of AI chat, uh, onboarding
flow, which makes a lot of sense.
Um,
Daphne Tideman: More interactive
onboarding, I think is
the principle behind it.
Jacob: Yeah.
Great advice.
Um, well, this was awesome.
I really appreciate you joining Daphne.
Uh, I think there's so many great
insights, uh, uh, to unpack for everybody.
Anything you would like to promote
or tell people to go check out.
Have you, you have your
newsletter, anything else?
Uh, and we can include links in the show
notes to whatever you want, but Yeah.
What, what do you want?
Uh, if people go check out.
Daphne Tideman: Yeah, if you, if you
want more content from me, um, my Growth
Ways newsletter is a good place to start.
I share weekly, um, a free actionable
advice for startup scale up apps.
And then I'm also very active on LinkedIn,
so that's also a great place to, uh,
reach out if you have a question or to get
some little bite-sized pieces of content.
Jacob: Yeah.
Yeah, your newsletter is
great and, and always, uh, uh,
tons of insights on LinkedIn.
So yeah, everybody should go.
Definitely, uh, subscribe and follow.
Daphne Tideman: Thanks
so much for having me.
Jacob: Yeah.
Thank you for coming.
This was great.
All right.
Thanks.
Bye.
Speaker: Thanks for listening.
Hope you enjoyed.
Please go to price power podcast.com
to see all the episodes.
Go to Spotify and YouTube and
give us a subscribe and follow
so you don't miss any episodes.
Alright, talk to you next time.
All.