How to Use AI to Supercharge Product-Led Growth
Referenced during this episode…
Books
- Product-Led Onboarding by Ramli John
- Growth Hacking for Dummies by Anuj Adhiya
- The Personal MBA by Josh Kaufman
Links
- ProductLed.com (Referenced in relation to PLG strategies and resources)
- ThisWeekInJavaScript.com (Dani Grant mentioned it for weekly JavaScript updates)
SaaS Tools
- Appcues: A tool for product adoption and user onboarding.
- Jam.dev: Used for bug reporting and user feedback.
- PartyClick: Mentioned in the context of event coordination.
- Amplitude: Product analytics tool for retention and user behavior insights.
- June: Analytics tool for understanding user activation and retention.
- Mixpanel: Another product analytics platform.
- Heap Analytics: Mentioned for its diversity-focused AI training features.
- Castmagic: For podcast transcription and content repurposing.
- Opus Clip: A tool for creating video snippets and highlights from longer content.
- Gong: An AI tool for analyzing sales and customer interaction data.
—
AI is changing everything; and it’s no different when it comes to product-led growth. This live panel is designed to help you learn about AI’s potential for growth throughout every phase of the user journey.
With AI becoming central to data-driven decision-making, many PMs find it tough to cut through the noise and discover where AI can truly support growth, without overcomplicating their strategy or getting lost in overly technical jargon.
Join us on November 19th at 9am PT / 12pm ET for an exclusive webinar featuring industry experts, as they dive into practical, actionable ways to integrate AI into your PLG strategy. This session will help you cut through the complexity and give you the tools to start leveraging AI where it matters most.
We’re thrilled to have three stellar speakers joining us for this conversation:
- Ramli John: Author of Product-Led Onboarding and renowned expert in PLG strategy, Ramli brings a deep understanding of how to turn prospects into passionate users.
- Dani Grant: Founder of Jam.dev and a former Product Lead, Dani is known for using product data to design seamless, high-impact user experiences.
- Anuj Adhiya: Anuj is a growth expert and author of Growth Hacking for Dummies, with deep experience guiding SaaS companies on implementing product-led growth strategies.
In this session, you’ll learn:
- How AI can amplify each phase of the user journey: from acquisition through retention and expansion.
- Real-world examples from panelists on integrating AI into their PLG strategies.
- Common pitfalls and misconceptions about using AI in PLG—and how to avoid them.
We’ll wrap up with a live Q&A session, giving you a unique chance to get your specific questions answered by the experts. Don’t miss this opportunity to learn directly from leaders who are pioneering AI-driven PLG in SaaS.
Track My Progress
Host
Guests
[00:00:00] Hannah Clark: Latest in our community events series. Uh, we're seeing these always grow. We are always seeing more and more folks coming in and joining us for these panels. And they've become a really valuable way for members to engage with our experts. Um, and, and just be part of our community. So we're happy to have you if you've decided to join us today.
For those who don't know me, my name is Hannah Clark. I'm the editor for the product manager and I'll be your host today. So today's session is going to be focusing on how to use AI to supercharge product led growth. And we'll be speaking with some amazing voices in the space. We have a really exciting lineup today.
I'm really excited to introduce them. So we've got Romly John, who's the author of Product Led Onboarding. He's also a renowned expert in PLG strategy. Uh, Romley brings a deep understanding of how to turn prospects into passionate users. We've got a little Jeopardy question for Romley today. So Romley, you've been called the onboarding [00:01:00] wizard by some of the biggest names in SaaS and your, uh, your book Product Led Onboarding has been a game changer for countless product teams.
We're really just pumping your tires right now. Um, if you can wave a magic wand and instantly fix the most common onboarding mistake you've seen, what would it be?
[00:01:14] Ramli John: Thanks so much Hannah. I would say it's actually not related to the product. It's often the biggest problem around onboarding is internal friction, not product friction.
And what I mean by that is product does their product thing inside of the product marketing, that's their onboarding emails and then customer success. That's their customer onboarding thing. And they don't talk to each other. This is what happened while we're even working at a, uh, onboarding company called app queues, where like a product adoption software.
And we had that same issue. So I think if I could make him wave a magic wand is if I can get the teams to talk to each other more and agree, what success look like for the user, which is a hard problem, then I would, that's where I would start. Thanks. Man
[00:01:56] Hannah Clark: siloing. It's just one of those perennial issues.
Well, thanks for that. [00:02:00] We also have Dani Grant joining us. She's the founder of Jam. dev and a former product lead. Dani is known for using product data to design seamless high impact user experiences, and she's also just such an awesome person. So Dani, thanks so much for joining us. Um, question to pass to you.
Uh, so Jam has skyrocketed to 150, 000 users at 32 different Fortune 100 companies, which is incredible. Um, but you're also speaking at events, conferences, and you're active daily on LinkedIn. Honestly, I'm jealous of how you managed to make all this work. So what's your secret to getting all of these things done and staying sane?
If you are staying sane.
[00:02:34] Dani Grant: First, just everyone here should read Romley's book. It is so good. My co founder read it first in our company and then was like, told me you have to read it. I read it. Now it's required reading for our growth product team.
[00:02:44] Ramli John: The
[00:02:45] Dani Grant: thing that it will completely change your mind on is, um, it will change your mind as to what onboarding is.
So we all, product managers, we all think about onboarding all the time, but we all think about it as like, From when you sign up to when you've used the product and Romley's [00:03:00] book will show you that onboarding starts a lot before and ends a lot later. And, and focusing on onboarding in that way will change the outcomes of your product like it has for us.
Anyway, Romley's brilliant. Read his book. As far as, um, speaking, we are, so we are so lucky. All of our users are builders. They are out in the world trying to change the world by using software. And so, um, we're over here building our company. Our users are over there building their companies. And so our, our job is, um, To share what we're learning building jam with everyone else building their things.
And so we, we end up posting learning lessons online, joining things like this. It's, it's such a privilege and an honor. So thank you for having me.
[00:03:42] Hannah Clark: Well, well, I appreciate you taking the time to pick up another panelist. That's so awesome. See, I told you guys, she's a great person. Uh, so, uh, we also have Anoush Adia, uh, joining us today.
So Anoush is a growth expert and author of growth hacking for dummies. Um, he's got writing on, uh, productled. com. And he's also got deep experience [00:04:00] guiding SaaS companies on implementing product led growth strategies. Um, so really honored to have you here with us today, Anuj. During our pre call, you shared that you're planning on setting up a Guinness Book of World Records, uh, record by gathering, I think this is just so cool, um, the largest number of people wearing party hats in Boston.
Um, I think we need some more context here. Can you tell us more about what you're doing and how can people get involved? Where, where, where do we send the party hats?
[00:04:26] Anuj Adhiya: Right. Thanks for having me. This is such a fun group to be part of. And yes, um, you know, I think on the surface, it feels like it has nothing to do with product led growth, but it really does because it's a large experiment.
So there's this. Uh, party invite, uh, app that I'm consulting with called PartyClick, uh, you know, if you want to go check it out. Uh, and easiest way to set up an event and we just try to think of like, what are ways we can put this in front of more people? Uh, and you know, a lot of ideas kind of came around, [00:05:00] especially things like, oh, we should do our own celebrity lookalike thing, like the Timothee Chalamet thing that just happened and I'm like, Yeah, that's okay.
But you know, what else can we do? And then somebody on the team was like, don't they have world records for largest gatherings? Yes, that's what we should do. So I go and looked up the Guinness book and sure enough, there's a world record for You know, people with party hats. I'm like, great. This goes with the name of the product.
You know, party hats, party click. Great. We should, we should just do this. So literally this week, I'm in the process of, uh, uh, getting through the application with the Guinness book and, you know, uh, begging and pleading the city of Boston to let me have 2, 500 people be in Boston common. Uh, uh, Uh, so, you know, let's see how close I get to achieving that goal.
So, but, and the real reason I'm saying this publicly is more to hold myself accountable and shame me if I don't make this happen.
[00:05:53] Hannah Clark: So if you're in the Boston area, bring your party hats. Uh, sorry, what was the, the date again in December?
[00:05:58] Anuj Adhiya: Uh, we're thinking December [00:06:00] 22nd, so. Okay, well, now you got it. Where was in the area we got about it and, you know.
If I don't make it happen, you know who to throw the brick bats at.
[00:06:10] Hannah Clark: Uh, all right. So we'll get into the discussion. We'll, we'll get started. Um, so just to give everybody a little bit of an overview of what we're going to be covering today, we've got three sections, uh, sections to the discussion, and then we are going to be holding some time.
Uh, like I said before, for questions towards the end. Um, so our first, uh, section is going to be setting the stage. So that's going to be, uh, just a discussion about the intersection of AI and PLG. Um, then we're going to move into our second section, which is going to be building the foundation. So we're going to be walking through the different stages of PLG and talking about AI's role and how it can support each stage of product led growth.
And then the last section before we move on to questions will be from vision to execution, which is going to be all about building your AI enhanced PLG strategy. Some more tactical things that you can implement into your own products, product led growth strategy, as well as some other tips and tricks that our panelists have found to be helpful.
You're encouraged to [00:07:00] ask questions. All the way through the session, um, we'll be keeping an eye on those questions as they come in and responding to them. Um, if they're relevant to what we're talking about, we'll try and pop them in earlier. Otherwise, we'll be leaving more miscellaneous questions towards the end.
So let's get started. So moving on to setting the stage, the intersection of AI and PLG. This question is for Ramli. First of all, has, just as a general question, has PLG become essential for the SaaS industry in general? We kind of see it as sort of a buzzword. Kind of where are we at with that right now?
[00:07:30] Ramli John: Yeah, I would say it depends on, yeah, it depends. That sucks. But it depends on how you define product led. I think if you say product led is PLG necessary, if you have a free trial freemium, I don't think so. I think like I actually suggest startups, uh, you know, start with high touch, you know, get close to your customers.
If you mean by product led is removing any unnecessary friction. And create a great experience for users. I do think if that is the definition that we're talking about, PLG, then [00:08:00] yes. Uh, the standard and bar for people in terms of what they expect from a product experience has exponentially grown a lot more than it was 10 years ago, where, you know, something's something annoying is like, Oh, that's, that's normal.
But now it's like, if this is annoying. I have a hundred other options that I can jump into, you know, there's a ton more products out there. And I really do believe that creating a great experience for, for end users, for users who are going to be using the product is going to be critical in terms of retention and activation and all the other things that we're going to be talking about later, we're going to be touching upon how AI will affect the rest of the funnel.
And if that's the definition of product led, then yes, I do think it's essential. If you mean. Let's add free trial and freemium then. No, I don't think so.
[00:08:50] Hannah Clark: Ah, nuanced answer. Okay. Does anyone have anything that they want to add about, uh, just like the relevance or state of PLG today?
[00:08:57] Dani Grant: It's so powerful. [00:09:00]
[00:09:01] Anuj Adhiya: I'll just add that, you know, I think I see a lot of founders actually asking the wrong question.
Because the question they seem to ask is, how can I be more like Slack or how can I be more like, uh, right when really the question they should be asking is, how can I use my product better to serve the end needs of my customer or my user, right? And that can take any and many forms. Right. And that opens up far more powerful avenues for sort of implementing a product led approach than just trying to copy something that's just never going to work for you.
[00:09:41] Hannah Clark: Yeah. So, so true. Um, so let's kind of move into where AI is sort of entering the space. This one is for Danny. So how would you say AI is poised to supercharge PLG strategies? I'm not sure if you've got any stories from how you guys have leveraged it at Jam.
[00:09:57] Dani Grant: We're about to go into all the tactics, but at a [00:10:00] broad level, in PLG, your product does the selling.
And selling is something that is more effective when it's personalized. It takes a lot of content. It takes a lot of convincing. And so this is actually something that AI is really, really good at. Companies have been trying to do this Forever, like machine learning for personalization, um, super segmentation with lots of content.
It's now easier than ever, um, to do this really well. You have to understand large swaths of unorganized data. And so, really, really excited to dive into the tactics. This just feels like something AI is really good at. One last thing I'll say is that the best PLG strategy is just to have such a great product that people can't help but talk about it to others.
And with AI, you can now build even more powerful features for your users. And so that's also a really big part of the puzzle, I think.
[00:10:47] Hannah Clark: So I want to add on to that answer before we move on to the second section.
All right. Mic drop moment. Uh, okay. So we'll move on to building the [00:11:00] foundation. So AI's role across the user journey. I think this is going to be the section that takes up the bulk of the session. So definitely a must listen. Uh, now's the time to put away the phones. Uh, so let's walk through each stage of PLG and show how AI can support each.
So let's start with the acquire. Danny, did you want to chime in with how your team is using a PLG to, uh, or sorry, how is using AI to power that acquire stage?
[00:11:23] Dani Grant: There are two things to mention. So the first is like maybe the most obvious, which is in PLG, your customers find your product, um, because the product has sold itself in a way and they onboard on their own.
And one place where you have to generate a ton of content, um, is in SEO and AI can be really helpful. Unfortunately, all of us have seen what this looks like when it's bad. It just looks like AI mumbo jumbo and it doesn't rank actually. Um, but AI is actually really, really helpful in SEO. So. Here is, like, tactically how we use it.
It's, um, AI writes the outline using SEO best practices. Like, you literally take an article about [00:12:00] SEO best practices. You give it to the LLM. You say, write me an outline that would follow all these best practices. Um, AI that edits your SEO or gives you recommendations for best practices. Um, using perplexity to search for, like, the tools that you're going to mention.
Like, get me all the prices of all these tools. You know, whatever. So, very, very helpful in that way. But beyond that, um, we're, we're always thinking about like, how, how do we build content that engages our audience online? Our audience, our developers, um, and there's, and in web development, there's so much going on, uh, the space is moving really quickly.
And so we wanted to be able to, to do content around that for our audience. So we thought, okay, let's, let's create a newsletter and a podcast about what's happening this week in JavaScript. Um, if that's interesting to you, by the way. You can go to thisweekinjavascript. com and, and every week in four minutes or less, get the news of JavaScript in the week.
And we're a tiny startup team. We can't, we can't [00:13:00] record a podcast each week and, you know, and so AI is really, really helpful for us and it allows us to write great content. We have like over 50 percent open rates on these newsletters. Um, the podcast is really well listened to, um, but AI really helps us do that.
So those are two ways that we use AI for the acquire step of PLG.
[00:13:17] Hannah Clark: Super cool. Um, does anyone else have any other, uh, kind of tactics they've been using for this stage that might be relevant? So this point in time,
if not, we can move on to the activate stage. Um, so, uh, Romney, did you have any, Oh, sorry. Yeah, I was going to
[00:13:34] Ramli John: add, I think the other place that I've been using, uh, AI for in terms of the choir, it's like repurposing content. So like taking like a podcast episode and then plugging into cast magic, which is a tool or summaries, and it like outputs like blog posts and Twitter posts and, and LinkedIn posts that you can share and newsletter posts.
Uh, a bit that it's, it needs a little bit of massaging because it's not [00:14:00] perfect, but I think that's another area in content that I've seen, uh, AI has worked in terms of, um, of, uh, acquisition is kind of like, how do you squeeze more juice out of the lemon? Is that the saying? I'm not entirely sure. How do you get more out of, how do you get more bang out of your content?
Essentially. It's, uh, one way I've seen AI. Be used for acquisition.
[00:14:22] Dani Grant: We actually do the exact same thing. So one of our core beliefs is that founder led sales in 2024 is happening largely online. So it used to be the case. You'd have to get like a warm intro and then like you have a first meeting and the founder like introduces themselves to the customer that's happening passively online as people are scrolling LinkedIn today.
Um, and so you're trying to constantly share how you see the world and how, like what changes you're trying to make in the world passively online. And so we use AI for. For part of this, and it's just the repurposing content. What we'll do is if we go, if someone on the team goes on like a podcast somewhere, we grab the YouTube link, we'll put it into a tool like [00:15:00] Opus clip, which will clip up the YouTube link, and it either gives us video clips to share online or it pulls out video clips of things that like aha moments, and that gives us nuggets of things to talk about and, and write about later.
Um, and so that, that's been really, really helpful in the acquire step too.
[00:15:17] Hannah Clark: That's so awesome. Yeah, that I think that's that's a huge thing right now. It's just I think both of you kind of spoke to this being able to repurpose content as much as possible. And yeah, get as much juice out of the lemon as Romley put it.
That's super awesome. Okay, I think we can move on to the activate stage then. So Romley, you're, you're the onboarding guy, what would you suggest as far as AI enabled practices to kind of support that activate? Stage.
[00:15:45] Ramli John: Yeah, this is actually something that I found, uh, another company do while we were in a conversation around how they're using AI in terms of their onboarding.
And it was one of the, so one of the challenges with product led is, you know, challenges or [00:16:00] opportunities as well, is that you, you really have to have a strong, like user research or customer research muscle. With sales side approach, you're like in direct contact with, with the customers, you're hearing the objections when your product is the one selling, you don't, you don't hear those objections, they just leave.
So now, in that case, like, how do you get those valuable insights, the, the why that the data, the quality, the quantitative data can't show, and the, Uh, one way they're using AI, which I thought was cool is they take all their sales calls, uh, right. They have some sales, uh, they have like a hybrid approach. Uh, they take their sales costs recording, and then they have like also high touch, uh, onboarding experience for enterprise companies.
They take that recording, they plug it into chat GPT to train, uh, the large language model. And then that kind of, once it's plugged in, they've, you know, Kind of started using it to create things. One of them, which I thought was cool is create a sales to customer success handoff document. And [00:17:00] like, here's kind of the things that you should do.
You can talk about like, you can help craft, like, you know, can you help sequence create a five email onboarding sequence? That's kind of cool. Based on the pain points on the sales call and customer onboarding calls. So since it's been trained on the objections that the sales calls have made and the, the kind of question and confusion during the onboarding calls, you cannot, it's more.
Aware and more trained as to what your customers are actually going through Um, I suggested that they also plug in their more most top requested support tickets into into the the that that Uh that chat bot or that chat gpt and then kind of kind of feed it information that is valuable to that activation phase so that Um, when you write that sequence or you write that tour, like what is, you know, the, the tour and the sequence.
And that can really help out with that. I think that's one way it's around kind of like plug into chat GPT. I've also seen more advanced tools, AI tools, um, kind [00:18:00] of plug in your data and kind of figure out like the retention metrics. So I'm seeing like amplitude or, um, you know, things like, you know, June that where you plug in your data and it kind of tells you, Hey, Uh, your most of your successful users, if they do X, Y, and Z, they're like 30 percent to stick around, which, you know, I'm going to take that with a grain of salt, of course, but it is some valuable information to kind of like go through the data that you have, especially if you have a large enough user set, rather than like, um, you know, if you have 10 users, you plug it in, you know, to, to this, this, um, this model, it's not going to have enough data to kind of figure out that, that, um, Um, That kind of, uh, information to figure that information out.
So I, I would say, I think that's, that's a caveat to that kind of, um, analysis where if you do have enough data set, you know, over time, then you could probably plug it in and kind of figure out like some of those, those, um, those information and [00:19:00] data like that. Like that.
[00:19:01] Hannah Clark: Yeah, it makes sense. Uh, does anyone else have any kind of tips that they would want to chime in with on supporting the activation stage of PLG?
[00:19:09] Dani Grant: This is kind of obvious, but like, you know, once you figure out, okay, are to active, like good activation, the user has to take the following steps, then you have to figure out like, what are a bunch of ideas we should try to try to get those users. To do those steps. Right? So for us, we consider a user activated if they create four bug reports with Jam.
And so we're always trying to think like, how do we push more of our users to create their third and fourth bug report? Like how can we encourage them and show them the value? And and so it's just, there's just in a, a, a big brainstorm, like lots of ideas moment and AI is actually just fine at like coming up with a lot of bad ideas that trigger you to come up with, with the good ideas.
And so that's just another obvious way to use AI in this step, maybe.
[00:19:55] Hannah Clark: I do like the idea of, of generating bad ideas that help you come to the conclusions of good ideas. [00:20:00]
[00:20:00] Dani Grant: Here's, here's the thing about humans. We're like, it's really, really hard to have pure imagination to come up with an idea from zero.
Like if you're ever tasked with being the first one to write the first draft of something, it's really, really hard. But humans are really, really good at reviewing and reacting because like, that's just how our brains are wired. We see something and then it triggers other ideas. And so as long as you have one bad first draft or one bad first idea.
You're suddenly really, really good at the second draft, and so AI is good at giving you the first thing that triggers your brain to think the way that it does.
[00:20:28] Hannah Clark: Love that advice. Sorry, I think I might have cut you off like a little bit there.
[00:20:31] Ramli John: No, I was just adding on that. I think the great thing about, um, you know, this, this kind of large language models is like, you can train it.
So it's like, Oh, this is bad. Here's why you're like, here's why I think it's bad. And you kind of give it like some kind of feedback. I heard this really great pod quote from a podcast episode with Nathan Berry. He's over at and he's like, you have to treat AI like a very, early stage intern, like an intern who's like a, like a [00:21:00] beginner.
And by you telling it good feedback, it actually learns what is good and what is bad. So that as Danny mentioned, that, you know, it comes up with a bad idea, give it, give it feedback. And it'll actually start kind of producing better and better output and results based on, on that.
[00:21:18] Hannah Clark: Yeah. Or maybe it'll, at least the bad version will be a baseline better.
So your ideas are even better. I don't think it works that way, but maybe,
[00:21:27] Anuj Adhiya: uh, Ramli, I've been calling it like a drunken ton, but you know, that's a better day
[00:21:33] Ramli John: because it's very visual,
[00:21:35] Anuj Adhiya: but also the only other thing I will add here, and maybe this is a segue into the next thing that we talk about is that activation really isn't really a separate step.
From a user perspective because once they're in the product, they're in the product, right? And I think what's helped me sort of connect the [00:22:00] dots downstream and upstream is just thinking of activation as just short term retention, but it's all because once they're in the product, that's that's the game, right?
Is how do you keep them around longer? How do you monetize them better? More all of that, right? And, you know, connecting You know, what did they do as Ramli, you know, and Danny were talking about initially to what happens later is it's all happening along the continuum of retention. It's not a, it's not a flip a switch in a user's mind is like, Oh, I've been activated.
Now I've been retained. No, it's all the same thing.
[00:22:34] Hannah Clark: Yeah, that's a good point. Yeah. You kind of have to be thinking about things, not just from what you see as the stages, but where the user sees themselves.
[00:22:41] Dani Grant: It's, it's possible, right? It's like you're using the product and you use it. Yeah. In the beginning stage of use at the end.
I do think like as a, not as like a product person, but as a user, there's a point at which I'm checking something out and there's a point in which I'm using it. And I think activation happens in the middle. So I do, I do think they are somewhat different from the user's [00:23:00] perspective.
[00:23:01] Hannah Clark: Sure. Yeah. Right.
Um, do we want to explore that a little further or should we move on to the retention stage and get Anoush to lead the charge there?
[00:23:14] Dani Grant: Let's retain some users.
[00:23:16] Hannah Clark: All right. Let's retain. Uh, so yeah. Well, uh, Anuj, what, what would you say is, or some of the possibly use cases that you've seen already, or some ideas that you have around using AI, uh, for that retention stage now that we're kind of in that, okay, we've got our user in the product, where are we going from here?
[00:23:30] Anuj Adhiya: Yeah. And some of this, I'll say this upfront, you know, it'll sound like an expansion of what's already been said in many ways, because I think. What's most fascinating to me about retention and PLG is how it's sort of completely transforming our ability to understand and act on user behavior patterns, right?
And I'll share an example. So, you know, many teams that I work with. Unlike, I think everybody even potentially audience just sitting on mountains of product [00:24:00] usage data, right? Everybody's got amplitude and heap and mixed panel and whatever, right? But for whatever reason, maybe the stage of company, uh, not enough people on, you know, not enough data analysts, whatever it may be, they're struggling to extract enough or the enough of the right kind of actionable insights quickly enough.
Right. So I think what's changing the game is sort of combining these traditional analytics platforms like your mixed panels, amplitudes, heaps with these sort of modern AI analysis capabilities, right? And you can just see it, right? It's almost like People were operating in the dark, and now the lights have been turned on, right?
Because, uh, the thing that, uh, sort of started to emerge to me, right? And I'm sure many of us have used these product analytics tools for, you know, eons, right? And I thought I was reasonably competent at them. But when I started plugging in some of this data, you know, into, you know, whether it was private versions of ChatGPT or, you know, things like [00:25:00] that, You know, it started identifying these very sort of fascinating micro cohorts.
I think that would be almost impossible to spot manually or, you know, based on your level of expertise with any of these tools, right? And so kind of going off of the examples that already be told, right? I think you might discover, right, that users who perform a specific sequence of actions in their first week become power users at three times the normal rate, right?
Uh, like I think there's, there's One example from one customer where, you know, they found that users who both export data and share it with teammates in their first few days are showing much stronger engagement patterns like this kind of thing is is the kind of thing people are looking for, right? But so I think what's, uh, Really powerful here, I think, is that combination of analysis and automation, right?
Because I think what some of these companies now are starting to set up a better behavior trigger journeys, right? Because people have always wanted to do this, right? Because that [00:26:00] when the system detects a particular user following some sort of high value pattern. It can automatically sort of shift, like the content or the experience into getting them faster to that path, right?
And so I think in one case for me has gone even beyond what's happening in product, because a high potential users in one case sort of got invited to a beta program, or they got invited to a customer advisory board based on their experience. Sort of actual usage patterns, right? And that's the thing people want to do to Romney's earlier point of getting closer to the customer, but they just haven't been able to do it fast enough.
And I think, uh, also, I think for me, the real game changer is sort of using, uh, all of these tools to connect the quantitative to the qualitative, right? Because machine learning, yes, you know, you can analyze patterns across support and community and product feedback, right? And so, but connecting those insights with usage data, right?
So like when you, [00:27:00] uh, maybe combine that with something like gong or your product analytics, right? You get a, I think, much, uh, sort of deeper 360 degree view of sort of not just what they're doing, but why they're doing it. Right. And I think that's that of the insight and the speed of that insight, uh, I think is, is invaluable because again, it's, you're relying less and less on assumptions and upping the percentage of what's actually happening because of user patterns and needs.
[00:27:31] Hannah Clark: Okay. So I'm curious about this is, uh, do you or any other panelists have kind of Um, like anecdotes or stories of kind of like seeing some of this in action. Cause I, I, I mean, that's a perennial issue of qualitating, uh, or, uh, combining like quantum qual data and being able to kind of tell that story more effectively.
[00:27:55] Anuj Adhiya: Um, so I think I'll, I'll, I'll jump in with another example from, from my end. Right. I think [00:28:00] one, um, for a lot of teams, I think these exercises are disconnected.
[00:28:04] Ramli John: Right.
[00:28:05] Anuj Adhiya: So there'll be a team that will, you know, run the product market fit survey. Right, then there will be a completely separate database that will host all of the support tickets at the completely separate database of sales conversations, right?
And, uh, sometimes it's not clear even whose job it is to sort of put all of this together. Right. And it's just that uncertainty and indecision, not because it's somebody's fault, it's just because nobody was told that this is, this is their job because they're like, Oh, I'm a user researcher, I'm a salesperson.
Right. And so I think it's power and, uh, sort of assigning it as your, let's call it your AI employee to make the, take that decision out of your hands and just Collate all of this data for you and then present it to, you know, all of the people, all the stakeholders involved. Right. I think is [00:29:00] a trend I'm starting to just see more of where, you know, it's, it's the, the, um, I guess, you know, the opportunity cost of waiting or indecision is just so high.
Right. That this is sort of the most MVP way of let's just connect a few data sources or even if you can't automatically do it, let's just take some exports of these data. Let's just throw multiple spreadsheets into this thing and let it, you know, sort of elucidate these patterns for us. Even if you do the most MVP way of starting to at least speed up your rate of insights and being able to act on it.
[00:29:40] Hannah Clark: Yeah, what I'm hearing here is kind of like, uh, using AI as a way to sort of, um, sort of break down some of the silos between all these departments are. So are you suggesting sort of like generating a cross departmental, you know, state of the union report kind of thing to kind of help people like give everybody in different departments, um, Uh, like a bird's eye view of what's [00:30:00] going on across areas.
[00:30:02] Anuj Adhiya: No, absolutely. Right. And so I think it's not controversial to say that growth in general is, you know, it's a multiplayer sport, right? It's cross functional by definition, right? And certainly true in the product led world because, you know, you're trying to bring all of these pieces of content and education and community and product and sales and all of these teams have to work together.
Right. Right. And if you have a growth team of any size, right, the entire purpose of that is to bring all of the key stakeholders together to understand, you know, what's the current state of growth, where are the problems, where are the opportunities. And it's always a challenge for everybody to understand, not necessarily what is the role I play.
But what is the best, most impactful thing I could be doing right now, right? And communicating that because, you know, I think being a person that leads growth is as much about, [00:31:00] to your point, storytelling and communicating insights rather than here's what the data says. For people to understand, why should they care about this thing?
Yeah. You know, don't rely on your own abilities, right? Like, get help from an AI tool to help you understand, how do I communicate this data point better to sales versus, you know, DevRel versus, You know, product, whatever that may be. Right. And so I think it's power and breaking down those silos and getting everybody on the same page is highly underrated right now.
[00:31:31] Hannah Clark: I, you know, I really, really like this tactic. I want to kind of vote upvote this tactic. Uh, we had a podcast episode with Michelle Ronson, who is a well known UX researcher. Um, I, I, it's a very similar kind of process that she recommended that was a little bit more analog. It wasn't necessarily AI enabled.
I like this as an AI enabled counterpart, but her recommendation was, uh, more in terms of UX research, when you've got these, you know, you've conducted all this research. That's great. But then how is it usable to all of the different departments [00:32:00] that it connects to? And, and her recommendation was just to frame it department by department.
Like, what are, what are these learnings kind of informing as far as the action, the next action items for each department? So I kind of see it as sort of a similar, an equivalent to that in which you're kind of using the AI to kind of translate, like, what are the findings that we're, that we're seeing across Um, and what are the action items that we can kind of derive to kind of help us work together better as a team?
Because yes, you're right. I think that this is an issue that we struggle with, um, in every startup and, and every company of any size is, you know, how do we work better at cross functionally? Totally. Yeah. Does anyone else, uh, I love this kind of, um, this discussion point, does anyone else have, um, any notes on, even using AI to work better cross functionally I think is a whole other area that we haven't explored.
[00:32:46] Dani Grant: This is not AI for cross functional, but, um, The, look, why, why do companies hire product managers as everyone on this call knows it's because you want one point person who is going to get to the [00:33:00] bottom of things and owns the whole problem and owns the success of the product and is going to do what literally whatever it takes to make the product succeed and the project go well.
Um, and so like, uh, what Anuj is saying where, um, something is not someone's problem, like someone understood it at a spreadsheet level, but not at a user level is like, it's so true, but that's. The beauty of PMs, right? There's no PM in the world that says, that's not my problem. They're just there to make the product succeed.
One thing when, so not AI related, but one tactic that's been very helpful for us that maybe is helpful for you all is, um, sometimes it can be kind of a leap to go from a spreadsheet level. To a user level if you don't have user calls set up anyway, and so the best thing you can have is sort of a drumbeat of like you're talking to customers all the time, and then you just like plug in a few questions that you have.
Um, but that's tricky. How do you do that? And so we have, um, We have a little bit of automation around this that I recommend for everyone, which is every single week, the top 100 [00:34:00] users from the week before that haven't heard from us recently, get an email from a jam co founder asking how things are going.
And if they reply and their reply is like very thoughtful, we'll say like, Oh, like, we'd love to learn more. Do you, like, would you be up to hop on a call? And so we have this drumbeat of calls. And what that means is that if a PM is looking at something and saying, this doesn't make sense to me, or I'm curious to learn more, there is something easy to join instead of having to invent a new motion of customer calls from scratch.
So anyway, one tactic, not AI related, but highly recommend.
[00:34:29] Hannah Clark: Awesome. Yeah, yeah, it doesn't have to be AI. We're just looking to grow. Um, alright, so we'll move on to the expand, uh, stage here. Uh, so Anuj, if you want to take the lead on this one as well, um, how would you use AI to take call transcripts and generate a first draft of, oh, sorry, this is Dani's question.
I'm sorry, I totally, I lifted it. Sorry, we'll get a little peek behind the curtain here. Uh, but Anuj, if you do have anything you wanted to, um, Lead with on expanding then we can move into Danny's tactic as well.
[00:34:57] Anuj Adhiya: Sure. Um, so again, I think like I said, [00:35:00] everything's sort of a bit of a continuum for me, right?
So, but if you think about sort of the traditional expansion playbook, like, you know, how does it work? Like, you wait for usage indicators, maybe you have a customer success team to do some quarterly reviews. You know, and quite frankly, I've worked with some teams that are just hoping to catch expansion signals, uh, right, um, so, but I'll give you, I'll give you a practical example.
This is one SaaS company, you know, that are using, uh, um, you know, sort of gong, right, to analyze customer interactions across sort of every touch point, right? Sales call, support tickets, QBRs, Of everything, right? And sort of the key is sort of they're trying to train the system to look for very specific expansion indicators, right?
So it's not just picking up on like very obvious phrases like we need more licenses, like, right, but I think it can start to identify patterns around discussions of say, like, adjacent use cases, right? A mention of a new team member or department, or a [00:36:00] conversation about a pain point that could be solved with some additional capability, right?
So I think when you start to combine that sort of conversational data, uh, right with product usage analytics, uh, that's when, you know, it's allowing this team to create, let's just call it, uh, you know, expansion intent scores, right? Based on sort of multiple signals, right? So Then when they start to hit certain usage thresholds, right, like, uh, in this one case, like they were approaching like an AI limit, sorry, an API limit, uh, right, or they were attempting to access some premium features, right?
And so this sort of thing correlates with very specific, uh, sort of conversation patterns, uh, that they've also analyzed, right? And that sort of started to create a really clear signal for expansion opportunities. Right.
[00:36:53] Hannah Clark: So, um,
[00:36:57] Anuj Adhiya: Oh, so all of this to say, it's not just to [00:37:00] identify expansion opportunities, but to, I think, make the process also feel more natural and sort of value aligned for the customer.
[00:37:08] Hannah Clark: Fair enough. Yeah. Um, I know Danny has a process that she was going to walk us through, excuse me, uh, as far as using call transcripts to generate a first draft of follow up emails, which I kind of alluded to earlier. Uh, did you want to walk us through that process, Danny?
[00:37:23] Dani Grant: I mean, pretty self explanatory.
I'll say that as product people, we really like, um, at least, I really believe that products should do all of the heavy lifting. But really, to change people's behavior, you also want a face that you can trust. And so a lot of the Xpand stuff is very human led, um, versus product led. Um, in a, um, Really well running motion.
You have a lot of back to back calls, helping teams expand their usage. Um, and you want to follow up from those calls with something thoughtful. Um, and again, it's hard to start from no draft. It's a lot easier to start from a first draft, taking a call transcript, giving it to Claude and say like, what were the three [00:38:00] main points this person like communicated to me?
And then using that to draft, like, here's what we heard. Uh, it's just a lot easier than playing slate, especially if you have eight back to back calls, then you're doing it at the end of the day. You don't fully remember, like, brain is a little fried.
[00:38:16] Hannah Clark: Yeah, I appreciate, yeah, anything to, uh, save the old brain at the end of the week.
Um, okay, we'll move on to section three here, uh, which is from vision to execution. So this is all about building AI enhanced PLG strategy. I'm actually going to throw it to Ronly because we haven't seen him for a couple minutes here. Um, so, uh, I would love if you can kind of take us through some of the common pitfalls and misconceptions, um, around PLG, around using AI to support it.
Kind of what are some, uh, is, some great pitfalls or, uh, Not great. Detrimental pitfalls that you've seen in your work.
[00:38:48] Ramli John: I would say in terms of AI itself, the output is only as good as the input. So like, if you feed it, like, nothing or garbage information, then you're going to get garbage for output as [00:39:00] well.
And I think just based on the conversation we're hearing, what Daddy's saying, like, don't start with a blank page. You really got to feed it with, uh, your, you know, the right kind of information, things specifically, uh, around training them with, you know, the best customer experience and the best calls that you have, uh, and like giving it the right kind of context, especially during the expansion that we just heard, where here are the calls that we've had with them in here.
It's like the documents that we have around them, like give me the output around that. Um, and kind of giving it like the right kind of context as to like, you know, um, the right kind of prompt in terms of like, Hey, we're trying to do this. Here's what we're expecting. Give me three points and kind of being goes back to what Anush said earlier.
Uh, he called it a treated like a drunk intern. I would say treated like an intern. just starting at the job. So like a little bit more verbose or a little bit more up front at the beginning and saying, Hey, here's, [00:40:00] here's what I'm looking for. Here's the output I'm expecting. And here is the context that is in that in terms of the information.
I think that goes a long way rather than. Then, um, then, um, being big about it, I've also seen prompts where pretend that you're, um, the director of customer success or customer director of product for this company. There are, here's some information about the company. Here's the LinkedIn profile of this company, of this person.
And, uh, what would I write? What are three bullet points that this person might care about based on the calls that we've had with them? I think that kind of like being like very clear and upfront and being Uh succinct about that as well. It could be very helpful in terms of using AI with them
[00:40:45] Dani Grant: I think one pitfall to add is especially with really junior people on your team Um one of the if you remember being a new person in the workforce or new person in your role One of the hardest skills to learn is what does done mean?
You And the more junior [00:41:00] you are, the more done just means complete, versus understanding the impact of what really good work can create. And so, I think one of the pitfalls with using AI in any of these processes is that for a junior person, done just means, can just mean complete. And so, oh, the AI ransom analysis, but a more senior person would be like, And that has surfaced some new questions for us to dive into that are the next logical things to look at.
Or this actually doesn't quite make sense to me. This must be a hallucination. Or this draft of an email reads like mumbo jumbo and like, obviously we shouldn't send it. And so I think that's a big pitfall. And I think the way to avoid that is to actually just have a lot of conversations about what does done mean in this team?
What does done mean in this company using it or not?
[00:41:41] Hannah Clark: That's very, I think, great career building advice in general is just to be constantly asking that question. What does done mean? Um, I, I love that take. Go ahead.
[00:41:54] Anuj Adhiya: Sorry, guys. Add a couple of things. One to what Danny said and one to what Romney said. Uh, so [00:42:00] the point Romney was making about, you know, uh, being garbage in, garbage out, uh, sort of a pro tip for myself now that I have begun to implement is, uh, I've started to realize most people don't even know how to Google properly.
Right. And that's almost my way of. Getting an early indicator of whether, if I'm going to hand off a task to somebody, will they even be able to prompt properly? Um, right. Uh, because prompting is so far away from, like, it can get to this point, like, super verbose, right? I think if, if, so I literally have people run, you know, what I think are, you know, basic Google searches.
I can just see how they respond to that. And like that almost gives me a signal of, okay, you know, maybe this is a person that needs a little bit more training, you know, before I let them lose on this sort of a system versus, okay, this person inherently [00:43:00] gets how to query a system, uh, and let them into the system first.
Um, so that's, uh, the other thing, um, the, uh, other thing about what, uh, Danny was talking about, right? Where things might seem. Like, you know, complete, right? Or, or, or done. Um, I think this is why I have found great value in one of the first exercises I do is have everybody on a team understand what is the product North Star metric.
Uh, right. Because this is critical for everybody to understand that this is how we deliver value to our users and customers. Everything we do. is in the service of growing value. So, and that applies as much to if you're going to prompt a system, that is the perspective the system needs, not the perspective of the specific task or the specific analysis, is this is in the service of that greater thing of growing [00:44:00] value, you know, to our users.
Uh, right. So I think the common theme that's running through all of this is I think there's a little bit of more context setting and background that whoever is going to interact with these systems should have, uh, you know, in terms of that sort of greater, uh, user perspective, you know, before they start, uh, interacting with these systems and extract whatever they think is an insight.
[00:44:26] Hannah Clark: Yeah, I have some definitely a few common themes emerging here that is as well as challenges with regards to working together as different teams. Okay, we're just in the state for the sake of time, we're going to cut this section a little short so that we can get into Q& A, but I think the Q& A is going to really wrap up nicely into this whole idea of building AI enhanced PLG strategies.
But before we get into that, um, we're just gonna, uh, go through a little pre close here. Um, we, uh, oh, you know what, actually, normally we take this time to just tell you about our next session. We are, [00:45:00] uh, for those who are, you know, uh, kind of engaged with us month over month with, uh, sessions, just so you know, we are not going to be doing a panel in December.
Uh, we're taking a bit of a break for the holidays. Um, so, uh, if you see, you know, there's no session, you know, where are they, we are going to be back in January, uh, with a, uh, uh, basically a more of a course or excuse me, a career focused session. Um, so it'll be about transitioning into career, uh, excuse me, transitioning into a career in product management.
That'll be the focus of the panel. So registration for that will be starting in December. So we'll send out a link to our subscribers, um, when that is available, when the registration is open, but it should be a really great, uh, session. Um, for those who are here probably won't be relevant as much since you're all product managers already.
Uh, but if you do know anyone who's interested in the career, you Uh, who's been asking you a lot of questions and is kind of curious about making that jump. Um, please let them know. We'd love if you could help us spread the word. And we'll get right into questions. Q& A. Um, so this one is from Diem. And by the way, if you [00:46:00] are, uh, haven't voted for a question yet, you can still vote, uh, upvote your questions in the Q& A area.
You can also ask some more. We'll, we'll get through as many as we can. Um, our most voted question is from DM. Um, it is AI can exhibit bias in interpreting nuanced data leading to misleading conclusions in the context of PLG, how can we strategically leverage AI to enhance user experiences and drive growth while mitigating these risks, then you wouldn't want to take the lead on that one.
[00:46:24] Dani Grant: The thing about AI is it doesn't solve human problems and you really just have to use your judgment. So it's a great tool, but like, you need a skeptic's mind. And, um, and, and that's, that's, that, that's like in the pitfalls of like junior person thinking something is done, just having an answer doesn't actually solve your problem.
I just like, when you're using AI, especially you have to be even more kind of really cognizant, really there, really present with your work, really thinking.
[00:46:51] Ramli John: Yeah, I totally agree. I think there needs, there, there, this goes into the whole ethics of AI. Like there needs to be human intervention. I really [00:47:00] do believe that where, you know, that biases we, there needs to be human to catch those kinds of biases, especially, um, you know, like you, it's just an input.
I think it's just an input to your decision making process. And you could have multiple inputs. You can have qualitative data, quantitative data. You have. AI suggestion. You can ask your CEO, you can ask somebody from customer success, but this are all input to eventually. It needs to be a human to make the decision, but it needs to be somebody that's like, yes, that's what we're doing.
That's where we're going because based on all of this information, including an input from AI, uh, and I would be very cautious if, if we let AI make the decision for us, uh, and kind of go from there and there's obviously there's some biases there that they might not catch it on its own. So.
[00:47:51] Anuj Adhiya: Um, so what I would say is, uh, I think it's really important to not forget how things were done in the past. [00:48:00] Right. So, you know, let's just call them like, you know, ground truth baselines right where. You know, how did we do things before AI, right? Just so that we can have a clear comparison point, right?
Because if we don't know how we did it, like, you know, how will you even know whether that system is hallucinating or even have the opportunity to ask that question as to should I poke at this a little bit more or not? Right? And I think what's associated with that, right? I think I think it's sort of important to have sort of this analysis.
Yeah. Triangulation approach, right? So I don't know. Let's pick a situation where, uh, I don't know. Let's say an AI system is going to flag like some users at risk or something, right? It's like, okay, you can't just take that at face value, right? You've got to sort of look at sort of the raw product analytics data.
You gotta look at feedback, look at whatever the customer success team is talking about, right? And like, see, like, are those signals manually also aligning? [00:49:00] And that gives you a little bit more confidence in the system as well, right? Okay. Uh, and so I think a recent example I came across, uh, I haven't personally used it.
So disclaimer, but, uh, I saw a heap analytics, uh, has this sort of great feature. Uh, right where, you know, they call it like, you know, they've explicitly designed for diversity in their training data and sort of in PLG. What that means is, you know, so they've ensured that their AI models learn from users across different company sizes, industries, use cases, things like that, right?
So what that does is sort of, you know, it helps you catch sort of when your AI system might be over indexing on a behavior pattern, You know, from like a larger or a more active customer base, right? While missing any signals from smaller ones, right? So I think there are some tools that are catching on to this and trying to account for it as well.
Uh, right. But I think again, [00:50:00] just don't forget the way you've done it and you can always manually verify.
[00:50:04] Hannah Clark: Good tips. Um, before we move on to our next question, I just wanted to apologize. We do have one more session that I forgot to, uh, I was so focused on panels. I was thinking about, uh, or forgot about our other event we have coming up.
Uh, in December, or excuse me, in this month. Uh, which is Ask the Expert in Product Design with Koji Pereira. Uh, so he is a pro uh, a lead at, uh, Sigma Computing, but he spent the bulk of his career at Google. So he's the real deal. Uh, he was on the podcast with us recently. Uh, if you're interested in product design, definitely check out that episode, and we'd love to see you at that session.
Um, but anyway, moving on, uh, so the next best up voted question is, would you recommend using AI in place of traditional support or customer success teams? I feel like we've kind of already asked, answered this one passively, uh, but we can maybe give some more context here. So those roles are too typically seen as responsible for cohort retention, yet are two areas where we were seeing explosive growth rate in agentic AI tools.
This is true. This is interesting. Okay. Who wants to get into this [00:51:00] one?
[00:51:00] Dani Grant: I think that the most important thing in 2024 is your user's experience. Because it's, there's more competition than ever, and it's easier than ever to switch tools. So, um, because most companies have some sort of PLG strategy, if someone gets frustrated with your product, it's not that much later in the day that they sign up for a free trial of another product.
And so user experience in 2024 is the most important thing. And so I think that that's the way to make that decision for your users. If the user experience that you want to enable is, we've got a lot of documentation, we want to make it really, really easy to search, and natural language is a more, like, simple way to search our docs, then having some sort of agentic customer support is actually great, right?
Because it's just a nice UI for your docs. If your, if the user experience that you want to provide is that We're here for you 24 seven. There's a person on the other side who cares about your problem and is right there with you for every step of the product journey, then that's probably not the experience to provide.
It's possible that for some of your users, [00:52:00] like for your developer audience, you want one, they actually don't want to talk, get on a call, but they want just quick, like query language for information and for your enterprise, like customers, you want something else. So anyway, I think that the way to think about this is not through a tools lens, but rather through what's the user experience.
[00:52:20] Ramli John: So I totally agree. I think people are craving, I think with the rise, with more and more rise of AI, I think people crave human connection. There's this quote, cause I'm, I'm, I'm working on a new book, but it's a quote from Josh, Josh Kaufman. He wrote the book, the book, the personal MBA, he said that there's a paradox of automation.
The more that the automation is efficient, the more crucial the human experience is. And I think that's so true because I think in, in, um, talking about customer success and customer support and user experience, like what Danny was talking about, I think people, Would be delighted when there's an actual [00:53:00] human chatting with them.
There's like face to face, like right now. And whenever I go to in person events, they just like, Oh, you're real. You're not like a chat bot. That's like spitting out like things you're actually somebody who. Cares for me, you know, who I feel valued as a customer. And I think that's going to go in a longer way, especially as, you know, companies automate more and more things through AI support, um, you know, marketing websites and things like that.
And even emails, I heard, um, KFC is introducing AI emails to get you to buy more fried chicken, which is crazy. But I think it's really going to be more important to have humans involved, especially in, in, uh, in. B to B products. I think where, you know, customer and face to face does matter in that value.
[00:53:48] Dani Grant: I think at the same time, the, um, the bar for what quality AI experience is getting higher.
So imagine like your own experiences using products out in the world. If someone introduces an AI feature [00:54:00] or an AI chat support, and it is a low quality experience, you actually hate them even more than if it was just like a normal feature or a normal chat support. Um, there is something about like, Oh my God, they're just on this buzz train.
It's just hype. It's just marketing. It's whatever that just like feels bad in your gut. And so I think that if you are going to introduce. Such a core functionality of your product, like customer support using AI and that the customer knows that it's AI, it better be like really darn good. The bar is actually even higher to ship that than it would be otherwise.
[00:54:29] Hannah Clark: So we have very few time or a few minutes left before we get to our very last question. It's gonna have to be a bit of a speed round. I just want to point everybody out to Michael's posted the link to the feedback form for today's session. We would love your feedback. We're always trying to make these sessions better.
We always review it. We always read everything. So please fill out that form and let us know what you think what we would like to see in the future. And then our last question for the day. How do you balance personalization through AI with user privacy concerns?
[00:54:58] Dani Grant: Follow your privacy policy. Like don't do any, if [00:55:00] you're, if you're thinking this is wrong, don't do it.
Like just, just
[00:55:04] Hannah Clark: don't do it.
[00:55:05] Dani Grant: Like, um, like for example, if you, if you're in your privacy policy, it says we're allowed to look at top level usage metrics in order to improve our product. Then an AI can help you in that. If it's, if in your privacy policy, it doesn't say we, we're allowed to look at all user data to all, like, like just, you know, whatever, like user trust above all else.
Okay.
[00:55:31] Hannah Clark: I don't think that there's much else to say about that.
[00:55:37] Ramli John: There's definitely some nuance, um, response. Like I've gotten emails where like, Oh, Ramlee, I, um, I noticed you did X, Y, and Z in our product. And like, I think there is some value there where like, Oh yeah. Okay. I'm stuck. I need help. But if it's like, I'm not sure. I think there's definitely, I would agree with Danny.
I think it really is like the privacy concern where like, Oh, I feel like I'm being stalked, but I'm in your house. [00:56:00] I'm in your product. So maybe I do expect that a little bit for you to see what I'm doing.
[00:56:05] Anuj Adhiya: Yeah. So, I think this one goes to a question I've been kind of pondering, right? And somebody, I don't know who said this to me, but they, you know, they called it, you know, their attempt was at respectful personalization, right?
And so I think that that's the sort of the key for them was to be really transparent about the value exchange, right? And I don't think people are not going to expect that we use AI, you know, to personalize experiences moving forward. But I think it'd be useful to just follow. You know, the simple rules like, okay, if you're going to collect a particular data point, we internally really should have a really clear demonstrable benefit to the user that that as your guiding light first.
[00:56:50] Hannah Clark: Well, um, yeah, these are all very insightful. Unfortunately, we're so out of time. We always do this. We always leave ourselves not quite enough time to get through all [00:57:00] the questions. Uh, so, uh, thank you everybody for such an engaged session. Thank you for participating in the chat and for all of your questions.
Um, and for joining us today. It's really great of you guys to make time. I hope that this, uh, session was helpful for everybody who attended and I want to give a warm thank you to our panelists. Romney and news, Danny, you guys have been amazing. Always a pleasure to have you around with us. And thank you everybody.
Have a great day