In today’s rapidly evolving tech landscape, Product-Led Growth (PLG) has emerged as a pivotal strategy for companies aspiring to thrive. At the heart of a successful PLG approach lies an outstanding product—one that effortlessly captivates users and inspires them to share their experience with others.
In this episode, Hannah Clark is joined by Ramli John (Founder of Delight Path), Dani Grant (CEO of Jam.dev), and Anuj Adhiya (Expert In Residence at Techstars) to delve into the transformative power of PLG, especially in conjunction with Artificial Intelligence (AI).
Interview Highlights
- Ramli John’s Insights on Onboarding [01:16]
- Ramli identifies the biggest onboarding issue as internal friction, not product friction.
- Teams like product, marketing, and customer success often work in silos without collaborating.
- Ramli shares that even at Appcues, a product adoption software company, this issue existed.
- His solution: improve communication between teams and align on what user success looks like.
- Dani Grant on Balancing Growth and Engagement [02:12]
- Jam.dev has grown to 150,000 users across 32 Fortune 100 companies.
- Dani praises Ramli’s book, Product-Led Onboarding, as transformative and required reading for her growth product team.
- The book redefines onboarding as a process starting before signup and continuing long after, improving product outcomes.
- Dani credits Jam’s growth to sharing lessons from building their product with their builder-focused user base.
- She emphasizes the privilege of contributing to events and discussions like this one.
- Anuj Adhiya’s Unique Approach to PLG [03:49]
- Anuj is planning a Guinness World Record attempt for the largest gathering of people wearing party hats in Boston.
- The event ties into his work with PartyClick, a party invite app he’s consulting for.
- The idea emerged as a creative way to promote the app and align with its brand.
- Anuj is currently applying to Guinness and coordinating logistics to host 2,500 people at Boston Common.
- He shares the plan publicly to stay accountable and motivated to follow through.
- The Intersection of AI and PLG [05:51]
- If PLG means having a free trial or freemium model, it’s not always necessary, especially for startups.
- Ramli advises startups to begin with high-touch approaches to build close customer relationships.
- If PLG means removing friction and creating great user experiences, then it is essential.
- User expectations for seamless experiences have increased significantly, impacting retention and activation.
- Ramli emphasizes that AI’s role in improving the product experience will be critical in the future.
- Ramli highlights that many founders ask the wrong question, focusing on replicating successful companies like Slack.
- He advises founders to instead ask how they can better use their product to meet their customers’ needs.
- This approach leads to more effective and tailored product-led strategies, rather than blindly copying others.
I see many founders asking the wrong question. The question they should be asking is: How can I use my product better to serve the end needs of my customers or users?
Ramli John
- AI and PLG Strategies [07:58]
- Dani explains that in PLG, personalized selling is key, and AI can greatly enhance this by handling large, unorganized data for better segmentation and content delivery.
- AI makes personalization and super segmentation easier than ever, helping companies improve their PLG strategies.
- The best PLG strategy is having an exceptional product that drives word-of-mouth, and AI can help create even more powerful product features.
- Dani is excited to explore tactics further, with AI playing a major role in improving user experience and outcomes.
- AI in the Acquire Stage of PLG [09:11]
- Dani discusses how Jam uses AI in the acquire stage of PLG.
- AI helps with SEO by generating outlines and editing content according to SEO best practices, improving rankings.
- AI assists in gathering data, like pricing, to enhance content accuracy and relevance.
- Jam uses AI to create a JavaScript newsletter and podcast for their developer audience, delivering quick, engaging content weekly.
- The use of AI has led to high engagement, with over 50% open rates for newsletters and strong podcast listenership.
- Ramli uses AI to repurpose content for acquisition, such as turning podcast episodes into blog posts, Twitter posts, LinkedIn posts, and newsletter content.
- He uses tools like Castmagic to generate summaries and content, which requires some editing.
- AI helps maximize content value, allowing for more efficient distribution and engagement.
- Dani’s team believes founder-led sales in 2024 are happening online, with founders sharing their vision passively via platforms like LinkedIn.
- They use AI for content repurposing, such as taking podcast YouTube links and using tools like OpusClip to generate video clips or highlight “aha moments.”
- These AI-generated clips provide valuable content for further discussion and sharing, aiding the acquisition process.
- Dani discusses how Jam uses AI in the acquire stage of PLG.
- AI in the Activate Stage of PLG [12:48]
- AI for onboarding activation: One company uses AI to analyze sales call recordings and create personalized customer success documents.
- AI helps craft email onboarding sequences based on pain points from sales and onboarding calls.
- ChatGPT is trained with objections and support ticket data to improve customer activation.
- AI tools like Amplitude and June.so analyze user data to identify retention metrics, helping predict user behaviors and improve engagement.
- Data volume matters—AI models work best with a large user base for accurate insights.
- To drive user activation, focus on encouraging key actions (e.g., creating four bug reports in Jam).
- AI is helpful for brainstorming many ideas, even if some are bad, to inspire better solutions.
- The goal is to find effective ways to push users toward specific actions that demonstrate product value.
- Humans struggle with creating ideas from scratch but excel at reviewing and reacting to existing ones.
- AI helps by providing a first draft or idea, which triggers further creative thinking.
- Once a starting point is given, humans are better at refining and improving the concept.
- Large language models can be trained through feedback to improve their output.
- Treat AI like an early-stage intern that learns from feedback on what is good or bad.
- Providing feedback helps the AI generate better ideas and results over time.
- Activation is connected to short-term retention, not a separate step.
- Once users are in the product, the focus shifts to keeping them around longer and improving monetization.
- Activation and retention happen along a continuous continuum, not as distinct stages.
- AI in the Retention Stage of PLG [19:07]
- Retention is being transformed by combining traditional analytics with AI analysis.
- AI helps uncover micro cohorts and actionable insights from product usage data.
- AI can identify key behaviors, like exporting data and sharing with teammates, that lead to stronger user engagement.
- AI enables behavior-triggered journeys that adjust content and experiences based on user actions.
- High-potential users can be identified for beta programs or advisory boards based on usage patterns.
- Combining quantitative data with qualitative insights provides a 360-degree view of user behavior and needs.
- The combination of AI and analytics improves the speed and accuracy of user insights.
- Combining Quant and Qual Data with AI [22:52]
- Teams often have disconnected data sources, like product market fit surveys, support tickets, and sales conversations.
- The challenge lies in integrating these data sources due to unclear ownership.
- AI can act as an “employee” to collate and present this data to stakeholders, eliminating indecision.
- Even if full automation isn’t possible, combining exported spreadsheets can help speed up insights and decision-making.
- The opportunity cost of waiting for insights is high, making quick integration and action essential.
- Breaking Down Departmental Silos with AI [25:05]
- Growth is a cross-functional effort, requiring collaboration between content, education, community, product, and sales teams.
- A growth team’s goal is to bring stakeholders together to understand the state of growth, identify problems, and seize opportunities.
- Communicating insights effectively across departments is crucial, not just presenting data.
- AI can help leaders craft and communicate data points in ways that resonate with different teams, breaking down silos.
- The power of storytelling and clear communication is vital in ensuring all departments are aligned.
- Product managers (PMs) own the product’s success and are responsible for solving problems, no matter the challenge.
- Transitioning from spreadsheet-level data to user-level insights can be difficult, but PMs make it work.
- A helpful tactic is having regular customer interactions: top 100 users from the previous week are emailed by founders for feedback.
- Thoughtful responses lead to opportunities for follow-up calls, ensuring a consistent feedback loop.
- This tactic provides PMs with easy access to user insights without needing to establish a new customer call process.
- Expanding with AI: Practical Examples [29:20]
- Expansion is seen as part of a continuous process, connecting various stages.
- Traditional expansion strategies often rely on waiting for usage signals or quarterly reviews.
- Example: A SaaS company uses Gong to analyze customer interactions (sales calls, support tickets) for expansion indicators.
- The system identifies subtle signs like discussions of adjacent use cases, new team members, or pain points.
- By combining conversational data with product usage analytics, the company creates “expansion intent scores.”
- Clear expansion signals are triggered when users reach usage thresholds, such as approaching API limits or exploring premium features.
- The goal is to identify expansion opportunities while making the process feel natural and value-driven for customers.
- Product should do most of the heavy lifting, but behavior change requires a trusted face.
- The expansion step is often human-led, with many back-to-back calls to help teams expand usage.
- After calls, follow-ups should be thoughtful, but starting from scratch can be challenging.
- Using AI tools like Claude can help by summarizing key points from call transcripts to quickly draft follow-up messages.
- This approach is especially useful after multiple calls when energy is low or details are hard to recall.
- Building AI-Enhanced PLG Strategy [32:20]
- The quality of AI output depends on the quality of input; poor input leads to poor output.
- Avoid starting from scratch—provide AI with relevant and accurate data.
- Train AI with your best customer experiences and key documents.
- Give AI clear context and specific instructions on what you’re looking for.
- Treat AI like a new intern—be detailed and upfront with expectations.
- Use specific prompts, such as asking AI to generate bullet points based on customer interactions and company details.
- Junior team members may view “done” as just “complete,” lacking understanding of impact.
- Senior team members identify next steps or errors in AI-generated work.
- A pitfall of using AI is assuming completion is sufficient without deeper analysis.
- To avoid this, have discussions on what “done” means in your team and company, with or without AI.
- Proper prompting is crucial for AI systems; testing basic Google searches can indicate someone’s ability to prompt effectively.
- Team members should understand the product’s North Star metric to ensure all tasks align with growing user value.
- Context setting and understanding the broader user perspective are essential before interacting with AI systems.
Make sure everyone on the team understands the product’s North Star metric. This is critical because it helps everyone recognize how we deliver value to our users and customers. Everything we do serves the purpose of growing that value.
Anuj Adhiya
- Mitigating AI Bias in PLG Strategies [38:22]
- AI is a tool, but human judgment is crucial to avoid bias and misleading conclusions.
- A skeptic’s mindset is needed when using AI to ensure critical thinking and awareness.
- Just having an answer from AI doesn’t necessarily solve the problem; thoughtful analysis is required.
- AI should be used as one input in decision-making, alongside qualitative data and other sources.
- Human intervention is necessary to catch biases and make final decisions.
- AI can provide suggestions, but humans must ultimately decide the direction based on all inputs.
- Caution is needed to avoid letting AI make decisions independently.
- It’s important to remember past methods for comparison and detecting AI errors.
- Use a triangulation approach: verify AI findings with raw data, feedback, and customer success team insights.
- Ensure AI systems are diverse in training data, accounting for different company sizes, industries, and use cases.
- Some tools, like Heap Analytics, are designed to address biases in AI models.
- Always verify AI results manually to maintain confidence in the system.
The thing about AI is that it doesn’t solve human problems—you still need to use your judgment. It’s a great tool, but you need a skeptic’s mindset.
Dani Grant
- AI vs. Traditional Support in 2024 [41:47]
- In 2024, user experience is the top priority due to increased competition and ease of switching tools.
- AI tools can be useful for customer support if they enhance the user experience, like providing easy access to documentation.
- The decision to use AI should be based on the desired user experience: whether it’s automated support or personalized, 24/7 human interaction.
- For certain audiences, like developers, quick query tools may be sufficient, while enterprise users may require more personalized support.
- As AI automation increases, people crave more human connection.
- Customers appreciate human interactions, especially in customer support and success.
- In-person or human touch in B2B products can create stronger connections and value.
- Despite AI advancements, human involvement remains crucial in fostering customer trust and loyalty.
- The quality of AI experiences is increasingly important.
- Low-quality AI features or chat support can frustrate users more than traditional support.
- If AI is introduced, especially for customer support, it must be high quality.
- The bar for AI functionality is higher than for traditional features due to customer expectations.
- Balancing AI Personalization with User Privacy [45:17]
- Follow your privacy policy when using AI for personalization.
- Only use data permitted by your privacy policy.
- If user data isn’t covered, don’t use it.
- Prioritize user trust above all else.
- Personalized support can be valuable when users are stuck.
- Privacy concerns arise if users feel watched.
- Users may expect some level of tracking within a product.
- “Respectful personalization” involves transparency about value exchange.
- Users expect AI-driven personalization but require clear benefits.
- Ensure data collection has a demonstrable benefit for users.
Meet Our Guests
Founder of Delight Path, Ramli John is a leading expert in product-led onboarding for B2B SaaS companies. As the bestselling author of Product-Led Onboarding, which has sold over 35,000 copies, Ramli empowers product teams to design onboarding experiences that drive activation and customer retention. With a background in marketing, UX design, and software development, he has advised companies like Zapier, Appcues, and Mixpanel, creating actionable strategies that boost MRR and customer success.
If I could wave a magic wand, it would be to get teams to communicate more and agree on what success looks like for the user. It’s a challenging problem, but that’s where I would start.
Ramli John
Dani Grant is the co-founder and CEO of Jam.dev, a developer tool that allows for faster communication between product and engineering about bugs and fixes. Today, Jam has captured the hearts of huge clients like Unilever, Staples, T-Mobile, and Dell.
The best PLG strategy is to have such a great product that people can’t help but talk about it to others.
Dani Grant
Anuj Adhiya is a growth expert and author of Growth Hacking for Dummies, with deep experience guiding SaaS companies on implementing product-led growth strategies. Known for his data-driven approach, Anuj has helped startups scale by building growth frameworks that drive user acquisition, retention, and long-term engagement. He’s a frequent speaker on growth and PLG, and his insights help product teams unlock sustainable growth by aligning data, experimentation, and strategy.
Growth, in general, is a team sport. It’s cross-functional by definition, and this is especially true in the product-led world. You’re bringing together content, education, community, product, sales, and many other teams that must work in harmony.
Anuj Adhiya
Resources From This Episode:
- Subscribe to The Product Manager newsletter
- Check out this episode’s sponsor: Wix Studio
- Connect with Ramli, Dani, and Anuj on LinkedIn
- Check out Delight Path, Jam.dev, and Techstars
Related Articles And Podcasts:
- About The Product Manager Podcast
- Your Guide To The Fundamentals Of Product-Led Growth
- How To Growth Hack Even If You’re Not A Growth PM
- 6 Product-Led Growth Examples: The Companies Doing PLG Right
- How To Harness The 3 Growth Loops
- Measuring Product-Led Growth: The Metrics That Matter & How To Use Them
- The Product Led Growth Flywheel: Required Reading To Win At PLG
Read The Transcript:
We’re trying out transcribing our podcasts using a software program. Please forgive any typos as the bot isn’t correct 100% of the time.
Hannah Clark: If we were to rank the top paradigm shifts in product in the last 10 years, I'd reckon AI and product-led growth would sit at the top of that list. Both concepts really come down to amplifying efficiency, doing more with less. But the real cheat code is bringing them together and using the time saving power of LLMs to enhance your PLG strategy.
In a recent panel event, how to use AI to supercharge product-led growth, we brought together three awesome PLG experts — Ramli John, the founder of Delight Path, Dani Grant, the CEO of Jam.dev, and Anuj Adhiya, the author of Growth Hacking for Dummies. We got the three of them talking about the ways product teams can leverage AI technology right now to boost every phase of the user journey.
What I loved about this event was how each panelist's expertise came together in such a complimentary way, offering lots of practical ways to adapt and adjust your growth strategy using AI tools. And they were just super nice people. Let's jump in.
So today's session is going to be focusing on how to use AI to supercharge product-led growth. And we'll be speaking with some amazing voices in the space. We have a really exciting lineup today. I'm really excited to introduce them.
We've got Ramli John, who's the author of Product-Led Onboarding. He's also a renowned expert in the PLG strategy. Ramli brings a deep understanding of how to turn prospects into passionate users. We've got a little jeopardy question for Ramli today.
So Ramli, you've been called the onboarding wizard by some of the biggest names in SaaS, and your book Product-Led Onboarding has been a game changer for countless product teams. We're really just pumping your tires right now. If you can wave a magic wand and instantly fix the most common onboarding mistake you've seen, what would it be?
Ramli John: Thanks so much, Hannah. I would say it's actually not related to the product. It's often the biggest problem around onboarding is internal friction, not product friction. And what I mean by that is product does their product thing inside of the product, marketing does their onboarding emails, and then customer success does their customer onboarding thing.
And they don't talk to each other. This is what happened while we're even working at a onboarding company called Appcues, where like a product adoption software, and we had that same issue. So I think if I could wave a magic wand is if I can get the teams to talk to each other more and agree what success look like for the user, which is a hard problem, that's where I would start.
Hannah Clark: Man, siloing is just one of those perennial issues. Well, thanks for that.
We also have Dani Grant joining us. She's the founder of Jam.dev and a former product lead. Dani is known for using product data to design seamless, high impact user experiences, and she's also just such an awesome person. So, Dani, thanks so much for joining us.
Question to pass to you. So, Jam has skyrocketed to 150,000 users at 32 different Fortune 100 companies, which is incredible. But you're also speaking at events, conferences, and you're active daily on LinkedIn. Honestly, I'm jealous of how you managed to make all this work. So what's your secret to getting all these things done and staying sane?
Dani Grant: First, just everyone here should read Ramli's book. It is so good. My co-founder read it first in our company and then was like, told me you have to read it. I read it. Now it's required reading for our growth product team. And the thing that it will completely change your mind on is it will change your mind as to what onboarding is.
So we all, product managers, we all think about onboarding all the time. But we all think about it as from when you sign up to when you've used the product. And Ramli's book will show you that onboarding starts a lot before and ends a lot later and focusing on onboarding in that way will change the outcomes of your product like it has for us.
Anyway, Ramli's brilliant. Read his book. As far as speaking, we are so lucky. All of our users are builders. They are out in the world trying to change the world by using software. And so we're over here building our company. Our users are over there building their companies. And so our job is to share what we're learning, building Jam with everyone else, building their things.
And so we end up posting learning lessons online, joining things like this. It's such a privilege and an honor. So thank you for having me.
Hannah Clark: I appreciate you taking the time to pick up another panelist. That's so awesome. See, I told you guys, she's a great person.
We also have Anuj Adhiya joining us today. So Anuj is a growth expert and author of Growth Hacking for Dummies. He's got writing on productled.com and he's also got deep experience guiding SaaS companies on implementing product-led growth strategies. So really honored to have you here with us today, Anuj.
During our pre call, you shared that you're planning on setting up a Guinness book of world record by gathering, I think this is just so cool, the largest number of people wearing party hats in Boston. I think we need some more context here. Can you tell us more about what you're doing and how can people get involved? Where do we send the party hats?
Anuj Adhiya: Firstly, thanks for having me. This is such a fun group to be part of. And yes, I think on the surface, it feels like it has nothing to do with product-led growth, but it really does because it's a large experiment.
So there's this party invite app that I'm consulting with called PartyClick. If you want to go check it out and easiest way to set up an event. And we just try to think of what are ways we can put this in front of more people? And, a lot of ideas came around, especially things like, Oh, we should do our own celebrity lookalike thing, like the Timothee Chalamet thing that just happened and I'm like, yeah, that's okay.
But you know, what else can we do? And then somebody on the team was like, Don't they have world records for largest gatherings? Yes. That's what we should do. So I go and looked up the Guinness book and sure enough, there's a world record for people with party hats. I'm like, great. This goes with the name of the product, party hats, PartyClick.
Great. We should just do this. So literally this week, I'm in the process of getting through the application with the Guinness Book and, begging and pleading the city of Boston to let me have 2,500 people be in Boston Common. So, let's see how close I get to achieving that goal. So, but, and the real reason I'm saying this publicly is more to hold myself accountable and shame me if I don't make this happen.
Hannah Clark: So if you're in the Boston area, bring your party hats. What was the date again in December?
Anuj Adhiya: We're thinking December 22nd, so.
Hannah Clark: Okay, well now you got it.
Anuj Adhiya: What was in the area, we thought about it and if I don't make it happen, you know who to throw the brick back at.
Hannah Clark: So moving on to setting the stage, the intersection of AI and PLG.
This question is for Ramli, as a general question, has PLG become essential for the SaaS industry in general? We see it as a buzzword. Where are we at with that right now?
Ramli John: I would say it depends on, yeah, it depends. That sucks. But it depends on how you define product led. I think if you say product led, is PLG necessary?
If you have a free trial, freemium? I don't think so. I think I actually suggest startups start with high touch, get close to your customers. If you mean by product led is removing any unnecessary friction and create a great experience for users. I do think if that is the definition that we're talking about, POG, then yes, the standard and bar for people in terms of what they expect from a product experience has been improved.
Exponentially grown a lot more than it was 10 years ago, where, something annoying is oh, that's normal. But now it's if this is annoying, I have a hundred other options that I can jump into. There's a ton more products out there. And I really do believe that creating a great experience for end users, for users who are going to be using the product.
It's going to be critical in terms of retention and activation and all the other things that we're going to be talking about later, we're going to be touching upon how AI will affect the rest of the funnel. And if that's the definition of product led, then yes, I do think it's essential. Do you mean that's a free trial and freemium? Then no, I don't think so.
Hannah Clark: Wow. Nuanced answer. Okay. Does anyone have anything that they wanted to add about just like the relevance or state of PLG today?
Dani Grant: It's so powerful.
Ramli John: I'll just add that I see a lot of founders actually asking the wrong question, because the question they seem to ask is how can I be more like Slack or how can I be more like, when really the question they should be asking is how can I use my product better to serve the end needs of my customer or my user, right?
And that can take any and many forms. And that opens up far more powerful avenues for implementing a product led approach than just trying to copy something that's just never going to work for you.
Hannah Clark: So true.
So let's move into where AI is entering the space. This one is for Dani. So how would you say AI is poised to supercharge PLG strategies? I'm not sure if you've got any stories from how you guys have leveraged it at Jam.
Dani Grant: We're about to go into all the tactics, but at a broad level in PLG, your product does the selling and selling is something that is more effective when it's personalized, it takes a lot of content, it takes a lot of convincing.
And so this is actually something that AI is really good at. Companies have been trying to do this forever, like machine learning for personalization, super segmentation with lots of content. It's now easier than ever to do this really well. You have to understand large swaths of unorganized data.
And so really excited to dive into the tactics. This just feels like something AI is really good at. One last thing I'll say is that the best PLG strategy is just to have such a great product that people can't help but talk about it to others. And with AI, you can now build even more powerful features for your users.
And so that's also a really big part of the puzzle, I think.
Hannah Clark: Okay, so we'll move on to building the foundation. So AI's role across the user journey. I think this is going to be the section that takes up the bulk of the session. So definitely a must listen. Now is the time to put away the phones. So let's walk through each stage of PLG and show how AI can support each.
So let's start with the acquire. Dani, did you want to chime in with how your team is using AI to power that acquire stage?
Dani Grant: There are two things to mention. So the first is maybe the most obvious, which is in PLG, your customers find your product because the product has sold itself in a way and they onboard on their own.
And one place where you have to generate a ton of content is in SEO and AI can be really helpful. Unfortunately, all of us have seen what this looks like when it's bad. It just looks like AI mumbo jumbo and it doesn't rank actually, but AI is actually really helpful in SEO. So here's like tactically how we use it.
It's AI writes the outline using SEO best practices. You literally take an article about SEO best practices. You give it to the LLM and you say, write me an outline that would follow all these best practices. AI that edits your SEO or gives you recommendations for best practices. Using perplexity to search for the tools that you're going to mention, get me all the prices of all these tools, whatever.
So very helpful in that way. But beyond that, we're always thinking about like. How do we build content that engages our audience online? Our audience are developers and in web development, there's so much going on. The space is moving really quickly. And so we wanted to be able to do content around that for our audience.
So we thought, okay, let's create a newsletter and a podcast about what's happening this week in JavaScript. If that's interesting to you, by the way, you can go to thisweekinJavaScript.com and every week in four minutes or less, get the news of JavaScript in the week. And we're a tiny startup team. We can't record a podcast each week and AI is really helpful for us and it allows us to write great content.
We have like over 50 percent open rates on these newsletters when the podcast is really well listened to, but AI really helps us do that. So those are two ways that we use AI for the acquire step of PLG.
Hannah Clark: Super cool.
Ramli John: I think the other place that I've been using AI for in terms of the acquire, it's like repurposing content.
So like taking like a podcast episode and then plugging into Castmagic, which is a tool or summaries and it like outputs like blog posts and Twitter posts and LinkedIn posts that you can share and newsletter posts. A bit that it's, it needs a little bit of massaging. It's not perfect, but I think that's another area in content that I've seen AI has worked in terms of acquisition is.
How do you squeeze more juice out of the lemon? Is that the saying? I'm not entirely sure. How do you get more bang out of your content, essentially, it's one way I've seen AI be used for acquisition.
Dani Grant: We actually do the exact same thing. So one of our core beliefs is that founder led sales in 2024 is happening largely online.
So it used to be the case you'd have to get like a warm intro and then like you have a first meeting and the founder like introduces themselves to the customer that's happening passively online as people are scrolling LinkedIn today. And so you're trying to constantly share how you see the world and what changes you're trying to make in the world passively online.
And so we use AI for part of this, and it's just the repurposing content. What we'll do is, if someone on the team goes on like a podcast somewhere, we grab the YouTube link, we'll put it into a tool like OpusClip, which will clip up the YouTube link, and it either gives us video clips to share online, Or it pulls out video clips of things that were like aha moments.
And that gives us nuggets of things to talk about and write about later. That's been really helpful in the acquire stuff too.
Hannah Clark: I think that's a huge thing right now. I think both of you spoke to this, being able to repurpose content as much as possible and yeah, get as much juice out of the lemon as Ramli put it. That's super awesome. Okay, I think we can move on to the activate stage then.
So Ramli, you're the onboarding guy. What would you suggest as far as aI enabled practices to support that activate stage?
Ramli John: Yeah, this is actually something that I found another company do while we were in a conversation around how they're using AI in terms of their onboarding.
So one of the challenges with product led is Challenges or opportunities as well, is that you really have to have a strong like user research or customer research muscle with sales that approach you're like in direct contact with the customers, you're hearing the objections when your product is the one selling, you don't hear those objections, they just leave.
So now, in that case, like, how do you get those valuable insights, the why that the data, the quantitative data can't show. One way they're using AI, which I thought was cool, is they take all their sales calls right? They have some sales they have a hybrid approach. They take their sales calls recording, and then they have a, also a high touch onboarding experience for enterprise companies.
They take that recording, they plug it into ChatGPT to train the large language model. Once it's plugged in, they've started using it to create things. One of them, which I thought was cool, is create a sales to customer success, handoff document. And here's the things that you should do.
You can help craft can you help sequence create a five email onboarding sequence based on the pain points on the sales call and customer onboarding calls. So since it's been trained on the objections that the sales calls have made and the kind of question and confusion during the onboarding calls, It's more aware and more trained as to what your customers are actually going through.
I suggested that they also plug in their more, most top requested support tickets into that chat bot or that ChatGPT, and then feed it information that is valuable to that activation phase so that when you write that sequence or you write that tour, like what is, the tour and the sequence, and that can really help out with that.
I think that's one way it's around plug into ChatGPT. I've also seen. More advanced tools. AI tools plug in your data and it figure out like retention metrics. So I'm seeing like amplitude or, things like, June.so where you plug in your data and it tells you, Hey, you're most of your successful users.
If they do X, Y, and Z 30 percent stick around, which, I'm going to take that with a grain of salt, of course, but it is some valuable information to go through the data that you have, especially if you have a large enough. User set rather than if you have 10 users, you plug it in to this model.
It's not going to have enough data to figure that information out. So I would say, I think that's a caveat to that kind of analysis where if you do have enough data set, over time, then you can probably plug it in and figure out like some of those information and data like that.
Hannah Clark: Yeah, makes sense. Does anyone else have any kind of tips that they would want to chime in with on supporting the activation stage of PLG?
Dani Grant: This is obvious, but once you figure out, okay, good activation, the user has to take the following steps, then you have to figure out what are a bunch of ideas we should try to get those users, to do those steps, right?
So for us, we consider a user activated if they create four bug reports with Jam. And so we're always trying to think like, how do we push more of our users to their third and fourth bug report? Like, how can we encourage them and show them the value? There's just in a big brainstorm, like lots of ideas moment and AI is actually just fine at coming up with a lot of bad ideas that trigger you to come up with the good ideas.
And so that's just another obvious way to use AI in this step maybe.
Hannah Clark: I do like the idea of generating bad ideas that help you come to the conclusions of good ideas.
Dani Grant: Here's the thing about humans. It's really hard to have pure imagination to come up with an idea from zero. If you're ever tasked with being the first one to write the first draft of something, it's really hard.
But humans are really good at reviewing and reacting because that's just how our brains are wired. We see something and then it triggers other ideas. And so as long as you have one bad first draft or one bad first idea. You're suddenly really good at the second draft, and so AI is good at giving you the first thing that triggers your brain to think the way that it does.
Hannah Clark: Love that advice. Sorry, I think I might have cut you off a little bit there.
Ramli John: No, I was just adding on that I think the great thing about this kind of large language model is you can train it. So it's oh, this is bad. Here's why. Here's why I think it's bad. And you give it Some kind of feedback.
I heard this really great quote from a podcast episode with Nathan Berry. He's over at Kit and he's you have to treat AI like a very early stage intern, like an intern who's like a beginner. And by you telling it good feedback, it actually learns what is good and what is bad. So that, as Danny mentioned, that it comes with a bad idea, give it feedback and it'll actually start producing better and better output and results based on that.
Hannah Clark: Or maybe it'll, at least the bad version will be a baseline better. So your ideas are even better. I don't think it works that way, but maybe.
Anuj Adhiya: Ramli, I've been calling it like a drunk intern, but you know.
Ramli John: Oh, that's a better thing. Because it's very unusual.
Anuj Adhiya: But also the only other thing I will add here, and maybe this is a segue into the next thing that we talk about is.
Activation isn't really a separate step from a user perspective, because once they're in the product, they're in the product, right? And I think what's helped me connect the dots downstream and upstream is just thinking of activation as just short term retention. Because once they're in the product, that's the game, right?
Is how do you keep them around? Longer, how do you monetize them better, more, all of that, and connecting what they do as Ramli, and Danny, we're talking about initially to what happens later. It's all happening along a continuum of retention. It's not a switch in a user's mind is oh, I've been activated now.
I've been retained. No it's all the same thing.
Hannah Clark: Yeah. You have to be thinking about things, not just from what you see as the stages, but where the user sees themselves.
Dani Grant: It's possible. It's like you're using the product and you use it in the beginning stages. You use it at the end. I do think as a, not as like a product person, but as a user, there's a point at which I'm checking something out and there's a point at which I'm using it.
And I think activation happens in the middle. So I do think they are somewhat different from the user's perspective.
Hannah Clark: Anuj, what would you say is, or some of the possibly use cases you've seen already, or some ideas that you have around using AI for that retention stage, now that we're in that, okay, we've got our user in the product, where are we going from here?
Anuj Adhiya: Yeah, and some of this, I'll say this up front, it'll sound like an expansion of what's already been said in many ways, because I think what's most fascinating to me about retention and PLG is how it's completely transforming our ability to understand and act on user behavior patterns.
And I'll share an example. So many teams that I work with, Unlike I think everybody, even potentially audience just sitting on mountains of product usage data, right? Everybody's got amplitude and heap and mixed panel and whatever, right? But whatever reason, maybe the stage of company, not enough data analysts, whatever it may be, they're struggling to extract enough of the right kind of actionable insights quickly enough.
So I think what's changing the game is combining these traditional analytics platforms, like your mixed panels, amplitudes, heaps, right? With these sort of modern AI analysis capabilities, right? And you can just see it, right? It's almost like people were operating in the dark and now the lights have been turned on.
Because the thing that started to emerge to me, right? And I'm sure many of us have used these product analytics tools for eons. And I thought I was reasonably competent at them, but when I started plugging in some of this data, into whether it was private versions of ChatGPT or things like that, it started identifying these very sort of fascinating micro cohorts.
I think that would be almost impossible to spot manually or based on your level of expertise with any of these tools, right? And so going off of the examples that already be told, right? I think you might discover that users who perform a specific sequence of actions in their first week become power users at three times the normal rate, right?
Like I think there's one example from one customer where, they found that users who both export data and share it with teammates in their first few days. are showing much stronger engagement patterns, like this kind of thing is the kind of thing people are looking for, right? So I think what's really powerful here, I think, is that combination of analysis and automation, right?
Because I think what some of these companies now are starting to set up better, Behavior trigger journeys, right? Because people have always wanted to do this, right? Because that when the system detects a particular user following some sort of high value pattern, it can automatically shift the content or the experience into getting them faster to that path, right?
And so I think in one case for me has gone even beyond what's happening in product because a high potential users in one case got invited to a beta program, or they got invited to a customer advisory board based on their sort of actual usage patterns, right? And that's the thing people want to do to Ramli's earlier point of getting closer to the customer, but they just haven't been able to do it fast enough.
Also, I think for me, the real game changer is using all of these tools to connect the quantitative to the qualitative, right? Because machine learning, yes, you can analyze patterns across support and community and product feedback, but connecting those insights with usage data, right? So like when you maybe combine that with something like Gong or your product analytics, right?
You get I think much deeper 360 degree view of. So not just what they're doing, but why they're doing it, right? And I think that's that of the insight and the speed of that insight, I think is invaluable because you're relying less and less on assumptions and upping the percentage of what's actually happening because of user patterns and needs.
Hannah Clark: Okay. So I'm curious about this is, do you or any other panelists have anecdotes or stories of seeing some of this in action? Cause I mean, it's a perennial issue of qualitating or combining the quant and qual data and being able to tell that story more effectively.
Anuj Adhiya: I'll jump in with another example from my end, right?
For a lot of teams, I think these exercises are disconnected. There'll be a team that will, run the product market fit survey. Then there'll be a completely separate database that will host all of the support tickets. The completely separate database of sales conversations, right? And sometimes it's not clear even whose job it is to put all of this together.
And it's just that uncertainty and indecision, not because of somebody's fault, it's just because nobody was told that this is their job. Cause they're like, Oh, I'm a user researcher. I'm a salesperson, right? So I think it's power and assigning it as your, let's call it your AI employee to make the, take that decision out of your hands and just Collate all of this data for you and then present it to all the stakeholders involved.
I think is a trend I'm starting to just see more of where I guess, the opportunity cost of waiting or indecision is just so high, right? That this is the most MVP way of let's just connect a few data sources, or even if you can't automatically do it, let's just take some exports of these data.
Let's just throw multiple spreadsheets into this thing and let it elucidate these patterns for us. Yeah. Most MVP way of starting to at least speed up your rate of insights and being able to act on it.
Hannah Clark: Web designers, this one's for you. I've got 30 seconds to tell you about Wix Studio, the web platform for agencies and enterprises. So here are four things you can do in 30 seconds or less on Studio. Adapt your designs for every device with responsive AI. Reuse assets like templates, widgets, sections, and design libraries across sites and share them with your team. Add no code animations and gradient backgrounds right in the editor. And export your designs from Sigma to Wix Studio in just a click. Time's up, but the list keeps going. Step into Wix Studio and see for yourself.
What I'm hearing here is using AI as a way to break down some of the silos between all these departments are so are you suggesting like generating a cross departmental, state of the union report kind of thing to help people like give everybody in different departments, like a bird's eye view of what's going on across areas?
Anuj Adhiya: Yeah, no, absolutely. And so I think it's not controversial to say that growth in general is. It's a multiplayer sport, right? It's cross functional by definition and certainly true in the product led world. Cause you're trying to bring all of these pieces of content and education and community and product and sales and all of these teams have to work together.
And if you have a growth team of any size, the entire purpose of that is to bring all of the key stakeholders together to understand. What's the current state of growth? Where are the problems? Where are the opportunities? And it's always a challenge for everybody to understand, not necessarily what is the role I play, but what is the best, most impactful thing I could be doing right now.
And communicating that because I think being a person that leads growth is as much about your point, storytelling and communicating insights rather than here's what the data says for people to understand why should they care about this thing. Yeah. Don't rely on your own abilities, right? Get help from an AI to help you understand how do I communicate this data point better to sales versus DevRel versus product, whatever that may be.
And so I think it's power and breaking down those silos and getting everybody on the same page is highly underrated.
Hannah Clark: I really like this tactic. I want to this tactic. We had a podcast episode with Michele Ronsen, who is a well-known UX researcher. It's a very similar kind of process that she recommended that was a little bit more analog.
It wasn't necessarily AI enabled. I like this as an AI enabled counterpart. But her recommendation was more in terms of UX research. When you've got these, you conducted all this research, that's great, but then how is it usable to all of the different departments that it connects to? And her recommendation was just to frame it.
Department by department, like what are these learnings informing as far as the action, the next action items for each department. So I see it as a similar, an equivalent to that in which you're using the AI to translate, like what are the findings that we're seeing across departments?
And what are the action items that we can derive to help us work together better as a team? Because yes, you're right. I think that this is an issue that we struggle with in every startup and every company of any size is, how do we work better across functionally? I love this discussion point.
Does anyone else have Any notes on even using AI to work better cross functionally, I think is a whole other area that we haven't explored.
Dani Grant: This is not AI for cross functional, but look, why do companies hire product managers as everyone on this call knows? It's because you want one point person who is going to get to the bottom of things and owns the whole problem and owns the success of the product and is going to do what literally whatever it takes to make the product succeed and the project go well.
And so what Anuj is saying where something is not someone's Problem. Someone understood it at a spreadsheet level, but not at a user level, like it's so true, but that's the beauty of PMs, right? There's no PM in the world that says that's not my problem. They're just there to make the product succeed.
So not AI related, but one tactic that's been very helpful for us that maybe is helpful for you all is sometimes it can be. a leap to go from a spreadsheet level to a user level if you don't have user calls set up anyway. And so the best thing you can have is a drumbeat of like you're talking to customers all the time, and then you just plug in a few questions that you have.
But that's tricky. How do you do that? And so we have a little bit of automation around this that I recommend for everyone, which is Every single week, the top 100 users from the week before that haven't heard from us recently get an email from a Jamco founder asking how things are going. And if they reply, and their reply is like very thoughtful, we'll say Oh we'd love to learn more.
Do you would you be up to hop on a call? And so we have this drumbeat of calls. And what that means is that if a PM is looking at something and saying. This doesn't make sense to me, or I'm curious to learn more. There is something easy to join instead of having to invent a new motion of customer calls from scratch.
So anyway, one tactic, not AI related, but highly recommend.
Hannah Clark: Awesome. Yeah. Yeah. It doesn't have to be AI. We're just looking to grow.
All right. So we'll move on to the expand stage here. So Anuj, if you want to take the lead on this one as well, anything you wanted to lead with on expanding, then we can move into Dani's tech as well.
Anuj Adhiya: Sure. Like I said, everything's a bit of a continuum for me, right? But if you think about the traditional expansion playbook how does it work? Like you wait for usage indicators, maybe you have a customer success team to do some quarterly reviews. Quite frankly, I've worked with some teams that are just hoping to catch expansion signals.
But I'll give you, I'll give you a practical example. There's this one SaaS company that are using Gong, right? To analyze customer interactions across every touch point, right? Sales calls, support tickets, QBRs, everything. And the key is they're trying to train the system to look for very specific expansion indicators.
So it's not just picking up on like very obvious phrases, like we need more licenses but I think it can start to identify patterns around discussions of say, like adjacent use cases, a mention of a new team member or a department or a conversation about a pain point that could be solved with some additional capability, so I think when you start to combine. That sort of conversational data with product usage analytics, that's when it's allowing this team to create, let's just call it expansion intent scores based on multiple signals, right? So then when they start to hit certain usage thresholds, right?
Like in this one case, like they were approaching like an API limit. Or they were attempting to access some premium features, right? And this sort of thing correlates with very specific sort of conversation patterns that they've also analyzed. And that sort of started to create a really clear signal for expansion opportunities.
All of this to say, it's not just to identify expansion opportunities, but to, I think, make the process also feel more natural and value aligned for the customer.
Hannah Clark: Fair enough. Yeah.
Dani has a process that she was going to walk us through as far as using call transcripts to generate first traffic follow up emails, which I alluded to earlier.
Did you want to walk us through that process, Dani?
Dani Grant: I mean, pretty self explanatory, I'll say that as product people, we really at least, I really believe the product should do all of the heavy lifting, but really, to change people's behavior, you also want a face that you can trust, and so a lot of the expand step is very human led versus product led.
In a really well running motion, you have a lot of back to back calls helping teams expand their usage. You want to follow up from those calls with something thoughtful. And again, it's hard to start from no draft. It's a lot easier to start from a first draft, taking a call transcript, giving it to Claude and say what were the three main points this person communicated to me?
And then using that to draft here's what we heard, is just a lot easier than blank slate. Especially if you have eight back to back calls, then you're doing this at the end of the day, you don't fully remember, or like brain is a little fried.
Hannah Clark: I appreciate yeah, anything to save the old brain at the end of the week.
Okay, we'll move on to section three here, which is from vision to execution. So this is all about building AI enhanced PLG strategy. I'm actually going to throw it to Ramli because we haven't seen him for a couple minutes here. I would love if you can take us through some of the common pitfalls and misconceptions around PLG, around using AI to support it.
Kind of what are some great pitfalls or detrimental pitfalls that you've seen in your work?
Ramli John: I would say in terms of, itself, the output is only as good as the input. So if you feed it nothing or garbage information, then you're going to get garbage for output as well. And I think just based on the conversation we're hearing, what Dani is saying Don't start with a blank page.
You really got to feed it with the right kind of information, things specifically around training them with your best customer experience and the best calls that you have. And giving it the right kind of context, especially during the expansion that we just heard where here are the calls that we've had with them.
And here is the documents that we have around them. Give me the output around that and giving it like the right kind of context as to the right kind of prompt in terms of Hey, we're trying to do this. Here's what we're expecting. Give me three points and being goes back to what Anuj said earlier.
He called it a treated like a drunk intern. I would say treated like an intern who's just starting the job. So like a little bit more verbose or a little bit more upfront at the beginning and saying, Hey, here's what I'm looking for. Here's the output I'm expecting. And here is the context that is in that in terms of the information.
I think that goes a long way rather than being vague about it. I've also seen prompts where pretend that you're director of customer success or customer, director of product for this company. Here's the information about the company. Here's the The LinkedIn profile of this company, of this person. And what would I write?
What are the three bullet points that this person might care about based on the calls that we've had with them? I think that kind of like being like very clear and upfront and being succinct about that as well, it could be very helpful in terms of using AI for that.
Dani Grant: I think one pitfall to add is, especially with really junior people on your team.
If you remember being a new person in the workforce or a new person in your role, one of the hardest skills to learn is what does done mean? And the more junior you are, the more done just means complete versus understanding the impact of what really good work can create. And so I think one of the pitfalls with using AI in any of these processes is that for a junior person done can just mean complete.
And so, oh, the AI ran some analysis, but a more senior person would be like. And that has surfaced some new questions for us to dive into that are the next logical things to look at. Or this actually doesn't quite make sense to me. This must be a hallucination or this draft of an email reads like mumbo jumbo and obviously we shouldn't send it.
And so I think that's a big pitfall. And I think the way to avoid that is to actually just have a lot of conversations about what does done mean in this team? What does that mean in this company using AI or not?
Hannah Clark: Great career building advice in general is to be constantly asking that question. What does done mean? Having that conversation constantly.
Anuj Adhiya: Sorry, can I just add a couple of things. One to what Dani said and one to what Ramli said. So, The point Ramli was making about garbage in, garbage out. So a pro tip for myself now that I have begun to implement is I've started to realize most people don't even know how to Google properly.
And that's almost my way of getting an early indicator of whether if I'm going to hand off a task to somebody, will they even be able to prompt properly? Cause prompting is so far away from it can get to this point, like super So I literally have people run what I think are, basic Google searches, and just see how they respond to that. And I'm like, that almost gives me a signal of, okay, maybe this is a person that needs a little bit more training. Before I let them lose on this sort of a system versus, okay, this person inherently gets how to query a system and let them into the system first.
The other thing about what Danny was talking about where things might seem complete, right? Or done. I think this is why I have found great value in one of the first exercises I do is have everybody on a team understand what is the product North Star metric. Because this is critical for everybody to understand that this is how we deliver value to our users and customers.
And everything we do is in the service of growing value. And that applies as much to if you're going to prompt a system, that is the perspective the system needs, not the perspective of the specific task or the specific analysis. Is this isn't the service of that greater thing of growing value to our users.
I think the common theme that's set in running through all of this is I think there's a little bit of more context setting and background that whoever's going to interact with these systems should have. In terms of that sort of greater user perspective before they start interacting with these systems and extract whatever they think is an insight.
Hannah Clark: Yeah, I have some definitely a few common themes emerging here that is as well as challenges with regards to working together as different teams. We're just gonna go through a little pre close here. Normally we'd take this time to just tell you about our next session. For those who are engaged with us month over month with the sessions, just so you know, we are not going to be doing a panel in December.
We're taking a bit of a break for the holidays. So if you see, there's no session, where are they? We are going to be back in January with a, basically a more of a career focused session. So it'll be about transitioning into a career in product management, that'll be the focus of the panel.
So registration for that will be starting in December. So we'll send out a link to our subscribers when that is available when the registration is open. But it should be a really great session. Those who are here probably won't be relevant as much since you're all product managers already. But if you do know anyone who's interested in the career, who has been asking you a lot of questions and is curious about making that jump. Please let them know. We'd love if you could help us spread the word.
We'll get right into questions. Q& A. Our most voted question is from DM. It is, AI can exhibit bias in interpreting nuanced data leading to misleading conclusions. In the context of PLG, how can we strategically leverage AI to enhance user experiences and drive growth while mitigating these risks?
Anyone want to take the lead on that one?
Dani Grant: The thing about AI is it doesn't solve human problems and you really just have to use your judgment. So it's a great tool, but you need a skeptic's mind. In the pitfalls of like junior person thinking something is done, just having an answer doesn't actually solve your problem.
I just, when you're using AI, especially if to be even more really cognizant, really there, really present with your work, really thinking.
Ramli John: Yeah, I totally agree. I think this goes into the whole ethics of AI, like there needs to be human intervention. I really do believe that where that biases, there needs to be a human to catch those kinds of biases, especially, like you, I think it's just an input to your decision making process and you could have multiple inputs.
You can have qualitative data, quantitative data, you have. AI suggestion, you can ask your CEO, you can ask somebody from customer success, but these are all input to eventually, it needs to be a human to make the decision, but it needs to be somebody's yes, that's what we're doing. That's where we're going.
Because based on all of this information, including an input from AI. And I would be very cautious if we let AI make the decision for us and go from there. And there's obviously, there's some biases there that they might not catch on its own, so.
Anuj Adhiya: So what I would say is, I think it's really important to not forget how things were done in the past.
So, let's just call them like, ground truth baselines, right? Where, how did we do things before AI, right? Just so that we can have a clear comparison point, right? Okay. Because if we don't know how we did it how will you even know whether that system is hallucinating or even have the opportunity to ask that question as to, should I poke at this a little bit more or not?
And I think what's associated with that, right? I think it's important to have Triangulation approach. So I don't know, let's pick a situation where let's say an AI system is going to flag, like some users at risk or something, right? It's okay, you can't just take that at face value, right?
You've got to look at sort of the raw product analytics data. You've got to look at feedback, look at whatever the customer success team is talking about, right? And see are those signals manually also aligning? And that gives you a little bit more confidence in the system as well, right? I think a recent example I came across, I haven't personally used it, so disclaimer.
I saw a heap analytics has this sort of great feature. Where, they call it like, they've explicitly designed for diversity in their training data and in PLG, what that means is they've ensured that their AI models learn from users across different company sizes, industries, use cases, things like that, right?
So what that does is it helps you catch when, Your AI system might be over indexing on a behavior pattern from a larger or a more active customer base, right? While missing any signals from smaller ones, right? So I think there are some tools that are catching on to this and trying to account for it as well, right?
But I think just don't forget the way you've done it and you can always manually verify.
Hannah Clark: Good tips.
So the next best upvoted question is, would you recommend using AI in place of traditional support or customer success teams? I feel like we've already answered this one passively, but we can maybe give some more context here.
So those roles are too typically seen as responsible for cohort retention, yet are two areas where we were seeing explosive growth rate in agentic AI tools. This is true. This is interesting. Okay. Who wants to get into this one?
Dani Grant: I think that the most important thing in 2024 is your user's experience.
Okay. Because there's more competition than ever, and it's easier than ever to switch tools. So, because most companies have some sort of PLG strategy, if someone gets frustrated with your product, it's not that much later in the day that they sign up for a free trial of another product. And so user experience in 2024 is the most important thing.
And so I think that's the way to make that decision for your users. So if the user experience that you want to enable is we've got a lot of documentation. We want to make it really easy to search and natural language is a more simple way to search our docs. Then having some sort of agentic customer support is actually great, right?
Because it's just a nice UI for your docs. If the user experience that you want to provide is that we're here for you 24 7, there's a person on the other side who cares about your problem and is right there with you for every step of the product journey, then that's probably not the experience to provide.
It's possible that for some of your users, for your developer audience, you want one, they actually don't want to talk, get on a call, but they want just quick query language for information and for your enterprise. So anyway, I think that the way to think about this is not through a tools lens, but rather through what's the user experience.
Ramli John: I totally agree. I think people are craving, I think with the rise, with more and more rise of AI, I think people crave human connection. There's this quote, cause I'm working on a new book, but it's from Josh Kaufman. He wrote the book, The Personal MBA. He said, there's a paradox of automation. The more that the automation is efficient, the more crucial the human experience is.
And I think that's so true because I think in talking about customer success and customer support and user experience, like what Danny was talking about, I think people would be delighted when there's an actual human chatting with them. There's like face to face like right now. And whenever I go to in person events, they just Oh, you're real.
You're not like a chatbot. That's like spitting out things. You're actually somebody who Cares for me, who I feel valued as a customer. And I think that's going to go in a longer way, especially as, companies automate more and more things through AI support, marketing websites and things like that.
And even emails I heard. KFC is introducing AI emails to get you to buy more fried chicken, which is crazy, but I think it's really going to be more important to have humans involved, especially in B2B products. I think the word, customer and face to face does matter in that value.
Dani Grant: I think at the same time, the bar for what quality AI experience is getting higher.
So imagine like your own experiences using products out in the world. If someone introduces an AI feature or an AI chat support, and it is a low quality experience, you actually hate them even more than if it was just like a normal feature or a normal chat support. There is something about oh my God, they're just on this buzz train.
It's just hype. It's just marketing. It's whatever. That just feels bad in your gut. And so I think that if you are going to introduce such a core functionality of your product, like customer support using AI, and that the customer knows that it's AI, it better be like really darn good. The bar is actually even higher to ship that than it would be otherwise.
Hannah Clark: Our last question for the day, how do you balance personalization through AI with user privacy concerns?
Dani Grant: Follow your privacy policy. If you're thinking this is wrong, don't do it. Like just don't do it. For example, if in your privacy policy, it says we're allowed to look at top level usage metrics in order to improve our product, then an AI can help you in that.
If in your privacy policy, it doesn't say we're allowed to look at all user data to all like just, whatever, like user trust above all else.
Hannah Clark: I don't think that there's much else to say about that.
Ramli John: There's definitely some nuance response. Like I've gotten emails where Oh, Ramli, I noticed you did X, Y, and Z in our product.
And I think there is some value there where Oh yeah. Okay. I'm stuck. I need help. But if it's I'm not sure. I think there's definitely, I would agree with Dani. I think it really is like the privacy concern where Oh, I feel like I'm being stuck, but I'm in your house. I'm in your product.
So. Maybe I do expect that a little bit for you to see what I'm doing.
Anuj Adhiya: I think this one goes to a question I've been pondering, right? When somebody, I don't know who said this to me, but they called it, their attempt was at respectful personalization. And so I think that the key for them was to be really transparent about the value exchange.
And I don't think people are not going to expect that we use AI to personalize experiences moving forward, but I think it'd be useful to just follow, simple rules. Okay, if you're going to collect a particular data point. We internally really should have a really clear demonstrable benefit to the user that as your guiding light first.
Hannah Clark: These are all very insightful. Unfortunately, we are so out of time. We always do this. We always leave ourselves not quite enough time to get through all the questions. Thank you, everybody, for such an engaged session. Thank you for participating in the chat and for all of your questions and for joining us today.
Really great of you guys to make time. I hope that Session was helpful for everybody who attended and I want to give a warm thank you to our panelists Ramli, Anuj, Dani, you guys have been amazing. Well, it's a pleasure to have you around with us and thank you everybody. Have a great day.
Thanks for listening in. For more great insights, how-to guides and tool reviews, subscribe to our newsletter at theproductmanager.com/subscribe. You can hear more conversations like this by subscribing to The Product Manager, wherever you get your podcasts.