Data-driven community management

With panellists Richard Millington, Amanda Boyle, Josh Bricel.

With insights from Richard Millington, Amanda Boyle, and Josh Bricel, the panel discusses everything from defining metrics that matter to balancing qualitative and quantitative insights.

They explore practical ways to build data literacy within teams and prioritize metrics that resonate with product, marketing, and customer success stakeholders. Whether it’s building trust with engineering or aligning community impact with company goals, this episode is a blueprint for making data meaningful. Perfect for DevRel pros looking to back their work with smart data-driven strategies.

Watch the episode

Episode outline

01:00 – Combining qualitative and quantitative data: Rebecca and Matthew discuss the importance of balancing qualitative insights and quantitative metrics for a holistic understanding of community impact.

14:00 – Building a community data system: Richard shares steps for designing a community data system, emphasizing the importance of clearly defining questions before diving into metrics and data.

27:00 – Trust-building through data with product teams: Amanda explains how the GitHub discussions team uses data to anticipate user needs and build trust with product teams by identifying frequently asked questions and key support content.

33:30 – Proving community value to different teams: Josh describes the challenge of proving community value to stakeholders, from showing ROI to providing actionable insights for marketing, product, and support teams.

38:30 – Caution in making data-driven claims: Richard and Josh discuss the potential pitfalls of correlational data and why clear, precise language is essential to avoid overstating community impact without causational proof.

49:45 – Improving data literacy in community management: Amanda and Richard stress the need for community professionals to build data literacy to accurately analyze and interpret metrics, ensuring credible and actionable insights.

54:30 – Practical steps for asking the right questions: Richard emphasizes the value of aligning data collection with specific business goals and prioritizing questions that directly inform actionable decisions for community growth and engagement.

Transcript

Matthew Revell: Hello and welcome to DevRel Round Table. My name is Matthew Revel and I'm joined by my co-host,

Rebecca Marshburn: Rebecca Marshburn, the Head of Community at Common Room. Hello, Matthew. Thanks for having me on again as co-host.

Matthew Revell: Yeah, of course. Thank you for joining me. So today we're going to talk about data and how data enables us to tell a story about communities and also to understand what's going on and to I guess make predictions. But we'll look at that as we go through the programme today. Rebecca, I think we've got a really solid lineup of people for this particular topic, and each of them works with data day in, day out, particularly related to community. So I'm very interested to hear what they've got to say. But obviously with you working at Common Room data's kind of your lifeblood too.

Rebecca Marshburn: Indeed, it is. I feel like you had mentioned this idea of making predictions, but it's also understanding how tying together what we're doing as an input, how that's affecting an output. And so what we would say, how that drives a business forward, how community is not just isn't that sweet to have, but it's a must have in terms of business success. And in order to actually tell that story, we also need the data behind it. So it's not just qualitative, but also quantitative as well. And so I'm really excited for this lineup of panellists because they've all approached data in a very precise and a scientific methodology type of way.

Matthew Revell: Cool. So shall we bring in our guests then, because I'm really excited to put them on stage and allow them to have their say. So let's start with Richard. Richard Millington,

Rebecca Marshburn: Richard Millington. Matthew brought you in a little quick because I love to say a little something about each person.

Richard Millington: I can leave if you want.

Rebecca Marshburn: Yeah, you want to lead? You take the lead.

Richard Millington: Oh, I said I can leave if you want. I'm happy to lead. Sure. Everyone. Hi, my name is Richard. I'm the founder of FeverBee, and we very much help clients take a data-driven approach to building their communities with a lot of research, a lot of community intelligence, and really doing what I think is quite cutting edge stuff to prove the return on investment of a community.

Rebecca Marshburn: And I think, Richard, you were very clear about you and what you do, but I want to add a little emphasis to it. You are the founder of FeverBee, as you said. You're also the author of three different books around community and they're talking about data and how to apply that to community strategies. And you're constantly speaking across the world and at conferences, helping people understand how to look at data and then interpret it and then apply it so that they can build better communities. So I just want to give you the little exclamation point on top of how you yourself.

Matthew Revell: So great to have you with us, Richard. I'm going to bring in Amanda Boyle from GitHub. So Amanda, welcome to the DevRel Roundtable. Thank you for joining us. You're joining us from the Pacific Northwest in the US and your day-to-Day is looking at the data around community in GitHub. And that's I guess in leading up to that in your career. You were doing community things at Tableau previously and then you've been at GitHub now, but for you, it feels as though, from what I understand of you, your day is all about looking at what does data mean and how can we make it tell a better story about our community and the communities that use GitHub?

Amanda Boyle: Yes. So my team is focused just on the GitHub community discussions. So we launched it, we actually migrated into the product last year in July. And so a lot of the work that we do is incorporating product feedback, question and answers, understanding users' needs and trending, and then how can that support our customer success efforts? How do we make sure that people are given the resources they need and also helping each other Ultimately at the end of the day, if you can create and enable a culture where users are helping users and they don't get frustrated and give up, that's ultimately what we hope to do. But yeah, my background was four years at Tableau leading their amazing Zen and ambassador programmes now the visionaries, and I mean if you learn data at Tableau, you're pretty much set after that. So it was pretty great.

Rebecca Marshburn: And Amanda, I've had the great pleasure of meeting you in person at a Seattle Common Room Neva Meetup last year, and I loved hearing what you did there or hearing about what you did when we were there together. So I am super excited to see and hear what you share today on the call.

Amanda Boyle: Yeah, it's great to see you again. I was surprised. So this is pleasant.

Rebecca Marshburn: This is pleasant. Our last panellist is Joss Bricel, someone who is near and dear to my heart because he is our own data and analytics lead at Common Room. And if there's one person who has truly taught me the importance of precision in terms of turning numbers into language, it is this man. We have worked together both on our 360 community led growth report where we established benchmarks and insights for community leaders based on 141 different communities that use Common room as well as the developer relations compensation report. And we would look at the numbers together and I would write a sentence and he'd be like, that's not quite exactly what that says. And I think it's super, super important that we also look at the way we write things and the way that numbers get translated into action based on the language interpretation in the middle. That is really important. So I'm glad to have him here too, to represent that extremely precise point of view when it comes to looking at the data, seeing the data that we see across our communities in our roles, and then understanding what that means when we say it out loud to someone else. Hello, Josh.

Josh Bricel: Hi Rebecca. And thanks for the intro. I feel like you took everything I was going to say and that's great and I appreciate it. But yeah, happy to be here and happy to talk about data and as Rebecca said, I do pretty much everything on the data gauntlet at Common Room, so excited to be here and talking about data.

Rebecca Marshburn: Cool. Well then should we talk about data?

Josh Bricel: Do it? Sounds good.

Rebecca Marshburn: Alright, so I'm going to kick us off with a general question to help set the context for what we're talking about when we're talking about community data and community metrics. So this to establish this baseline, right, this foundation of where we're all working from. Richard, I'm going to start with you and then we'll go around the horn. What are the core metrics that drive community success? So if we're someone starting off and we're like, I want to start using data to understand what I should do best in my community, granted it's probably not a one size fits all, but if there are a few core metrics that we should be looking at in terms of looking at community success, maybe that's longevity, maybe it's health, maybe it's engagement. What are those core metrics that you would consider the first place, the integral place to start?

Richard Millington: Yeah, I don't think metrics drive success. I think obviously actions drive the success of a community. I think where metrics really play a role isn't just in looking at a community and seeing what's happened, but in looking very deeply into what members need and what they want. And so that means combining both the quantitative data with the qualitative information. So on the quantitative side, I think surveys, interviews, UX research is obviously very important to understand exactly what members need. But what we're also looking to find out when we look at a community is to see where the drop off points are in a community. What can we fix, what's working, what's not working? And really trying to take it in a very, I hate the term, but a very holistic approach. So we look at the community, it's entirely figure out where the issues are and then systematically repair each of those issues.

Obviously to get support, you need very clear metrics for return on investment and what kind of impact you happen to make. But I think maybe that's a question for a little later on. The key thing that we really look at from a data perspective is one is does the organisation have metrics that they think are right for them? Is two, does everyone agree on what these metrics are? And three, and maybe the most challenging one is are they getting and analysing this data in a valid way? And it's so easy to bias the results of that. And so those are the kind of things that we think about when we are looking at what kind of metrics drive any kind of success.

Rebecca Marshburn: Thank you for that. I find that answer holistic in a very good way. In the happy way of using the term, Amanda, because you specialise specifically at GitHub, I'm wondering if what Richard just said resonates with you in terms of how you approach what your core metrics are in your specific area versus larger GitHub organisation as a whole?

Amanda Boyle: No, absolutely. And Richard, it's really fun to get to actually talk with you. I've been following your work through, so this is great. No, I think absolutely, totally echoing that. I think the sentiment or the qualitative versus quantitative, it has to be brought in. We can look at the data, but if you're not actually taking the time to really read and understand what are the pain points, if you're advanced enough to have a machine learning model, that's awesome, but even just taking the time to read some of the top trending issues, you're like, okay, what actually is going on here? I think it's super important for our business as a whole. GitHub is massive in a community of communities, so we're just one slice of that larger effort. And so it's really important that we work closely with the group that's focused on maintainers or open source programme offices or all these different audiences that are kind of segmented.

Our space is one of the few that I think anybody goes to, whether you're an enterprise user, a free user, a student, everybody kind of uses our discussions right now and we have done very little to promote them. I'm kind of terrified for the tidal wave when actually people know about it. We have a very small team, but for us, for the business, we use engaged contributors and engaged users with very specific criteria across a lot of the different community programmes. As a baseline we're like, okay, how many people are one using this discussions experience and then how many are actually contributing? And then from there we go deeper, but that helps us just know what's happening as a pulse. And then we get into a lot more around top contributors retention. We segment within our different categories, so product types to say what's happening in different areas and really trying to work closely with our counterparts on the product side to help them know is there something happening with your product, good or bad that you might not be aware of? I think is kind of just get started. There's a bunch packed in there though.

Rebecca Marshburn: Yeah, I love that it's sort of telling the story even of engaged, what were the two, the foundation engaged

Amanda Boyle: Users,

Rebecca Marshburn: Users and then engaged contributors, right? Were the two. Yeah, it's sort of interesting there too. I think there's almost a qualitative story you can draw the line between which you have to then investigate to be like when there's a drop off between users and contributors, what is the qualitative story in the middle where people are dropping off and is it because there's something that they're not getting from us that they need? Or is it just a natural common type of, well we're going to see this much, but there's a qualitative piece in that gap between the two, which I think is super interesting and excited to dive into. Josh, I saw you nodding along while Amanda was speaking, and so I'm curious from the looking at the numbers and the graphs all day and then putting those together, if this is also what you're seeing in terms of in large data sets across communities, what types of data people want, what they're asking for from you, and then what you're seeing when you pull data, what would be the right core pieces to look at?

Josh Bricel: Yeah, it's pretty much exactly that. What I've been focused on recently is the combination and the intersection between community and CRM and product data. And so we're getting to the point now where we can actually look at them all together for the same individual or the organisations or the trends and that kind of stuff. And so it opens up a new kind of realm of what we can find out about the customer journey to make sure that we're able to meet people where they're at as well as able to see what the impact is of their engagement in community. So if you think about the last time that you engaged with a new product or something, there's this whole cycle of how you engage with the product where you start by maybe asking people that questions about the product and then you move on to asking the community about the product. And then from there you move into actually using the product. So you have product usage data that's formulated from that, and then you might get to a point where you actually want to talk to somebody in sales about the product too. And so now we're able to start looking at all those together to really show the value of community in that whole cycle of the customer journey to make sure that we're able to support people and advocate for more resources for community and more resources for the product. And it's pretty cool. But yeah.

Rebecca Marshburn: Alright, well we are set off I think for our data discussion and thank you for establishing sort of those benchmarks in terms of what's core and then how we build out from the core. I know that Matthew has a question that he's been itching to ask jumping off of this. And so I shall pass it to you Matthew.

Matthew Revell: Thanks. So Richard, you've written about creating a community data system and one of the things that I think ties in nicely with what everyone was just saying is the idea of doing almost a community data audit. So can you give some advice on how people running communities today can take a measurement of where they are right now with their measurements and understand what is a good number and perhaps what's more noise rather than signal?

Richard Millington: Sure. But let me, if it's okay, I want to zoom out a little and tackle that from a broader perspective because one of the things that I feel is happening a lot is most of the time data collecting. Data analysing is a vanity exercise that doesn't lead to any outcome whatsoever. And I really strongly believe this because we see it time and time again is that people will collect data. Like the question you asked, what data are you collecting? We've asked this at workshops in many different parts of the world and everyone can say engage users or I'm measuring the number of click-throughs from this. And people can tell me very often what are the metrics that matter to them. But when we ask them, what do you do with that metric? They'll say, oh, I pull it in a report and I send it to my boss.

And then we'll be like, well then what happens? Then it'll be like, well then the boss looks at it and that's kind of where it ends. This whole exercise is like it's a giant vanity exercise because there isn't a system in place and the way you build a system is by figuring out what is the framework that going to do so if a metric you care about goes up or down by 10% next month, what will you do differently as a result of that? And what this ultimately ties down to is ties back to is what is the decision that you are going to make? Cause that's the whole point of data. You use data to make better decisions. And I feel we often lose track of that because data is often used to create nice reports, not create good decisions. And ultimately data is at the core level is what are you going to do more of?

What are you going to do less of and what are you going to change? And you need that framework in place before you can build out everything else. Because what happens is we dive straight into the data without thinking about what are the questions we're trying to answer? And once we have the questions, how do we turn them into data questions? So when I talk about the system, it means beginning with what is the outcome that's going to be the result of the data that we are collecting and what are the questions that is that inform the outcome? So even really simple questions are very difficult to translate it to very specific data terms. Let's imagine for example, and just cut me off if I'm talking too much, but let's imagine for example, we want to know what the impact of community is upon customer retention.

That sounds like not too difficult to answer actually. It's incredibly difficult to even define what that question means in data terms. First you have to define what does retention mean? What does that mean? Is it a customer that renews after one month? And then you have to figure out what does community mean If a member has visited a community once and never came back over a year ago, do you include them? Do you not exclude them? And so the challenge is getting very drilling very deep into what is the specific metrics that matter to you and all the trade-offs that go into that and how messy that can get, figuring out exactly how to get that data, how to extract it, how to analyse that data in statistically valid ways, and then how to visualise and present that data. So when we talk about the system, we talk about that whole process from what is a decision, what does the process look like, how do you extract it?

The ideal outcome honestly is where you have a data pipeline that is built out and it shows you in a visual form what is a decision, colour coded green, red, whatever. That is the ideal outcome. But that takes a lot of work, but at the very least you can do this in a manual way and still get the right kind of outcome. So I can give you benchmarks if you liked or we liked, but I think I want to give everyone else a chance to speak. So that's what we think about the system, the end-to-end system of how it all connects together, if that makes sense.

Rebecca Marshburn: Totally makes sense. And I kind of want to pass this to Josh for a second because I do think that that strict definition or that defining of that criteria where it's like, well what does this mean? What does retention mean? We had to, Josh, when you were pulling data for the community-led growth report, you had to set those definitions extremely specifically so that we could measure everything against the same exact starting point. And so I'm curious how you approach some of those strict definitions of criteria and maybe why some of the thought processes behind why you chose those starting places. And maybe would you do it again that way or is it like now that we've seen what communities are doing or thinking about what richer had just said in terms of how different companies want to define what they actually turn into actions, how did you come to those strict definitions of how you were measuring things like retention and responsiveness and engagement?

Josh Bricel: Yeah, no, all good questions and all of it, it hits close to home because the data quality for community is kind of tricky to work with. So you can only work with what you're able to get from APIs for different community sources or whatever you're able to record and stuff. And so a perfect example there was for retentions, one that comes up a bunch because you want to look at your community retention, your customer retention, your user retention. Everybody wants to know who keeps on coming back because it's a main success criteria in many ways. Or it can be used as a vanity metric. But when you think about community retention, there's a lot of passive users in communities. So somebody goes to a community, they might go and log into YouTube or something and watch a bunch of videos, but never like anything or give a thumbs up or comment or anything like that.

And so it makes it really tricky to track retention, for instance, unless somebody is actively engaging, which is a slightly different definition of retention. And then you have to be super clear that you're using a different metric here. So it has to be thought of and used in a different way. And I see that data definition and clarity problem come up over and over again with community data because it's really varied based on where you're getting your data from. But another thing that you had said Richard struck close to home with me too is a lot of people just want charts or what do you do with that data? Well, I take that data and then I go build a chart and then my boss looks at the chart and then they in theory are going to do something with that data. I was like, great, I got a chart and what did they do with their chart?

They tell their boss that the chart up or the chart went down. And so that's something that we're really looking to change is people actually should be able to act upon the data. And what's cool and common room at least is all of our charts have been wired up, so you can click on 'em and drill down and see the underlying stuff and members or organisations with the thought there that it's supposed to be much closer to the action. So if you see a trend go up or down, it's like, well, great, go and do something about that. Go and set up some kind of meeting with a person to talk about what their experience looks like or take it to that next level of actually acting upon it and driving the change in your community. And I think we're still a little ways away from that becoming the norm on how people are interacting with community data, but it's exciting to be at the forefront of enabling people to actually be able to do that.

Amanda Boyle: What you just said reminded me of a post that I saw at Richard, I think this was last week, and you sparked this awesome internal conversation was on reframing lurkers as learners. Did you post it on LinkedIn or something? Right?

Richard Millington: Yeah, I did love that. It was just one of my books as well, but thank you.

Amanda Boyle: Yeah, well it's such a good, I think everything that you just brought up was actually around that as essentially not highlighting that as a really important thing is thinking about users versus contributors is really important is taking the time to say somebody, finding value in this content is incredibly important and not everybody is going to contribute and we don't necessarily need everybody to, and that I don't want to slip us into self-help success metrics or case deflection and support metrics because I've spent a lot of time getting us away from all those numbers. I don't want to do that ever again, more power to people who do it, not my interest. But I think that taking the time to know that the content that we create is exponentially valuable to others when it's in a public space is really, really important, is valuing that and highlighting the content that's of value. For me, specifically working in a forum I think is really important. And if you think about how our teams might align to a marketing effort or a product launch effort or anything like that, it's like if we can say this is the content that users need and are going back to and linking to, that's really important too.

Josh Bricel: Yeah, if you think about the learners and contributors too, so one of the big things in responsiveness in forums or chat or whatever is the amount of trust that your community is able to build for answering the question the right way. So if somebody posts a question and somebody that doesn't really know what they're doing responds with an answer that they're super confident about, you might think that that would degrade trust in the community. However, oftentimes you're going to have somebody that's an expert step in and say, Hey, that's kind not a hundred percent of the truth. Here's the whole story, or here's where you should go and look. And so the learners and the contributors, it's a fine balance.

Rebecca Marshburn: I'm so glad that we have a bunch of people on this call who have been able to read each other's work in different places and then bring that in as well. Just makes me really happy. I feel like I'm with a bunch of celebrities. Amanda, I wanted to dive in with you around GitHub and especially when we're talking about data points that you've tried to move away from and data points you've tried to move toward to then tell that deeper story. So GitHub as a whole, right, must be a gold mine of community data, but you're focused on GitHub discussions and I'm really curious because discussions is, you said, right, the tidal wave is going to be pretty wild when people really start discovering it, but you haven't yet done that big push and so people are finding it on their own, interacting with it a lot.

Your team is now set up, it's almost like it's a new space, a new community space that I don't want to make assumptions, it's a blank canvas because certainly there's already GitHub qualities and expectations and deliverables that you bring to it, but because it is a relatively new community forum, is there a way that you're setting up the data that you want and pushing people toward the data that you should be looking for in order to take actions? And since forum is a little bit newer, what data are you looking to use to feed your community strategy specifically in this forum space?

Amanda Boyle: Yeah, thank you. That's a great question. I was taking notes kind of around that while y'all were talking, and so anybody who's not familiar with this, it's community.github.com. So anybody who's watching this and you're like, what are y'all talking about? That's where we live. And it's specifically a discussion on a publicly facing repo. That discussions launched I think maybe less than two years ago. I was like, I've only been at GitHub for two years and I remember when it launched. And so the product itself is new, and so it's a really special space to be in because we work really closely with the engineering product and design team on what features we're a weird use case. We've got tens of thousands of users, exponentially more than most open source projects using our discussions. And so the scale is just going to be different than what you're going to see even in a massive open source project.

So the stuff we need is different, but I think it's really important when we think about what we're able to get out of discussions, we actually have to work with our data and our analytics teams to build into the data model to even get the data out to say right now I need category level data for every different category in the discussions Right now I have engagement for everything but getting micro down to category, they have to rebuild a whole new data model and bless our analytics team. They're probably so sick of me being like, I really need this, but it's a tonne of work for them. And then because I just want to pull everything into Tableau, even though respectfully we can't do that anymore, I don't get to use it, but for us it's thinking about what does retention mean for discussions as a product, not just for us. And so whatever we build with it, I want to be able to have our team say, this is how we're using this product and how you can too, because any repo, anybody can turn it on. So if you have a project that you're running, learn from the stuff that we figured out and you don't have to recreate the wheel yourself and it's not, it's just a really awesome platform that people can use and take advantage of. So that's something we think about a lot.

I think specifically it's building trust with our product teams. I can't remember somebody mentioned earlier is that alignment with product and understanding what are the trending topics and the things that people need is product feedback with a company this big is coming in from countless channels, whether you're talking to your sales rep or you're talking to your customer success manager, feature requests, bugs, all of that is coming in from so many different streams. So making sure that we can analyse that information discreetly from other things too is really important. Product feedback isn't a question, so you have to actually build that into how it's categorised in your data model to suss it out from everything else and not have it kind of dilute the rest of the data. Richard's laughing. I can see you just nodding with

Richard Millington: No, I agree. I completely

Amanda Boyle: Agree. It happens and people are like, what's the sentiment and things like that. I'm like, it's just kind of hard to really get to it without a machine learning model.

Rebecca Marshburn: I secretly hope that you also still get to use Tableau on the side. You're like, I'm going to go home on Friday night and secretly open up my table.

Amanda Boyle: I just finished grad school and so I have a student licence. So I use it to validate that I'm thinking what I think I'm thinking, am I measuring what I think I'm measuring is something I ask myself all the time and taking the time to look back at the data and be measure twice, cut once kind of thing. It's like don't report it until you actually know that's what is real.

Rebecca Marshburn: So I want to dive a little bit deeper into how you're talking about delivering value to the product team or building trust with them is what you said. And I think that's specifically certainly one team that you want to build trust with. And then for a lot of people, they want to build trust with the marketing team or the growth team or the success team or the support team. And so I think there are different data pieces that can help build trust with each of those teams. I would love to hear from you, it sounds like obviously distinguishing between product feedback and a product question is one way to do it and then making sure that you're delivering those data points and what's happening in aggregate and individually across members back to the product team. So there's a quick iterative loop. I would love to hear a little bit more about the different or what that the product team is looking for in order to see value from your community work. And then I'd love to hear from others and you as well, if you also have, and this is what the marketing team is looking for and this is what the success team is looking for and this is what the growth team is looking for. I think kind of diving into how data in the community then drives those value propositions for your cross-functional teams would be a super rad piece of the convo here.

Amanda Boyle: Yeah, absolutely. I'll focus on product first because that's where we've had the most success lately and it's discreet based on the team that we're working with. I think I don't even know how many categories we have right now, at least like a dozen if not more. So copilot for Business is a product from GitHub that has just launched a new tier of it, just launched it, I think it's copilot for business ga. And we're working really closely with the product manager to get ahead of frequently asked questions that we're seeing from users. It's like we're seeing the same types of things keep coming in, whether they be support tickets, whether we see something trending on another site, whether it be Reddit or Stack Overflow or something else. People are like, these are the types of problems we're having. We're like, okay, how many places can we put this in front of users so they don't get stuck and not know what to do?

I think it's really important is using that qualitative information to inform what type of resources we need to make accessible to users is super helpful. And then also leading up to a launch, it's like we're just one channel that a user might come to for enablement, respecting that people all learn in so many unique and special ways, whether it's be watching a video, some people love forums, some people hate them. So it's like just we're not going to be everybody's flavour and that's okay. And knowing that the value that we provide needs to be consistent with all these other teams too, I think is super, super important that a lot of people just don't plan for in a go to market strategy, which brings in marketing, brings in product, brings in everybody is like whatever you're putting out, the messaging needs to be consistent and intentional across all of these different channels.

And not to forget about this one just because it might be not as familiar, but for product going to bring a new thing to market is who are the top contributors already who are using it if it's not a brand new thing? And then how can they help make it better? Building trust with those users is like what features do they, what do they hate? And taking the time to have real conversations with them and build that. That's something I saw done exceptionally well at Tableau is actually working with the top contributors was such a pleasure because they made the product what it is and they had great relationships with the engineering teams to say, this is how I use it in such a deep way. Then an engineer might not ever use a product the same way that a user's going to use it. And so building an opportunity for them to build trust together to almost demo and walk somebody through something is such a valuable. So it's of such value to any product team to build those kind of relationships and take the time to have that kind of feedback. It's very time consuming, but I think if you take the effort to do it, it pays back in dividends. I've been talking a lot, so else jump in please.

Josh Bricel: Just one comment I have there is this feels like it's kind of full circle because maybe four or five years ago I was down at DataOne providing product feedback on Tableau Nice on the data and the engineering team. And so it's fun to be thinking about that being a big part of that now talking about community, talking about data all this time later.

Amanda Boyle: It's all intentional. I think that they did it really well and we got really lucky. The community was incredible, is incredible. It's really great.

Rebecca Marshburn: Josh, I'm curious if, because I know that you get a lot of inbound requests from customers specifically looking to tie one piece of data back to a specific team depending on which team is, whether it's the community team or the marketing team or where someone rolls up into, sometimes it's the product team if it's a deral person. So I'm curious if there are specific inbound requests that you get most often in terms of which data, which teams want to tie from where and in what actions are they looking to take off of it? To Richard's point, now that you have that chart, what do you want to do with that chart?

Josh Bricel: Yeah, no, good question. So I mean I would say I get the most request are on trying to prove out the value of community, which is the main request is I think removed from the action quite frankly, where people want to say, Hey, this is my community, it's worth this much money or it's been able to make us this much money, or there's some correlation there, especially now that budgets are being cut at lots of companies, we all probably know somebody that's been laid off in the recent past. And as unfortunate as that is, it also means that people are getting a lot tighter with their budgets. And when you think of community, it's a space that a lot of companies don't have any allocated budget for it. They might have one or two community professionals, but leaders don't necessarily know how much they should be investing there.

And so a lot of the request ends up being like, Hey, show us all of our opportunities. How many of those opportunities can we contribute towards community for either having them show up in community before they showed up in CRM or having people that are in a sales cycle go towards community. And it's pretty basic stuff. It's not like it's rocket science or anything, but it's a way for people to start to say, hey, this is the dollar amount that we're able to show for community and that ends up being community leaders themselves trying to show that to leaders. The other piece is around, I dunno if this is categorised in marketing necessarily, but it's around events trying to prove out the value in events. So there's a lot of money that goes into conferences or different meetups or whatever, it might be hundreds if not millions dollars, like hundreds of thousands or millions of dollars.

And so one thing that people haven't done historically is really test the impact of this, right? It's like you host an event and you're like, great, we all get together, we talk about stuff that's awesome, but are the things you're trying to push at the event actually driving more community engagement or more product usage or more leads? And so doing tests before and after for the users that go to specific events where you're trying to push for a specific agenda is a great way to start to show the effectiveness of those different events. And then you can say, okay, hey, these ones are working out, let's do more of those, or these ones aren't working out, let's try to figure out what's going wrong there. So they're able to tweak that so we can get more out of it. And it kind of starts to bridge that gap between insights for the sake of just having some charts to show by boss to insights to actually change the way that we're running events or the content for the people that we're bringing to those events to present.

Rebecca Marshburn: Thank you. Because I've worked with you before, obviously I'm like, oh, what if we talked about, but instead I'm just a regular co-host here and your regular panellist, Richard. I'm curious if what Amanda and Josh just spoke about in terms of demonstrating the value and how that can be applied to different teams. If you've seen this across, you work with so many different customers and clients at fever B, so I'm curious if you're seeing the same desires or the same asks,

Richard Millington: The same asks for sure. Everyone wants to know what the value of their community is. Like I said before, I think they dunno what they're going to do with the data. Honestly. I think there's so many challenges that we need to overcome and one of them to be frank is we are very biassed. I feel like all of the vendors, all of the community managers, all of the consultants in this space, we have a biases that we want the ROI to be as high as possible. Of course we do. Why would we not? Which means we often make statements that I feel if community data was audited the way that financial data is, it would be disastrous for a lot of people in this space because the claims they're making wouldn't be supported by the data that they have. And so there's a lot of confusion between causation and correlation and the level of data literacy in this space I feel is quite low.

And I think there's so much work we need to do to bring this up to a level where we can identify what's causational, what's correlational, what's precise and and a lot of what we do with clients or try to do with clients at least is to guide them through the process of what kind of ROI do they want. Because saying I want to know what the ROI of my community is can mean many different things. Do they want a precise dollar value or not? That takes you down one different pathway compared to looking at the impact of community. And if it is looking for a precise dollar value, can we run a controlled experiment or not? The answer is almost always no. Although we've done it once before and the results were interesting, but the answer is almost always no, we can't do a controlled experiment.

So then fundamentally it's correlational data. We don't know for sure what would've happened. And so then we get data like core deflection, we get tracing the value of new customers that attended events or we look at comparative data from newcomer conversion rates compared to non-member conversion rates. It's some measure like that if they don't care about precise dollar value, which is quite common as well. Some people just want to know it's helping but they don't need to assign a dollar value to it, then it opens up a lot of opportunities. But if they just want to increase the level of engagement, fine, I'm not a big fan of that, but there are metrics to do that. There's community driven impact score where you can ask members what they think about things and analyse things that way. Or if it's customer support, you can look at task completion rates.

If it's marketing, you could look at net promoter score. If it's retention, there's CSAT scores. But I think fundamentally there's such a challenge right now is that the data literacy levels are so low is that we're making a lot of claims that frankly aren't supported in a statistically valid way. And I'm getting quite concerned about it. And time and time I think call deflection is probably the most troubling one where some of the methods to calculate call deflection in our industry at the moment. I dunno if I'm allowed to swear, but they're not great. And they'll give you a measure that I've seen these calculations used where they're greater than the entire value of the customer support team. And you're like, well, that's not possible. That's just not happening. And so then I've had clients where they get into this debate where we have to lower the result we're getting, that's nuts. Then they're deciding what result they want and trying to develop a methodology to match that. And that's completely the wrong way you should go about this. And so yeah, we do need better methods for ROI and everyone wants to know the ROI, but we need to have an interest in understanding different techniques and different ways of showing ROI as well because at the moment it's nowhere near the level that it needs to be and it's becoming more and more problematic.

Josh Bricel: I was just going to say, I have a follow up question there. So I totally agree. There's a tonne of confusion in the space and there's a lot more correlation than causation right now. And that's something that I deal with on a daily basis is just kind of like what can we show with what's available? And what I keep on telling myself to keep on going forward is we have to start somewhere and we have to get some momentum to more attention in this space so that we do start to have access or start to put an instrumentation in place to get better data, to get a cleaner story on being able to prove out causation and to create that whole picture. And so I'm curious, what is either of your thoughts on where do we start and how do we get moving in the direction of getting better data and being able to show better causation in a space? It feels so fresh and new with data and we got to start somewhere, right?

Richard Millington: Yeah, I completely agree. I think we have to start somewhere. I think one of the things that might be useful is to be considerate in the claims that we're making. It's one thing to say that the community caused X number of cells. It's another thing to say that X percent of people that visited the community also purchased. And so I think correlational value is fine. I think there's a great opportunity in there, but the language we use I think matters. So for example, if you do a controlled experiment, you might be able to say the community causes members to spend X amount more. If you've got correlational data as we often do, and I think common room does a great job of this, you can say community members spend more do more than non-members or any comparison groups. That's a valid statement. You're not saying that's driven by the community, you're just pointing out a statistical fact.

If it's surveys, you can say, members tell us that they spend to do X percent more because of the community. I think that's good. If it's like a value assignment method, you can say members who ask a question spend to do X amount more. I think all those things are valid. I think those are very statistically valid things to say and do I just feel like we've got to be careful in the claims we're making and make sure that they are supported by the data. I think the temptation is always to say the community caused X, and unless you're doing a controlled experiment, that's not usually an accurate claim. But I think members spend more than non-members or members who do X do more than others. I think those are more valid statements to make and I would love us to shift a little bit more in that direction. I think you're completely right. I think all these approaches are valid. Everyone has different capabilities and sometimes you can't do a controlled test or you can't wait months for the result. You just want to do a study that will give you a value you can work with today. But as long as we're making the right statement connected with that, I think we're on pretty safe ground. Does that make sense?

Josh Bricel: Yeah. I think you'd probably be surprised at how many requests I get that are like, Hey, can you prove that my community does this with our data? And it's like, well, I can look at the data and see what the data tells me and then we can review that together and see what the insights are, but I can't necessarily prove that it's doing a certain thing. And so yeah, I a hundred percent agree with all the things you're saying and Rebecca that's removing you from your host shoes to back when we were putting together the community 360 report, I just remember all the conversations we had about No, no, no. We have to make sure to be very specific about what we're saying here because if we misword this, it could be completely misconstrued or somebody could think that it means something completely different than what it's actually telling us or the level of insight that we have. The active retention is the one that keeps on coming up in my mind because it's not like most people think of a retention as somebody just comes back and starts using a product like a lurker or a learner, as we were saying earlier, when in reality oftentimes they have to actually actively be engaging to be considered active attention. So anyways,

Richard Millington: If you don't mind me saying one more thing quickly, I think a key part of this is teaching people about data, the kinds of questions that you are getting on the same kinds of questions that we are getting. And they want to say, tell me how many people the community caused to purchase again. And that's like you're like, okay, we've got to go through a discussion here where we can talk about how this works, how calculations are put together, all these kinds of things, but you've got to have that discussion before you can present the data. Otherwise there's no context about what it means, which also means that people can't interpret that data at all. Meaning retention is such a common one and such a challenging one to deal with because it's never just retention the number of people that participate. Again, once you start drilling into the data, you might end up with a really specific question such as do organisations with at least one member who has visited the community three times in the past 12 months have a higher probability of renewing their annual subscription and those that don't, and that sounds ridiculous, but that's the level you have to get to.

Could you go define very specifically what retention means, what community means, but even that, there's so many assumptions that are baked into that data question that you've got to have that discussion and that it's like a process, not just an outcome

Josh Bricel: That you just read through there. I've done that question, but with different numbers and found different balancing points are, so that would hit super close down.

Amanda Boyle: I want to jump in here. My little tableau heart from running data programmes is so happy from this conversation for pausing, for anybody who's listening to this, we're pretty much like a master's class in this, so this isn't intro to data stuff. If somebody's new to this listening to this, this would probably be kind of overwhelming. And I want to just kind of acknowledge that as going back to what you were saying, Richard, around data literacy levels are low. I think this is a key differentiator for anybody as a professional, is to take the time to learn this truly, truly take the time to learn this and it will help you no matter what you end up doing is understanding how to see and analyse, see and understand data, and ask the questions and take the time. I think often too where it's like we're resource constrained, we have competing priorities with time and we're going to do what's easy.

And sometimes that means taking shortcuts, and as soon as you lose your credibility in this space, especially with internal leadership or anything like that, it's like it's really hard to get back. So it's like take the time to learn and educate yourself on this, especially if you're I guess anywhere in your career. Because I've seen this with folks and I'm very, very fortunate. Our operations manager on my team came over from Tableau too. Absolutely brilliant. I'm so fortunate and drives a lot of this work for us because he works with the analytics team and he's able to ask these questions and actually Figma is out everything. He's like, this is what we're trying to do. We take the time to write it on paper before you try to build a dashboard. You're like, this is what we're trying to figure out. Do we have the information to figure it out?

And then even when we do, it's in that frame of we believe because of this, this is the direction we think it's going, but it's never an absolute avoid absolutes because you just will be wrong. You will be proven wrong so easily. And I think a lot of that when we're thinking about how we build these trusting relationships in what we do with other teams providing value, it comes with that caveat. It comes with, this is what we know, or based on these indicators, we believe this and it relates to this other thing, but it's not. Yeah, the absence is a big part. Oh, go ahead.

Richard Millington: I didn't mean to interrupt. Yeah, I love what you're saying there about learning and I think there's obviously levels to that. There's a level from just learning the basics to writing R or Python code and doing all that kind of stuff. And I think some people have this mentality of, oh, I wasn't good at maths, so I'm not going to dive deeper into that. Just show me the chart. And I get that, but I don't think we are really talking about mathematics. Obviously if you go deeper, you get into it. I think it's about thinking logically about things. And I think you don't need to be an expert in maths. You don't need to do quadratic equations most of the time. I mean, there's a level and I think if you were to take a really simple beginner's course in data just to understand the fallacies, the common mistakes, these kind of things, you'll be miles ahead of where you are today and be able to understand some of the common issues with data. And that's how I think all you need just be able to think logically. It's not mass, it's just that logical thinking I think.

Amanda Boyle: Yeah, I totally agree.

Josh Bricel: So one thing I was thinking, for anybody that's listening, you might be wondering where do you start? What are good resources that you can use? And Richard, to give you a plug, I was doing some creeping on you this morning on your website on fever b com slash roi and actually it was good stuff. It was good dinner. Start from like, Hey, this is what we're trying to do. These are some of the concepts that we're wrestling with. These are some of the metrics. This is why these metrics matter. These are some things you can do about that. It's not like a generic data course, it's community focused. But yeah, I think that would probably be a good recommendation for folks. I don't know if either of the two of you have any other ideas for where to start the ground floor for data literacy, but I'm a big Stephen Fe and Edward Tufty fan, but that's more of data visualisation.

Amanda Boyle: I think conceptually the storytelling with data folks do a really good job of making it not scary. I remember when I was just even learning Tableau, whatever tool you may use, I was encouraged to find a data set that I found interesting. Whether it be like they're all over, data world has countless things. You can be like, I'm interested in books and you can get the New York Times bestseller list and play with data you find interesting and ask questions and then just start to understand what are the variables and how do things and dimensions and measures and start to understand when you're talking about this, what are those things. If you go right into your work, it's almost like it's not fun. And I think it can be really fun if you actually learn how to do this so applicable to everything else that you do.

I'm not great at math at all. I never even took statistics, so I'm just like, yeah, you kind of learn as you go. And I think the confidence that you build with anything we do, I think for MySpace is it's all about helping users build confidence and not quit and help each other then and paying it forward. And so whether it be learning data, whether it be learning how to use GitHub, whether it be any of the things that we're all working on is like learning common room for the first time. You can get overwhelmed and just any new platform, any new tool. So I think taking the time to just learn it and share with other people I think is the magic of community, whether it be in any space, is sharing it with others.

Richard Millington: I also think, just to add to that, any tool is only as good as what you plan to do with it. And I feel the danger or not danger, but the challenge often with any tool, common room, Google Analytics, anything is that if you don't know what question you're trying to answer before you use the tool, the tool is not going to help you that much. You can dive into it and spend a whole afternoon finding really interesting stuff. That's what it is, kind of addictive. You can find lots of trends and patterns, but I feel you have to have some idea of what you want before you sit down at the computer to start exploring the tool. Otherwise, you're going to get lost. If you don't know where you're going, you're just going to get lost.

Rebecca Marshburn: So spoiler alert, Matthew and I, we have a little chat open on the side like, Hey, I have a follow-up. Great. You have a follow-up go and every follow-up that we've typed in, you have now answered as we were like, oh, nevermind. They covered it and like, oh, actually they covered that. So I just want to call out the three of you as being incredibly impeccably good at talking about what you are so good at. So thank you for that. Just as a quick plug, one of the things I wanted to touch on is I think a theme maybe or another way of saying what you all are saying is unless you know what story you're trying to even or what outcome you're trying to get to or all the way back to the beginning of the conversation, what action you think you want to take from what the data is telling you.

If you're not able to put the words into a sentence, into a thing with a period and then follow that on with the next sentence, like thus we will X, then the data is only going to be as powerful as what you want to do with it. And so Richard, to elevate again, some of the things that you've published recently that caught Josh's eye, for example, what are a few of those? Can you pull out a few of those points where you're like, Hey, if you're a non-data scientist, just starting to think through these questions about how to arrive at outcomes. What's like, here's steps one, two and three, or maybe not steps, but themes one, two, and three that you should hold in your head as you're beginning to think about what outcomes data might drive for you.

Richard Millington: Yeah, I mean I don't have a specific answer more than what I've said before. I feel like the danger when anyone creates any kind of strategy or measurement system is that they sit down with a notepad or a laptop and a blank document and then they type down, I think our community should be for call deflection and product feedback. And that's great, except now you've got to spend all your time persuading everyone else how valuable that is. And it's far better is said to go to your stakeholders and begin with, instead of saying this is valuable and you should use it, it's far easier to ask them what is valuable to you and then work backwards to how the community can support that. Then you're beginning with something that is already has value in the first place. And so that's often where we begin not trying to persuade people That's a metric we just made up should be important to them because that's a hard sell, not impossible, but it's a hard sell.

It's far easier just to work backwards from what people already find important and then design the right metrics to match. And I don't think for anything that we are doing, I don't think you need to be a data scientist on any of it. I think a lot of it is first being able to decide and define very precisely what you want. The next is to understand the process that you go through to answer any question. And there's certain parts where you play a role in that process in certain parts where you don't, I mean the community professionals should be very good at defining the question. So many assumptions that go into that. They have to be very good at defining the question, but in terms of extracting that data or preparing the data or modelling that data, that's not your skill. That's not their skill, their skill, their skillset.

So that's where you need a data analyst to be involved. And then you do the evaluation and the deployment of that data in terms of internalising it, making sure people understand how to use it, what to do with it. And so there's a whole process behind that and I feel like you've got to specialise in the parts that are most relevant to you. And I'm happy to share blog posts if it helps, but that process of understanding the business context, how the data will inform the decision, the type of answer that you need, is it correlational? Is it causational? What data you have access to defying in the question, assessing the data that's available. All these kinds of things are steps in that journey and once you start going down the journey, you get hit by a roadblock after roadblock after roadblock. And we haven't even talked about data privacy yet.

Data security, we haven't even talked about these things, which are one of the biggest trends and challenges that we face every single day. Now we haven't even got into that. And so I feel like you've got to start with the process of just defining the question. I feel like that will be the first step. Once you've defined that question in very specific metric terms, that's where the real fund begins. That's where you have to get access to the data, clean the data, extract it, and so that'd be where I'd start. I feel like I'm rambling a little, so I'll pass it on to someone else who wants to speak.

Amanda Boyle: I love it. You said that's where the real fund begins when you're talking about clean data, same arrived data privacy, data security. I think for us, Tableau is very different than GitHub and I think I'm very fortunate to have the perspective of distinctly different audiences, what we could do with the Tableau audience. They were so open and willing, it was very different, but I feel like the developer audience is like, if you break their trust, you're never getting it back. And we just being very respectful of that is understanding, thinking about security, thinking about privacy. It's how even the concept of telemetry of what is somebody doing in your platform? People, we don't have cookies turned on, so it's like, how do we do this? It's like, and that just presents new challenges, but it's like take a step back and be like, what are you actually trying to solve?

And there was another article, I don't remember where I read this was essentially for me it's like I know what value my team should bring personally. I have a very strong opinion in it, but that needs to tie to business value, it needs to tie to what the company objectives are or else we can't prioritise it. At the same time, I need to get support from my leadership to understand what I'm trying to do and to poke holes in it too. I'm like, please tell me what I'm missing because I'm going to miss stuff. And then all the way up to senior leaders getting what you do, especially in this space, I'm totally biassed, Richard, you said this earlier, we have to approach it with without ego. It's like I think this is important, but I'm only one tool in this approach. So being very respectful of that too. Yes, community data matters, but in the web of everything else, it's like one measure amongst a lot of other things.

Rebecca Marshburn: I'm curious, Josh, if you have from an aggregate point of view or a sheer volume of inbound requests for looking at data and then you had said when Richard gave that amazing example of an organisation with at least three members within the last 12 months who have asked at least two questions and participated once and all those different dimensions, which I think we all laughed because we're also like that is the specificity you need to get to. I'm wondering if you have an idea around, for example, I think what you said Richard was like, you need to define the question, Josh, is there a way that you help people when they're making these requests help them understand how they want to define this question so you can actually work with them through data and then understanding how those correlate?

Josh Bricel: Yeah, I mean it's different in every scenario. So I guess the answer I would come to is more of a framework of the approach there on how to get there, which ends up ultimately being how can you whittle down to what the core problem is that they're trying to solve? And so if somebody is trying to drive adoption or if they're trying to drive revenue or if they're trying to drive support or all those things are going to be very different. And so just figuring out what it is at the end of the day, what the change and the action is that they're trying to drive, and then just kind of backtracking from there for everything else because it's going to be a different approach for each scenario. I dunno if that's the answer you want, but it's the answer you got. So

Rebecca Marshburn: No, I appreciate that. And I think the last follow up I'll have building off of that is I'm wondering, Amanda, how you were saying that as you choose a metric and then you need to get buy-in from stakeholders, that that's the right metric. How has that conversation gone and have they poked holes at things where you're like, oh, okay, actually wasn't asking the right question based on the other dimensions I should be thinking about or where does that synergy come from and how do you approach those conversations and figuring out is this the right question and then let's go solve for it?

Amanda Boyle: I think a lot of it is internal education. I don't assume anybody understands what we do. I really just don't. We sit within our revenue org, which I sit in the customer success org too. I used to be in product marketing, so it kind of just depends on where community sits in the business and how people understand or where your team related to community, whether it be a dev rel evangelism team or a community forum team. That all depends on assuming your leaders even know what you do. Getting to that mutual understanding and then I might think what we're doing, whether it be a number of questions with solutions or time to solution or time to reply or anything like that, they're interesting. Health indicators I think are important, but if they're like, we want to really just make sure that it just depends on what they think is most valuable.

If I can say this ties to that and they don't understand that or there's just a disassociation, you have to spend a lot of time actually building that trust with the leaders too. For us, it's a lot of, I mean honestly this year is all about prioritisation and focusing because everybody's kind strung out with everything going on and just being really smart about where are we using our resources and what are we asking of other teams. Richard alluded to this, when you're saying your job isn't to do the analytics work, it's not to be the data scientist, it's not to be the data engineer, but also you have to be very strategic about what you ask of those teams in terms of the work that you need. They're not going to be able to do everything I want. I have a laundry list, and they're like, no, you get two things out of that list. So being really smart about why am I prioritising the work that I'm doing too? Not even just this is the business value, but why do I think this is the most important right now? And figuring out what hills you want to die on. That's a big for everybody. It's like I feel very passionately about things, but I know some things I'm not going to win. And you kind of just have to learn when to let it go too.

Matthew Revell: Well, listen, thank you so much all of you. This has been a discussion that frankly I've just sat back and listened to because it's been so fascinating. I want to say thank you to you for taking part, but also give you the opportunity to let people know where they can find out more about what you're doing and perhaps some other things you might want to share. So Richard, where would people find you?

Richard Millington: You can find me on www.feverb.com. I dunno if we still need the www, right? I feel like the web's been around for a long time. You can find me on fever b.com and also we're putting together a community data course soon, so maybe that'd be relevant to people that attended this. Awesome.

Matthew Revell: Thanks Amanda.

Amanda Boyle: Thank you, Richard. I'll share that with my team. Thanks. Community github.com. We're the community discussions platform. Myself and my team would love to see more folks there.

Matthew Revell: And Josh,

Josh Bricel: You can find me on LinkedIn. I don't have my own website or you could go to Common Room if you want to see the stuff we've been up to and all those charts and graphs you can think of the whole team and the work that we've all done together.

Matthew Revell: Wonderful. Thank you. Well, everyone, thank you, Rebecca. See you again. Thank you very much for the time we've spent together.

Rebecca Marshburn: Thank you all so much. And gosh, little call out Josh Burell, just so find you. Thanks. Just look up Josh on the internet and he'll,

Josh Bricel: There's a few. Josh. Yes, Josh, Burl, the

Rebecca Marshburn: Data is hi on. Josh is on the internet. Thank you all so much. This has been an amazing conversation and I consistently learned from each of you and I appreciate it a lot. Matthew, I will see you again and I can't wait.

Matthew Revell: Sure. Thank you. Bye.

Rebecca Marshburn: Thank you.