Solving community mysteries with data

Richard Millington
Richard Millington
Founder at FeverBee
DevRelCon London 2023
7th to 8th September 2023
CodeNode, London, UK

Join Richard Millington as he delves into the critical role of data analysis in diagnosing and resolving community engagement challenges, from declining participation to technical issues, and illustrates the transformative impact of this approach on community management strategies.

Watch the video

Key takeaways
  • 🔍 Uncover hidden issues
    Dive deep into data to uncover underlying causes of problems within communities. This approach led to the discovery of a hosting issue that significantly impacted community engagement.
  • 📉 Address technical pitfalls effectively
    By tracking and analyzing changes in web performance and search rankings, the community identified technical issues affecting user engagement, guiding them to effective solutions.
  • 👥 Understand the real user experience
    Engage directly with the problem as your members do. This real-world approach can uncover unexpected insights that data alone may not reveal.
  • 🔗 Connect the dots in community data
    Systematic analysis of engagement data can reveal important trends and anomalies, like the decline in new registrations and how it impacts overall community activity.
  • 📈 Use data to drive community strategy
    Harness detailed analytics to refine community management strategies, ensuring efforts are focused on interventions that yield the most impact.

Transcript

Matthew Revell: We have Richard Millington, who is probably known to most of you. He's the founder of FeverBee. They work with lots of big companies and smaller companies to help them create community programs that are effective and valuable. He's also the author of several books, which you should go and get hold of, and I'm really pleased he's here today to talk to one of the themes that has always been important at DevRelCon, which is understanding the data that we have to work with so that we can be more effective and well, that's it. Be more effective. I'll leave it there. Okay, so please, let's give a DevRelCon welcome to Richard Millington.

Richard Millington: So it's March, 2020, and most of you in this room at that time, I am in lockdown, but hopefully unlike most of you in this room today, I've just been through a really painful divorce. And so I'm kind of trapped at home now in the graveyard of my love, just waiting for some sort of distraction because I've baked as many loaves of bread as I can possibly bake. And then it happens. I receive an email from a friend of mine who runs a community for people affected by cancer, and she tells me that she's getting worried. Her community has suddenly had a sharp decline in the level of engagement and she can't work out why. And she asked me if I could help. I'm like, at this point in time, any distraction is a good distraction. So absolutely I 100% can help. And so we set up a call and she goes through what she's been doing to try and increase the level of engagement. And it turns out she's done all the things. She's done gamification, she's done events and webinars. She's done more of the insiders of that community. There's a translation plugin that she's put in as well. She's done all the things, but nothing has worked.

And so what we do in this situation is really try to go to the source like what's really happening within the community? Why is engagement suddenly collapsed? And so we do a survey and the survey showed that people love the community. Like this is one of the highest scores of any community we have ever analyzed on a survey. And so it raises the question, I think if people really love the community, why is engagement in decline? And when we think about this, we go through it as a series of clues. So we begin with one issue and then we try to dive deeper and deeper and deeper. And so the first thing we know is that members really like the community, but the level of engagement is in decline. So why is that? It could be because people that were engaging aren't engaging anymore, or it could be that fewer people are reaching the community in the first place.

And so we investigate that and what we find is that the registrations have dropped. There are fewer people that are reaching the community now, but why is the number of registrations in decline? It could just be that people get to the page and then they don't register to join the community. Or it could be that fewer people are reaching the community in the first place. And when we investigate this, we find it's a search issue. Google Analytics, you can see everything. It's going well, it's going well, it's going well, and then it absolutely collapses overnight. And so now we could say, well, this is a search issue. Notice we've narrowed down engagement into a core and more narrow issue. So it's a search issue, but it doesn't explain why it's the search issue. What specifically happened here. And Google Analytics doesn't actually give you much help, but if you dive deeper, the Google search console does, the click is not working well.

So we know that there's a declining level of search traffic. And when we get the data from the search analytics console here, this is what we find is that overall the click-through rate hasn't really changed. If people see the result in search, there's likely to click it as what they were before. But what we do find, and what's really interesting is that the position has dropped, and we find that as a result, fewer people are clicking on the results. And so this is useful. This is another clue that we can work with. So we are narrowing it down further and further, but we still don't know why. We still don't know what's happening here. And this is where we can start to look at the analytics of the page itself. And when we dive deeper, this is what we found is that all these URLs were good according to the search engine.

They were good. And then something happened specifically here that completely changed what was happening in that community. All these good u ls are suddenly bad and that's why it ranks even lower. And again, with narrowing down even deeper now. But we don't know precisely why or what's happening. So what I began doing here is taking different Uls and putting them into the Google page speed indexes out there to really try and get to the root cause of what's happening. And this is what one of the results showed. When you look at this, what occurs to you about this? What stands out on this page for you? You can just shout out the answer if you like. No preview. Sorry. No preview. No preview.

Audience member 1: Very slow.

Richard Millington: Yes. What we find is that performance, okay, it's not great, but most community platforms aren't great. What stands out to me here is this, is that the time it takes to load the first piece of content from that community and the time it takes to load the largest piece is five seconds, which means when you click a link, you have to wait 2, 3, 4, 5 seconds for that to load. That's ahead of a long time to wait for a page to load. And so now we know something really interesting. It's not about anything that's changed on the community itself. We know this is a hosting issue. You're requesting information from the host. It's taken that long to respond. And when we actually drilled down to what was really happening here and the real cause, what we found was this is that the platform vendor that was hosting all of these online communities, including our client, what they had done is moved hosting from AWS to another host.

And when they had done that, the speed of all their communities had plummeted, like absolutely collapsed. And they hadn't even realized it for months. They hadn't realized it. And so the lesson here, and what I really want to talk about today is that the best insights, the ones that really change the game, the ones that are really going to help you build any kind of community, come when you dig deeper into the data that you have, dig deeper. There are great mysteries everywhere. I feel like so often I look at any data set out there and there'll be a mystery in it. And I find that such an exciting thing. I remember years ago there was a story I think in the New York Post and it had this picture here. Essentially what happened is that some construction workers in New York had tunneled into an ancient or ancient for the USA, but I turned it into an ancient burial chamber.

And what they found was that there's the coffins and it's a burial site of some sort. And the article explained who these people might've been. The article explained how this might delay the construction work and why they might've been buried here. And then the article ends. And I don't know about you, but the thing that occurs to me most when I look at this picture here, it's what's behind the door is that not the mystery here is that not the mystery that calls you what's behind the door. Because if there's an exit, that means somewhere in New York City, there's an entrance to this and no one seems interested. I'm absolutely obsessed with this story. And in case we don't even understand how often this happens, 1963, a man in Cappadocia is remodeling his basement, knocks through a wall and finds a tunnel to an ancient underground city that's 12 stories deep.

And I've been here, you can visit it, it's absolutely amazing. There's places of education there and worship and places where you used to eat food. It's absolutely incredible. But imagine if you had knocked through that wall, looked at the tunnel and been like, huh, that's interesting time for lunch. But that's what we do with data all the time. We have these great mysteries that show up and we ignore them every single day. A very common thing I see is I'll be on a call, I'll see in the data, it'll be a line like this, and engagement was up 17% in the last quarter. Next quarter we're planning to increase the number of MVP people in the community and then have to be like, whoa, whoa, whoa, whoa, whoa. Why is engagement up 17%? Are you not curious? Did something change? What did we do differently during this time?

Why is engagement up? Because once you start digging deeper into these kinds of insights, once you start digging deeper into this data, that's where you get really useful things that you can use to change your community. For one example, we had a graph that looked like this, and what we noticed over time is that the number of people from India and the Philippines is growing and it's growing and it's growing, it's growing, but it's only a couple of percentage points a year. But once we know this, this is fantastic. We can do really great things with this. We can host more events in that region. We can consider the translation implications of that. We can make sure these people are represented in any programs that we are doing. Once we have this data, it's fantastic. Or another example, we find that engagement is growing because more people are coming via search.

And again, it fluctuates over the years, so it's easy to miss. But once we start digging deep into that data, you can be like, huh, maybe we should spend more time in search. And this is especially interesting actually, because when you look at the data behind this, or when you look at what organizations do, what we find is that they spend so much time on their newsletter on social, but when 75 to 80% of your traffic comes by a search, maybe that's where we want to invest. Another example from I think around half a year ago, I was on a call and someone said, fewer members are participating in groups, so are planning on closing most of 'em down. And then I have to be that guy who's like, whoa, whoa, whoa. Why are fewer people participating in groups? What's changed? Are they going somewhere else for that?

Did they have a bad experience with those groups? Are they just not interested in the topic anymore? And it's when we answer these kinds of questions that we could get really useful insights. We shouldn't do anything with groups until we figure out what's actually going on. And so what I recommend here is that you have to let these mysteries call to you. You have to let these mysteries call to you because that's how you turn information into insights that you can actually use. And you can also use data to solve any problem that you have. Lemme give you an example, and you might have to turn to your side screens for this. Let's imagine that in your community, the number of posts has declined recently. Well, there's only two possible reasons from a purely data sense why that could have happened. Option one is that people that were active are now participating less, they're still engaging, but instead of making say, five posts a month, they might be just making one post a month.

Or option number two is that fewer people are participating in the first place. And it's one of these two options. And once you know that, you can break it down even further. So what you can see here if it's post per active member, is that because you've got fewer new discussions or fewer replies to those discussions, these again are two completely different issues. Or if there's fewer people that are arriving in that community, is that because the number of registrations has dropped or is that people are returning less often? And you can use your data to narrow down precisely which one of these it is, and it'll take you to the problem that you might be able to solve. It could be that if there's a decline in new discussions that members aren't able to easily post a new discussion. It could be that members have fewer problems to solve in the first place.

It could be that if it's a declining in reply as they have less ability to reply or less motivation to reply if there's a decline in new registrations, if it's fewer people coming to the community or fewer people converting to visitors, again, different kinds of challenges. But the whole point is that you can take that problem of engagement and narrow it down to the specific problem that you want to solve. Instead of guessing what might work, you can be very specific and very clear about what the problem is. And then you can identify the kind of solutions that might actually work. And notice here, there's what? 1, 2, 3, 4, 6, 7, maybe eight possible solutions there. If you are guessing what might work, you have what a one in eight chance. But if you use your data, you're far more likely to get to exactly the right kind of solution that can have the biggest possible impact.

And you can also use this to solve any goal that you want to achieve if you treat your goals as a problem to solve. You can use the same methodology to do that, and you can use a systematic and logical approach for that. So let's imagine you have a goal of say, increase the number of views of content by 10%. We can use a very similar approach to this. First, there's two ways of achieving that. One is that when people do visit, we can get them to view more of the content in that community. Or two, we can get more people to visit in the first place. Again, two completely different strands here. And once you start picking away this and you figure out the different options of doing that, there's different levers that you can pull. And so pulling the right lever is the real challenge to your work, and you use to figure that out.

If paid views per visit is down, you could be solved that by increasing visits to the existing content or create new content for people to visit. If you want to increase the number of visitors, you can attract new visitors or get people to visit more. And again, this tells you what kind of problem that you want to solve. This tells you whether your problem is one of awareness. Maybe when people visit content, you want to have other content displayed alongside it. If it's an interest issue, you can find out what kind of content people are interested in and create more of that. But the whole point is before we dive into solutions, which is always the very last step that we diagnose what the problem is. And so what I want to do is take you through some of the mysteries that we've been going through over the last couple of years and explain how we went about solving those mysteries.

And you could do this with me as well if you like. So mystery number two, this was a client in the technology space. And after an event in London a couple of years ago, my contact there said, the problem we have is that the satisfaction scores of people in our community are really low. The CSAT scores were lower in the community than any other support channel. They're lower than virtual agents, they're lower than chatbots, they're lower than the customer support reps. And so why is that happening? And the first thing we look at again, is what's actually happening within the community. We look at the performance of the community itself, and in this case, we look at the average time to first response. We look at the response rate, we look at the percentage accepted solution rate, all the things we typically look at. And what we found is that the community performed fine.

There was no major issue here, so it wasn't an issue with the performance of that community itself. The next thing we look at was does it vary much by location or the language, or does it vary by month? And the answer is no, there's no major difference here. So it wasn't that the scores had been getting worse, it was just a case that it had always being bad. And again, that's a really interesting clue. And then we begin segmenting the results by the products themselves. And what we find for most products that this brand solved, the scores were okay, but for one product, they were abysmal, absolutely abysmal. They were dragging down the average of everything else. And so this is a really interesting clue for us. This is the kind of clue that gets us quite excited. And here we are kind of stuck for a while of trying to figure out why.

And so I went through the survey results and it was survey result after survey result after survey result. It's a whole mixture of really interesting things, but there are two responses in that that really surprised me. And they're one below each other. The first one said, so now I'm just stuck with product I can't use. Thanks a lot. By the way, the sentiment analysis for this thought this was a positive thing, which why you should be careful about that. Two, you shouldn't have a community if you can't help anyone. And I'm looking at this and I think there's something interesting going on here. I wonder what that is. It feels like people are asking questions, they're getting responses, but they're not getting the kind of response that actually helps. And so I contact a customer support rep and they tell me for this product, there's a certain type of problem that comes up again and again and again. And there isn't a solution for it. They have to buy a new version of the product. And so the community still helps. It gets 'em to the end of the journey, but it doesn't give them the answer that they want. And so we can think, okay, now we've gotten to the bottom of this, but that doesn't explain why community would be worse than customer support channels.

Surely the same problem affects all the channels equally. And so we were stuck on this for a while, really trying to figure out what was going on. And eventually I decided, you know what? I'm going to take a problem that's in the community and I'm going to phone them up and see what they say on the support channels. So I take a problem in the community, I phone them up and I'm like, Hey, this is the problem I'm having. They asked me for my customer ID information. I'm like, yep, didn't think that through. And so eventually it takes a long time, but you find a customer that's willing to go through this. And he said, at first everything is normal. They tell me that they can't help. And then right at the end of the call, the support rep says, you should post a question in the community instead because they will probably be able to help what these support reps have realized.

And I do kind of admire it to be honest. They can increase the ratings, the scores that they have by telling everyone with a problem they can't solve or can't help to post in the community. Instead, the community wasn't making people more upset and more angry. It's just where all these upset and angry people had to go, nowhere else that they could turn and things like this happen. And it really helped my contact to have this kind of information. But notice when you dig deeper into the data, these are the kind of insights that you get. But the real lesson from this, I think, is that sometimes you just have to experience the problem the way that your members do. You'll be amazed how often that when you go through the experience that you members have really go through it, how often the results are different from what you might expect.

Mystery number three, kind of similar to what we've just had. Why did a community suddenly cause people to cancel their subscriptions? This was a client in the gaming space, and what they noticed recently was that the community was suddenly driving most of the people that were canceling their subscriptions, and it was a very sudden change and they couldn't work out why. And so we do what we do first. We look at the community itself, and there was no major change in the sentiment. People weren't angrier in this community. There's nothing in particular that had changed here. The performance was more or less the same. The next thing we look at, was there any change in the technology? Did the community platform change at all? Is that what's driving people to cancel their subscriptions? And they're tracking this from where people were before they canceled the subscription. And the answer was no, there's no major change here at all. And so then we look at the number of changes in cancellations overall. And think about this for a second. For the game overall, the number of cancellations had not changed before the community. There's suddenly a massive increase in cancellations there. So what's happening here, you could just shout out the answer if you think you know it.

Audience member 2: People waiting for community to begin with.

Richard Millington: It's not a bad guess. No, but thanks for the attempt. It's not a bad guess.

Audience member 3: It's how they find how to cancel.

Richard Millington: Yes. The community has, people have shifted where they cancel from one channel to another, and for some reason now they're canceling from the community instead. And now the question is why? Why are people canceling from the community? And we looked at the source, and it wasn't the community overall specifically, what it was was from a discussion that was 18 months old. So 18 months ago, someone had to ask a question, how do I cancel my subscription? Someone had responded and that was it. But suddenly 18 months after that, there's a massive spike in engagement or not engagement in views of this discussion. And finally, we have a hunter, we test it out, and this is what had happened. The people that make the game, no, you've given away the mystery. I might need a new device here. This is just dying on me. So the link to the cancellation page had been removed from the FAQ and the account section.

And that means that when people were looking to cancel their account instead of the place where they used to go, they'd go to a search engine, they'd ask there, they'd drive at the discussion, and then they'd go to the discussion and cancel the game. And so the lesson here, I think, is that when you see a spike in the data, that's usually a technical issue. It's usually a technical issue. Something has suddenly changed. When you see a slope, that usually means there's a gradual change. And generally speaking, when you want to fix an issue, it's much better to fix a spike than a gradual change that reflects demographics, habits, and preferences of the audience and all those things. Mystery number four, my favorite one of the bunch, I think out of all the mysteries I've dealt with over the years, this is the most, I dunno if I'm allowed to swear, this is the most something, something crazy mystery I've ever had.

So this is from a retail client we worked with who are very successful at what they do. And they suddenly mentioned that they're now receiving a surge of traffic from Kansas in the USA. And I don't know much about Kansas in the USA, but I think there's a reason for that. Sorry if everyone's from Kansas here. And so we look at the clues and the first thing that we notice is that it's a cliff edge. So Kansas, not much going on, not much so much. And then suddenly there's a massive spike. And so what does that mean?

Yes, yes, we're getting there. It means that there's a technical issue here, most likely. And so, okay, there's a technical thing going on in Kansas, and then we start looking at the data itself. Oh, no. Okay, so when we dig deeper, it's not Kansas itself. It's a tiny town in Kansas called Lebanon. And this was the best photo that I could find of it. So that gives you some idea of what this town's about. So this is kind of interesting. All the surge is coming from this one tiny town in one state here. So I'm interested in this. I'm like, why? Why? And so let's check out the town. I mean, maybe there's something interesting about the town. I go onto the Wikipedia page and the first thing I notice here is the demographics. A population of 178. Just to be clear, that's not 178,000. I so 178 people that live in the town. And this is a really interesting clue because the clue here is that the data is wrong. What it's telling us, it's not possible. It's not possible for this to be happening. So we have a clue. But why Lebanon? This doesn't make any sense. And I started researching this more and more, and there really isn't anything special about the town at all, except one thing.

Can anyone guess what it it, yeah, you see...

Audience member 4: It's the middle of The USA. So it's got that GEOIP problem.

Richard Millington: Is it you just trying to say the middle of the USA? You don't give away the ending, but yes. Congratulations, by the way. Yeah, it's in the middle of the USA. And it's actually on the Wikipedia page itself. The only interesting thing about this town, again, apologies to anyone that's from this town or 178 of you, is that how amazing would that be is that it's in the middle of the USA. And so now we have another clue, and it's kind of interesting, although this guy just gave it away earlier. What essentially happened is the analytics tool had changed the way that they were locating individuals. And what they had done is that in the release notes, they had had this updated thing. And essentially when they couldn't determine what the IP address was, they were assigning it to the middle of the USA instead of their headquarters that it was before. And so it wasn't that there was a surge of people from this one tiny town, it's just that they were assigning people to this one tiny town in the USA. And the lesson here, I think, is that when the data doesn't make sense, validate the data, you'll be absolutely amazed how often the data is just wrong, how many times the data is just wrong. So here's some mini mysteries for us all to work through together. Mystery number one on the left was the community in the scientific space where we suddenly noticed a huge increase in visitors from India, had Chinese names, but its USA as their profile as their country in their profile. Mystery number two was a community that had a sharp decrease in the time to first response of their discussions. So how long it took for a discussion to receive a response. However many discussions had their negative time to first response, which means the response is appearing before the discussion has been being posted in the first place. And there's no overall change in the response time. Mystery number three, visitors from Europe plummet, but the post count from people in Europe hadn't changed, and it wasn't any issue with the translation tools. So if you think you can solve any mystery, one, two, or three, just shout out now. Yes.

Audience member 5: Is the last one, GDPR, that you're not getting analytics from?

Richard Millington: Yes. Congratulations. Yes. Yeah. So it turns out if you don't accept the GDPR tracker, you're not tracked, but your post still appear in the community itself. Nice one.

Audience member 6: First one's, spam farms, something like that.

Richard Millington: No, but it's not a bad guess. Yeah,

Audience member 6: This one's VPNs.

Richard Millington: I don't like you anymore. Yes. So what happened is that the community had been, the platform had been banned in China, and they were using A VPN that was the cheapest one was in India, and then the USA was the default.

Audience member 7: Is the second one a time zone issue or something? So far has changed how times?

Richard Millington: No, but the software did change. Why would our discussion have a negative time to first response?

Audience member 6: We posted like it was a

Richard Millington: Warmer. Warmer. Sorry. Someone over there shouted out.

Audience member 8: I said the question had been answered previously, but re-asked,

Richard Millington: Warm. Not quite there though. Nope. Sorry about that. I don't want to be rude, but no

Audience member 6: Time zone or daylight.

Richard Millington: Nope. I've got a mystery that you can't solve. Okay. So the cause is, it turns out if you edit a discussion, artists being posted, it updates the timestamp of their discussion. And so the tour would simply change what they were doing. And so now it showed up as the response appearing before the discussion was posted in the first place. And yeah, European GDPI trackers, okay. The final mystery that we're going to cover today is why did engagement decline in the community of a big tech brand after their big event? And this was a very well-known brand. You would have a big annual conference every year, and then they would find that the level of engagement in their community decline declined significantly. And their theory was a simple one, which is people are burned out after the event, they're really exhausted and they just take some time off. And I think that might be true for a couple of days, not a couple of months. You don't need that much time to recover from an event. And so we begin looking at the data here, and what we find is that people love the event. There was no question at all. People really loved the event. We find that, again, engagement after the event declines and remains lower for several months. And then we find that the number of visitors remains the same, but the number of post drops, the number of visitors remains the same. But the number of post drops, would anyone like to guess? Yes.

Audience member 6: Companies saving up a whole load of stuff for their annual event to announce. And then there's a sort of dip in anything new products, new features don't happen for a few months afterwards because they've shipped everything for that event. So there's a

Richard Millington: No, but that would've been a good thing to test actually. Yeah. Yeah. Anybody else? I have to be mindful of time.

Audience member 6: Other questions? Well answered at the event?

Richard Millington: Nope. Yes.

Audience member 7: Did they change the page or

Richard Millington: No, actually the answer is really interesting on this one. It's one of the ones where Spike doesn't necessarily reflect a technical change. So what had happened when we spoke to them is that the decline was almost entirely amongst the people that attended the event. So it's kind of says, well, maybe it is something to do with the event itself. Maybe people are burned out afterwards. But then we spoke to them, we interviewed, I think five or six of them, and the answer soon became really, really clear. I asked them, what's going on? Why are you participating less? Why are you stopped talking to all these people that you like? And they said, I haven't.

We connected on WhatsApp at the event, and now we're beginning to chat on WhatsApp and Slack and other channels that you don't control. And that's interesting because that's kind of the end of the mystery in a way. But is it because we have this functionality in the community? If you want a group, we'll build you a group, but they don't want that. And there's actually a lot of data that says that this happens more often than what you might think where people attend an event, they meet in person, and then they connect in private groups instead. So your top members attend an event they engage in more than ever, just not in a platform that you control. And so the question is why? And so the way we get to the bottom of this is again, by interviewing and doing surveys of people that attend the event to figure out where do they go and why. And so what we do is we look at the needs and desires that the audience has. We look at the level of desire, we look at how frequently this is. By the way, if you want to predict how much engagement you're going to get in any community, this is a great way of doing it. Where would you go to solve that need? Why do you go there and why not the community?

And what we find is that yes, they want to connect with peers and they use the community to find peers that they can connect with. But once they do, they want to go to a private group. And we can fight against that if we want, or we can embrace that and support it. And this was the end of the mystery for us. We figured out what was going on and why people were like WhatsApp, because it's easier to use, it's private, there's nothing really we could do in the platform to change that. So we can support that instead. And so the lesson here, I think, is that we use data to narrow down the options. And then that qualitative research is absolutely key. That qualitative research is absolutely key. And so summarizing the lessons that we've covered, one is that you have to dig deeper into your data to find the real cause.

It's so easy when you have any issue to just start guessing what might work. There's so many things that my contact in the first story did could do to increase the level of engagement. But your odds that any of those things will be accurate or correct are very small. It doesn't matter what you do in your community, if fewer people are reaching that community in the first place. Two, experience the problem, the way your members do that, you have to experience the problem as your members go through it. Spike is a technical change. And if it's sloped, that's usually a gradual change. Spikes are easier to fix. But there are some exceptions to this. If there's a pandemic, an election, or a major event, there can be some exceptions to this. Four, validate data. That's strange. You'll be amazed how often the data just doesn't make sense, but you can validate it and change what you're doing as a result.

And five, use data to narrow down your options. You don't need to guess. Use data to narrow down the options that you can do and then you can start selecting the ones that will have the biggest impact for you. And ultimately, don't waste resources solving the wrong problem. With so many clients we have, they spend so much time and energy and resources solving the wrong issue the wrong way. And finally, if you want to get my last book for absolutely free, there's a QR code here that you can scan. Please feel free to scan it. I just want to say De Kahn, thank you so, so much. It's been an honor to speak to you. Thank you.

Matthew Revell: What a great way to kick off our second day of DevRelCon. Thank you very much. Richard. Does anybody have a question for Richard?

Audience member 9: Richard, I'm going to be a fan boy and say I'm a huge fan since buzzing communities, so thank you for coming. Thank you. So community everywhere, which is what I think you were talking about there, which is a blog post of yours recently. How do we measure, given that people are moving to all these different places, having those private engagements in those groups, what can we do to sort of keep up with those conversations?

Richard Millington: So there are tools like Common Room, I think Josh is around here somewhere. But, so tools like that can definitely help you keep track of those conversations. But in terms of what we measure for us, we try to measure what the overall level of engagement is, and there's tools that that can help. But for me, the thing that matters most is do you change the sentiment of the audience? Like changing the attitudes, are they more likely to buy from you or not? Are they more likely to contribute to your projects or not? So what we've been using is what we call the community-driven impact score, which if you read the blog, you might read about, which is to what extent has community encouraged or helped you achieve goal? And you can change those words however you like. But once you have that, you can see that over time. So if you need a pure ROI metric, you can do that. But the formula is very complex, but it's way of doing it. But for me, I like the change in attitude regardless of what channel you use. The change in attitude I think really, really helps. I'm happy to have a bigger discussion later and thank you for reading the blog for 12 years. You need a hobby or something?

Matthew Revell: Anyone else? Yep. Mic coming.

Audience member 8: Do you have any data mysteries that are still on your mind? And if not, or if so, what was the worst one that was the most mind boggling?

Richard Millington: I mean, there's so many mysteries in communities that have blown my mind. Some data related, some not. One of my favorite ones was the first community that I was hosting as a gaming community in, I was like 15 or 16 years old. And we used to do a thing, which was like a secret Santa thing. Everyone would participate. I think Reddit do it now. And we had this guy that wrote a program for it and at the end of it, no one got their gift and we couldn't work out why. And do you know what happened? The guy that wrote the program just sent out his name to everyone. So no, I'll answer your question, but that's what occurred to me. He volunteered to create this thing and just sent out his name in secret center to everyone. I thought that was amazing. Any mystery to my mind? I think there's a lot about the value of communities that are really interesting. I think a lot of people are measuring it wrong, but I might write about that. I can't think of anything else. I'm kind of on the spot and I'm pretty nervous about it. Yeah, thanks. Anyway.

Matthew Revell: We've got time for one more question. I think if anyone would like to. Okay. Well let's thank Richard for speaking to us today. Thanks.