DevRel as a growth engine

Anna Filippove
Anna Filippove
Director of Developer Marketing at Snowflake
DevRelCon New York 2025
17th to 18th July 2025
Industry City, New York, USA

Anna draws on her experience at GitHub, DBT Labs, and Snowflake to tackle the uneasy topic of measuring DevRel impact.

She argues that meaningful metrics often start with qualitative insights—writing down who you talk to, spotting patterns, and connecting them to business outcomes.

Don’t wait for perfect telemetry. Instead, lead the story with the data you already have, and measure humans, not hits.

Watch the talk

Key takeaways

  • 🗣️ Treat conversations as data Capture who you talk to, what you learn, and turn qualitative insights into structured information.
  • 📈 Lead your own narrative Share wins and stories early to shape how leadership perceives DevRel’s value.
  • 🧭 Pick one north-star metric Focus on a single, human-centred measure that aligns DevRel impact with business goals.
  • 🤝 Bridge disciplines Borrow growth, product, and community tactics to find where DevRel can drive measurable change.

Transcript

Anna: I'm going to talk to you about growth and I'm going to talk to you about what I've learned about growth and juxtaposing growth and DevRel over the past, I don't know, 15 years or so, TLDR.

Setting the Stage: Growth and DevRel

So I'm going to walk you through a few things. Today we're going to start with kind of acknowledging the ick behind measuring some of the things that we're doing. We're going to have a little bit of a catharsis moment up here. We're going to talk about how data is more than just numbers. I promised you that in the abstract. So we're going to cover that and I'm going to talk to you a little bit about what I've seen in terms of leverage that you can get from adjacent disciplines to DevRel.

I've done a whole bunch of things in my career and I'm going to try and pull a little bit of that into this conversation. And then we're actually going to get into the meat of measuring what matters or what I think matters. Hopefully some of it is new, if it isn't, at least it's validating. And then we'll talk about metrics, scaffolding, right? I'm not going to give you a framework, I think that's a little heavy handed, but kind of getting started point. So actually before I do that, lemme get to know you guys. Let's see, who am I talking to? Okay, so how many people here in the room are running a DevRel team? Okay, cool. Yes. All right. How many of you are, whether or not you're running a team or not, how many of you're interfacing directly with someone who's breathing down your neck about metrics and value?

There we go. Yes. Okay, great. How many of you are kind of new to thinking about measuring the work that you're doing and thinking about it for the first time? Great. Okay. All right. So kind of like a decent mix, but I'd say most of you have probably somewhere along this journey, you've already kind put some thought into this, right? All right, so I think let's do this a little bit more like a conversation then.

Why Listen to Me: Measuring the Immeasurable

But first, a little bit of a obligatory why you should listen to me portion of the talk. So I kind of have a history of measuring squishy things about developers. In a past life, I was a researcher, so I used to study open source software and squishy things about open source software like people and conflict. And the fun part about my PhD was trying to turn that into numbers.

How do you measure that? How do you go in front of your dissertation committee and justify that? I've been doing this for a long time. I'm kind of a dinosaur standing up here. Everyone's here talking about ai. I remember when Microsoft was still suing Linux, right? I remember that. That was a while ago, right? We've come a long way since then. All right.

From Microsoft vs. Linux to Measuring Open Source Impact

I was also at GitHub when this happened. So we've come a long way since Microsoft was suing Linux. Now they're kind of a bastion for open source after the getup acquisition and kind of a model for how to do that in the industry as well. So I was really lucky to see that, and I was on a data team at the time, so I learned a lot about how to think about the impact, something of that amount of brand visibility and value has on your community. So we spent a lot of time thinking about measuring the impact of this acquisition.

See, what else have I done? Oh yeah, a little bit after that, I was at DBT labs, and then we built a developer hub. And actually this is one of the things that a previous speaker just talked about, AI search on docs. So we put docs and a bunch of other things all in one platform, all your community resources and put AI search over that. I was also streamli for a while running open source and go to market generally. We started to build a growth engine there, and now I'm at Snowflake. And what I do is Snowflake is a much bigger DevRel team than I think most people are used to. So we have an actual dedicated DevRel function. We have a community function that does all the events and things, and they do hundreds of events, tens of thousands of developers every year. And I have a fun job of thinking about the platform that underpins all of that. So think super big events. Think like annual conferences, biannual conferences, all of our developer channels, content resources, things like that. I have a programme management team, and we also have a bunch of folks who are either xev R or xg Growth Marketing who think about weaving all that into campaigns. So we've had the opportunity to grow and think and be a little bit more.

The Catharsis: Why Measuring DevRel Feels Wrong

Okay, so that's me all. This is the catharsis part of the talk. So let's agree that we all kind of hate this. We're talking about measurement of something that feels immeasurable and that feels wrong somehow. We're in the business of vibes. Vibes are the output of the work that we do. So how do you measure vibes? How do you go out and say, yeah, I got lots of great vibes and here's a number. People feel good after anything that I did, and that intrinsic value is really hard to put a finger on.

Quantifying Vibes: Lessons from AI Research

But I think there's an interesting parallel from AI research that I kind of want to leave you all with.

There's a really great paper, and I can tweet or link in the link if folks want, but it's at bench.mark.org/blog, and it's basically a paper that quantifies vibes for ai. So there's a bunch of researchers who figured out how to use AI to check other AI for vibes. We know that when we write and we produce ai, the thing that we do, vibe tests, vibe checks. Does it make sense? Does it say the thing that I want it to say? What's a vibe check? A vibe check is like, alright, this graphic was generated by AI in case you can't tell, look at this, look at this. What are these words?

But it's not accurate, but it passes the vibe. Check maybe for this event, at least I hope you think it passes the vibe check for this event. It's kind of getting at the feeling of what we're trying to communicate here. So that's kind of what this paper is trying to do, is trying to quantify and it's trying to use AI to check for and reproduce, evaluate those vibes. And I think, I'm not an AI researcher, so I'm going to dumb this down to my level. The way that I understand the work that they're doing is they're essentially using a panel of AI agents to do this. So basically talk to lots of AI agents and see if they agree, how much do they agree, how little do they agree, and that's kind of your data point. And then what do they agree on? So I think it's an interesting metaphor and it's kind of going to underpin a lot of what I talk about for the rest of the session because actually a lot of measuring vibes is about talking to people, is about talking to someone else, and then kind of stringing all those things together.

And the more you do that, the better the story that comes out of that. So we'll see a couple of examples in a bit, but that's kind of the idea. Am I making sense? Yeah. Okay, cool. All right.

AI did not make the slide. It's not that pretty.

Data Is More Than Numbers

So I told you this talk was about measurement, but I'm going to be up here and say data is more than numbers. And so there's a couple of things that I've learned in my career that I think are really useful to think about when we think about measurements. So I think a lot of times when people are like, Hey, yeah, okay, my leadership team is breathing down my neck. I need to go and measure a bunch of things. The first thing we do is let's go build a dashboard. Let's go define a thing that we do. Let's go measure some things. But the thing is that sometimes that's premature. Sometimes what you actually want is harder to measure. We just talked about. So there's kind of three things that I want y'all to think about as you're thinking about gathering more data, and that's not necessarily just numbers.

So first, you can actually make a lot of structure out of qualitative data, and that's a lot of what we do on a day-to-day basis. Did you host a user group or maybe you speak, you spoke at a user group, write down what you did, write down who you talked to, write down what the lessons were. Can you document an end number of conversations that you're having every week in a notion database or something? Point out some of those trends if you can. Some. Then you can make whatever structure you use discoverable. So if you have a database of a bunch of meetups that you went to, is it easy for other people to find it, right? If you're using Notion, put it into a database. If you're using Google Docs, cross-reference and cross link to those things. Make it so super easy to find those things.

And then, yes, I'm a dinosaur, so that's what I've done in my career, but now there's ai. So imagine you have all of this context, you've been writing all of this down, plug it into ask an l, LM to kind of summarise that for you. Suddenly you have some trends. Suddenly you have some interesting insights. The other thing you could do is there's a fine line between qualitative data and quantitative data when it comes to surveys. So we often want to know a lot of information about who we're talking to this morning, right? I'm like, who am I talking to in this audience? But it's a lot of work to put together a survey, especially if you're in a shoestring budget. If you have a team that's really small, you got a lot of things to do, you got a lot of people to get out in front of.

The thing is it's actually okay to do things that are smaller than that. You can run a discord survey. It doesn't have to take three quarters for you to plan. One of the most insightful things that I've ever done was I asked a question in a slack group, a really big slack group, and I still reference that question to this day. This was maybe four years ago. I really wanted to know at the time, at the time, I was running a data community at DBT labs and I really wanted to know who was in there and what people's reporting structures were. Who did data leaders report to? The answer really surprised me. In that community, people reported to engineering. I thought it would be like product. I thought it might be like a CIO or something like that, but it was actually engineering. It was like head exploding.

But that's been relevant till today. And so don't get stuck on big structure stuff. Just go out and try stuff. It's kind of point number one. Point number two, you can be a lot more proactive about your storytelling if the thing that you're worried about is people breathing down your neck. So lead the story rather than reacting to someone else's narrative. As much as you can ask for time and visibility to tell that story, you've been writing stuff down, right? Ask for time and visibility to tell that story before someone asks you for the ROI, right? Basically from day one, you get in there, you're like, all right, I'm going to go talk at you. You don't care about this. Great. I'm going to make you care about this. And then the last thing is think about aligning your narrative with the business. What is the one critical thing that represents what your users do?

If you're working for a SaaS company, for instance, maybe most of us are, is it some form of active user base, monthly active, weekly active really doesn't matter. Just pick one. Align yourself to that and think about and start watching that metric. How does it move over time? Are you looking at it at least weekly? Do you spot trends? Was there a spike after something? Can you figure out what it is? So becoming the expert on that puts you in a position to lead the conversation rather than trying to back into how the thing that I'm doing moves that number. If you're the person who understands how that number works. So, all right, let's see. You going to see, I think a lot of this over the day. I snooped on other people's slides. I think you're going to see a lot of variations of this.

This is one that I like because this is what I've done.

Bridging DevRel and Growth: Marketing, Product, and Community

But I think that ideally, if you're interested in thinking more about growth and leveraging DevRel as a growth engine, then you're kind of thinking somewhere in between these three circles. So what does that mean? Growth marketing. You're thinking about funnels, you're thinking about campaign tracking, you're thinking about attribution. I think you're going to hear a bunch about these things later on. But growth marketing is very often a kind of sales led or classic enterprise motion. And the thing that I'm learning, disclaimer, I'm at a big company named Snowflake. We have a really big, very effective, actually excellent sales motion. The thing that I'm learning is you don't need to be scared of that or try to distance yourself from that. It can actually be extremely complimentary if you work in a way that funnels that.

So for example, some stuff we're doing at Snowflake look like we've got a trial onboarding flow that drops people into a Salesforce database and then someone reaches out to them. But we can also augment the emails that people receive with really nice developer content. We can make that really pretty for folks. We can make that really accessible. We can personalise that a lot and create more leverage from the things that we've already done on the dev team. So that's kind of one example. All right, let's see. And then the other side of that is product led growth, right? PLG. So think about your sales stuff over there and then the product led growth stuff. So that's one kind of the product sells itself. When you're signing up and you're not really talking to a human and you're like, great, cool, amazing. Love this thing.

Here are your metrics that you're usually thinking about. You're thinking about in cohorts, you're thinking about in terms of activation, adoption, you're thinking about time to value. But where DevRel and product led growth motions intersect in really interesting ways, I think are helping the business refine the aha moment for developers who are adopting your product. Because you're there every day. You're seeing people kind of struggle. You're seeing the kind of questions people are asking, so what do you know about what causes people to go, oh my god, I love this stuff, and how can you help the business? Think about quantifying that. How can you help the business move people along that journey a little bit more, right? That's actually where you plug in. That's where you can show a lot of value.

Another example of how we've done this. Let's see. When I was at DBT labs, we actually updated the getting started tutorial. It was very opinionated, but it was opinionated a little bit too early in the journey. So we updated it and we kind of changed what the aha moment was based on what we understood from the community. We went from explaining for folks, I dunno how much context to go into. So we went from explaining what a comment table expression is and how to restructure your sequel to how to go and take your code into production essentially, because whatever your productizing is, the aha moment, when are you collaborating with people? That's the aha moment. When are you pulling other people into the process? That's your aha moment. So we updated our materials to reference that and surprise it led to better onboarding.

Then finally, community led growth. The other thing that you can pull from is, let's see, community-like growth is things like network effects. So enabling others to talk about you on the behalf of the company right here, you're thinking about depth of engagement. You're actually thinking about how to get more folks into this process. So you and DE are talking to people all the time who are extremely technical and maybe extremely engaged. So you're probably interfacing with the super users of the future of the company and you are in the position to bring them closer into the orbit. So even if you're not running a community team directly, maybe you're partnering with them, you're actually seeding or you can help seed those engines.

Measuring What Matters: A Practical Funnel Approach

Alright, all right. So let's see. We've got about five minutes left. That's good. Let's walk through's the meat of the stuff. What measuring, what matters.

All right, so there's four buckets here. Going to actually click through them since we don't have a lot of time. I also used AI for this slide. Can you tell what's happening there? It's trying it's best to stay in the same theme, but it's kind of devolving over time. I asked it to be pixelated and then it's kind of getting less and less pixelated over time. But hey, it actually did a pretty decent job. So this is a funnel. This is organised as a funnel. You probably recognise this, but I'm just going to tell you a couple of things about this funnel that I've learned that might be useful. So I think when people start to measure the impact of DevRel and the work that we do, we often start here at content and channel performance. That's the easiest one. It's the one that's most accessible to us.

It's the easiest to measure, but it's also the easiest to measure poorly. It's very tempting to say, ah, I did 20 talks this year. Isn't that awesome? Yeah, great. Then okay, what happened next? How many people did something with that information? Of those 20 talks, how many people showed up? How many people were engaged? What did you talk about? We did this thing this year at Summit, which is our annual snowflake conference. We put a QR code in our keynote to see whether people would actually click on it or scan it, and 35% of the people in the audience did It blew my mind, right? It's like thousands and thousands of people in the room. It's a one hour long session. It was a developer keynote and we're like, Hey, all the demos are on this QR code if you want them. 35% of people wanted them.

That was really cool. Now I can talk about that, right? All I did was put a QR code. So think about what people are actually doing with your content. That's really important. But also think about measuring humans, not hits, not pieces of content. What you're trying to do is you're trying to influence people. So I try to move metrics in the direction of humans. How many humans are interacting with the things that you're doing on a quarterly basis, on an annual basis? Also, can you compare channels with similar metrics? Can you design things in a way that allows you to cross-reference? And you don't have to be super accurate. It could be like, I don't know, sessions maybe is as good as you get on a website, but that's okay. It's better than page views. It starts to get at the same levels of reference for comparison. Let's see what else? Okay, so annum. Alright, I buy what you're saying, but what if I just don't have good telemetry? I can't really measure anything. Lots of pieces of the journey are poorly instrumented.

Can you talk to people? How often do you talk to people? Can you write down what you're talking about? That's kind of, I think the meat of what I'm trying to get at in this talk is don't get stuck on not having enough data. When you're talking to people, what do you know about how they got there? Ask them how they found out about your product. Ask them what they love about it. Ask them where they get stuck. Just get into the habit of asking the same question over and over again. That's your data and that's going to be really impactful data. Those those references, those things that you carry around with, you actually carry a lot of weight. Again, if you're being proactive.

And then we talked about community health already, and I think the biggest takeaway here for me is keep building the community that you have around the thing that you're doing and think about who are the people 12 months from now who are going to be taking over that community. The thing about communities and community health is that they work really well until everyone gets really burned out and goes home. So how do you make sure that doesn't happen? You actually have to start 12 months, 18 months out. You have to build that next generation of people. And so keep an eye on that. What is that number? Do you have a view of that number? Okay, alright, we talked about a lot. I promised you some scaffolding. So here it is. If you take away only three things today, let them be this, right? You can actually just start with one thing, pick a north star, run with it.

It's okay to be inefficient. You can adapt as you learn. Just pick a number and start learning about it. Start figuring out what works about it, what doesn't talk to people. If you haven't figured that out yet. I'm a big fan of talking to people. I'm a qualitative and a quantitative researcher. Can you tell? I really like talking to people, but also it's our job, so we might as well write it down and turn that into data and then drive the story. So share your wins early, often, ideally before people ask you for them. Consistent wins are the things that buy you capital to drive perception of the thing that you're doing. So stay in the driver's seat and good luck. That's all I got. Thanks.