The world of developer evangelism metrics can be described as squishy at best. There are talks and videos of conference sessions that deep dive into just this very thing. Few conclusions go beyond âthis is very difficult to measureâ and âvery few orgs do this well.â There are tools that try and help and can provide some metrics but rarely a cohesive view one can trust. Blog posts abound of strategies to measure developer engagement and awareness.
After all, how can one equate talking at a conference or meetup to revenue of a product? The path of that funnel is full of gross assumptions. Trying to show awareness online to adoption of a product can be very tricky to track. Even better how can one point to qualitative or quantitative data to represent the health and happiness of a developer community or usefulness of an open-source product?
Great. Freaking. Questions.
And yet, metrics and analytics are incredibly important. They let you know youâve been successful or where you need to improve. Data helps better decisions get made. Developer Advocacy metrics might be difficult, but that doesnât make them any less important.
However, one metric should never be considered a developer success measure: GitHub stars.
You know. These things on the top right of a repoâs page:
If youâre lucky enough to be working on an open-source project and have your code on a public GitHub there are far better metrics to use. To me stars are nothing more than a quaint data point to be used in conjunction with other metrics to provide a snapshot of a community and product usage. Itâs rather striking that people put so much weight on this metric so I wanted to give my viewpoint of this overhyped number.
Hereâs why one should tread with caution with this metric.
GitHub stars can be gamified
Say for instance GitHub stars are part of your KPIs and OKRs. What activities do you do to meet these goals? Thatâs right. You work hard to just get devs to hit the little star button. Not use the product heaven forbid. Not provide feedback or (gasp!) contribute. Just, âHeyyy developer Iâll give you a t-shirt if you wouldnât mind just go click that thing and then forget about us KTHXBAIIIIII.â
GitHub stars do nothing except show a faint interest
Hey I have a question for you: When was the last time YOU personally went and looked at your GitHub stars? Mmmhmmm. Probably never. Or a length of time that might equate to near never. Or at an interval that equates to a distance of light years.
Most of the time devs will just search for the repo they want if thereâs something they need to use or reference. GitHub stars can be nothing more than a âI shall add this to a list of which I may or may not make reference to again in my lifetime.â They donât show adoption. They donât show usage. They donât show anything.
GitHub stars only go up
If you look at the trends in repo stars over time every single one of them goes up. Some go up at a very sharp slope. Some are very slow and steady over time. Zero of them have any sort of up or down. There are no âbad weeksâ in the GitHub star world. Only âslow weeks.â
Being able to follow trendsâ both up and downâ over time gives far more insight into effects of activities and impacts of work. One more reason GitHub stars shouldnât be your main priority.
Comparing stars to spaceships
Say youâve been working on a super cool side project that helps people learn Node.js and want to make it open-source. First: good on you! Second: why would you ever compare yourself to an open-source project thatâs created by a company that focuses on containers in Go. I have no idea. That would be like comparing apples and rhinoceroses (plural of rhinoceros?). Preposterous, yet people do it if you can believe it.
Some repos start with lots of stars
I couldnât find any explanation for this, but some repos start out with a ton of stars. To be more explicit their day 1 starts at something greater than 0. These are repoâs like Kubernetes which came from an internal project before being externalizedâ maybe the stars originated internally first? Thatâs the only explanation I could surmise to why this happens. Again, if there are variables like this that influence stars then might be wise to not give a lot of merit to them.
</end rant>
Now that weâve gotten that out of the way letâs not go throwing the baby out with the bathwater (sidebar: was this ever a thing???). Here are wonderful GitHub metrics that DO provide insight:
Contributions
Ok mosssssst of the time contributions come from the people hired by the company that supports the project. That makes like total sense. HOWEVER. In the instances where thereâs an outside contributor that contributes?! HUZZAH! Success, my friend! Thatâs a great indicator that your developer community is well on its way.
Forks/Clones/Downloads
I group all of these together because to me they show that someone wants to use your product. Their intention is to either improve it, use it, contribute to it, or somehow work with it. Thereâs an air of commitment to these actions. Thatâs a win. Capturing data over time for these will show patterns and be able to provide a more clear view into usage.
Pull request
Someone has actually gone out, used your product, and then⌠the clouds open and rays of sunshine pour down upon you⌠they submit a pull request with changes or contributions. Print this out. Put it on the fridge. Call your folks and let them know. This is kind of a big deal. Someone has contributed their time, energy, and brainpower. To YOU. Your next step is to bake them cookies and send them trophies.
Activity
GitHub provides some pretty great metrics under Insights. Couple this with Google Analytics on the page and thereâs a decent snapshot of where your users are coming from, whoâs contributing and when, and the top people in your community for freeeeee.
There are actually so many ways to create a picture of your community success. Any action that implies intention or sentiment (both positive and negative!) can be used to show the success of your project or areas to improve. These could be things like:
- Number of messages (both public and private) sent a week in your communities Slack.
This to me could signal actual community engagement and people using your community as a hub. It could also show that some people are just super chatty. - Number of engagements on social media (retweets, likes, etc.) along with click-throughs to content.
One could suppose that developers find the content engaging and interesting. Très Bien. Alternatively it could mean that itâs utterly laughable. - HackerNews comments.
Again both positive (or in this case mostly negative. đ) sentiment helps. - A community member helping another community member.
This to me shows signs of organic growth and an actual community. - A conversation with a developer that can be used to provide insight into pricing strategies, competitors, ease of use, problems with documentation, etc.
Qualitative data FTW. Being able to point to a user as an example is powerful. - If you can gather specific number of API calls or even who makes them then thatâs great.
This is definitely the Holy Grail of metrics and really shows how people actually use your product.
Each of the above isnât in itself correlated with a successful product. However, creative ways to show engagement helps.
Developer communities are not one single number to rule them all
Like most things open-source products are measured as a broad spectrum of metrics both quantitative and qualitative. Monthly active or daily active users are probably your most solid metric. After that trying to nail down the funnel while providing feedback to improve the product is top priority.
Regardless of your developer evangelism strategy, be wary of GitHub stars as anything other than a vanity metric or the Monopoly money of open-source metrics and be sure to include them in a cascade of metrics providing a snapshot into the success of your product.