Source: A New Model for Social Media (and Traditional) Measurement by Don Bartholomew
I'm noodling around with a couple of ideas so this post does a lot musing ..
A few days ago, I asked "What are the best metrics to track your blog's ROI and make improvements?" I've been trying think through a benchmarking process that would identify metrics to track, how to track, and how to reflect on the data to make improvements in your blog.
Laura Lee Dooley's blog post "The Social Media Metrics Lottery" pointed to and summarized Don Bartholomew's thoughts on engagement as well as other posts on blogging ROI from the corporate sector. Laura goes on to share how she, as social media strategist for a nonprofit, thinks about it. What happens after a nonprofit donor or stakeholder is satisfied with the engagement.
The areas of the blogging ROI analysis included: author contribution, readership growth, reader engagement, authority, cost, and value. Reader engagement consists of metrics for:
- Conversation (commenting)
- Reader Sharing (bookmarked items)
I pulled these metrics from a tool called Postrank that uses a model called the "5 C's of Engagement" in some email back and forth with Melanie Baker, the community manager for Post Rank, Sue Waters and I asked her if she could unpack the 5s a bit further and tell us a little bit about how it worked. Here's what she shared:
Additionally, PostRanks are calculated one of two ways, either comparing a site's content against its own past performance only (feed-based PostRank). Our website shows feed-based, folder view in Google Reader with our extension installed shows thematic, as an example -- thematic is comparing the posts of all feeds you've added to a specific folder against each other. So in that case you can compare TechCrunch to Mashable, for example, if you wanted to, but with feed-based you're not being ranked against any sites/posts but your own. So you'll never get really low PostRank scores because your blog doesn't get hundreds of comments per post and millions of pageviews like TechCrunch does.
Now feed-based PostRank doesn't rank new posts against ALL past posts back to the beginning of your blog, but it analyzes back a ways based on time frame and posting frequency (among other things). Analyzing on more than one basis prevents rankings being skewed for publishers who only post once a month, for example, as opposed to those who post 10 times a day.
Once a feed is in our system, we regularly check for new posts that have been published, and start gathering the engagement metrics for the new post as they start showing up. We also check for metrics for existing posts for a set length of time. (We've done analysis that shows a fairly standard "engagement curve" for when and how engagement metrics show up for any post.) When a post is first added to our system after being published, it has a PostRank of 1.0, since it won't have any engagement yet. And as readers start to respond -- commenting, tweeting, bookmarking, etc., the PostRank score will go up over time to a maximum of 10.
Because feed-based PostRank analyzes your posts based on your own previous posts, it's possible to have a post ranked 10 on one blog with, say, 5 comments, 20 pageviews, and 6 bookmarks, if you don't usually get that much engagement, but have a post rank 10 on another blog with 200 comments, 10,000 pageviews, and 300 bookmarks if that's high for that blog's engagement, but not freakishly high.
Should also note, re. freakishly high, we do analyze for things like the digg/slashdot effect, so if one post in someone's blog gets a really unusually high amount of engagement compared to the norm, we don't allow that to completely skew the analysis afterwards, e.g. "dooming" the next bunch of posts to a PostRank of 1.0 because they're nowhere near what just one post got.
Hopefully that explains the fundamentals of how the analysis works. There's also some info on the differences that appear in PostRank scores for posts depending if they show up on the website or the widget (relates to feed-based vs. thematic PostRank), but I'm working on a blog post for that, so will send you the link when it's up.
I usually don't get into the carpet fibers about how different analytics tools do the measuring because I don't find it as interesting as the conceptual model AND of course reflecting on what it means for my blog. But, this is the type of question people will ask and sometimes it is because they want to challenge the validity of the data. (I love this post by Avinash Kaushik, "Data Quality Sucks, Let's Get Over It" that teachers you art of linking data to action.)
One of the interesting points for me in learning how the tools works is the "thematic" ranking - comparing feeds from different blogs to one another.
In the original post, I asked a couple of questions:
- Is there any value or meaning to looking at traffic trends via page views?
- How do you understand the impact of using Twitter to share your blog post links or if other people Re Tweet or share them?
- Is there a formula or set of sharper reflection questions?
- I'm doing this an individual, how would you use an analysis like this to help with planning or making the case for social media (blogs) to your executive director?
- What are tools or techniques are there to collect data, summarize it, or reflect that are efficient?
- How do you use qualitative information and perhaps survey data from readers effectively? Do you need it?
Kynam Doan left a comment suggesting adding a metric that measures your blog's performance in relation to others in your market space. Alan Benamer has indexed nonprofit web sites based on compete rankings and looking at readership. The one thing I don't like about this sort of analysis is that it encourages competition. I'd like to do this type of analysis to get a sense of what the industry average is for commenting on nonprofit blogs - so you could set some goals.
What do you think?
Beth:
Maybe I missed it, but it seems that the one key metric that wasn't covered has to do with the original purpose of the blog in the first place. Tracking engagement metrics such as backlinks (google), trackbacks, comments, burns, etc., are akin to tracking registrations and attendance at a conference or workshop. Sure, it tells you who showed up and how people engaged with the content, but what about the purpose -- the mission?
I think the metric needs to be tied to the purpose in order for there to be true value in the data collected. Otherwise, how is an author supposed to know what to do with the fact that they had 4 comments on a post, or that 7 trackbacks have been logged? The data itself is useless until the author links value to it.
Unfortunately, there are many bloggers who have begun blogging with no purpose other than to get out into the blogosphere and figure it out as they go. They spin their wheels and can't understand why they aren't getting any traction. They worry and fret over blog stats trying to figure out if there is any ROI in their efforts.
Maybe we should all just focus on one metric to start off with -- ROR -- return on relevancy. Let's measure that to ensure what we say is of value and importance, that we're actually contributing rather than making noise.
-- David Kinard, PCM
Posted by: David Kinard | January 08, 2009 at 03:45 PM
Measuring website traffic is VERY different from measuring social media success. I continue to come back to what Jason Falls wrote -
"The problem with trying to determine ROI for social media is you are trying to put numeric quantities around human interactions and conversations, which are not quantifiable." (http://www.socialmediaexplorer.com/2008/10/28/what-is-the-roi-for-social-media/)
I also agree with David Kinard - The purpose of social media, in my mind, is to participate with a community. But that community should have a goal/purpose/reason to be, so it matters who is part of the community, what they are saying/doing, and how they are connecting.
On twitter, on facebook, on linkedin, the numbers matter. But they don't capture the human interactions or whether you are meeting your purpose (which, by the way, may change over time or may be multi-faceted) - unless your purpose is to get a certain number of followers, friends, fans, or connections.
For example, how do you measure when I connected with the son of someone I used to work with 20 years ago who works for a group that offers a great free environmental news service that I now use on a regular basis to find stories I don't get through my Google Alerts or digg.com. And that we have created a facebook group - http://www.facebook.com/group.php?gid=38979624830&ref=ts - that is encouraging people to decorate beach balls with an earth theme to take to the mall on Inauguration Day to remind people of the importance of environmental issues for/to the Obama administration.
Is this type of social media connection the norm or beyond the standard? How do you measure it? This connection is an excellent one that I could not have defined by purpose and cannot measure numerically.
So, you can throw a bunch of numbers that highlight growth in engagement and interaction - but understand they are only part of what defines social media success.
BTW - I am really loving the tool http://tweetake.com/ which exists for the sole purpose to provide you with a backup of all your Twitter data. Data is backed up via a .csv file which you can download. This is another great tool for analyzing data about your twitter connections.
Posted by: Laura Lee Dooley | January 08, 2009 at 06:14 PM
Thanks for sharing useful ideas. Nice blog to read.
Posted by: seo company | January 09, 2009 at 01:53 AM
I am also of the mindset that the purpose of social media is to participate with a community. This is especially the case in the non-profit space. Individuals that participate regularly are more likely to become evangelists for their cause, therefore recruiting new members. How might this metric be measured?
One other thought on this topic. What if your organization's ultimate goal is to influence offline behavior in some way. You would quite possibly need to measure metrics beyond your own web presence to determine this impact.
Posted by: Kate Talbot | January 10, 2009 at 06:54 PM