Tag Archives: Metrics

Intranet Bounce Rate

Of late I’ve been spending some time looking at the Intranet Bounce Rate on an enterprise social media project I’m working on for a large multinational. And by Bounce Rate, (rather than Intranet Bounce Rate), I’ll take the definition found on Wikipedia today:

It essentially represents the percentage of initial visitors to a site who “bounce” away to a different site, rather than continue on to other pages within the same site.
The formula used to calculate bounce rate is: Bounce Rate = Total Number of Visits Viewing One Page ÷ Total Number of Visits

The metrics produced by Google Analytics look quite good to me, at least bythe usual  industry standards:

Bounce Rate

As the Wikipedia article cites, this is very good indeed:

Google.com analytics specialist Avinash Kaushik has stated:
“It is really hard to get a bounce rate under 20%, anything over 35% is cause for concern, 50% (above) is worrying.”

But is this good for an intranet bounce rate, or enterprise social network site? A high bounce rate on a large corporate intranet might mean that users are happiest when they bounce away quickly as they’ve found what they want. Here high Bounce Rate = Good? On an enterprise social network site, well what does intranet bounce rate really mean?

Both Bing and Google offer nothing on this that I could see. Indeed when I search for ‘Intranet Bounce Rate’ on Google, it kindly asks – ‘Did you mean Internet? ‘!!

p.s. One interesting point – Saturdays generate the high spikes.  Why?

p.p.s. Some excellent resources from my old colleague at Derby Uni, Dr Dave Chaffey to mull on. Bounce rates in Web design articles

The sweet data of Enterprise 2.0 success?

I for one, have thoroughly enjoyed the argy bargy over why Enterprise 2.0 projects fail and whether it’s a crock or not, plus the risks of not doing anything at all. What’s absent from all this is that interminable and incorrigible bogeyman, Mr. Hard Data, who cries. “Show me the money, or lack thereof.”

The debate is about analysis and perception, but what if (as some actually has done) someone says well actually yes, I’d like to see the money, or at least some real hard data. Anecdotes are not enough and while there might be shrill cries if the whole thang were turned off “may be valid, but a bit cute and begs question of against what.”

So let’s step back and look at what we might be aiming for. Here’s Mike Gotta on the imperatives:

•  We need to connect people globally.

•  We need to address generational shifts.

•  We need to break down barriers.

•  We need to “know what we know.”

•  We need to collaborate better.

•  We need to innovate from the bottom up.

•  We need to learn differently.

How can we apply hard data to any of these? Can we apply indices or benchmarks on collaboration and innovation such as the sociality of the networks and new widgets leaving the factory gates as a result. Maybe, but where are the numbers, and real numbers at that, not internal anecdotal ones?

Now I recall on a previous project discussing the absence of decent metrics for a wide range of comms tools in an organisation that ranged from none to completely pukka to hideously complex. The task of getting anything meaningful from the morass seemed a complete conundrum.

I discussed this with someone who had no insight into the specifics of online comms tools but a great deal to say on marketing comms and metrics. Her point was simple in the absence of decent metrics or data, don’t give up or try to boil the ocean, but start from somewhere, even if it doesn’t even begin to really address the issues.

With that aim in mind I’m going to start to throw some ideas into the pan in my next blogs on Enterprise 2.0 success factors and metrics.

Dumb down, deeper and down?

Quick on the tail of the, erm Long Tail, The Register now takes a new pot shot at Malcom Gladwell, author of Tipping Point: “The dumb, dumb world of Malcolm Gladwell. Subtitled “A guru for the brain dead”, Andrew Orlowski’s article is a stiff piece of polemic, while also taking nifty side-wipes at Britain’s liberal intelligentsia (there must be a neologism lurking there) and at Corporateville, identifying what he calls the ‘Vertical Marketing Bureaucrat’. The common factor between these bureaucrats and their counterpart in the Public Sector is for Orlowski, their fondness for monitoring and measurement.

Now maybe I’m a bit too biased, but I’m not so sure that measurement itself is a bad thang. Vertical marketing is like horizontal collaboration, it very much depends on the depends – get it wrong and it’s tars and feather, get it right and it’s stars and stripes. In terms of measurement itself, the key factor is what’s measured. A common misery here (or at least as we’re told by those that lament as such), is that the measurements desired by New Labour Bureaucracies are simply not quite right – the targets are not bull’s eyes but emanate from the other end of the bovine equation.

But getting back to Orlowski’s piece, I wonder if what we’re seeing here is the start of a trend to demand the hard facts – in God we trust and all that. For long the Environmentalist lobby has proclaimed the supremacy of hard data against those that doubt their prognosis (too much to close down the chatter some say). Are we though seeing others begin to demand the same scrutiny? The Register is on a mission and then we have Ben Goldacre’s “Bad Science” hitting the window-displays for Xmas, plus “Street Science” a new series from Radio 4 that promises/threatens to look at GM crops and Human-Animal hybrid research with an open eye, or should that be ear?

Good news for Science perchance. Except that is that Gladwell says his work isn’t even that:

I like to think of it as an intellectual adventure story. […] I think it will appeal to anyone who wants to understand the world around them in a different way. I think it can give the reader an advantage–a new set of tools. Of course, I also think they’ll be in for a very fun ride.

Dash it, so we’ve been taken for a ride and we’re just back to storytelling after all…