On November 14, Ian Scott and I introduced the Analyst Value Index (slides here) to a select group of around a dozen of the world’s largest IT, telecoms and services firms. Scott, an expert social and market researcher who I’ve worked with on numerous projects in recent years, has combined the data from the Analyst Value Survey (expanded report here) into an index that shows which firms are delivering the most value to their clients. In a separate Trajectory study, he also shows which firms are rising fastest.
So far, those decks, and the AR Forum presentation of the results, have been read 4,858 times. That’s around twice the audience of last year’s survey, and it means that the survey is coming in for more questioning and criticism, which we really welcome since it forces us to be clearer in how we explain the research (and it helps us to understand that there a still a few people who get a bit baffled by percentages and have a little numerophobia).
The big difference this year is that my colleague Ian has been able to combine the data from the different questions to produce a single Value Index to show which of these firms are doing best on the variables that matter for clients. It’s a brilliant idea, and one I can’t claim for myself. The whole idea was plagiarised mercilessly from the Ventana Value Index, and we didn’t even ask permission. Sorry guys, but thank you.
This is what’s right about Ventana’s approach: Unlike extensive prose reports on software, of the type I co-authored at Ovum, the Value Index focussed on the benefits that most matter to customers, then on the quality with which the provider gives them, and then brings them together into a single index. It is clear and accessible. It’s based, as far as possible, on quantified data that integrate the key aspects that matter to users. It’s not paid for by the firms in the study. It is as impartial as possible.
All of those things are really useful, and we’ve tried to emulate that with the Analyst Value Survey. We’ve asked which services from analysts are valued the most, and we’re asked which firms are best at delivering each of those services. We also know that influence and independence matter to clients, so we ask about those. By producing the Index, we’re aiming to produce something which is simple enough to be grasped quickly.
These sort of indexes are not perfect for everything. If a hammer doesn’t make a good cup, that doesn’t mean that it’s not a good hammer. An index doesn’t remove the need for nuanced prose and that’s why Ventana and similar organisations produce other research and benchmarking tools.
So, a Value Index is useful but there’s more to life than an Index. Clearly, it’s very useful. Almost 5,000 downloads indicates a thirst for the research. I’ve learnt that the format of an Index has a much bigger impact than the comprehensive results. I’ve been running this survey since 2000, and now I’m hearing comments about the data from industry veterans who never noticed the survey before. That’s great.
Of course, an Index is not the be-all and end-all of research.
- First, for example, a Value Index doesn’t tell you which firm you should buy from. Just because SAP does well in a Ventana Value Index that doesn’t mean you should buy your laptops there: SAP doesn’t make them. And it doesn’t mean you should buy your financial software there either: there are lots of other strong providers. The same is true with our Index. It’s not a shopping list.
- Second, what does it mean to compare niche providers with powerhouse firms that work across silos? This, of course, is an issue that Ventana also has in its Value Index: IBM and Longview might have both felt unhappy when they ended up on the same Value Index for financial performance management. Ventana has a graceful solution: to segment the data between different markets. We’ve done that in the survey to a degree, comparing vendors against non-vendors. This year the number of respondents wasn’t big enough to segment IT from Line of Business but that’s what we could do next year to see how far the analyst firms have different audiences. I’d also love to compare analyst value in the US with the rest of the world.
- Third, the number of respondents itself is a weakness. It represents the opinions of 352 people. Okay, in our opinion that’s an modestly large number. It’s certainly an advance over KCG’s fun little Mystical Box which reflects the opinions of two people and never pretended to be more than a generic ‘stalking horse’. Even with 352 people we had to make some choices about which firms to chart, and we decided that firms commented on by fewer than 40 people would not get into our study.
- Fourth, the Index creates new questions for each one it seems to answer. In the short presentation to the AR Forum the most controversial part of the deck was in the data about analyst influence on investors, where it’s clear that public and private companies are facing very different influencers.
We are hugely grateful to all the people who are asking questions about the survey and the Index. We’re not pretending it’s perfect, although after such a long time we’re grown to accept its strengths and weaknesses. The IIAR’s LinkedIn group had a great discussion about it. Keep the comments coming.
Ian and I are now preparing custom analysis to help analyst firms understand how users see their value proposition, and to benchmark against other analyst firms. Let us know if you’d like such an analysis, and check out the Value Index slides below.