How we do our Analyst Index rankings

Showing results is one of the key challenges faces analyst relations managers. Some AR managers only track ‘inputs’ (how much activity they are doing) but most companies focus on ‘outputs’, the shift in analyst behaviour that they aim to influence.

Our Analyst Index is one of the simplest forms of measuring outputs: it’s share of voice research. As we’ve noted, it’s one of the topics that AR managers are most interested in this year.

Share of voice metrics are very simple: they show how prominently vendors are being mentioned in research. They calculate each individual vendor’s prominance as a percentage of the total, and then rank the data accordingly. It’s a simple enough measure that even the busiest boss can understand it, and it’s less dangerous than the traditional approach.

We’re not the only people to do studies like this, of course. We are very happy to see that a US competitor is following in providing a similar service. We think our approach is better through, for five reasons.

  1. We put more weight on research from the largest and important analyst firms, which reduces the risk of volatile bloggers and analysts-for-hire distorting the data.
  2. We look across a wide global range of analyst firms. Many analyst firms give us access to their research and, with a large majority of them, we are the only firm of our type with access their research.
  3. We also cover a wide range of technology providers; the Index sees how often hundreds of technology firms are mentioned. [While our Analyst Index ‘cuts off’ after the top 75 firms in each of our four segments, we are collecting data on more].
  4. Ours also has the merit of being the oldest, most widely used and the only one that’s free in its most basic form (although we sell custom analyses).
  5. We smooth the data a little with a statistical technique that’s rather like a running weighted average. We’ve found it quite important to do a bit of work to avoid the data being distorted by peaks. The ranking actually does include some of the weighting from the previous month, to smooth the time series. That works pretty well, as one can see from Dell’s relative stability in our Index through the Sony battery problems it had. It also gets around the risk of quarterly financial annoucements having a major impact. Generally firms tend to end their quarters in the same months. Almost all close in months 3, 6, 9 and 12. So these peaks end to syncronise and cancel each other out. Few firms do what Dell does, closing quarters on another cycle. Since so few analysts are driven by quarterlies, I can’t see this would produce any significant shift in the results [however, using a running average would reduce even that risk: for more about this read why you should not be scared of statistics].

Of course, the are limits to every kind of research. An important division exists between the AR pereformance of publicly-listed and privately-owned firms: since public firms often release more information than private firms, they are written about more through the whole year.

One way to get more out of these data is to weight the information according to how importan each firm is to you. Our Analyst Mindshare service takes that a bit further by weighting each firm according to its impact on sales in the market that interests the client. I’ve also mentioned how Analyst Track, another Lighthouse service, helps to control and manage reputational risk. It takes the same data and uses a combination of human readers and data mining approaches to show the tonality in the research.

A lot of analyst relations managers are very unsure about whether or not AR can be measured in any way. What we’ve found is that it’s only AR programmes that measure their progress which are able to defend and extend their resources.

P.S. One further feature of our Indexes should also be mentioned: we continue to track brands that have been retired. If a company changes its name, replaces a brand, merges or otherwise stops using a brand, we continue to track the rate at which that old barnd name stops being written about. Some brands fall away quickly, like PeopleSoft. Others live for a long time, like MCI. Either way, we would be removing useful information about the ‘half life’ of brands without that information.

P.P.S. I’ve added a few words [in square brackets] to the numbered points above.

More from Duncan Chapple
Fersht: some IIAR award-winners “just tick the boxes”
Some of the firms mentioned by the IIAR’s analyst team awards fall...
Read More
23 replies on “How we do our Analyst Index rankings”

Comments are closed.