Categories
Book DevOps

Reviewed: the Phoenix Project

A while back, a friend of mine tipped me off to this book, and said it was a book I should read. Here are my thoughts.

Categories
ITIL

Understanding Incidents: Urgency, Impact and Priority

It is part of the nature of IT service and support that you will, from time to time, be called upon to handle a high priority, high urgency incident. In most production systems these are mercifully rare, but it is still important that you understand how to identify them.

Categories
Career Opinion

A multi-disciplinary approach to career-building in IT

In the IT world, there are a lot of different paths to take. Mine, though conventional enough in its beginnings, is increasingly becoming unconventional. I believe this is a good and important thing. Here, then, is my approach to my own career.

Categories
ITIL

ITIL adoption numbers – a critical review of one interpretation

A while back, one of my fellow students questioned to what extent ITIL has been adopted outside of the UK. He cited a source, which claimed that the adoption was very low. This assertion was based on a single statistic; the Global TSO book sales figures. The original blog post can be found here.
 
I immediately wondered at the truth of the assertion, as I know that ITIL has been widely adopted in Norway. I decided to perform a PROMPT analysis:
 
Presentation:
 
The statistics are presented in a clear and concise way. However, there is no information about what metric is used, and whether it corrects for anything at all.
 
Relevance:
 
There is reason to question the relevance of the statistics cited, for a few reasons:
 
It measures a single metric; total sales of books about ITIL from a single publisher
The data appear to be presented as percentage of total sales, and do not appear to correct for the size of the different markets, relative to each other
The data do not take into account that there are several publishers of literature about ITIL
 
Objectivity:
 
The language is neutral and measured. I find no issue with objectivity.
 
Method:
 
There is reason to question the method. There is no information about how the data was collected, which means that we cannot verify whether the methodology was appropriate. In addition, we cannot know whether the data is representative, which I find that there is reason to question, as the statistics only cite sales of one publisher, and does not appear to correct for relative market size. There is also the possibility that the sales listed as UK also includes UK vendors shipping out of the UK. Lastly, there is no information about whether the information is print-only, electronic-only or print and electronic sales.
 
Provenance:
 
There is reason to question the provenance. The author of the article itself is clearly identified, but that’s where provenance stops. The author does cite a source (ITSMF UK), but he does not give a citation for the actual stats, which means that we cannot verify the numbers or the metrics they represent. We cannot identify the authors of the information, nor verify how they were published.
 
Timeliness:
 
There is reason to question the timeliness. There is no information about when the data was published. However, the information does cite what periods it relates to, and the information appears to be recent enough to still be relevant. Still, more up to date information seems likely to be available.
 
In conclusion, the blog post clearly fails five out of the six PROMPT criteria. In particular, I would argue that book sales over a relatively short period of time is a poor metric for how much or little impact any given technology has had in any market, as printed books are not the only way to learn about a given technology.