Some may remember the reaction by many IT service management professionals to the OGC’s endorsement of
Ken Turbitts software assessment scheme in March 2009.
One of the most hilarious parts of making a prescriptive set of tool requirements out of a set of non-prescriptive best practices was that the criteria of the assessment were kept a secret. So the only thing such a badge would tell a prospective buyer is: This company paid money to receive this badge of honour and they do have a few clients that are not afraid of publicly stating that they use this product.
Right now this has changed, since at least the mandatory part of the assessment scheme is publicly available at the itil-officialsite (Thanks again to the itskeptic for alerting me to it). It does not state if there are other criteria lists used, so I am going to assume that this is all there is.
I did not have time to scan through all of the 22 listed processes yet, but just reading the first (incident management, staggering 25 questions that need to be in the tool, automated & documented), gave me an urge to write this blog post. If this list is all there is to making a proper “ITIL-Tool”, than I wonder why the vendors charge so much for them.
criteria criterion is described in the fine detail of a maximum of 2 sentences or 3 bullets in a list, making the total list of requirements less than 2,5 pages long (changing the layout a bit would fit all on one single page). Questions like
Does the Incident record contain a field or fields to relate a CI record(s) to the Incident?
Does the Incident record contain a field or field(s) to assign an initial incident priority according to pre-established or manually overridden conditions? (CI type, Business Services impacted, level of service disruption, security breach, Service Request)?
do not provoke confidence about the quality of this assessment. It ommits a huge amount of requirements you would list for a tool to be usable and it also omits a huge amount of questions about following proper ITIL guidance (e.g. no mention of major incidents, of urgency and impact, recovery within incidents, links to problem- or change management and many other things).
Sorry, but I have to cry bullshit once again. Nobody in his right mind with at least a grain of experience in assessing software or implementing IT service management processes will ever use such a useless list of questions. So as expected in the previous series of this post, it is just a
clever simple plot to wrangle some more cash out of the ITIL market without producing anything more substantial than 2 sheets of paper.
You may say that there is no real harm done, just a few bucks taken from the software vendors, so what? Next time I am sitting with a customer asking me why we need all this fuzz for implementing some simple processes that the tool he just purchased brings along out of the box, you ask me again. With this lousy scheme my customer can even say: Hey it has the formal approval of the “owners of ITIL”, so this must be the rigth way.
I will now go and “develop” my Excel based tool for accreditation, dirctly after a short period of proper mourning for the IT service management practice.