Moviri blog

A Performance Optimization Maturity Model

As stated by Gartner Research: “Since 2005, the average amount of unplanned downtime on mission-critical applications has increased 56%. In large organizations, those with more than 2,500 users, average unplanned downtime has increased 69%.” (Gartner, Dataquest Insight: Unplanned Downtime Rising for Mission-Critical Applications, Ron Silliman, Oct. 3, 2008, Document ID G00161629)

In our experience, the most affordable way to address this issue is to follow a maturity model based on a pragmatic, staged approach that mitigates the risks impacting process development and organizational systems.

Maturity Level 0 – Firefighting

At this stage, there is a complete lack of any process and management has not even recognized that there is an issue to be addressed. Day-by-day emergencies are the driver and firefighting is the only approach to solve problems.

Maturity Level 1 – Single Customer / Application

At this stage, the most critical customer or application is selected because management has evidence that an issue needs to be addressed. However, there is no standardized process yet and ad-hoc approaches tend to be applied on a case-by-case basis.

Some of the key indicators at this level are:

  • No maturity goals are defined
  • Reliance on the knowledge of individuals
  • Testing is used just to demonstrate that the system works
  • No effort is performed to track the progress

Maturity Level 2 – Monitoring & Testing

At this stage, more than one core-business application is monitored with different vertical tools used by each department (Networking, DBAs, Sysadmins, etc.) and some procedures are put in place by different people undertaking the same task. Management has started initiative(s) aimed to define the common foundation needed in the next level.

Some of the key indicators at this level are:

  • Several tools could be used by different departments
  • Lack of a centralized monitoring solution
  • Infrastructure and/or end-to-end monitoring are in place
  • Basic testing techniques and methods are institutionalized
  • No formal training or communication of standardized procedures

Maturity Level 3 – Performance Optimizing

At this stage, some formal procedures have been defined and communicated and a software testing group is established. Management has clearly defined maturity goals and standard tools have been selected. A centralized monitoring solution has been adopted in order to integrate the previous “silo” approach. Typically, the organization is transforming by building up new departments or teams dedicated to testing and monitoring.

Some of the key indicators at this level are:

  • Basic procedures are in place, but they are the formalization of existing practices
  • It is left to the individual to follow these procedures
  • A centralized monitoring solution has been adopted
  • A training program is established

Maturity Level 4 – Business Optimization

At this stage, it is possible to monitor and measure compliance with procedures and to take corrective actions where the process is not working effectively. A process-oriented culture is emerging and processes are under constant improvement.

Some of the key indicators at this level are:

  • Procedures are institutionalized
  • An organization-wide review program is established
  • The testing process is part of Software Development Lifecycle

Maturity Level 5 – Process Optimization

At this stage, processes have been refined to best practices, based on the results of continuous improvement. Information Technology is able to improve quality and effectiveness, making quick to adapt to business changes.

Some of the key indicators at this level are:

  • Statistical quality control methods are used
  • Processes for defect prevention are in place
  • Organization is using Information Technology to achieve business goals


  1. Scott Barber:

    Ok, but what’s the point. The only “use” I can envision is as a gimmick for consultants to “get a foot in the door” by providing a “maturity assessment” as a means to get paid to qualify clients.

    I mean, I hope that’s not the point, but I’m having trouble seeing the value.

    Scott Barber
    System Performance Strategist & CTO, PerfTestPlus, Inc.
    Co-Author, Performance Testing Guidance for Web Applications

    “If you can see it in your mind…
    you will find it in your life.”

  2. David Furnari:

    Hello Scott,
    thank you for your comment that gives me the opportunity to clarify some points:

    (a) Performance Management topic is seldom detailed in frameworks like ITIL, CobiT, TMMi, eTOM, etc. The idea is providing an “itinerary” with some KPIs to let customers have a self-assessment before a third-party engagement.

    (b) The post should emphasize that organizational and process subjects are the most important when approaching to Performance Management. Often, I see that just tools and bare technical knowledge are taken into account.

    (c) Besides, our mission, as member of itSMF and CMG, is to improve client’s awareness on Performance Management topics.

    Best regards,

  3. Performance Testing: The importance of (not) being a factory | Moviri:

    […] no performance regression is occurring from a previous baseline or expected service levels. A previous article from our colleague David Furnari showed the importance of developing a maturity model aimed to […]