Ñòóäîïåäèÿ

ÊÀÒÅÃÎÐÈÈ:

ÀñòðîíîìèÿÁèîëîãèÿÃåîãðàôèÿÄðóãèå ÿçûêèÄðóãîåÈíôîðìàòèêàÈñòîðèÿÊóëüòóðàËèòåðàòóðàËîãèêàÌàòåìàòèêàÌåäèöèíàÌåõàíèêàÎáðàçîâàíèåÎõðàíà òðóäàÏåäàãîãèêàÏîëèòèêàÏðàâîÏñèõîëîãèÿÐèòîðèêàÑîöèîëîãèÿÑïîðòÑòðîèòåëüñòâîÒåõíîëîãèÿÔèçèêàÔèëîñîôèÿÔèíàíñûÕèìèÿ×åð÷åíèåÝêîëîãèÿÝêîíîìèêàÝëåêòðîíèêà


Tying It All Together




Now that you have been introduced to the components of CMMI models, you need to understand how they fit together to meet your process improvement needs. This chapter introduces the concept of levels and shows how the process areas are organized and used. It also discusses some of the key concepts that are significant for applying a CMMI model in the context of service-related work.

CMMI-SVC does not specify that a project or organization must follow a particular process flow or that a certain number of services be delivered per day or specific performance targets be achieved—only that they have processes in place for adequately addressing service-related practices. To determine whether this is so, a project or organization maps its processes to the process areas contained in this document.

The mapping of processes to process areas enables service providers to track their progress against the CMMI-SVC model as they implement processes. It is not intended that every process area of the CMMI-SVC will map one to one with a given organization’s or project’s processes.

Understanding Levels

Levels are used in CMMI to describe an evolutionary path recommended for an organization that wants to improve the processes it uses to provide services. Levels can also be the outcome of the rating activity in appraisals.[4] Appraisals can apply to entire companies or to smaller groups such as a small group of projects or a division in a company.

CMMI supports two improvement paths using levels. One path enables organizations to incrementally improve processes corresponding to an individual process area (or process areas) selected by the organization. The other path enables organizations to improve a set of related processes by incrementally addressing successive sets of process areas.

These two improvement paths are associated with the two types of levels, capability levels and maturity levels. These levels correspond to two approaches to process improvement called representations. The two representations are continuous and staged. The continuous representation has capability levels. The staged representation has maturity levels.

Regardless of which representation you select, the level concept is the same. Levels characterize improvement from an ill-defined state to a state that uses quantitative information to determine and manage improvements that are needed to meet an organization’s business objectives.

To reach a particular level, an organization must satisfy all of the appropriate goals of the process area or set of process areas that are targeted for improvement, regardless of whether it is a capability or a maturity level.

Both representations provide ways to implement process improvement to achieve business objectives, and both provide the same essential content and use the same model components.

Structures of the Continuous and Staged Representations

Figure 3.1 illustrates the structures of the continuous and staged representations. The differences jump out at you immediately when you look at these structures. The staged representation utilizes maturity levels, whereas the continuous representation utilizes capability levels.

Figure 3.1: Structure of the Continuous and Staged Representations

What may strike you as you compare these two representations is their similarity. Both have many of the same components (e.g., process areas, specific goals, and specific practices), and these components have the same hierarchy and configuration.

What is not readily apparent from the high-level view in Figure 3.1 is that the continuous representation focuses on process area capability as measured by capability levels and the staged representation focuses on organizational maturity as measured by maturity levels. These dimensions (the capability/maturity dimensions) of CMMI are used for benchmarking and appraisal activities, as well as guiding an organization’s improvement efforts.

· Capability levels apply to an organization’s process improvement achievement in individual process areas. These levels are a means for incrementally improving the processes corresponding to a given process area. There are six capability levels, which are numbered 0 through 5.

· Maturity levels apply to an organization’s process improvement achievement across multiple process areas. These levels are a means of predicting the general outcomes of the next project undertaken. There are five maturity levels, numbered 1 through 5.

Table 3.1 compares the six capability levels to the five maturity levels. Notice that the names of four of the levels are the same in both representations. The differences are that there is no maturity level 0 and at level 1, the capability level is Performed, whereas the maturity level is Initial. Therefore, the starting point is different.

Table 3.1 Comparison of Capability and Maturity Levels

Level Continuous Representation Capability Levels Staged Representation Maturity Levels
Level 0 Incomplete (not applicable)
Level 1 Performed Initial
Level 2 Managed Managed
Level 3 Defined Defined
Level 4 Quantitatively Managed Quantitatively Managed
Level 5 Optimizing Optimizing

The continuous representation is concerned with selecting both a particular process area to improve and the desired capability level for that process area. In this context, whether a process is performed or incomplete is important. Therefore, the name incomplete is given to the continuous representation starting point.

Because the staged representation is concerned with the overall maturity of the organization, whether individual processes are performed or incomplete is not the primary focus. Therefore, the name initial is given to the staged representation starting point.

Both capability levels and maturity levels provide a way to measure how well organizations can and do improve their processes. However, the associated approach to process improvement is different.

Understanding Capability Levels

To support those using the continuous representation, all CMMI models reflect capability levels in their design and content.

The six capability levels, designated by the numbers 0 through 5, are as follows:

0. Incomplete

1. Performed

2. Managed

3. Defined

4. Quantitatively Managed

5. Optimizing

A capability level for a process area is achieved when all of the generic goals are satisfied up to that level. The fact that capability levels 2 through 5 use the same terms as generic goals 2 through 5 is intentional because each of these generic goals and practices reflects the meaning of the capability levels of the goals and practices. (See the “Generic Goals and Generic Practices” section in Part Two for more information about generic goals and practices.) A short description of each capability level follows.

Capability Level 0: Incomplete

An incomplete process is a process that either is not performed or partially performed. One or more of the specific goals of the process area are not satisfied, and no generic goals exist for this level since there is no reason to institutionalize a partially performed process.

Capability Level 1: Performed

A capability level 1 process is characterized as a performed process. A performed process is a process that satisfies the specific goals of the process area. It supports and enables the work needed to provide services.

Although capability level 1 results in important improvements, those improvements can be lost over time if they are not institutionalized. The application of institutionalization (the CMMI generic practices at capability levels 2 through 5) helps to ensure that improvements are maintained.

Capability Level 2: Managed

A capability level 2 process is characterized as a managed process. A managed process is a performed (capability level 1) process that has the basic infrastructure in place to support the process. It is planned and executed in accordance with policy; employs skilled people who have adequate resources to produce controlled outputs; involves relevant stakeholders; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. The process discipline reflected by capability level 2 helps to ensure that existing practices are retained during times of stress.

Capability Level 3: Defined

A capability level 3 process is characterized as a defined process. A defined process is a managed (capability level 2) process that is tailored from the organization’s set of standard processes according to the organization’s tailoring guidelines and contributes work products, measures, and other process improvement information to the organizational process assets.

A critical distinction between capability levels 2 and 3 is the scope of standards, process descriptions, and procedures. At capability level 2, the standards, process descriptions, and procedures may be quite different in each specific instance of the process (e.g., on a particular project). At capability level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit and therefore are more consistent, except for the differences allowed by the tailoring guidelines.

Another critical distinction is that at capability level 3 processes are typically described more rigorously than at capability level 2. A defined process clearly states the purpose, inputs, entry criteria, activities, roles, measures, verification steps, outputs, and exit criteria. At capability level 3, processes are managed more proactively using an understanding of the interrelationships of the process activities and detailed measures of the process and its work products.

Capability Level 4: Quantitatively Managed

A capability level 4 process is characterized as a quantitatively managed process. A quantitatively managed process is a defined (capability level 3) process that is controlled using statistical and other quantitative techniques. Quantitative objectives for quality and process performance are established and used as criteria in managing the process. Quality and process performance is understood in statistical terms and is managed throughout the life of the process.

Capability Level 5: Optimizing

A capability level 5 process is characterized as an optimizing process. An optimizing process is a quantitatively managed (capability level 4) process that is improved based on an understanding of the common causes of variation inherent in the process. The focus of an optimizing process is on continually improving the range of process performance through both incremental and innovative improvements.

Remember that capability levels 2 through 5 use the same terms as generic goals 2 through 5, and a detailed description of these terms appears in the “Generic Goals and Generic Practices” section in Part Two.

Advancing Through Capability Levels

The capability levels of a process area are achieved through the application of generic practices or suitable alternatives to the processes associated with that process area.

Reaching capability level 1 for a process area is equivalent to saying that the processes associated with that process area are performed processes.

Reaching capability level 2 for a process area is equivalent to saying that there is a policy that indicates you will perform the process. There is a plan for performing it, resources are provided, responsibilities are assigned, training to perform it is provided, selected work products related to performing the process are controlled, and so on. In other words, a capability level 2 process can be planned and monitored just like any project or support activity.

Reaching capability level 3 for a process area assumes that an organizational standard process exists associated with that process area, which can be tailored to the needs of the project. The processes in the organization are now more consistently defined and applied because they are based on organizational standard processes.

Reaching capability level 4 for a process area assumes that this process area is a key business driver that the organization wants to manage using quantitative and statistical techniques. This analysis gives the organization more visibility into the performance of selected subprocesses, which will make it more competitive in the marketplace.

Reaching capability level 5 for a process area assumes that you have stabilized the selected subprocesses and that you want to reduce the common causes of variation in that process. Remember that variation is a natural occurrence in any process, so although it is conceptually feasible to improve all processes, it is not economical to improve all processes to level 5. Again, you want to concentrate on those processes that help you to meet your business objectives.

Understanding Maturity Levels

To support those using the staged representation, all CMMI models reflect maturity levels in their design and content. A maturity level consists of related specific and generic practices for a predefined set of process areas that improve the organization’s overall performance. The maturity level of an organization provides a way to predict an organization’s performance in a given discipline or set of disciplines. Experience has shown that organizations do their best when they focus their process improvement efforts on a manageable number of process areas at a time and that those areas require increasing sophistication as the organization improves.

A maturity level is a defined evolutionary plateau for organizational process improvement. Each maturity level matures an important subset of the organization’s processes, preparing it to move to the next maturity level. The maturity levels are measured by the achievement of the specific and generic goals associated with each predefined set of process areas.

There are five maturity levels, each a layer in the foundation for ongoing process improvement, designated by the numbers 1 through 5:

1. Initial

2. Managed

3. Defined

4. Quantitatively Managed

5. Optimizing

Remember that maturity levels 2 through 5 use the same terms as capability levels 2 through 5. This was intentional because the concepts of maturity levels and capability levels are complementary. Maturity levels are used to characterize organizational improvement relative to a set of process areas, and capability levels characterize organizational improvement relative to an individual process area.

Maturity Level 1: Initial

At maturity level 1, processes are usually ad hoc and chaotic. The organization usually does not provide a stable environment to support processes. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this chaos, maturity level 1 organizations provide services that often work, but they frequently exceed the budget and schedule documented in their plans.

Maturity level 1 organizations are characterized by a tendency to overcommit, abandonment of processes in a time of crisis, and an inability to repeat their successes.

Maturity Level 2: Managed

At maturity level 2, projects establish the foundation for an organization to become an effective service provider by institutionalizing basic project management and service establishment and delivery practices. Projects define a project strategy, create project plans, and monitor and control the project to ensure the service is delivered as planned. The service provider establishes agreements with customers as well as develops and manages customer and contractual requirements. Configuration management and process and product quality assurance are institutionalized, and the service provider also develops the capability to measure and analyze process performance.

At maturity level 2, projects, processes, work products, and services are managed. The service provider ensures that processes are planned in accordance with policy. In order to execute the process, the service provider provides adequate resources, assigns responsibility for performing the process, trains people on the process, and ensures the designated work products of the process are under appropriate levels of configuration management. The service provider identifies and involves appropriate stakeholders and periodically monitors and controls the process. Process adherence is periodically evaluated and process performance is shared with senior management. The process discipline reflected by maturity level 2 helps to ensure that existing practices are retained during times of stress.

Maturity Level 3: Defined

At maturity level 3, service providers use defined processes for managing projects. They embed tenets of project management and services best practices, such as service continuity and incident resolution and prevention, into the standard process set. The service provider verifies that selected work products meet their requirements and validates services to ensure they meet the needs of the customer and end user. These processes are well characterized and understood and are described in standards, procedures, tools, and methods.

The organization’s set of standard processes, which is the basis for maturity level 3, is established and improved over time. These standard processes are used to establish consistency across the organization. Projects establish their defined processes by tailoring the organization’s set of standard processes according to tailoring guidelines. (See the glossary for a definition of “organization’s set of standard processes.”)

A critical distinction between maturity levels 2 and 3 is the scope of standards, process descriptions, and procedures. At maturity level 2, the standards, process descriptions, and procedures may be quite different in each specific instance of the process (e.g., on a particular project). At maturity level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit and therefore are more consistent except for the differences allowed by the tailoring guidelines.

Another critical distinction is that at maturity level 3, processes are typically described more rigorously than at maturity level 2. A defined process clearly states the purpose, inputs, entry criteria, activities, roles, measures, verification steps, outputs, and exit criteria. At maturity level 3, processes are managed more proactively using an understanding of the interrelationships of process activities and detailed measures of the process, its work products, and its services.

At maturity level 3, the organization must further mature the maturity level 2 process areas. Generic practices associated with generic goal 3 that were not addressed at maturity level 2 are applied to achieve maturity level 3.

The purpose of risk management is to identify and assess project risks during project planning and manage these risks throughout the project.

Maturity Level 4: Quantitatively Managed

At maturity level 4, service providers establish quantitative objectives for quality and process performance and use them as criteria in managing processes. Quantitative objectives are based on needs of the customer, end users, organization, and process implementers. Quality and process performance is understood in statistical terms and is managed throughout the life of processes.

For selected subprocesses, specific measures of process performance are collected and statistically analyzed. When selecting processes or subprocesses for analyses, it is critical to understand the relationships between different processes and subprocesses and their impact on the service provider’s performance relative to delivering the product specified by the customer. Such an approach helps to ensure that quantitative and statistical management is applied to where it has the most overall value to the business. Performance models are used to set performance objectives for service provider performance and to help achieve business objectives.

A critical distinction between maturity levels 3 and 4 is the predictability of process performance. At maturity level 4, the performance of processes is controlled using statistical and other quantitative techniques and is quantitatively predictable. At maturity level 3, processes are typically only qualitatively predictable.

Maturity Level 5: Optimizing

At maturity level 5, an organization continually improves its processes based on a quantitative understanding of the common causes of variation inherent in processes. (See the definition of “common cause of process variation” in the glossary.)

Maturity level 5 focuses on continually improving process performance through incremental and innovative process and technology improvements that enhance the service provider’s ability to meet its quality and process-performance objectives.

Quantitative process improvement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement. The effects of deployed process improvements are measured and compared to quantitative process improvement objectives. Both the defined processes and the organization’s set of standard processes are targets of measurable improvement activities.

A critical distinction between maturity levels 4 and 5 is the type of process variation addressed. At maturity level 4, the organization is concerned with addressing special causes of process variation and providing statistical predictability of results. Although processes may produce predictable results, the results may be insufficient to achieve established objectives. At maturity level 5, the organization is concerned with addressing common causes of process variation and changing the process (to shift the mean of the process performance or reduce the inherent process variation experienced) to improve process performance and to achieve established quantitative process improvement objectives.

Advancing Through Maturity Levels

Organizations can achieve progressive improvements in their organizational maturity by achieving control first at the project level and continuing to the most advanced level—organization-wide continuous process improvement—using both quantitative and qualitative data to make decisions.

Since improved organizational maturity is associated with improvement in the range of expected results that can be achieved by an organization, it is one way of predicting general outcomes of the organization’s next project. For instance, at maturity level 2, the organization has been elevated from ad hoc to disciplined by establishing sound project management. As your organization achieves generic and specific goals for the set of process areas in a maturity level, you are increasing your organizational maturity and reaping the benefits of process improvement. Because each maturity level forms a necessary foundation for the next level, trying to skip maturity levels is usually counterproductive.

At the same time, you must recognize that process improvement efforts should focus on the needs of the organization in the context of its business environment and that process areas at higher maturity levels address the current needs of an organization or project. For example, organizations seeking to move from maturity level 1 to maturity level 2 are frequently encouraged to establish a process group, which is addressed by the Organizational Process Focus process area that resides at maturity level 3. Although a process group is not a necessary characteristic of a maturity level 2 organization, it can be a useful part of the organization’s approach to achieving maturity level 2.

This situation is sometimes characterized as establishing a maturity level 1 process group to bootstrap the maturity level 1 organization to maturity level 2. Maturity level 1 process improvement activities may depend primarily on the insight and competence of the process group staff until an infrastructure to support more disciplined and widespread improvement is in place.

Organizations can institute process improvements anytime they choose, even before they are prepared to advance to the maturity level at which the specific practice is recommended. In such situations, however, organizations should understand that the success of these improvements is at risk because the foundation for their successful institutionalization has not been completed. Processes without the proper foundation may fail at the point they are needed most—under stress.

A defined process that is characteristic of a maturity level 3 organization can be placed at great risk if maturity level 2 management practices are deficient. For example, management may commit to a poorly planned schedule or fail to control changes to baselined requirements. Similarly, many organizations prematurely collect the detailed data characteristic of maturity level 4 only to find the data uninterruptable because of inconsistencies in processes and measurement definitions.

Process Areas

Process areas are viewed differently in the two representations. Figure 3.2 compares views of how process areas are used in the continuous representation and the staged representation.

Figure 3.2: Process Areas in Continuous and Staged Representations

The continuous representation enables the organization to choose the focus of its process improvement efforts by choosing those process areas, or sets of interrelated process areas, that best benefit the organization and its business objectives. Although there are some limits on what an organization can choose because of the dependencies among process areas, the organization has considerable freedom in its selection.

To support those using the continuous representation, process areas are organized into four categories: Process Management, Project Management, Service Establishment and Delivery, and Support. These categories emphasize some of the key relationships that exist among the process areas.

Once you select process areas, you must also select how much you would like to mature processes associated with those process areas (i.e., select the appropriate capability level). Capability levels and generic goals and practices support the improvement of processes associated with individual process areas. For example, an organization may wish to reach capability level 2 in one process area and capability level 4 in another. As the organization reaches a capability level, it sets its sights on the next capability level for one of these same process areas or decides to widen its view and address a larger number of process areas.

This selection of a combination of process areas and capability levels is typically described in a target profile. A target profile defines all of the process areas to be addressed and the targeted capability level for each. This profile governs which goals and practices the organization will address in its process improvement efforts.

Most organizations, at minimum, target capability level 1, which requires that all specific goals of the process area be achieved. However, organizations that target capability levels higher than 1 concentrate on the institutionalization of selected processes in the organization by implementing the associated generic goals and practices.

The staged representation provides a predetermined path of improvement from maturity level 1 to maturity level 5 that involves achieving the goals of the process areas at each maturity level. To support those using the staged representation, process areas are grouped by maturity level, indicating which process areas to implement to achieve each maturity level. For example, at maturity level 2, there is a set of process areas that an organization would use to guide its process improvement until it could achieve all the goals of all these process areas. Once maturity level 2 is achieved, the organization focuses its efforts on maturity level 3 process areas, and so on. The generic goals that apply to each process area are also predetermined. Generic goal 2 applies to maturity level 2 and generic goal 3 applies to maturity levels 3 through 5.

Table 3.2 provides a list of CMMI-SVC process areas and their associated categories and maturity levels.

Table 3.2 Process Areas and Their Associated Categories and Maturity Levels

Process Area Category Maturity Level
Capacity and Availability Management (CAM) Project Management 3
Causal Analysis and Resolution (CAR) Support 5
Configuration Management (CM) Support 2
Decision Analysis and Resolution (DAR) Support 3
Integrated Project Management (IPM) Project Management 3
Incident Resolution and Prevention (IRP) Service Establishment and Delivery 3
Measurement and Analysis (MA) Support 2
Organizational Innovation and Deployment (OID) Process Management 5
Organizational Process Definition (OPD) Process Management 3
Organizational Process Focus (OPF) Process Management 3
Organizational Process Performance (OPP) Process Management 4
Organizational Training (OT) Process Management 3
Project Monitoring and Control (PMC) Project Management 2
Project Planning (PP) Project Management 2
Process and Product Quality Assurance (PPQA) Support 2
Quantitative Project Management (QPM) Project Management 4
Requirements Management (REQM) Project Management 2
Risk Management (RSKM) Project Management 3
Supplier Agreement Management (SAM) Project Management 2
Service Continuity (SCON) Project Management 3
Service Delivery (SD) Service Establishment and Delivery 2
Service System Development (SSD) Service Establishment and Delivery 3
Service System Transition (SST) Service Establishment and Delivery 3
Strategic Service Management (STSM) Service Establishment and Delivery 3

Equivalent Staging

Equivalent staging is a way to compare results from using the continuous representation to those of the staged representation. In essence, if you measured improvement relative to selected process areas using capability levels in the continuous representation, how would you compare that to maturity levels? Is this possible?

Up to this point, we have not discussed process appraisals in much detail. The SCAMPISM method[5] is used to appraise organizations using CMMI, and one result of an appraisal is a rating [SEI 2006b, Ahern 2005]. If the continuous representation is used for an appraisal, the rating is a capability level profile. If the staged representation is used for an appraisal, the rating is a maturity level (e.g., maturity level 3) rating.

A capability level profile is a list of process areas and the corresponding capability level achieved for each. This profile enables an organization to track its capability level by process area. The profile is an achievement profile when it represents the organization’s actual progress for each process area. Alternatively, the profile is a target profile when it represents the organization’s planned process improvement objectives. Figure 3.3 illustrates a combined target and achievement profile. The gray portion of each bar represents what has been achieved. The unshaded portion represents what remains to be accomplished to meet the target profile.

Figure 3.3: An Example of a Target and Achievement Profile

An achievement profile, when compared with a target profile, enables an organization to plan and track its progress for each selected process area. Maintaining capability level profiles is advisable when using the continuous representation.

Target staging is a sequence of target profiles that describes the path of process improvement to be followed by the organization. When building target profiles, the organization should pay attention to the dependencies between generic practices and process areas. If a generic practice depends on a process area, either to carry out the generic practice or to provide a prerequisite product, the generic practice may be much less effective when the process area is not implemented.[6]

Although there are many reasons to use the continuous representation, ratings consisting of capability level profiles are limited in their ability to provide organizations with a way to generally compare themselves with other organizations. Capability level profiles could be used if each organization selected the same process areas; however, maturity levels have been used to compare organizations for years and already provide predefined sets of process areas.

Because of this situation, equivalent staging was created. Equivalent staging enables an organization using the continuous representation for an appraisal to convert a capability level profile to the associated maturity level rating.

The most effective way to depict equivalent staging is to provide a sequence of target profiles, each of which is equivalent to a maturity level rating of the staged representation. The result is a target staging that is equivalent to the maturity levels of the staged representation.

Figure 3.4 shows a summary of the target profiles that must be achieved when using the continuous representation to be equivalent to maturity levels 2 through 5. Each shaded area in the capability level columns represents a target profile that is equivalent to a maturity level.

Name Abbr. ML CL1 CL2 CL3 CL4 CL5
Configuration Management CM   Target Profile 2      
Measurement and Analysis MA      
Project Monitoring and Control PMC      
Project Planning PP        
Process and Product Quality Assurance PPQA        
Requirements Management REQM        
Service Delivery SD        
Supplier Agreement Management SAM        
Capacity and Availability Management CAM          
Decision Analysis and Resolution DAR Target Profile 3    
Integrated Project Management IPM    
Incident Resolution and Prevention IRP          
Organizational Process Definition OPD          
Organizational Process Focus OPF          
Organizational Training OT          
Risk Management RSKM          
Service Continuity SCON          
Service System Development SSD          
Service System Transition SST          
Strategic Service Management STSM          
Organizational Process Performance OPP Target Profile 4    
Quantitative Project Management QPM
Causal Analysis and Resolution CAR Target Profile 5    
Organizational Innovation and Deployment OID

Figure 3.4: Target Profiles and Equivalent Staging

The following rules summarize equivalent staging:

· To achieve maturity level 2, all process areas assigned to maturity level 2 must achieve capability level 2 or higher.

· To achieve maturity level 3, all process areas assigned to maturity levels 2 and 3 must achieve capability level 3 or higher.

· To achieve maturity level 4, all process areas assigned to maturity levels 2, 3, and 4 must achieve capability level 3 or higher.

· To achieve maturity level 5, all process areas must achieve capability level 3 or higher.

These rules and the table for equivalent staging are complete; however, you may ask why target profiles 4 and 5 do not extend into the CL4 and CL5 columns. The reason is that maturity level 4 process areas describe a selection of the subprocesses to be stabilized based, in part, on the quality and process-performance objectives of the organization and projects. Not every process area will be addressed in the selection and CMMI does not presume in advance which process areas might be addressed in the selection.

So, the achievement of capability level 4 for process areas cannot be predetermined because the choices depend on the selections made by the organization in its implementation of the maturity level 4 process areas. Thus, Figure 3.4 does not show target profile 4 extending into the CL4 column, although some process areas will have achieved capability level 4. The situation for maturity level 5 and target profile 5 is similar.

The existence of equivalent staging should not discourage users of the continuous representation from establishing target profiles that extend above capability level 3. Such a target profile would be determined in part by the selections made by the organization to meet its business objectives.

Understanding Key Concepts in the Use of CMMI for Services

The concepts and terms explained so far, such as process areas, capability levels, and equivalent staging, are common to all CMMI models. However, there are some additional terms that are particularly significant in the CMMI for Services model only. Although all are defined in the Glossary, they each employ words that can cover a range of possible meanings to people coming from different backgrounds, and so they merit some additional discussion to ensure that readers do not misinterpret model material that includes these concepts.

Service

The most important of these terms is probably the word service itself, which the Glossary defines as a product that is intangible and non-storable. While this definition accurately captures the intended scope of meaning for service, it does not highlight some of the possible subtleties or misunderstandings of this concept in the CMMI context.

The first point to highlight is that a service is a kind of product, given this definition. Many people routinely think of productsand servicesas two mutually exclusive categories. In CMMI models, however, products and services are not disjoint categories: a service is considered to be a special variety of product. Any reference to products can be assumed to refer to services as well. If you find a need to refer to a category of products that are not services in a CMMI context, you may find it helpful to use the term goods, as in the commonly used and understood phrase “goods and services.” (For historical reasons, portions of CMMI models still use the phrase “products and services” on occasion. However, this usage is always intended to explicitly remind the reader that services are included in the discussion.)

A second possible point of confusion is between services and processes, especially because both terms refer to entities that are by nature intangible and non-storable, and because both concepts are intrinsically linked. However, in CMMI models, processes are activities, while services are a useful result of performing those activities. For example, an organization that provides training services performs training processes (activities) that are intended to leave the recipients of the training in a more knowledgeable state. This useful state of affairs (i.e., being more knowledgeable) is the service that the training provider delivers, or attempts to deliver. If the training processes are performed but the recipient(s) fail to become more knowledgeable (perhaps because the training is poorly designed, or the recipients don’t have some necessary preliminary knowledge), then the service–the useful result–has not actually been delivered. Services are the results of processes (performed as part of a collection of resources), not the processes themselves.

A final possible point of confusion over the meaning of service will be apparent to those with a background in information technology, especially those familiar with disciplines like service-oriented architecture (SOA) or software as a service (SaaS). In a software context, services are typically thought of as methods, components, or building blocks of a larger automated system, rather than as the results produced by that system. In CMMI models, services are useful intangible and non-storable results delivered through the operation of a service system, which may or may not have any automated components. To completely resolve this possible confusion, an understanding of the service system concept is necessary.

Service System

A service is delivered through the operation of a service system, which the Glossary defines as an integrated and interdependent combination of component resources that satisfies service requirements. The use of the word system in service system may suggest to some that service systems are a variety of information technology, and that they must have hardware, software, and other conventional IT components. This interpretation is too restrictive. While it is possible for some components of a service system to be implemented with information technology, it is also possible to have a service system that uses little or no information technology at all.

In this context, the word system should be interpreted in the broader sense of “a regularly interacting or interdependent group of items forming a unified whole,” a typical dictionary definition. Also, systems created by people usually have an intended unifying purpose, as well as a capability to operate or behave in intended ways. Consider a package delivery system, a health care system, or an education system as examples of service systems with a wide variety of integrated and interdependent component resources.

Some may still have trouble with this interpretation because they may feel that the way they deliver services is not systematic, does not involve identifiable “components,” or is too small or difficult to view through the lens of a systems perspective. While this may in some cases be true for service provider organizations with relatively immature practices, part of the difficulty may also be traced to an overly narrow interpretation of the word resources in the definition of service system.

The full extent of a service system encompasses everything required for service delivery, including work products, processes, tools, facilities, consumable items, and human resources, Some of these resources may belong to customers or suppliers, and some may be transient (in the sense that they are only part of the service system for a limited time). But all of these resources become part of a service system if they are needed in some way to enable service delivery.

Because of this broad range of included resource types and the relationships among them, a service system can be something very large and complex, with extensive facilities and tangible components (e.g., a service system for health care or for transportation). Alternatively, a service system could be something consisting primarily of people and processes (e.g. for an independent verification and validation service). Since every service provider organization using the CMMI-SVC model must have at a minimum both people and process resources, they should be able to apply the service system concept successfully.

Service providers who are not used to thinking of their methods, tools, and personnel for service delivery from a broad systems perspective may need to expend some effort to reframe their concept of service delivery to accommodate this perspective. The benefits of doing so are great, however, because critical and otherwise unnoticed resources and dependencies between resources will become visible for the first time. This insight will enable the service provider organization to effectively improve its operations over time without being caught by surprises or wasting resources on incompletely addressing a problem.

Service Agreement

A service agreement is the foundation of the joint understanding between a service provider and customer of what to expect from their mutual relationship. The Glossary defines a service agreement as a binding, written record of a promised exchange of value between a service provider and a customer. Service agreements can appear in a wide variety of forms, ranging from simple posted menus of services and their prices, to tickets or signs with fine print that refer to terms and conditions described elsewhere, to complex multi-part documents that are included as part of legal contracts. Whatever they may contain, it is essential that service agreements be recorded in a form that both the service provider and customer can access and understand so that misunderstandings are minimized.

The “promised exchange of value” implies that each party to the agreement commits to providing the other party or parties with something they need or want. A common situation is for the service provider to deliver needed services and for the customer to pay money in return, but many other types of arrangements are possible. For example, an operating level agreement (OLA) between organizations in the same enterprise may only require the customer organization to notify the service provider organization when certain services are needed. Service agreements for public services provide by governments, municipal agencies, and non-profit organizations may simply document what services are available, and identify what steps end users must follow to get those services. In some cases, the only thing the service provider needs or wants from the customer or end user is specific information required to enable service delivery.

See the Glossary for additional discussion of the terms service agreement, service level agreement, customer, and end user.

Service Request

Even given a service agreement, customers and end users must be able to notify the service provider of their needs for specific instances of service delivery. In the CMMI-SVC model, these notifications are called service requests, and they can be communicated in every conceivable way, including face-to-face encounters, phone calls, all varieties of written media, and even non-verbal signals (e.g., pressing a button to call a bus to a bus stop).

However it is communicated, a service request identifies one or more desired services that the request originator expects to fall within the scope of an existing service agreement. These requests are often generated over time by customers and end users as their needs develop. In this sense, service requests are expected intentional actions that are an essential part of service delivery; they are the primary triggering events that cause service delivery to occur. (Of course, it is possible for the originator of a request to be mistaken about whether or not it is actually within the scope of agreed-on services.)

Sometimes specific service requests may be incorporated directly into service agreements themselves. This is often the case for services that are to be performed repeatedly or continuously over time (e.g., a cleaning service with a specific expected cleaning schedule, or a network management service that must provide 99.9% network availability for the life of the service agreement.) Even in these situations, ad hoc service requests may also be generated when needed and the service provider should be prepared to deliver services in response to both types of requests.

Service Incident

Even with the best planning, monitoring, and delivery of services, unintended events may occur that are unwanted. Some instances of service delivery may have lower than expected or acceptable degrees of performance or quality, or may be completely unsuccessful. The CMMI-SVC model refers to these difficulties as service incidents, which the Glossary defines as an indication of an actual or potential interference with a service. The single word incident is used in place of service incident when the context makes the meaning clear.

Like requests, incidents require some recognition and response by the service provider; but unlike requests, incidents are unintended events, although some types of incidents may be anticipated. Whether or not they are anticipated, incidents must be resolved in some way by the service provider. In some service types and service provider organizations, service requests and incidents are both managed and resolved through common processes, personnel, and tools. The CMMI-SVC model is compatible with this kind of approach, but does not require it, as it is not appropriate for all types of services.

The use of the word “potential” in the definition of service incident is deliberate and significant; it means that incidents do not always have to involve actual interference with or failure of service delivery. Indications that a service may have been insufficient or unsuccessful are also incidents, as are indications that it may be insufficient or unsuccessful in the future. (Customer complaints are an almost universal example of this type of incident, because they are always indications that service delivery may have been inadequate.) This aspect of incidents is often overlooked, but it is important: failure to address and resolve potential interference with services is likely to lead eventually to actual interference, and possibly to a failure to satisfy service agreements.

Project

While it refers to a concept that is used across all CMMI models, the term project deserves some special clarification in the context of the CMMI-SVC model. It is likely that no other single word in the model has the same potential to raise misunderstandings, questions, and even objections.

Those with prior experience using other CMMI models, or who routinely think of their work as part of a project-style work arrangement, may wonder where the difficulty lies. The CMMI Glossary defines a project as a managed set of interrelated resources that delivers one or more products or services to a customer or end user, and continues by declaring that a project has a definite beginning (i.e., project startup) and typically operates according to a plan. These are conventional characteristics of a project according to many definitions, so why is there an issue? Why might there be a difficulty with applying terms like project planning or project management in some service provider organizations?

One simple reason is that many people work on or know of projects that have a definite end as well as a definite beginning; such projects are focused on accomplishing an objective by a certain point in time. In fact, the Glossary in prior versions of CMMI models (i.e., prior to V1.2) specifically included a definite end as part of the definition of project. This older and more restrictive definition reflected the legacy and original focus of CMMI (and the other maturity models that preceded it), which was principally on development efforts that do normally come to some expected end once an overall objective has been reached. While some services follow this same pattern, many are delivered over time without an expected definite end (e.g., services from businesses that intend to offer them indefinitely, or typical municipal services). Service providers in these contexts would naturally be reluctant to describe their service delivery work as a project under this definition.

However, for the latest (V1.2) CMMI models, the definition of project was deliberately changed to eliminate this limitation, in part to allow the term to be applied more easily to the full range of services types. Projects must be planned, but they do not need to have a planned end, and this broader definition can therefore make sense in the context of all service delivery (provided that CMMI model users are willing to suppress an expectation that all projects must come to an end).

Even given this adjustment, some people may still have difficulty thinking of the delivery of services as being a project, which often carries the connotation of trying to accomplish an overall objective by following some preset plan. Many services are delivered in response to what are effectively small independent objectives established over time—individual service requests—in ways that are not planned in advance according to predetermined milestones. In these circumstances, service providers are often not used to thinking of a single objective to be accomplished. Therefore, characterizing their work arrangements as projects may seem awkward at best.

For this reason, the CMMI-SVC model explicitly interprets the term project to encompass all of the resources required to satisfy a service agreement with a customer. Satisfaction of the terms of the service agreement becomes the overall objective under which individual service requests are handled. Planning the effort to satisfy the service agreement is required in the form of work structures, resource allocations, schedules, and other typical project planning work products and processes. If you think of a service agreement as outlining the scope of a project in this way, the use of project in a service context becomes less of a problem.

Even better, the Glossary includes notes explaining that a project can be composed of projects. These additional notes mean that interrelated sets of service agreements or service agreements covering multiple customers can be treated as projects, as can distinct subsets of work within the scope of a single service agreement. For example, the development of a new version of a service system or the transition of a new service delivery capability into operational use can be treated as projects as well.

In the end, of course, organizations will use whatever terminology is comfortable, familiar, and useful to them, and the CMMI-SVC model does not require this to change. However, all CMMI models need a convenient way to refer consistently and clearly to the fundamental groupings of resources that organize work to achieve significant objectives. Given the Glossary definition and the preceding discussion, the term project is still adequate and effective for this purpose, although its meaning has had to grow in scope over time. This adaptation is not a surprise, because CMMI models themselves have grown in scope over time, and are likely to continue to do so in the future. CMMI-SVC users are strongly encouraged to consider how they too may adapt their way of thinking to reflect greater flexibility, and thereby gain the benefits of different ways of improving services.

 



Ïîäåëèòüñÿ:

Äàòà äîáàâëåíèÿ: 2015-09-13; ïðîñìîòðîâ: 117; Ìû ïîìîæåì â íàïèñàíèè âàøåé ðàáîòû!; Íàðóøåíèå àâòîðñêèõ ïðàâ





lektsii.com - Ëåêöèè.Êîì - 2014-2024 ãîä. (0.009 ñåê.) Âñå ìàòåðèàëû ïðåäñòàâëåííûå íà ñàéòå èñêëþ÷èòåëüíî ñ öåëüþ îçíàêîìëåíèÿ ÷èòàòåëÿìè è íå ïðåñëåäóþò êîììåð÷åñêèõ öåëåé èëè íàðóøåíèå àâòîðñêèõ ïðàâ
Ãëàâíàÿ ñòðàíèöà Ñëó÷àéíàÿ ñòðàíèöà Êîíòàêòû