WO2001026013A1 - Method and estimator for providing service level management - Google Patents

Method and estimator for providing service level management Download PDF

Info

Publication number
WO2001026013A1
WO2001026013A1 PCT/US2000/027803 US0027803W WO0126013A1 WO 2001026013 A1 WO2001026013 A1 WO 2001026013A1 US 0027803 W US0027803 W US 0027803W WO 0126013 A1 WO0126013 A1 WO 0126013A1
Authority
WO
WIPO (PCT)
Prior art keywords
service level
level management
task
designing
infrastructure
Prior art date
Application number
PCT/US2000/027803
Other languages
French (fr)
Inventor
Samir Anand
William C. Bond
Original Assignee
Accenture Llp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accenture Llp filed Critical Accenture Llp
Priority to AU11936/01A priority Critical patent/AU1193601A/en
Publication of WO2001026013A1 publication Critical patent/WO2001026013A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/80Management or planning

Definitions

  • IT Information Technology
  • IT framework there is a need, therefore, to construct a complete yet simple IT framework that would quickly convey the entire scope of IT capability in a functional composition.
  • IT framework has to be a single framework for describing such IT management.
  • the IT framework should be a framework of all functions; a representation of a complete checklist of all relevant activities performed in an IT enterprise.
  • Framework should represent all functions operative in an IT enterprise.
  • Service Level Management is a key function of customer service management. Therefore, to meet this competition there are needs for improved methods for providing a service level management function, and an estimator for doing so.
  • one embodiment of the invention is a method for providing service level management that includes planning, designing, building, testing, and deploying a service level management function for an IT enterprise.
  • the planning step preferably includes developing a business performance model.
  • the designing step preferably includes designing business processes, skills, and user interaction, and may also include designing an organization infrastructure, designing a performance enhancement infrastructure, and designing technology infrastructure and operations architecture.
  • the building step preferably includes building the technology infrastructure, building the operations architecture, business policies, procedures, performance support, and developing learning products for the service level management.
  • the testing step preferably includes testing the technology infrastructure, and testing the operations architecture for service level management.
  • the developing step includes deploying the technology infrastructure for the IT enterprise.
  • Another aspect of the present invention is a method for providing an estimate for building a service level management function in an information technology organization. This aspect of the present invention allows an IT consultant to give on site estimations to a client within minutes. The estimator produces a detailed break down of cost and time to complete a project by displaying the costs and time corresponding to each stage of a project along with each task.
  • Another aspect of the present invention is a computer system for allocating time and computing cost for building a service level management function in an information technology organization.
  • Figure 1 shows a representation of the steps in a method for providing a service level management function according to the presently preferred embodiment of the invention.
  • Figure 2 shows a representation of the tasks for defining a business performance model for the method represented in Figure 1.
  • Figure 3 shows a representation of the tasks for designing business processes, skills, and user interaction for the method represented in Figure 1.
  • Figure 4 shows a representation of the tasks for designing technology infrastructure requirements for the method represented in Figure 1.
  • Figure 5 shows a representation of the tasks for designing an organization infrastructure for the method represented in Figure 1.
  • Figure 6 shows a representation of the tasks for designing a performance enhancement infrastructure for the method represented
  • Figure 7 shows a representation of the tasks for designing operations architecture for the method represented in Figure 1.
  • Figure 8 shows a representation of the task for validating a technology infrastructure for the method represented in Figure 1.
  • Figure 9 shows a representation of the tasks for acquiring a technology infrastructure for the method represented in Figure 1.
  • Figure 10 shows a representation of the tasks for building and testing operations architecture for the method represented in Figure 1.
  • Figure 1 1 shows a representation of the tasks for developing business policies, procedures, and performance support architecture for the method represented in Figure 1.
  • Figure 12 shows a representation of the tasks for developing learning products for the method represented in Figure 1.
  • Figure 13 shows a representation of the tasks for testing a technology infrastructure product for the method represented in Figure 1.
  • Figure 14 shows a representation of the tasks for deploying a technology infrastructure for the method represented in Figure 1.
  • Figure 15 shows a flow chart for obtaining an estimate of cost and time allocation for a project.
  • Figures 16a, 16b, and 16c show one embodiment of an estimating worksheet for an OM service level management estimating guide.
  • an information technology (“IT”) enterprise may be considered to be a business organization, charitable organization, government organization, etc. that uses an information technology system with or to support its activities.
  • An IT organization is the group, associated systems and processes within the enterprise that are responsible for the management and delivery of information technology services to users in the enterprise.
  • multiple functions may be organized and categorized to provide comprehensive service to the user.
  • the various operations management functionalities within the IT framework include a customer service management function; a service integration function; a service delivery function; a capability development function; a change administration function; a strategy, architecture and planning function; a management and administration function; a human performance management function; and a governance and strategic relationships function.
  • Service Level Management plays an important role within the customer service management function.
  • the present invention includes a method for providing a service level management function and an estimator useful for determining the times and cost to provide such a function.
  • Customer service management manages on-going relationships between users, IT services, and the IT organization by putting the appropriate service-oriented frameworks, processes, and measures in place.
  • the goal of the customer service management function category is to assist the IT organization in providing quality IT service and support, while meeting and exceeding established levels of service.
  • Customer service management links with other function categories to provide input and aims to continuously improve current IT services and offerings.
  • customer service management links with the IT developers/architects/vendors to verify that service levels are not affected by new offerings.
  • IT organizations focus on identifying and meeting the needs the users to provide better quality customer service.
  • Customer service management includes service management, demand management, and service control functions. Service Management:
  • Service management markets and manages IT offerings and capabilities to current and potential users of IT services. Explicitly, Service management demonstrates the value of IT investments to the executive management team, and serves as the source for gathering business needs and requirements. Service management includes service level management and customer management functions.
  • Service level management identifies and sets the expectations of users of IT services and becomes a measure for customer satisfaction. In an IT community, the vendors, outsourcers, etc. may also be made aware of their service level expectations through this function. Service level management often plays a key role in vendor management.
  • This framework uses (1 ) service level agreements (“SLA”s) to refer to service levels established and managed between IT and internal or external customers, also known as
  • OLA Operational Level Agreements
  • This function ensures that before creating an SLA, the agreement with the users is in line with the overall business strategy and determines what services the users want. This includes identifying service needs and business unit service requirements.
  • the business liaison function plays a key role in this base practice.
  • This function ensures that IT can deliver services included in the SLAs.
  • the service organization first verifies the level of service which is provided, and compiles metrics for current service levels to act as baselines for service levels promised, which involves linking with the quality metrics function. This function is used to ensure that the service organization is capable of providing services that it promises.
  • SLA Costing Determination This function identifies chargeback, budget, or costing structure components, which are included in SLAs. This function also determines rewards for meeting and/or exceeding the agreed upon SLAs, and also determines how individual costs for services may be reported.
  • SLA Definition This function includes the creation of SLA document(s), including draft and final versions, reviewing SLA drafts with users and key stakeholders, and getting approval of the final version. This function also identifies key performance indicators (KPIs) to report upon in an SLA, including the details of how and how often measurements will be reported. The following four base practices measure, report, control, and review the service level.
  • KPIs key performance indicators
  • This practice includes collecting, sorting, organizing, and validating measurement data and/or KPIs. It also links with the quality metrics function.
  • This practice includes creating reports and distributing them on the agreed-upon time schedule. This practice obtains detailed service reports from appropriate service providers, and collates and/or summarizes report- data as necessary. This practice includes developing service reports against service measures, highlighting service issues as necessary, saving statistics for historical purposes, and comparing service statistics against key performance indicators. 7. Service Level Control:
  • This practice ensures reports are delivered as scheduled, and determines if service levels are being met. This practice compares key performance indicators against agreed upon service expectations. If necessary, it may initiate a service level review session with a user group or service provider to resolve service issues.
  • This practice is performed at scheduled intervals. This practice includes reviewing service results with service providers and end-users on a regular basis. This practice initiates and/or identifies actions needed to resolve service issues, escalates service disputes as necessary, and monitors actions taken to correct SLA service issues.
  • This practice identifies changes to existing SLAs as necessary, receives requests from end-users and service providers to change the SLAs for business needs, and manages short-term deviation to SLAs due to business requirements.
  • This practice ensures that a specific SLA is retired and the accompanying processes for that SLA (i.e., service reports, scheduled review sessions, etc.) are ceased when services are no longer provided for whatever business reason.
  • This practice ensures that only active SLAs are in existence, which helps to raise the credibility of all SLAs.
  • This practice ensures that historical information, including statistics, reports, various versions of SLA, regarding ceased SLAs is appropriately recorded and stored for future reference.
  • the method for providing Operations Management (“OM”) service level management is directed to the tasks involved in building a particular OM function, or delivering a system for an OM function, which for this invention is the service level management function.
  • OMPC Operations Management Planning Chart
  • This chart provides the business integration methodology for capability delivery, which includes tasks such as planning analysis, design, build & test, and deployment.
  • Each OM function includes process, organization, and technology elements that are addressed throughout the following description of the corresponding OM function.
  • the method for providing service level management comprises four stages, as described below in connection with Figure 1.
  • the first stage, "capability analysis stage” 102 includes the step of Refining Business Performance Model 2110.
  • the second stage, “capability release design stage", 104 includes the steps of Designing Bus Processes, Skills, & User Interaction 2410, Designing Organization Infrastructure 2710, Designing
  • Performance Enhancement Infrastructure 2750 Analyzing Technology Infrastructure Requirements 3510, Selecting & Designing Operations Architecture 3550, and Validating Technology Infrastructure 3590.
  • the third stage "capability release build and test state" 106 includes the stage of Acquiring Technology Infrastructure 5510, Building & Testing Operations
  • the fourth stage "deployment" 108 includes the step of Deploying Technology Infrastructure 7170. In the following description, the details of the tasks within each step are discussed.
  • Step 2110 Refining Business Performance Model
  • step 2110 the business model requirements for service level management is defined, and the scope of the delivery and deployment effort is determined.
  • Figure 2 shows a representation of the tasks for carrying out these functions according to the presently preferred embodiment of the invention. These tasks include Confirming Business Architecture 2111 , Analyzing Operating Constraints 21 13, Analyzing Current Service Level Management Capability 2115, Identifying Service Level Management Best Practices 2117, Refining Service Level Management Requirements 2118, and Updating Business Performance Model 2119.
  • Task 2111 includes assessing the current business architecture, confirming the goals and objectives, and refining the components of the business architecture. Preferably, this task delievers the planning stage documentation, confirming or refining the overall service level management architecture, and ensuring management commitment to the project.
  • the amount of analysis performed in this task depends on the work previously performed in the planning phase of the project. Process, technology, organization, and performance issues are included in the analysis.
  • the overall requirements for service level management may be grouped into four main categories:
  • the service level agreement format which encompasses the measurements to be tracked, and the target performance for each of the measurements.
  • Service level agreements are stated in definable and measurable terms if they are to be managed on an on-going basis. For example, phrases like
  • Task 21 13 includes identifying the operating constraints and limitations, and assessing their potential impact on the operations environment.
  • the task includes assessing the organization's strategy and culture and its potential impact on the project, and assessing organization, technology, process, equipment, and facilities for the constraints.
  • the task also includes assessing the organization's ability to adapt to changes as part of the constraints analysis.
  • Task 2115 Analyzing Current Service Level Management Capability
  • the goal of task 2115 is to identify the current service level management elements, their operation, and their performance.
  • the task delivers a current service level management capability assessment.
  • the task is carried out by documenting current activities and procedures to establish a performance baseline, and assessing strengths and weaknesses of the service level management capability.
  • the method also includes considering the issues noted in Task 2111 , when assessing current capabilities and identifying gaps.
  • Task 2117 Identifying Service Level Management Best Practices
  • Task 2117 includes defining the relevant best practices for service level management that meet the organization's requirements, identifying the service level management areas that could benefit from application of best practices, and identifying the optimum best practices to meet the environment and objectives.
  • Task 2118 Refining Service Level Management Requirements Task 2118 includes defining service level management requirements, and allocating the requirements across changes to human performance, business process, and technology. Preferably, the task also includes using the current assessment, constraining assessment, and best practice research to generate the requirements for a change. The task also includes identifying the changes needed to upgrade the service level management capability, or to create a new SL management capability, and allocating the changes according to organization and performance improvements, process improvements, and technology improvements.
  • Service level targets provide input to business capability design, including the application architecture (mechanisms for SL report data collection), technology infrastructure (processing power, server location), and organization (service desk locations, staffing).
  • Task 2119 Updating Business Performance Model Task 2119 includes defining the metrics and measures to describe the performance of the service level management process, tools and organization. Preferably, the task includes identifying the performance and operational objectives previously defined, identifying the approaches to be used, and monitoring the overall effectiveness of the service level management process.
  • the project plan is delivered 110, and then the designing step, i.e., the "capability analysis stage” 104, is commenced.
  • the designing step includes steps2410, 2710, 2750, 3510, 3550 and 3590.
  • Step 2410 Designing Business Processes, Skills & User Interaction:
  • step 2410 the business processes, skills, and user interaction are designed.
  • the method includes designing the new service level management processes, and developing the framework and formats for the SLA's.
  • Figure 3 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include
  • Designing Workflow for Processes, Activities and Tasks 2411 Defining Physical Environment Interaction 2412, Identifying Skill Requirements 2413, Defining Application Interaction 2415, Identifying Performance Support Requirements 2416, Developing Capability Interaction Model 2417 and Verifying and Validate Business Processes, Skills and User 2419.
  • Task 2411 Designing Workflows for Processes, Activities and Tasks
  • Task 2411 includes creating the workflow diagrams and defining the workloads for service level reporting, controlling, and reviewing.
  • the task also includes identifying service lines to be included in SLA's.
  • the task includes developing workflow diagrams for all processes and activities, defining the relationships between core and supporting processes, activities, and tasks.
  • the task also includes defining the metrics associated with the processes and activities, and identifying the service lines to be monitored for each business unit within scope.
  • the key issues to be addressed for SL reporting are what data needs to be collected to report on service level performance, how it will be collected, and how it will be reported. Data may be collected from many sources, including:
  • Application programs - mechanisms may be built into applications to report on certain events.
  • Operations architecture components - performance and event/exception/problem data come from sources, such as service desk, print management, file transfer, backup/restore/ archive, security management, asset management, and change control.
  • Hardware and systems software components - downtime, recovery, and some form of event response times are collected for various hardware and software components.
  • Task 2412 Defining Physical Environment Interaction Task 2412 includes identifying the implications of the service level management processes on the physical environment, including location, layout, and equipment requirements.
  • the task preferably includes identifying the Workflow/Physical environment interfaces, designing the facilities, layout, and equipment required for service level management, and identifying distributed service level management physical requirements.
  • the task also includes defining the layout and co-location implications of the service level management workflows and the physical environment.
  • Task 2413 includes identifying the skill and behavior requirements for performing service level management tasks.
  • the task includes identifying critical tasks from the workflow designing, defining the skills needed for the critical tasks, and identifying supporting skills and appropriate behavioral characteristics.
  • the overall SL coordination function may be a full time position.
  • the people most often involved in performing the key roles may be one or more management representatives from the organization responsible for managing the systems/service delivery (presumably the IT organization), and management representatives from the business units to whom the services are delivered. These may be the same groups involved in defining and maintaining the service level agreements.
  • Task 2415 Defining Application Interaction Task 2415 includes identifying the human-computer interactions necessary to fulfill key service level management activities. Preferably, the task includes using the workflows to identify the service level management activities supported by software elements, and defining the human-computer interactions needed to meet the requirements. Application interactions may occur in data collection during all phases of service level management process, i.e., when comparing actual service metrics against service levels.
  • Task 2416 Identifying Performance Support Requirements Task 2416 includes analyzing the service level management processes, and determining how to support human performance within these processes. Preferably, the task includes analyzing the critical performance factors for each SL management task, and selecting training and support aids to maximize workforce performance. Performance support is useful in data gathering effort if there are manual processes or application interactions involved. Job aids are also used if business unit users are responsible for triggering the SL reporting process.
  • Task 2417 Developing Capability Interaction Model Task 2417 identifies the relationships between the tasks in the workflow diagrams, the physical location, skills required, human-computer interactions and performance support needs.
  • the task also includes developing capability interaction models by identifying the interactions within each process for physical environment, skills, application and performance support, and unifying these models.
  • the task includes integrating workflows, the physical environment models, role and skill definitions, the application interactions, and support requirements to develop the capability interaction models. This model illustrates how the process is performed, what roles fulfill the activities involved, and how the roles are supported to maintain the service level management capability.
  • Task 2419 Verifying and Validating Business Processes, Skills and
  • Task 2419 includes verifying and validating that the process designs and the capability interaction models meet the service level management requirements and are internally consistent.
  • the task also includes verifying that the capability model fulfills the original requirements.
  • the task may include the use stakeholders and outside experts as well as the design teams to do the validation.
  • step 3510 technology infrastructure requirements are analyzed.
  • the method of the present invention includes selecting and designs the technology infrastructure, and establishing preliminary plans for technology infrastructure product testing.
  • Figure 4 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the task include Preparing Technology Infrastructure Performance Model 3511 , Analyzing Technology Infrastructure
  • Task 3511 Preparing Technology Infrastructure Performance Model This task includes analyzing the functional, technical, and performance requirements for the service level management infrastructure. Preferably, the task includes identifying the key performance indicators for service level management, establishing baseline estimates, setting measurable targets for the performance indicators, and developing the functional model, and the performance model.
  • This task includes analyzing and documenting the requirements for service level management components, and defining their needs.
  • the task includes identifying the constraints imposed by the environment, refining functional, physical, and performance requirements developed in the models previously built, and assessing the interfaces to other OM components to avoid redundancy and ensure consistency/compatibility.
  • the technology requirements are determined from the business capability requirements, and more specifically the service level agreement scope, metrics, and measurement parameters.
  • This information identifies data collection and reporting requirements to support SL management. All components of the application and supporting infrastructure (e.g., network management software, database management software, etc.) are providers of the information. Depending on the requirements, other aspects of the operations architecture are also sources of data.
  • the requirements for historical SL reporting to identify trends in service levels overtime may indicate a requirement for long-term storage of SL data in some type of database or repository.
  • This task includes assessing the ability of the current service level management infrastructure to support the new component requirements, and identifying current standards for technology infrastructure.
  • the task includes documenting and analyzing the current service level management technology environment, and noting any standards/policies regarding service level management technology.
  • Task 3517 Planning Technology Infrastructure Product Test This task includes planning the product test for the service level management infrastructure. The results of this task provide the basis on which the product test is performed. Preferably, the task includes defining the test objectives, scope, environment, testing conditions and expected results, and developing the deployment plan. The product test is a test of the infrastructure. Therefore, the organization and process elements are within the scope of the test. Step 2710 - Designing Organization Infrastructure:
  • the method of the present invention includes defining the structures for managing human performance, and defining what is expected of people who participate in the service level management function, the required competencies, and how performance is managed.
  • Figure 5 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the tasks include Designing Roles, Jobs and Teams 2711 , Designing Competency Model 2713, Designing Performance Management Infrastructure 2715, Determining Organization Infrastructure Mobilization Approach 2717 and Verifying and
  • Task 2711 Designing Roles, Jobs and Teams This task includes determining competencies and the roles required to operate the new capability, defining how roles are grouped to fit into teams and jobs, and designing the metrics for the roles. Preferably, the task includes confirming the service level management competency requirements, designing the roles, jobs and teams, determining the reporting relationships, and identifying the performance measurement factors.
  • the key issues include the scope of activities to be performed, geographic distribution of the function, the software tools which are involved in supporting the SL management process, and the complexity of the environment. These key issues also include allocating responsibilities for the SL definition, control, reporting, and review activities. This step preferably develops a responsibility matrix of activities, and assigns responsibility for all activities as well as overall responsibility for SL Management. Since there may in fact be several agreements (for example, one per business unit), several people may have overall responsibility for specific areas. The functions and responsibilities involved in SL management normally do not require full-time positions, meaning a new organizational infrastructure may not be required. Task 2713: Designing Competency Model
  • This task includes defining the skills, knowledge, and behavior that people require to accomplish their roles in the service level management process.
  • the task includes determining the characteristics required of the individuals/teams, defining the individual capabilities necessary for success, organizing the capabilities along a proficiency scale, and relating them to the jobs and teams.
  • Task 2715 Design Performance Management Infrastructure This task includes defining how individual performance is measured, developed, and rewarded, and determining a performance management approach and appraisal criteria.
  • the task includes developing standards for individuals and teams involved in the service level management process, and using the standards, identifying a system to monitor the individuals' and teams' abilities to perform up to the standards.
  • the task also includes ensuring the performance measures are based upon actions within the individual's control.
  • This task includes determining and mobilizing the resources required to staff the new service level management capability.
  • the task includes identifying profiles of the ideal candidates for each position, identifying the sourcing approaches and timing requirements, and determining the selection and recruiting approaches.
  • Task 2719 Verifying and Validating Organization Infrastructure
  • the method of the present invention includes verifying and validating that the service level management organization meets the needs of the service level management capability and is internally consistent.
  • the task includes determining the approach to be used and participants to be involved, and verifying that the organization structure satisfies service level management capability requirements.
  • a performance enhancement infrastructure is designed.
  • the step includes determining the training needed for new service level management functions, and determining the on-line help text, procedures, job aids, and other information to be used.
  • Figure 6 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Assessing Employee Competency and Performance 2751 , Determining Performance Enhancement Needs 2753, Designing Performance Enhancement Product 2755, Defining Learning Test Approach 2757, and Verifying and Validating Performance
  • Task 2751 Assessing Employee Competency and Performance. This task includes refining the information about the current service level management staffs competency, proficiency, and performance levels in specific areas, and assessing the gaps in competencies and performance levels that drive the design of the performance enhancement infrastructure. Preferably the task includes assessing the competency of the current service level management staff based on the competency model previously developed.
  • This task includes assessing the performance support and training requirements necessary to close the competency and performance gaps in the workforce.
  • the task includes using the employee assessment to determine the type of performance enhancement required to close the gaps and reach the desired competency levels.
  • Task 2755 Designing Performance Enhancement Products This task includes defining the number and structure of performance support and learning products. Preferably, the task includes determining the delivery approaches for training and performance support, designing the learning and performance support products, and defining the support systems for delivering training and performance support.
  • Data collection is a combination of automated and manual processes, which may require some knowledge of computer applications. Reporting may be via customized applications, or via Spreadsheets in less complex environments. Due to part-time service level management roles, job aids or desktop reminder systems are frequently the preferred approaches for performance support in this type of situation. When these situations arise, the performance enhancement infrastructure focuses on fairly training and support for the part-time function.
  • This task includes developing a comprehensive approach for testing the learning products with respect to achieving each product's learning objectives.
  • the task includes identifying which learning objectives to be tested, and identifying the data capture methods to be used to test those objectives.
  • the task of the present invention includes verifying the performance enhancement infrastructure and the learning test deliverables to determine how well they fit together to support the new service level management capability.
  • the task includes simulating the processes and activities performed by the members of the service level management team in order to identify performance enhancement weaknesses.
  • the task includes identifying the problems and repeating the appropriate tasks necessary to address the problems.
  • This step includes selecting and designing the components required to support a high-level service level management architecture, including reuse, package, and custom components.
  • Figure 7 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Identifying Operation Architecture Component Options 3551 , Selecting Reuse Operations Architecture Components 3552, Selecting Packaged Operations Architecture Components 3553, Designing Custom Operations Architecture Components 3555, Designing and Validating Operations Architecture 3557 and Developing
  • Task 3551 Identifying Operations Architecture Component Options This task includes identifying specific component options that are needed to support the production environment. Preferably, the task includes identifying all risks and gaps that exist in the current service level management environment, selecting components that should support the service level management architecture, considering current software resources, packaged software and custom software alternatives during the selection process, and if new packaged software is part of the solution, submitting RFPs to vendors for software products that meet basic requirements.
  • Data collection gaps may exist for several reasons.
  • the collection mechanisms are the software packages that collect performance and event data as a by-product of their primary function, such as service desk support, production scheduling, software distribution, or other OM functions.
  • determining response-time for a transaction may include collecting data from the network management system, the database management system, and the application itself.
  • some of the currently installed operations software may not be designed to collect the data required, or may be incompatible with other elements of the infrastructure. Also, some operation functions may be handled manually, requiring either manual data collection, or installation of a new piece of software that can generate the required data.
  • Task 3552 Selecting Reuse Operations Architecture Components This task identifies whether there is any opportunity to reusing existing architecture components. Preferably, the task includes evaluating the reuse component options, determining possible gaps where the software does not satisfy requirements, and selecting the appropriate reuse components. Use of existing components to collect SL management data, and also to generate reports, are the primary alternative for satisfying technology requirements.
  • This task includes evaluating the packaged component options against the selection criteria in order to determine the best fit. Preferably, the task includes evaluating the packaged component options, determining gaps where the software does not satisfy requirements, and selecting the appropriate packaged components.
  • Packaged softwares are generally not the primary option for SL management. Reuse of existing components is far more desirable. If there are gaps in data collection that may be remedied by the use of an additional piece of operations management software, then a supplemental project to install that software could be recommended to the sponsoring organization.
  • Task 3555 Designing Custom Operations Architecture Components This task includes designing the custom components that are needed, and customizing a reuse or packaged component. Preferably, the task includes designing and validating the custom components, evaluating time, cost, and risk associated with custom development, and selecting the custom components.
  • Task 3557 Designing and Validating Operations Architecture This task includes developing a high-level design of the service level management architecture. Preferably, the task includes combining the reuse, package, and custom components into an integrated design, ensuring that the architecture meets the service level management requirements, and defining the standards and procedures for component build and test.
  • Task 3559 Developing Operations Architecture Component and Assembly Test Approach, and Plan This task includes defining the approach and test conditions for the service level management assembly, component, and component acceptance testing. Preferably, the task includes defining objectives, scope, metrics, regression test approaches, and risks associated with each test, defining component testing for custom and customized (reuse or package) components, and defining assembly testing for all components and all interfaces.
  • Service level management testing includes the steps in the various formal testing approaches (i.e., component test, assembly test, and component acceptance test), but may actually occur in one phase and take a rapid testing approach.
  • the test plans may indicate that data may be collected and stored not only from operations management software but also from application software and perhaps from manual entry.
  • the method of the present invention includes verifying that the service level management design is integrated, compatible, and consistent with the other components of the technology infrastructure design, and meeting the business performance model and business capability requirements.
  • Figure 8 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Reviewing and Refining Technology
  • Task 3591 Reviewing and Refining Technology Infrastructure Design
  • This task includes ensuring that the service level management infrastructure design is compatible with other elements of the technology infrastructure.
  • the task includes testing that service level management is integrated and consistent with the other components of the technology infrastructure, and developing the issue list for design items that conflict with the infrastructure or items that do not meet performance goals or requirements. If any data must be collected and input manually, the method includes confirming that this can be accomplished accurately and on a timely basis. Based on the requirements for Ad Hoc reporting, this may be done on a daily or even more frequent basis.
  • This task includes designing, building, and implementing the validation environment for the technology infrastructure.
  • the task includes establishing the validation environment, selecting and trains participants, and scheduling the validation.
  • the designers/architects of OM components that interface with service level management may be included in the validation.
  • Task 3595 Validating Technology Infrastructure Design This task includes identifying gaps between the service level management infrastructure design and the technology infrastructure requirements defined earlier. Preferably, the task includes validating the design, recording issues as they arise, identifying and resolving critical gaps, iterating through the validation until the critical issues have been resolved, and developing action plans for less critical issues. If service level management is being installed as part of a larger business capability, it is used as a checkpoint to verify that the most current requirements from the business capability release are being considered. Task 3597: Analyzing Impact and Revise Plans for Technology Infrastructure
  • This task includes updating the appropriate technology infrastructure delivery plans based on the outcome of the validation process.
  • the task includes analyzing the associated scope of work required for modifications and enhancements, analyzing the impact of validation outcomes on costs and benefits, and refining plans for deployment testing.
  • authorization for build and test 1 12 is sought from the client. After obtaining authorization, the method proceeds to the building and testing steps, or stage 106, of the project. As noted above, this stage includes steps 5510, 5550,5590, 6220 and 6260.
  • Step 5510 - Acquiring Technology Infrastructure This step includes planning and executing the procurement of the software components that must be acquired, and if choices are available, deciding who supplies the components and services and how they are supplied.
  • This task package is required if new packaged software is to be procured and installed as part of the project.
  • Figure 9 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Initiating Acquisition of Technology Infrastructure Components 5511 , Selecting and Appointing Vendors 5513, Evaluating Development Implications of Vendor Appointments 5515, and Preparing and Executing Acceptance Test of Technology Architecture Components 5517.
  • This task includes initiating the process for selecting and obtaining packaged software components.
  • the task includes defining vendor selection criteria, selecting potential vendors, preparing RFP/RFQ documents, and issuing request documents to selected vendors.
  • This task includes selecting the vendor(s) who provide the components and negotiating the terms of the procurement.
  • the task includes evaluating responses to RFP/RFQ documents, determining the selected component(s), and identifying the desired vendor(s).
  • the task also includes negotiating procurement terms, and managing the placement of contracts/orders through component delivery. Software training may be negotiated as part of the contractual agreement.
  • the task also includes ensuring that software is consistent with software standards, and reviewing contract with procurement standards.
  • Task 5515 Evaluating Deployment Implications of Vendor Appointments This task includes determining the impact and deployment implications of the software and vendor selection on the project economics and the business case. Preferably, the task includes comparing procurement costs with project estimates, assessing impact on business case and business performance model, and making revisions and obtains approvals as necessary. The task also includes ensuring that the economics of the transaction(s) are consistent with plans documented in the business case, or modifying the business case as appropriate to reflect changes.
  • Task 5517 Preparing and Executing Acceptance Test of Technology Architecture Components This task includes ensuring that the packaged components meet the technology infrastructure requirements. Preferably, the task includes building the test scripts, the test drivers, the input data, and the output data to complete the technology architecture component acceptance test model, executing the test, and documents any fixes/changes required of the component vendor(s). Software component training may be scheduled and conducted as soon as the new components are installed.
  • Step 5550 Building and Testing Operations Architecture:
  • This step includes designing and programming the service level management components, including extensions to reused and packaged items, and performing component and assembly testing.
  • Figure 10 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the tasks include Performing Operations Architecture Detailed Design 5551 , Revising Operations Architecture Component and Assembly, Test Approach, and Plan
  • Task 5551 Performing Operations Architecture Detailed Design
  • This task includes defining the requirements and program specifications related to each service level management component.
  • the task includes preparing program specifications for custom and customized components, designing the packaged software configuration, and conducting detailed designing reviews.
  • Specifications for custom components are required for design of the database for collection of historical service level performance data; designing of modules to collect and store SL data in the history database; and designing of the performance reports, usually including current period analysis of planned versus actual performance and long-term trend analysis.
  • This task includes updating the service level management test plans to reflect the components' detailed designing, and defining revised considerations or changes to the requirements.
  • the task includes reviewing the test approaches and plans, and revising as needed for new or updated requirements.
  • Task 5553 Building Operations Architecture Components This task delivers all custom service level management components and extensions to packaged or reuse components. Preferably, the task includes building the custom components, building the customized extensions to package or reuse components, and configuring the packaged components.
  • Task 5555 Preparing and Executing Component Test of Custom Operations Components This task includes ensuring that each custom service level management component and each customized component meets its requirements. Preferably, the task includes verifying the component test model, setting up the test environment, executing the test, making component fixes and retests as required, and updating service level management detailed design document with changes. The task also includes confirming component performance and functionality. System performance should not be compromised by the amount of customization. This may be tested here or in subsequent testing tasks.
  • Task 5557 Preparing and Executing Operations Assembly Test This task includes performing a full test of all interactions between service level management components. Preferably, the task includes verifying the assembly test model, setting up the test environment, executing the test, and making fixes and retests as required.
  • Step 6220 Developing Policies, Procedures and Performance Support:
  • This step includes producing a finalized, detailed set of new service level management policies, procedures, and reference materials; creating the new service level agreements for the business unit(s); and conducting a usability test and review to verify ease of use.
  • Figure 11 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks made Performing Policies, Procedures and Performance Support Detailed Design 6221 , Developing Business Policies and Procedures 6223, Developing User Procedures 6225, Developing Reference Materials and Job Aids 6227 and
  • Task 6221 Performing Policies, Procedures and Performance Support Detailed Design
  • This task includes providing the standard structure for all the new service level management policies, procedures, reference materials, and job aids, and providing prototype templates for each product.
  • the task includes designing the structure of the new policies, procedures, and support materials, defining standards for policy and performance support development, designing templates for product development, and creating prototype products.
  • the structure for developing the SLA's includes: business unit, service lines within the business unit, and service items within each service line.
  • Task 6223 Developing Business Policies and Procedures This task includes developing a complete set of business policies and procedures for service level management.
  • Business policies describe the business rules governing workflows.
  • Business procedures describe the sequential sets of tasks to follow based on the policies.
  • the task also includes developing the service level agreements.
  • the task includes collecting and reviewing content information, drafting policies and procedures, drafting service level agreements, and planning for the production of the materials.
  • Service level policies and procedures cover all aspects of the control and reporting process including responsibilities for process steps and identification of communications channels.
  • Each SLA may require its own procedures, but the sponsoring organization may wish to establish a single standardized policy statement regarding SLA's and SL management.
  • Task 6225 Developing User Procedures
  • This task includes drafting a detailed set of service level management user procedures.
  • User procedures provide the details necessary to enable smooth execution of new tasks within a given business procedure.
  • the task includes collecting and reviewing content information, drafting the procedures, verifying consistency with business policies and procedures, and planning for the production of the materials.
  • Each business unit, and each SLA, may require its own unique user procedures. Since the SL management function is likely to be distributed widely across an organization, with a number of clerical and management personnel performing part-time tasks to achieve the objectives, job aids and quick references are desirable performance support tools to supplement the user procedures.
  • Task 6227 Developing Reference Materials and Job Aids This task includes drafting the reference materials and job aids that make a task easier or more efficient. The information provided in the reference materials and job aids is used on the job. Preferably, the task includes collecting and reviewing content information, drafting the performance support products, verifying consistency with policies and procedures, and planning for the production of the materials. Since the SL management function is distributed widely across an organization, with a number of clerical and management personnel performing part-time tasks to achieve the objectives, job aids and quick references are desirable performance support tools to supplement the user procedures.
  • Task 6229 Validating and Testing Policies, Procedures and Performance Support This task includes confirming that the products meet the requirements of the service level management capability and the needs of the personnel who will use them. Preferably, the task includes preparing validation scenarios, validating content and ease of use of materials, testing on-line support products, and resolving open issues. Once service level targets are documented, they are validated by both end users and service delivery providers for reasonability. There is no sense in creating agreements that are unacceptable or unrealistic for one party or the other. In some cases, a prototype of a process may be required, such as to illustrate to an end user exactly what a two-minute transaction turnaround really means in an actual work setting.
  • Step 6260 Develop Learning Products
  • This step includes creating a complete, finalized set of learning products.
  • Figure 12 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Developing Learning Product Standards and
  • Task 6261 Developing Learning Product Standards and Development Environment
  • This task includes creating the environment for developing the service level management learning products.
  • the task includes selecting authoring, and developing tools, defining standards, and designing templates and procedures for product development.
  • Technical training in service level management software components may come from the package vendor or a third party training organization. Procedural training may be custom built.
  • Task 6263 Performing Learning Program Detailed Design This task includes specifying how each learning product identified in the learning product design is developed. Preferably, the task includes defining learning objectives and context, designing the learning activities, and preparing the test plan.
  • SL data collection and reporting are supported by other components of the operations architecture or the application itself, training in those components may be necessary.
  • the available learning products for those components are used when possible to cover the SL interfaces, since custom training for what are often part-time responsibilities may not be cost effective. Because of the part-time aspect of the work, job aids and other performance support means are used in lieu of formal training. If SLA's are defined for individual business units, which are normally recommended, the training may be unique for each SLA because different data collection and reporting components may be involved.
  • This task includes completing prototypes and conducts ease-of-use sessions on classroom-based learning components (i.e., activities, support system, instructor guide).
  • the task includes creating the prototype components, and conducting and evaluating the prototype.
  • Task 6267 Creating Learning Products
  • This task includes developing the learning materials proposed and prototyped during the design activities.
  • the method includes developing activities, content, and evaluation and support materials required, developing maintenance plan, training instructors/facilitators, and arranging for production.
  • Task 6269 Testing Learning Products
  • This task includes testing each product with the intended audience to ensure that the product meets the stated learning objectives, that the instructors are effective, and that the learning product meets the overall learning objectives for service level management.
  • the task includes confirming the Test Plan, executes learning test, and reviewing and making required modifications.
  • this test serves as the formal training session for the group. Multiple sessions may be appropriate if responsibilities are split and all personnel are not responsible for knowing all activities.
  • Step 5590 Preparing and Executing Technology Infrastructure Product Test:
  • This step includes ensuring that the technology infrastructure design, including service level management, has been properly implemented, and that the infrastructure can support the development, execution, and operations architectures.
  • the method also includes testing the deployment of the new technology infrastructure and its integration with the current technology infrastructure.
  • Figure 13 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the tasks include Preparing Technology Infrastructure Test Model
  • Task 5591 Preparing Technology Infrastructure Test Model
  • This task includes creating the service level management infrastructure test model.
  • the task includes creating the test data and expected results, and creating the testing scripts for production, deployment, and configuration tests.
  • the task also includes conducting the service level management training not yet completed, and reviewing and approving the test model. If a complete business capability is being deployed, this is a comprehensive test with service level management being one piece. In general, however, service level management is frequently implemented as an independent capability designed to monitor other applications.
  • the product test should occur in a production-ready environment and should include the hardware and software to be used in production.
  • This task includes verifying that the technology infrastructure successfully supports the requirements outlined in the business capability design stage.
  • the task includes executing the test scripts, verifying the results, and making changes as required.
  • This task includes ensuring that the new service level management infrastructure is correctly deployed within the organization.
  • the task includes executing the test scripts, verifying the results, and making changes as required.
  • This task includes ensuring that the performance of the technology infrastructure, including service level management, is consistent with the technology infrastructure performance model after the infrastructure has been deployed.
  • the task includes executing the test scripts, verifying the results, making changes as required, and updating the risk assessment.
  • the deployment 108 includes the step of Deploying Technology Infrastructure 7170.
  • This step includes bringing the deployment unit up to the technology infrastructure baseline required, including service level management.
  • Figure 14 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Configuring Technology Infrastructure 7171 , Installing Technology Infrastructure 7173, and Verifying Technology Infrastructure 7179.
  • Task 7171 Configuring Technology Infrastructure This task includes customizing the deployment unit's technology infrastructure to prepare for the new business capability components. Preferably, the task includes reviewing the customization requirements, performing the customization, and verifying the infrastructure configuration. Customizing the infrastructure is normally completed in task package 5550, building and testing operations architecture.
  • This task includes installing the technology infrastructure for service level management.
  • the task includes preparing installation environment, installing service level management infrastructure, and verifying the installation.
  • the documentation, performance support and training tools are completed and put in place prior to the deployment.
  • Task 7179 Verifying Technology Infrastructure This task includes verifying the new technology infrastructure environment and addresses the issues raised as a result of the testing.
  • the task includes performing the infrastructure verification, making changes as required, and notifying stakeholders.
  • a follow-up audit is recommended after some period of production operations to confirm the validity and accuracy of service level reports and the adequacy of the actual provisions of the SLA's in support of the business capability releases covered.
  • the present invention also includes a method and apparatus for providing an estimate for building a service level management function in an information technology organization.
  • the method and apparatus generate a preliminary work estimate (time by task) and financial estimate (dollars by classification) based on input of a set of estimating factors that identify the scope and difficulty of key aspects to the function.
  • Fig. 15 is a flow chart of one embodiment a method for providing an estimate of the time and cost to build a service level management function in an information technology organization.
  • a provider of a service level management function such as an IT consultant, for example, Andersen Consulting, obtains estimating factors from the client 202. This is a combined effort with the provider adding expertise and knowledge to help in determining the quantity and difficulty of each factor.
  • Estimating factors represent key business drivers for a given operations management OM function. Table 1 lists and defines the factors to be considered along with examples of a quantity and difficulty rating for each factor.
  • the provider with the help of the client, will determine an estimating factor for the number of service level agreements ("SLA") 202.
  • SLA service level agreements
  • the difficulty rating 204.
  • the number and difficulty rating are input into a computer program.
  • the computer program is a spreadsheet, such as EXCEL, by Microsoft Corp. of Redmond, Washington, USA.
  • the consultant and the client will continue determining the number and difficulty rating for each of the remaining estimating factors 206.
  • this information is transferred to an assumption sheet 208, and the assumptions for each factor are defined.
  • the assumption sheet 208 allows the consultant to enter in comments relating to each estimating factor, and to document the underlying reasoning for a specific estimating factor.
  • an estimating worksheet is generated and reviewed 210 by the consultant, client, or both.
  • An example of a worksheet is shown in Figs. 16a, b, and c.
  • the default estimates of the time required for each task will populate the worksheet, with time estimates based on the number factors and difficulty rating previously assigned to the estimating factors that correspond to each task.
  • the amount of time per task is based on a predetermined time per unit required for the estimating factor multiplied by a factor corresponding to the level of difficulty.
  • Each task listed on the worksheet is described above in connection with details of the method for providing the service level management function.
  • the same numbers in the description of the method above correspond to the same steps, tasks, and task packages of activities shown on the worksheet of Figs. 16a, b and c.
  • the worksheet is reviewed 210 by the provider and the client for accuracy. Adjustments can be made to task level estimates by either returning to the factors sheet and adjusting the units 212 or by entering an override estimate in the 'Used' column 214 on the worksheet. This override may be used when the estimating factor produces a task estimate that is not appropriate for the task, for example, when a task is not required on a particular project.
  • the workplan contains the total time required in days per stage and per task required to complete the project. Tasks may be aggregated into a "task package" of subtasks or activities for convenience.
  • a worksheet as shown in Figs. 16a, 16b, and 16c, may be used, also for convenience. This worksheet may be used to adjust tasks or times as desired, from the experience of the provider, the customer, or both.
  • the total estimated payroll cost for the project will then be computed and displayed, generating final estimates.
  • a determination of out-of- pocket expenses 222 may be applied to the final estimates to determine a final project cost 224.
  • the provider will then review the final estimates with an internal functional expert 226.
  • project management costs for managing the providers work are included in the estimator. These are task dependant and usually run between 10 and 15% of the tasks being managed, depending on the level of difficulty. These management allocations may appear on the worksheet and work plan.
  • the time allocations for planning and managing a project are typically broken down for each of a plurality of task packages where the task packages are planning project execution 920, organizing project resources 940, controlling project work 960, and completing project 990, as shown in FIG. 16a.

Abstract

A method for providing a service level management function in an information technology enterprise includes conducting the tasks involved in building the service level management function (100). The tasks include the planning (110), analyzing (3510), designing (2110, 2410, 2710, 2750, 3550), building (112), testing (5550), and deploying (7170) the service level management function. Each task includes process, organization, and technology infrastructure elements.

Description

METHOD AND ESTIMATOR FOR PROVIDING SERVICE LEVEL
MANAGEMENT
RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application 60/158,259, filed October 6, 1999. This application is related to Application
Serial No. entitled "Organization of Information Technology
Functions," by Dove et al. (Atty docket No. 10022/45), filed herewith. These above applications are incorporated herein by reference in their entirety.
BACKGROUND OF THE INVENTION
The biggest challenges in Information Technology ("IT") development today are actually not in the technologies, but in the management of those technologies in a complex business environment. From idea conception to capability delivery, and to operation, all IT activities, including strategy development, planning, administration, coordination of project requests, change administration, and managing demand for discretionary and non- discretionary activities and operations, must be collectively managed. A shared understanding and representation of IT management is needed because today's technological and business environment demands it. The new technological management orientation should include ways for planning, assessing, and deploying technology within and across enterprises. Business need to balance technological capability with enterprise capability in order to become, or stay, a modern organization that has a chance of survival. Within that IT framework, there is a need, therefore, to construct a complete yet simple IT framework that would quickly convey the entire scope of IT capability in a functional composition. Such IT framework has to be a single framework for describing such IT management. The IT framework should be a framework of all functions; a representation of a complete checklist of all relevant activities performed in an IT enterprise. A single IT
Framework should represent all functions operative in an IT enterprise.
There is also a need for customer service management that manages the ongoing relationships between users, IT services, and the IT enterprise by putting the appropriate service-oriented frameworks, processes, and measures in place. By marketing current IT service offerings, increasing customer satisfaction, and building stronger customer relationships, the IT enterprise can better service their business customer. A customer service management becomes critical to the IT organization as competition to provide
IT services is beginning to increase from outsourcers. Service Level Management is a key function of customer service management. Therefore, to meet this competition there are needs for improved methods for providing a service level management function, and an estimator for doing so.
BRIEF SUMMARY OF THE INVENTION
The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims. By way of introduction, one embodiment of the invention is a method for providing service level management that includes planning, designing, building, testing, and deploying a service level management function for an IT enterprise.
The planning step preferably includes developing a business performance model. The designing step preferably includes designing business processes, skills, and user interaction, and may also include designing an organization infrastructure, designing a performance enhancement infrastructure, and designing technology infrastructure and operations architecture.
In yet another aspect of the preferred embodiment, the building step preferably includes building the technology infrastructure, building the operations architecture, business policies, procedures, performance support, and developing learning products for the service level management.
In another aspect of the preferred embodiment, the testing step preferably includes testing the technology infrastructure, and testing the operations architecture for service level management.
In still another aspect of the preferred embodiment, the developing step includes deploying the technology infrastructure for the IT enterprise. Another aspect of the present invention is a method for providing an estimate for building a service level management function in an information technology organization. This aspect of the present invention allows an IT consultant to give on site estimations to a client within minutes. The estimator produces a detailed break down of cost and time to complete a project by displaying the costs and time corresponding to each stage of a project along with each task. Another aspect of the present invention is a computer system for allocating time and computing cost for building a service level management function in an information technology organization. These and other features and advantages of the invention will become apparent upon review of the following detailed description of the presently preferred embodiments of the invention, taken in conjunction with the appended drawings.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS The present invention is illustrated by way of example and not limitation in the accompanying figures. In the figures, like reference numbers indicate identical or functionally similar elements.
Figure 1 shows a representation of the steps in a method for providing a service level management function according to the presently preferred embodiment of the invention.
Figure 2 shows a representation of the tasks for defining a business performance model for the method represented in Figure 1.
Figure 3 shows a representation of the tasks for designing business processes, skills, and user interaction for the method represented in Figure 1. Figure 4 shows a representation of the tasks for designing technology infrastructure requirements for the method represented in Figure 1. Figure 5 shows a representation of the tasks for designing an organization infrastructure for the method represented in Figure 1. Figure 6 shows a representation of the tasks for designing a performance enhancement infrastructure for the method represented
Figure 1. Figure 7 shows a representation of the tasks for designing operations architecture for the method represented in Figure 1.
Figure 8 shows a representation of the task for validating a technology infrastructure for the method represented in Figure 1. Figure 9 shows a representation of the tasks for acquiring a technology infrastructure for the method represented in Figure 1.
Figure 10 shows a representation of the tasks for building and testing operations architecture for the method represented in Figure 1.
Figure 1 1 shows a representation of the tasks for developing business policies, procedures, and performance support architecture for the method represented in Figure 1.
Figure 12 shows a representation of the tasks for developing learning products for the method represented in Figure 1.
Figure 13 shows a representation of the tasks for testing a technology infrastructure product for the method represented in Figure 1.
Figure 14 shows a representation of the tasks for deploying a technology infrastructure for the method represented in Figure 1.
Figure 15 shows a flow chart for obtaining an estimate of cost and time allocation for a project. Figures 16a, 16b, and 16c show one embodiment of an estimating worksheet for an OM service level management estimating guide.
DETAILED DESCRIPTION OF THE INVENTION
For the purposes of this invention, an information technology ("IT") enterprise may be considered to be a business organization, charitable organization, government organization, etc. that uses an information technology system with or to support its activities. An IT organization is the group, associated systems and processes within the enterprise that are responsible for the management and delivery of information technology services to users in the enterprise. In a modern IT enterprise, multiple functions may be organized and categorized to provide comprehensive service to the user. Thereby, an information technology framework for understanding the interrelationships of the various functionalities, and for managing the complex IT organization is provided.
The various operations management functionalities within the IT framework include a customer service management function; a service integration function; a service delivery function; a capability development function; a change administration function; a strategy, architecture and planning function; a management and administration function; a human performance management function; and a governance and strategic relationships function. Service Level Management plays an important role within the customer service management function. The present invention includes a method for providing a service level management function and an estimator useful for determining the times and cost to provide such a function.
Before describing the method for providing service level management, some related terms are first described as follows:
Customer Service Management:
Customer service management manages on-going relationships between users, IT services, and the IT organization by putting the appropriate service-oriented frameworks, processes, and measures in place. The goal of the customer service management function category is to assist the IT organization in providing quality IT service and support, while meeting and exceeding established levels of service. Customer service management links with other function categories to provide input and aims to continuously improve current IT services and offerings. To continue delivering quality IT service, customer service management links with the IT developers/architects/vendors to verify that service levels are not affected by new offerings. Following the industry's best practices, IT organizations focus on identifying and meeting the needs the users to provide better quality customer service. Customer service management includes service management, demand management, and service control functions. Service Management:
Service management markets and manages IT offerings and capabilities to current and potential users of IT services. Explicitly, Service management demonstrates the value of IT investments to the executive management team, and serves as the source for gathering business needs and requirements. Service management includes service level management and customer management functions.
Service Level Management:
Service level management identifies and sets the expectations of users of IT services and becomes a measure for customer satisfaction. In an IT community, the vendors, outsourcers, etc. may also be made aware of their service level expectations through this function. Service level management often plays a key role in vendor management. This framework uses (1 ) service level agreements ("SLA"s) to refer to service levels established and managed between IT and internal or external customers, also known as
"end users", and (2) Operational Level Agreements ("OLA"s) to refer to operational levels established and managed between IT and internal or external service providers. Service level management includes 10 functions as described below:
1. SLA Requirements Identification:
This function ensures that before creating an SLA, the agreement with the users is in line with the overall business strategy and determines what services the users want. This includes identifying service needs and business unit service requirements. The business liaison function plays a key role in this base practice.
2. SLA Support Verification:
This function ensures that IT can deliver services included in the SLAs. In order to ensure that the service expectations of the user group can be met, the service organization first verifies the level of service which is provided, and compiles metrics for current service levels to act as baselines for service levels promised, which involves linking with the quality metrics function. This function is used to ensure that the service organization is capable of providing services that it promises.
3. SLA Costing Determination: This function identifies chargeback, budget, or costing structure components, which are included in SLAs. This function also determines rewards for meeting and/or exceeding the agreed upon SLAs, and also determines how individual costs for services may be reported.
4. SLA Definition: This function includes the creation of SLA document(s), including draft and final versions, reviewing SLA drafts with users and key stakeholders, and getting approval of the final version. This function also identifies key performance indicators (KPIs) to report upon in an SLA, including the details of how and how often measurements will be reported. The following four base practices measure, report, control, and review the service level.
5. Service Level Measurement:
This practice includes collecting, sorting, organizing, and validating measurement data and/or KPIs. It also links with the quality metrics function.
6. Service Level Reporting:
This practice includes creating reports and distributing them on the agreed-upon time schedule. This practice obtains detailed service reports from appropriate service providers, and collates and/or summarizes report- data as necessary. This practice includes developing service reports against service measures, highlighting service issues as necessary, saving statistics for historical purposes, and comparing service statistics against key performance indicators. 7. Service Level Control:
This practice ensures reports are delivered as scheduled, and determines if service levels are being met. This practice compares key performance indicators against agreed upon service expectations. If necessary, it may initiate a service level review session with a user group or service provider to resolve service issues.
8. Service Level Review:
This practice is performed at scheduled intervals. This practice includes reviewing service results with service providers and end-users on a regular basis. This practice initiates and/or identifies actions needed to resolve service issues, escalates service disputes as necessary, and monitors actions taken to correct SLA service issues.
9. SLA Maintenance:
This practice identifies changes to existing SLAs as necessary, receives requests from end-users and service providers to change the SLAs for business needs, and manages short-term deviation to SLAs due to business requirements.
10. SLA Retirement:
This practice ensures that a specific SLA is retired and the accompanying processes for that SLA (i.e., service reports, scheduled review sessions, etc.) are ceased when services are no longer provided for whatever business reason. This practice ensures that only active SLAs are in existence, which helps to raise the credibility of all SLAs. This practice ensures that historical information, including statistics, reports, various versions of SLA, regarding ceased SLAs is appropriately recorded and stored for future reference.
A METHOD FOR PROVIDING SERVICE LEVEL MANAGEMENT
According to a preferred embodiment of the present invention, the method for providing Operations Management ("OM") service level management is directed to the tasks involved in building a particular OM function, or delivering a system for an OM function, which for this invention is the service level management function. These specific tasks are described in reference to the Operations Management Planning Chart ("OMPC") that is shown on Figure 1. This chart provides the business integration methodology for capability delivery, which includes tasks such as planning analysis, design, build & test, and deployment. Each OM function includes process, organization, and technology elements that are addressed throughout the following description of the corresponding OM function. The method for providing service level management comprises four stages, as described below in connection with Figure 1. The first stage, "capability analysis stage" 102, includes the step of Refining Business Performance Model 2110. The second stage, "capability release design stage", 104, includes the steps of Designing Bus Processes, Skills, & User Interaction 2410, Designing Organization Infrastructure 2710, Designing
Performance Enhancement Infrastructure 2750, Analyzing Technology Infrastructure Requirements 3510, Selecting & Designing Operations Architecture 3550, and Validating Technology Infrastructure 3590. The third stage "capability release build and test state" 106 includes the stage of Acquiring Technology Infrastructure 5510, Building & Testing Operations
Architecture 5550, Developing Policies, Procedures, & Performance Support 6220, Developing Learning Products 6260, and Preparing & Executing Tech Infrastructure Product Test 5590. The fourth stage "deployment" 108 includes the step of Deploying Technology Infrastructure 7170. In the following description, the details of the tasks within each step are discussed.
Step 2110 - Refining Business Performance Model
In step 2110, the business model requirements for service level management is defined, and the scope of the delivery and deployment effort is determined. Figure 2 shows a representation of the tasks for carrying out these functions according to the presently preferred embodiment of the invention. These tasks include Confirming Business Architecture 2111 , Analyzing Operating Constraints 21 13, Analyzing Current Service Level Management Capability 2115, Identifying Service Level Management Best Practices 2117, Refining Service Level Management Requirements 2118, and Updating Business Performance Model 2119.
Task 2111 : Confirming Business Architecture
Task 2111 includes assessing the current business architecture, confirming the goals and objectives, and refining the components of the business architecture. Preferably, this task delievers the planning stage documentation, confirming or refining the overall service level management architecture, and ensuring management commitment to the project.
The amount of analysis performed in this task depends on the work previously performed in the planning phase of the project. Process, technology, organization, and performance issues are included in the analysis. The overall requirements for service level management may be grouped into four main categories:
1. The processes and procedures for service level definition, control, reporting, and review.
2. The service level agreement format, which encompasses the measurements to be tracked, and the target performance for each of the measurements.
3. The organizational requirements for Service Level ("SL") management, which may be documented in a "responsibility matrix," a communication flow, and/or a set of job descriptions. 4. Reporting requirements for reporting of actual performance versus planned or targeted performance for the defined measurement components. It is ensured that the data exists to report actual performance against plan, and that collecting the data is not cost prohibitive.
Service level agreements are stated in definable and measurable terms if they are to be managed on an on-going basis. For example, phrases like
"as fast as possible," "best efforts," or "immediately upon request" are unacceptable language, and create more problems than benefits because expectations based on them are usually difficult to achieve.
Task 2113: Analyzing Operating Constraints
Task 21 13 includes identifying the operating constraints and limitations, and assessing their potential impact on the operations environment.
Preferably, the task includes assessing the organization's strategy and culture and its potential impact on the project, and assessing organization, technology, process, equipment, and facilities for the constraints. The task also includes assessing the organization's ability to adapt to changes as part of the constraints analysis.
Task 2115: Analyzing Current Service Level Management Capability The goal of task 2115 is to identify the current service level management elements, their operation, and their performance. Preferably, the task delivers a current service level management capability assessment. The task is carried out by documenting current activities and procedures to establish a performance baseline, and assessing strengths and weaknesses of the service level management capability. The method also includes considering the issues noted in Task 2111 , when assessing current capabilities and identifying gaps.
Task 2117: Identifying Service Level Management Best Practices
Task 2117 includes defining the relevant best practices for service level management that meet the organization's requirements, identifying the service level management areas that could benefit from application of best practices, and identifying the optimum best practices to meet the environment and objectives.
Task 2118: Refining Service Level Management Requirements Task 2118 includes defining service level management requirements, and allocating the requirements across changes to human performance, business process, and technology. Preferably, the task also includes using the current assessment, constraining assessment, and best practice research to generate the requirements for a change. The task also includes identifying the changes needed to upgrade the service level management capability, or to create a new SL management capability, and allocating the changes according to organization and performance improvements, process improvements, and technology improvements.
When defining service level management requirements for service levels, the process, technology, and organizational requirements need to be identified and documented. Service level targets provide input to business capability design, including the application architecture (mechanisms for SL report data collection), technology infrastructure (processing power, server location), and organization (service desk locations, staffing).
Task 2119: Updating Business Performance Model Task 2119 includes defining the metrics and measures to describe the performance of the service level management process, tools and organization. Preferably, the task includes identifying the performance and operational objectives previously defined, identifying the approaches to be used, and monitoring the overall effectiveness of the service level management process.
After completing the planning step, i.e., the capability analysis stage 102, the project plan is delivered 110, and then the designing step, i.e., the "capability analysis stage" 104, is commenced. As noted above, the designing step includes steps2410, 2710, 2750, 3510, 3550 and 3590.
Step 2410 - Designing Business Processes, Skills & User Interaction:
In step 2410, the business processes, skills, and user interaction are designed. The method includes designing the new service level management processes, and developing the framework and formats for the SLA's. Figure 3 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include
Designing Workflow for Processes, Activities and Tasks 2411 , Defining Physical Environment Interaction 2412, Identifying Skill Requirements 2413, Defining Application Interaction 2415, Identifying Performance Support Requirements 2416, Developing Capability Interaction Model 2417 and Verifying and Validate Business Processes, Skills and User 2419.
Task 2411 : Designing Workflows for Processes, Activities and Tasks
Task 2411 includes creating the workflow diagrams and defining the workloads for service level reporting, controlling, and reviewing. The task also includes identifying service lines to be included in SLA's. Preferably, the task includes developing workflow diagrams for all processes and activities, defining the relationships between core and supporting processes, activities, and tasks. The task also includes defining the metrics associated with the processes and activities, and identifying the service lines to be monitored for each business unit within scope.
The key issues to be addressed for SL reporting are what data needs to be collected to report on service level performance, how it will be collected, and how it will be reported. Data may be collected from many sources, including:
Application programs - mechanisms may be built into applications to report on certain events. Operations architecture components - performance and event/exception/problem data come from sources, such as service desk, print management, file transfer, backup/restore/ archive, security management, asset management, and change control.
Hardware and systems software components - downtime, recovery, and some form of event response times are collected for various hardware and software components.
Workforce reporting and observation - some data may not be available from automated sources and are manually collected. Some form of confirmation or verification is desirable for manual data. The design recognizes the need for Ad Hoc control and review, as well as scheduled control and review. Ad Hoc control recognizes the fact that problems may arise which require immediate attention and resolution.
Task 2412: Defining Physical Environment Interaction Task 2412 includes identifying the implications of the service level management processes on the physical environment, including location, layout, and equipment requirements. The task preferably includes identifying the Workflow/Physical environment interfaces, designing the facilities, layout, and equipment required for service level management, and identifying distributed service level management physical requirements. The task also includes defining the layout and co-location implications of the service level management workflows and the physical environment.
Task 2413: Identifying Skill Requirements
Task 2413 includes identifying the skill and behavior requirements for performing service level management tasks. Preferably, the task includes identifying critical tasks from the workflow designing, defining the skills needed for the critical tasks, and identifying supporting skills and appropriate behavioral characteristics.
In very large and complex applications, the overall SL coordination function may be a full time position. The people most often involved in performing the key roles may be one or more management representatives from the organization responsible for managing the systems/service delivery (presumably the IT organization), and management representatives from the business units to whom the services are delivered. These may be the same groups involved in defining and maintaining the service level agreements.
Task 2415: Defining Application Interaction Task 2415 includes identifying the human-computer interactions necessary to fulfill key service level management activities. Preferably, the task includes using the workflows to identify the service level management activities supported by software elements, and defining the human-computer interactions needed to meet the requirements. Application interactions may occur in data collection during all phases of service level management process, i.e., when comparing actual service metrics against service levels.
Task 2416: Identifying Performance Support Requirements Task 2416 includes analyzing the service level management processes, and determining how to support human performance within these processes. Preferably, the task includes analyzing the critical performance factors for each SL management task, and selecting training and support aids to maximize workforce performance. Performance support is useful in data gathering effort if there are manual processes or application interactions involved. Job aids are also used if business unit users are responsible for triggering the SL reporting process.
Task 2417: Developing Capability Interaction Model Task 2417 identifies the relationships between the tasks in the workflow diagrams, the physical location, skills required, human-computer interactions and performance support needs. The task also includes developing capability interaction models by identifying the interactions within each process for physical environment, skills, application and performance support, and unifying these models. Preferably, the task includes integrating workflows, the physical environment models, role and skill definitions, the application interactions, and support requirements to develop the capability interaction models. This model illustrates how the process is performed, what roles fulfill the activities involved, and how the roles are supported to maintain the service level management capability.
Task 2419: Verifying and Validating Business Processes, Skills and
User Interaction
Task 2419 includes verifying and validating that the process designs and the capability interaction models meet the service level management requirements and are internally consistent. The task also includes verifying that the capability model fulfills the original requirements. The task may include the use stakeholders and outside experts as well as the design teams to do the validation.
Step 3510 - Analyzing Technology Infrastructure Requirements:
In step 3510, technology infrastructure requirements are analyzed. For this step, the method of the present invention includes selecting and designs the technology infrastructure, and establishing preliminary plans for technology infrastructure product testing. Figure 4 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The task include Preparing Technology Infrastructure Performance Model 3511 , Analyzing Technology Infrastructure
Components Requirements 3513, Assessing Technology Infrastructure Current Environment 3515, and Planning Technology Infrastructure Product Test 3517.
Task 3511 : Preparing Technology Infrastructure Performance Model This task includes analyzing the functional, technical, and performance requirements for the service level management infrastructure. Preferably, the task includes identifying the key performance indicators for service level management, establishing baseline estimates, setting measurable targets for the performance indicators, and developing the functional model, and the performance model.
Task 3513: Analyzing Technology Infrastructure Component Requirements
This task includes analyzing and documenting the requirements for service level management components, and defining their needs. Preferably, the task includes identifying the constraints imposed by the environment, refining functional, physical, and performance requirements developed in the models previously built, and assessing the interfaces to other OM components to avoid redundancy and ensure consistency/compatibility. The technology requirements are determined from the business capability requirements, and more specifically the service level agreement scope, metrics, and measurement parameters. This information identifies data collection and reporting requirements to support SL management. All components of the application and supporting infrastructure (e.g., network management software, database management software, etc.) are providers of the information. Depending on the requirements, other aspects of the operations architecture are also sources of data. This includes the software packages now in use or contemplated for use in support of OM service management, service delivery, and change management functions. These efforts are coordinated to ensure there is an appropriate source for all SL data, and to identify the data that will is manually input to the SL reporting process.
The requirements for historical SL reporting to identify trends in service levels overtime may indicate a requirement for long-term storage of SL data in some type of database or repository.
Task 3515: Assessing Technology Infrastructure Current Environment
This task includes assessing the ability of the current service level management infrastructure to support the new component requirements, and identifying current standards for technology infrastructure. Preferably, the task includes documenting and analyzing the current service level management technology environment, and noting any standards/policies regarding service level management technology.
Task 3517: Planning Technology Infrastructure Product Test This task includes planning the product test for the service level management infrastructure. The results of this task provide the basis on which the product test is performed. Preferably, the task includes defining the test objectives, scope, environment, testing conditions and expected results, and developing the deployment plan. The product test is a test of the infrastructure. Therefore, the organization and process elements are within the scope of the test. Step 2710 - Designing Organization Infrastructure:
For step 2710, the method of the present invention includes defining the structures for managing human performance, and defining what is expected of people who participate in the service level management function, the required competencies, and how performance is managed. Figure 5 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks include Designing Roles, Jobs and Teams 2711 , Designing Competency Model 2713, Designing Performance Management Infrastructure 2715, Determining Organization Infrastructure Mobilization Approach 2717 and Verifying and
Validating Organization Infrastructure 2719.
Task 2711 : Designing Roles, Jobs and Teams This task includes determining competencies and the roles required to operate the new capability, defining how roles are grouped to fit into teams and jobs, and designing the metrics for the roles. Preferably, the task includes confirming the service level management competency requirements, designing the roles, jobs and teams, determining the reporting relationships, and identifying the performance measurement factors.
The key issues include the scope of activities to be performed, geographic distribution of the function, the software tools which are involved in supporting the SL management process, and the complexity of the environment. These key issues also include allocating responsibilities for the SL definition, control, reporting, and review activities. This step preferably develops a responsibility matrix of activities, and assigns responsibility for all activities as well as overall responsibility for SL Management. Since there may in fact be several agreements (for example, one per business unit), several people may have overall responsibility for specific areas. The functions and responsibilities involved in SL management normally do not require full-time positions, meaning a new organizational infrastructure may not be required. Task 2713: Designing Competency Model
This task includes defining the skills, knowledge, and behavior that people require to accomplish their roles in the service level management process. Preferably, the task includes determining the characteristics required of the individuals/teams, defining the individual capabilities necessary for success, organizing the capabilities along a proficiency scale, and relating them to the jobs and teams.
Task 2715: Design Performance Management Infrastructure This task includes defining how individual performance is measured, developed, and rewarded, and determining a performance management approach and appraisal criteria. Preferably, the task includes developing standards for individuals and teams involved in the service level management process, and using the standards, identifying a system to monitor the individuals' and teams' abilities to perform up to the standards. The task also includes ensuring the performance measures are based upon actions within the individual's control.
Task 2717: Determine Organization Infrastructure Mobilization Approach
This task includes determining and mobilizing the resources required to staff the new service level management capability. Preferably, the task includes identifying profiles of the ideal candidates for each position, identifying the sourcing approaches and timing requirements, and determining the selection and recruiting approaches.
Task 2719: Verifying and Validating Organization Infrastructure This task, the method of the present invention includes verifying and validating that the service level management organization meets the needs of the service level management capability and is internally consistent. Preferably, the task includes determining the approach to be used and participants to be involved, and verifying that the organization structure satisfies service level management capability requirements. Step 2750 - Designing Performance Enhancement Infrastructure:
In this step, a performance enhancement infrastructure is designed. The step includes determining the training needed for new service level management functions, and determining the on-line help text, procedures, job aids, and other information to be used. Figure 6 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Assessing Employee Competency and Performance 2751 , Determining Performance Enhancement Needs 2753, Designing Performance Enhancement Product 2755, Defining Learning Test Approach 2757, and Verifying and Validating Performance
Enhancement Infrastructure 2759.
Task 2751 : Assessing Employee Competency and Performance. This task includes refining the information about the current service level management staffs competency, proficiency, and performance levels in specific areas, and assessing the gaps in competencies and performance levels that drive the design of the performance enhancement infrastructure. Preferably the task includes assessing the competency of the current service level management staff based on the competency model previously developed.
Task 2753: Determining Performance Enhancement Needs
This task includes assessing the performance support and training requirements necessary to close the competency and performance gaps in the workforce. Preferably, the task includes using the employee assessment to determine the type of performance enhancement required to close the gaps and reach the desired competency levels.
The service level management function requires communication, negotiation, and other management skills that do not lend themselves to one-time training and job aids. Performance enhancement in this case focuses on training needed by clerical and operations personnel to collect appropriate data and prepare service level reports as identified in the capability requirements analysis. Task 2755: Designing Performance Enhancement Products This task includes defining the number and structure of performance support and learning products. Preferably, the task includes determining the delivery approaches for training and performance support, designing the learning and performance support products, and defining the support systems for delivering training and performance support.
Data collection is a combination of automated and manual processes, which may require some knowledge of computer applications. Reporting may be via customized applications, or via Spreadsheets in less complex environments. Due to part-time service level management roles, job aids or desktop reminder systems are frequently the preferred approaches for performance support in this type of situation. When these situations arise, the performance enhancement infrastructure focuses on fairly training and support for the part-time function.
Task 2757: Defining Learning Test Approach
This task includes developing a comprehensive approach for testing the learning products with respect to achieving each product's learning objectives. Preferably, the task includes identifying which learning objectives to be tested, and identifying the data capture methods to be used to test those objectives.
Task 2759: Verifying and Validating Performance Enhancement
Infrastructure
This task performance enhancement infrastructure is validated. The task of the present invention includes verifying the performance enhancement infrastructure and the learning test deliverables to determine how well they fit together to support the new service level management capability. Preferably, the task includes simulating the processes and activities performed by the members of the service level management team in order to identify performance enhancement weaknesses. The task includes identifying the problems and repeating the appropriate tasks necessary to address the problems. Step 3550 - Selecting and Designing Operations Architecture:
This step includes selecting and designing the components required to support a high-level service level management architecture, including reuse, package, and custom components. Figure 7 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Identifying Operation Architecture Component Options 3551 , Selecting Reuse Operations Architecture Components 3552, Selecting Packaged Operations Architecture Components 3553, Designing Custom Operations Architecture Components 3555, Designing and Validating Operations Architecture 3557 and Developing
Operations Architecture Component and Assembly Test Approach, and Plan 3559.
Task 3551 : Identifying Operations Architecture Component Options This task includes identifying specific component options that are needed to support the production environment. Preferably, the task includes identifying all risks and gaps that exist in the current service level management environment, selecting components that should support the service level management architecture, considering current software resources, packaged software and custom software alternatives during the selection process, and if new packaged software is part of the solution, submitting RFPs to vendors for software products that meet basic requirements.
Data collection gaps may exist for several reasons. First, the collection mechanisms are the software packages that collect performance and event data as a by-product of their primary function, such as service desk support, production scheduling, software distribution, or other OM functions. Second, determining response-time for a transaction may include collecting data from the network management system, the database management system, and the application itself. Third, some of the currently installed operations software may not be designed to collect the data required, or may be incompatible with other elements of the infrastructure. Also, some operation functions may be handled manually, requiring either manual data collection, or installation of a new piece of software that can generate the required data.
Task 3552: Selecting Reuse Operations Architecture Components This task identifies whether there is any opportunity to reusing existing architecture components. Preferably, the task includes evaluating the reuse component options, determining possible gaps where the software does not satisfy requirements, and selecting the appropriate reuse components. Use of existing components to collect SL management data, and also to generate reports, are the primary alternative for satisfying technology requirements.
Task 3553: Selecting Packaged Operations Architecture Components
This task includes evaluating the packaged component options against the selection criteria in order to determine the best fit. Preferably, the task includes evaluating the packaged component options, determining gaps where the software does not satisfy requirements, and selecting the appropriate packaged components. Packaged softwares are generally not the primary option for SL management. Reuse of existing components is far more desirable. If there are gaps in data collection that may be remedied by the use of an additional piece of operations management software, then a supplemental project to install that software could be recommended to the sponsoring organization.
Task 3555: Designing Custom Operations Architecture Components This task includes designing the custom components that are needed, and customizing a reuse or packaged component. Preferably, the task includes designing and validating the custom components, evaluating time, cost, and risk associated with custom development, and selecting the custom components.
Task 3557: Designing and Validating Operations Architecture This task includes developing a high-level design of the service level management architecture. Preferably, the task includes combining the reuse, package, and custom components into an integrated design, ensuring that the architecture meets the service level management requirements, and defining the standards and procedures for component build and test.
Task 3559: Developing Operations Architecture Component and Assembly Test Approach, and Plan This task includes defining the approach and test conditions for the service level management assembly, component, and component acceptance testing. Preferably, the task includes defining objectives, scope, metrics, regression test approaches, and risks associated with each test, defining component testing for custom and customized (reuse or package) components, and defining assembly testing for all components and all interfaces.
Service level management testing includes the steps in the various formal testing approaches (i.e., component test, assembly test, and component acceptance test), but may actually occur in one phase and take a rapid testing approach. The test plans may indicate that data may be collected and stored not only from operations management software but also from application software and perhaps from manual entry.
Step 3590 - Validating Technology Infrastructure:
For this step, the method of the present invention includes verifying that the service level management design is integrated, compatible, and consistent with the other components of the technology infrastructure design, and meeting the business performance model and business capability requirements. Figure 8 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Reviewing and Refining Technology
Infrastructure Design 3591 , Establishing Technology Infrastructure Validation Environment 3593, Validating Technology Infrastructure Design 3595, and Analyzing Impact and Revise Plans for Technology Infrastructure 3597. Task 3591 : Reviewing and Refining Technology Infrastructure Design This task includes ensuring that the service level management infrastructure design is compatible with other elements of the technology infrastructure. Preferably, the task includes testing that service level management is integrated and consistent with the other components of the technology infrastructure, and developing the issue list for design items that conflict with the infrastructure or items that do not meet performance goals or requirements. If any data must be collected and input manually, the method includes confirming that this can be accomplished accurately and on a timely basis. Based on the requirements for Ad Hoc reporting, this may be done on a daily or even more frequent basis.
Task 3593: Establishing Technology Infrastructure Validation Environment
This task includes designing, building, and implementing the validation environment for the technology infrastructure. Preferably, the task includes establishing the validation environment, selecting and trains participants, and scheduling the validation. The designers/architects of OM components that interface with service level management may be included in the validation.
Task 3595: Validating Technology Infrastructure Design This task includes identifying gaps between the service level management infrastructure design and the technology infrastructure requirements defined earlier. Preferably, the task includes validating the design, recording issues as they arise, identifying and resolving critical gaps, iterating through the validation until the critical issues have been resolved, and developing action plans for less critical issues. If service level management is being installed as part of a larger business capability, it is used as a checkpoint to verify that the most current requirements from the business capability release are being considered. Task 3597: Analyzing Impact and Revise Plans for Technology Infrastructure
This task includes updating the appropriate technology infrastructure delivery plans based on the outcome of the validation process. Preferably, the task includes analyzing the associated scope of work required for modifications and enhancements, analyzing the impact of validation outcomes on costs and benefits, and refining plans for deployment testing.
After completing the design stage 104 of the method, authorization for build and test 1 12 is sought from the client. After obtaining authorization, the method proceeds to the building and testing steps, or stage 106, of the project. As noted above, this stage includes steps 5510, 5550,5590, 6220 and 6260.
Step 5510 - Acquiring Technology Infrastructure: This step includes planning and executing the procurement of the software components that must be acquired, and if choices are available, deciding who supplies the components and services and how they are supplied. This task package is required if new packaged software is to be procured and installed as part of the project. Figure 9 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Initiating Acquisition of Technology Infrastructure Components 5511 , Selecting and Appointing Vendors 5513, Evaluating Development Implications of Vendor Appointments 5515, and Preparing and Executing Acceptance Test of Technology Architecture Components 5517.
Task 5511 : Initiating Acquisition of Technology Infrastructure Components
This task includes initiating the process for selecting and obtaining packaged software components. Preferably, the task includes defining vendor selection criteria, selecting potential vendors, preparing RFP/RFQ documents, and issuing request documents to selected vendors.
Task 5513: Selecting and Appointing Vendors
This task includes selecting the vendor(s) who provide the components and negotiating the terms of the procurement. Preferably, the task includes evaluating responses to RFP/RFQ documents, determining the selected component(s), and identifying the desired vendor(s). The task also includes negotiating procurement terms, and managing the placement of contracts/orders through component delivery. Software training may be negotiated as part of the contractual agreement. The task also includes ensuring that software is consistent with software standards, and reviewing contract with procurement standards.
Task 5515: Evaluating Deployment Implications of Vendor Appointments This task includes determining the impact and deployment implications of the software and vendor selection on the project economics and the business case. Preferably, the task includes comparing procurement costs with project estimates, assessing impact on business case and business performance model, and making revisions and obtains approvals as necessary. The task also includes ensuring that the economics of the transaction(s) are consistent with plans documented in the business case, or modifying the business case as appropriate to reflect changes.
Task 5517: Preparing and Executing Acceptance Test of Technology Architecture Components This task includes ensuring that the packaged components meet the technology infrastructure requirements. Preferably, the task includes building the test scripts, the test drivers, the input data, and the output data to complete the technology architecture component acceptance test model, executing the test, and documents any fixes/changes required of the component vendor(s). Software component training may be scheduled and conducted as soon as the new components are installed.
Step 5550 - Building and Testing Operations Architecture:
This step includes designing and programming the service level management components, including extensions to reused and packaged items, and performing component and assembly testing. Figure 10 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks include Performing Operations Architecture Detailed Design 5551 , Revising Operations Architecture Component and Assembly, Test Approach, and Plan
5552, Building Operations Architecture Components 5553, Preparing and Executing Component Test of Custom Operations Components 5555 and Preparing and Executing Operations Assembly Test 5557.
Task 5551 : Performing Operations Architecture Detailed Design This task includes defining the requirements and program specifications related to each service level management component. Preferably, the task includes preparing program specifications for custom and customized components, designing the packaged software configuration, and conducting detailed designing reviews. Specifications for custom components are required for design of the database for collection of historical service level performance data; designing of modules to collect and store SL data in the history database; and designing of the performance reports, usually including current period analysis of planned versus actual performance and long-term trend analysis.
Task 5552: Revising Operations Architecture Component and
Assembly, Test Approach, and Plan
This task includes updating the service level management test plans to reflect the components' detailed designing, and defining revised considerations or changes to the requirements. Preferably, the task includes reviewing the test approaches and plans, and revising as needed for new or updated requirements.
Task 5553: Building Operations Architecture Components This task delivers all custom service level management components and extensions to packaged or reuse components. Preferably, the task includes building the custom components, building the customized extensions to package or reuse components, and configuring the packaged components.
Task 5555: Preparing and Executing Component Test of Custom Operations Components This task includes ensuring that each custom service level management component and each customized component meets its requirements. Preferably, the task includes verifying the component test model, setting up the test environment, executing the test, making component fixes and retests as required, and updating service level management detailed design document with changes. The task also includes confirming component performance and functionality. System performance should not be compromised by the amount of customization. This may be tested here or in subsequent testing tasks.
Task 5557: Preparing and Executing Operations Assembly Test This task includes performing a full test of all interactions between service level management components. Preferably, the task includes verifying the assembly test model, setting up the test environment, executing the test, and making fixes and retests as required.
Assembly testing is a critical point for the SL components because the numerous interfaces required may be brought together for the first time for interaction testing. In addition to the operations architecture assembly testing, the manual data input interfaces, and the application architecture components interfaces are tested. The latter may have to be simulated. Several test cycles may be needed to fully test historical trend reporting capabilities. Step 6220 - Developing Policies, Procedures and Performance Support:
This step includes producing a finalized, detailed set of new service level management policies, procedures, and reference materials; creating the new service level agreements for the business unit(s); and conducting a usability test and review to verify ease of use. Figure 11 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks made Performing Policies, Procedures and Performance Support Detailed Design 6221 , Developing Business Policies and Procedures 6223, Developing User Procedures 6225, Developing Reference Materials and Job Aids 6227 and
Validating and Testing Policies, Procedures and Performance Support 6229.
Task 6221 : Performing Policies, Procedures and Performance Support Detailed Design
This task includes providing the standard structure for all the new service level management policies, procedures, reference materials, and job aids, and providing prototype templates for each product. Preferably, the task includes designing the structure of the new policies, procedures, and support materials, defining standards for policy and performance support development, designing templates for product development, and creating prototype products.
This task package covers detailed design, drafting, review, and approval of the actual service level agreements as well as the supporting materials. The structure for developing the SLA's includes: business unit, service lines within the business unit, and service items within each service line.
Task 6223: Developing Business Policies and Procedures This task includes developing a complete set of business policies and procedures for service level management. Business policies describe the business rules governing workflows. Business procedures describe the sequential sets of tasks to follow based on the policies. The task also includes developing the service level agreements. Preferably, the task includes collecting and reviewing content information, drafting policies and procedures, drafting service level agreements, and planning for the production of the materials. Service level policies and procedures cover all aspects of the control and reporting process including responsibilities for process steps and identification of communications channels. Each SLA may require its own procedures, but the sponsoring organization may wish to establish a single standardized policy statement regarding SLA's and SL management.
Task 6225: Developing User Procedures
This task includes drafting a detailed set of service level management user procedures. User procedures provide the details necessary to enable smooth execution of new tasks within a given business procedure.
Preferably, the task includes collecting and reviewing content information, drafting the procedures, verifying consistency with business policies and procedures, and planning for the production of the materials.
Each business unit, and each SLA, may require its own unique user procedures. Since the SL management function is likely to be distributed widely across an organization, with a number of clerical and management personnel performing part-time tasks to achieve the objectives, job aids and quick references are desirable performance support tools to supplement the user procedures.
Task 6227: Developing Reference Materials and Job Aids This task includes drafting the reference materials and job aids that make a task easier or more efficient. The information provided in the reference materials and job aids is used on the job. Preferably, the task includes collecting and reviewing content information, drafting the performance support products, verifying consistency with policies and procedures, and planning for the production of the materials. Since the SL management function is distributed widely across an organization, with a number of clerical and management personnel performing part-time tasks to achieve the objectives, job aids and quick references are desirable performance support tools to supplement the user procedures.
Task 6229: Validating and Testing Policies, Procedures and Performance Support This task includes confirming that the products meet the requirements of the service level management capability and the needs of the personnel who will use them. Preferably, the task includes preparing validation scenarios, validating content and ease of use of materials, testing on-line support products, and resolving open issues. Once service level targets are documented, they are validated by both end users and service delivery providers for reasonability. There is no sense in creating agreements that are unacceptable or unrealistic for one party or the other. In some cases, a prototype of a process may be required, such as to illustrate to an end user exactly what a two-minute transaction turnaround really means in an actual work setting.
Step 6260 - Develop Learning Products:
This step includes creating a complete, finalized set of learning products. Figure 12 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Developing Learning Product Standards and
Development Environment 6261 , Performing Learning Program Detailed Design 6263, Prototyping Learning Products 6265, Creating Learning Products 6267 and Testing Learning Products 6269.
Task 6261 : Developing Learning Product Standards and Development Environment
This task includes creating the environment for developing the service level management learning products. Preferably, the task includes selecting authoring, and developing tools, defining standards, and designing templates and procedures for product development. Technical training in service level management software components may come from the package vendor or a third party training organization. Procedural training may be custom built.
Task 6263: Performing Learning Program Detailed Design This task includes specifying how each learning product identified in the learning product design is developed. Preferably, the task includes defining learning objectives and context, designing the learning activities, and preparing the test plan.
Where SL data collection and reporting are supported by other components of the operations architecture or the application itself, training in those components may be necessary. The available learning products for those components are used when possible to cover the SL interfaces, since custom training for what are often part-time responsibilities may not be cost effective. Because of the part-time aspect of the work, job aids and other performance support means are used in lieu of formal training. If SLA's are defined for individual business units, which are normally recommended, the training may be unique for each SLA because different data collection and reporting components may be involved.
Task 6265: Prototyping Learning Products
This task includes completing prototypes and conducts ease-of-use sessions on classroom-based learning components (i.e., activities, support system, instructor guide). Preferably, the task includes creating the prototype components, and conducting and evaluating the prototype.
Task 6267: Creating Learning Products
This task includes developing the learning materials proposed and prototyped during the design activities. Preferably, the method includes developing activities, content, and evaluation and support materials required, developing maintenance plan, training instructors/facilitators, and arranging for production. Task 6269: Testing Learning Products
This task includes testing each product with the intended audience to ensure that the product meets the stated learning objectives, that the instructors are effective, and that the learning product meets the overall learning objectives for service level management. Preferably, the task includes confirming the Test Plan, executes learning test, and reviewing and making required modifications.
If the target audience is small, this test serves as the formal training session for the group. Multiple sessions may be appropriate if responsibilities are split and all personnel are not responsible for knowing all activities.
Step 5590 - Preparing and Executing Technology Infrastructure Product Test:
This step includes ensuring that the technology infrastructure design, including service level management, has been properly implemented, and that the infrastructure can support the development, execution, and operations architectures. The method also includes testing the deployment of the new technology infrastructure and its integration with the current technology infrastructure. Figure 13 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks include Preparing Technology Infrastructure Test Model
5591 , Executing Technology Infrastructure Product Test 5593, Executing Technology Infrastructure Deployment Test 5595 and Executing Technology Infrastructure Configuration Test 5597.
Task 5591 : Preparing Technology Infrastructure Test Model This task includes creating the service level management infrastructure test model. Preferably, the task includes creating the test data and expected results, and creating the testing scripts for production, deployment, and configuration tests. The task also includes conducting the service level management training not yet completed, and reviewing and approving the test model. If a complete business capability is being deployed, this is a comprehensive test with service level management being one piece. In general, however, service level management is frequently implemented as an independent capability designed to monitor other applications. The product test should occur in a production-ready environment and should include the hardware and software to be used in production.
Task 5593: Executing Technology Infrastructure Product Test
This task includes verifying that the technology infrastructure successfully supports the requirements outlined in the business capability design stage. Preferably, the task includes executing the test scripts, verifying the results, and making changes as required.
Task 5595: Executing Technology Infrastructure Deployment Test
This task includes ensuring that the new service level management infrastructure is correctly deployed within the organization. Preferably, the task includes executing the test scripts, verifying the results, and making changes as required.
Task 5597: Executing Technology Infrastructure Configuration Test
This task includes ensuring that the performance of the technology infrastructure, including service level management, is consistent with the technology infrastructure performance model after the infrastructure has been deployed. Preferably, the task includes executing the test scripts, verifying the results, making changes as required, and updating the risk assessment.
After completing the building and testing step 106, authorization for deployment 114 is sought from the client. After obtaining authorization, the deploying step 108 may commence. The deployment 108 includes the step of Deploying Technology Infrastructure 7170.
Step 7170 - Deploying Technology Infrastructure:
This step includes bringing the deployment unit up to the technology infrastructure baseline required, including service level management. Figure 14 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. These tasks include Configuring Technology Infrastructure 7171 , Installing Technology Infrastructure 7173, and Verifying Technology Infrastructure 7179.
Task 7171 : Configuring Technology Infrastructure This task includes customizing the deployment unit's technology infrastructure to prepare for the new business capability components. Preferably, the task includes reviewing the customization requirements, performing the customization, and verifying the infrastructure configuration. Customizing the infrastructure is normally completed in task package 5550, building and testing operations architecture.
Task 7173: Installing Technology Infrastructure
This task includes installing the technology infrastructure for service level management. Preferably, the task includes preparing installation environment, installing service level management infrastructure, and verifying the installation. In addition to the service level management software, the documentation, performance support and training tools are completed and put in place prior to the deployment.
Task 7179: Verifying Technology Infrastructure This task includes verifying the new technology infrastructure environment and addresses the issues raised as a result of the testing.
Preferably, the task includes performing the infrastructure verification, making changes as required, and notifying stakeholders.
A follow-up audit is recommended after some period of production operations to confirm the validity and accuracy of service level reports and the adequacy of the actual provisions of the SLA's in support of the business capability releases covered.
In addition to the method for providing the service level management function, as described above, the present invention also includes a method and apparatus for providing an estimate for building a service level management function in an information technology organization. The method and apparatus generate a preliminary work estimate (time by task) and financial estimate (dollars by classification) based on input of a set of estimating factors that identify the scope and difficulty of key aspects to the function.
Previous estimators only gave a bottom line cost figures and were directed to business rather than OM functions. It would take days or weeks before the IT consultant produced these figures for the client. If the project came in either above or below cost, there was no way of telling who or what was responsible. Therefore, a need exists for an improved estimator.
Fig. 15 is a flow chart of one embodiment a method for providing an estimate of the time and cost to build a service level management function in an information technology organization. In Fig. 15, a provider of a service level management function, such as an IT consultant, for example, Andersen Consulting, obtains estimating factors from the client 202. This is a combined effort with the provider adding expertise and knowledge to help in determining the quantity and difficulty of each factor. Estimating factors represent key business drivers for a given operations management OM function. Table 1 lists and defines the factors to be considered along with examples of a quantity and difficulty rating for each factor.
For example, as an illustration of the method of the invention, the provider, with the help of the client, will determine an estimating factor for the number of service level agreements ("SLA") 202. Next comes the determination of the difficulty rating 204. Each of these determinations depends on the previous experience of the consultant. The provider or consultant with a high level of experience will have a greater opportunity to determine the correct number and difficulty. The number and difficulty rating are input into a computer program. In the preferred embodiment, the computer program is a spreadsheet, such as EXCEL, by Microsoft Corp. of Redmond, Washington, USA. The consultant and the client will continue determining the number and difficulty rating for each of the remaining estimating factors 206. After the difficulty rating has been determined for all of the estimating factors, this information is transferred to an assumption sheet 208, and the assumptions for each factor are defined. The assumption sheet 208 allows the consultant to enter in comments relating to each estimating factor, and to document the underlying reasoning for a specific estimating factor.
Table 1
Figure imgf000039_0001
Figure imgf000040_0001
Next, an estimating worksheet is generated and reviewed 210 by the consultant, client, or both. An example of a worksheet is shown in Figs. 16a, b, and c. The default estimates of the time required for each task will populate the worksheet, with time estimates based on the number factors and difficulty rating previously assigned to the estimating factors that correspond to each task. The amount of time per task is based on a predetermined time per unit required for the estimating factor multiplied by a factor corresponding to the level of difficulty. Each task listed on the worksheet is described above in connection with details of the method for providing the service level management function. The same numbers in the description of the method above correspond to the same steps, tasks, and task packages of activities shown on the worksheet of Figs. 16a, b and c. The worksheet is reviewed 210 by the provider and the client for accuracy. Adjustments can be made to task level estimates by either returning to the factors sheet and adjusting the units 212 or by entering an override estimate in the 'Used' column 214 on the worksheet. This override may be used when the estimating factor produces a task estimate that is not appropriate for the task, for example, when a task is not required on a particular project.
Next, the provider and the client review and adjust, if necessary, the personnel time staffing factors for allocations 216 for the seniority levels of personnel needed for the project. Referring to Figs. 16a, b, and c, these columns are designated as Partner - "Ptnr", Manager - "Mgr", Consultant - "Cnslt", and Analyst - "Anlst", respectively. These allocations are adjusted to meet project requirements and are typically based on experience with delivering various stages of a project. It should be noted that the staffing factors should add up to 1.
The consultant or provider and the client now review the workplan 218, and may optionally include labor to be provided by the client. In one embodiment, the workplan contains the total time required in days per stage and per task required to complete the project. Tasks may be aggregated into a "task package" of subtasks or activities for convenience. A worksheet, as shown in Figs. 16a, 16b, and 16c, may be used, also for convenience. This worksheet may be used to adjust tasks or times as desired, from the experience of the provider, the customer, or both.
Finally, a financial estimate is generated in which the provider and client enter the agreed upon billing rates for Ptnr, Mgr, Cnslt, and Anlst 220.
The total estimated payroll cost for the project will then be computed and displayed, generating final estimates. At this point, a determination of out-of- pocket expenses 222 may be applied to the final estimates to determine a final project cost 224. Preferably, the provider will then review the final estimates with an internal functional expert 226.
Other costs may also be added to the project, such as hardware and software purchase costs, project management costs, and the like. Typically, project management costs for managing the providers work are included in the estimator. These are task dependant and usually run between 10 and 15% of the tasks being managed, depending on the level of difficulty. These management allocations may appear on the worksheet and work plan. The time allocations for planning and managing a project are typically broken down for each of a plurality of task packages where the task packages are planning project execution 920, organizing project resources 940, controlling project work 960, and completing project 990, as shown in FIG. 16a.
It will be appreciated that a wide range of changes and modifications to the method as described are contemplated. Accordingly, while preferred embodiments have been shown and described in detail by way of examples, further modifications and embodiments are possible without departing from the scope of the invention as defined by the examples set forth. It is therefore intended that the invention be defined by the appended claims and all legal equivalents.

Claims

1. A method for providing a service level management function for an IT enterprise, the method comprising:
(a) planning for said service level management function; (b) designing said service level management function;
(c) building said service level management function;
(d) testing said service level management function: and
(e) deploying said service level management function.
2. The method of claim 1 wherein said planning includes: (f) developing a business performance model for said service level management.
3. The method of claim 2 wherein said developing includes at lest one of the following:
(g) confirming business architecture; (h) analyzing a plurality of operating constraints;
(i) analyzing a current service level management capability;
(j) identifying a plurality of best practices for said service level management;
(k) defining a plurality of requirements for said service level management; and
(I) developing said business performance model.
4. The method of claim 1 wherein said designing includes:
(f) designing business processes, skills, and user interaction for said service level management.
5. The method of claim 4 wherein said step (f) includes at least one of the following:
(g) designing a plurality of workflows for processes, activities, and tasks for said service level management;
(h) identifying physical environment interactions; (i) identifying skill requirements for performing said service level management;
G) defining application interactions;
(k) identifying performance support requirements; (I) developing a capability interaction model; and
(m) developing said business processes, skills, and user interaction.
6. The method of claim 1 wherein said designing includes:
(f) designing an organization infrastructure for said service level management.
7. The method of claim 6 wherein said step (f) includes at least one of the following:
(g) designing a plurality of roles, jobs, and teams; (h) designing a competency model; (i) designing a performance management infrastructure;
(j) determining an organization infrastructure mobilization approach; and
(k) developing said organization infrastructure.
8. The method of claim 1 wherein said designing includes: (f) designing a performance enhancement infrastructure for said service level management.
9. The method of claim 8 wherein said step (f) includes at least one of the following:
(g) assessing employee competency and performance for a current service level management;
(h) determining performance enhancement needs;
(i) designing performance enhancement products;
(j) defining a learning test approach; and
(k) developing said performance enhancement infrastructure.
10. The method of claim 1 wherein said designing includes:
(f) designing a technology infrastructure for said service level management.
11. The method of claim 10 wherein said step (f) includes at least one of the following:
(g) preparing a technology infrastructure performance model; (h) analyzing a plurality of technology infrastructure component requirements;
(i) assessing a current technology infrastructure; (j) developing a technology infrastructure design; and
(k) planning a technology infrastructure product test.
12. The method of claim 1 wherein said designing includes:
(f) designing operations architecture for said service level management.
13. The method of claim 12 wherein said step (f) includes at least one of the following:
(g) identifying operations architecture components;
(h) selecting reuse operations architecture components; (i) selecting packaged operations architecture components; (j) designing custom operations architecture components; and
(k) designing the operations architecture.
14. The method of claim 10 wherein said testing includes:
(g) validating said technology infrastructure for said service level management.
15. The method of claim 14 wherein said validating includes at least one of the following:
(h) reviewing said technology infrastructure; (i) establishing an environment for validating said technology infrastructure;
0) validating said technology infrastructure; and
(k) analyzing an impact of said technology infrastructure.
16. The method of claim 14 wherein said building includes:
(h) acquiring a plurality of technology infrastructure components for said technology infrastructures.
17. The method of claim 16 wherein said acquiring includes at least one of the following: (i) defining acquisition criteria;
(j) selecting vendors for said technology infrastructure components;
(k) appointing said vendors;
(I) evaluating deployment implications of said selecting and appointing; and
(m) testing said technology infrastructure components for acceptance.
18. The method of claim 13 wherein said building includes: (I) building said operations architecture components.
19. The method of claim 18 wherein said testing includes at least one of the following:
(m) testing said operations architecture components; and (n) testing said operations architecture.
20. The method of claim 1 wherein said building includes: (f) developing policies, procedures, and performance support for said service level management.
21. The method of claim 20 wherein said developing includes at least one of the following: (g) developing business policies and procedures;
(h) developing user procedures;
(i) developing reference materials and job aids; and
(j) validating said policies, procedures, and reference materials.
22. The method of claim 1 wherein said building includes:
(f) developing learning products for said service level management.
23. The method of claim 22 wherein said developing includes at least one of the following:
(g) developing learning products standards; (h) prototyping said learning products;
(i) building said learning products; and (j) testing said learning products.
24. The method of claim 16 wherein said testing includes:
(i) testing said technology infrastructure for said service level management.
25. The method of claim 24 wherein said step (i) includes at least one of the following: (j) preparing a plurality of test models for said technology infrastructure;
(k) executing a technology infrastructure product test; (I) executing a technology infrastructure deployment test models; and (m) executing a technology infrastructure configuration test model.
26. The method of claim 24 wherein said deploying includes: (j) deploying said technology infrastructure for said production services.
27. The method of claim 26 wherein said step (j) includes at least one of the following:
(k) configuring said technology infrastructure; (I) installing said technology infrastructure; and (m) verifying said technology infrastructure.
28. A method for providing an estimate for building a service level management function in an information technology organization, the method comprising: (a) obtaining a plurality of estimating factors;
(b) determining a difficulty rating for each of said estimating factors;
(c) generating a time allocation for building said service level management based on said estimating factor and said difficulty rating; and (d) generating a cost for building said service level management based on said time allocation.
29. The method as recited in claim 28, wherein obtaining said estimating factor further includes receiving said estimating factors from a client.
30. The method as recited in claim 28, wherein said estimating factors include the number of at least one of acquired components, current service desks, current roles, current services, future service desks, future roles, new personnel, new services, personnel, personnel, services, software components, and vendors.
31. The method as recited in claim 28, wherein said difficulty rating is selected from the group of simple, moderate, or complex.
32. The method as recited in claim 28, wherein said time allocation includes time allocated for a plurality of individual team members where said individual team members include at least one of partner, manager, consultant, and analyst.
33. The method as recited in claim 28, wherein said cost depends on said time allocation and a billing rate for said individual team member.
34. The method as recited in claim 28, wherein said cost is broken down for each of a plurality of stages for building said service level management where said stages include at least one of plan and manage, capability analysis, capability release design, capability release build and test, and deployment stages.
35. The method as recited in claim 28, wherein said time allocation is used to generate a project work plan.
36. The method as recited in claim 28, wherein said billing rate is used to generate a financial summary of said cost.
37. The method as recited in claim 35, wherein said work plan is broken down for each of a plurality of stages for building said service level management where said stages are plan and manage, capability analysis, capability release design, capability release build and test, and deployment.
38. The method as recited in claim 37, wherein said plan and manage stage is broken down for each of a plurality of task packages where said task packages are plan project execution, organize project resources, control project work, and project complete.
39. A computer system for allocating time and computing cost for building a service level management function in an information technology organization, comprising: a processor; a software program for receiving a plurality of estimating factors and difficulty rating for each of said estimating factors and generating a time allocation and cost for building said service level management; and a memory that stores said time allocation and cost under control sor.
PCT/US2000/027803 1999-10-06 2000-10-06 Method and estimator for providing service level management WO2001026013A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU11936/01A AU1193601A (en) 1999-10-06 2000-10-06 Method and estimator for providing service level management

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15825999P 1999-10-06 1999-10-06
US60/158,259 1999-10-06

Publications (1)

Publication Number Publication Date
WO2001026013A1 true WO2001026013A1 (en) 2001-04-12

Family

ID=22567316

Family Applications (12)

Application Number Title Priority Date Filing Date
PCT/US2000/027804 WO2001026014A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing service control
PCT/US2000/027856 WO2001025970A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing operations maturity model assessment
PCT/US2000/027796 WO2001026010A1 (en) 1999-10-06 2000-10-06 Method and estimator for production scheduling
PCT/US2000/027518 WO2001026005A1 (en) 1999-10-06 2000-10-06 Method for determining total cost of ownership
PCT/US2000/027795 WO2001025876A2 (en) 1999-10-06 2000-10-06 Method and estimator for providing capacity modeling and planning
PCT/US2000/027803 WO2001026013A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing service level management
PCT/US2000/027629 WO2001026008A1 (en) 1999-10-06 2000-10-06 Method and estimator for event/fault monitoring
PCT/US2000/027857 WO2001025877A2 (en) 1999-10-06 2000-10-06 Organization of information technology functions
PCT/US2000/027802 WO2001026012A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing storage management
PCT/US2000/027593 WO2001026028A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing change control
PCT/US2000/027801 WO2001026011A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing operation management strategic planning
PCT/US2000/027592 WO2001026007A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing business recovery planning

Family Applications Before (5)

Application Number Title Priority Date Filing Date
PCT/US2000/027804 WO2001026014A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing service control
PCT/US2000/027856 WO2001025970A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing operations maturity model assessment
PCT/US2000/027796 WO2001026010A1 (en) 1999-10-06 2000-10-06 Method and estimator for production scheduling
PCT/US2000/027518 WO2001026005A1 (en) 1999-10-06 2000-10-06 Method for determining total cost of ownership
PCT/US2000/027795 WO2001025876A2 (en) 1999-10-06 2000-10-06 Method and estimator for providing capacity modeling and planning

Family Applications After (6)

Application Number Title Priority Date Filing Date
PCT/US2000/027629 WO2001026008A1 (en) 1999-10-06 2000-10-06 Method and estimator for event/fault monitoring
PCT/US2000/027857 WO2001025877A2 (en) 1999-10-06 2000-10-06 Organization of information technology functions
PCT/US2000/027802 WO2001026012A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing storage management
PCT/US2000/027593 WO2001026028A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing change control
PCT/US2000/027801 WO2001026011A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing operation management strategic planning
PCT/US2000/027592 WO2001026007A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing business recovery planning

Country Status (4)

Country Link
EP (2) EP1222510A4 (en)
AU (12) AU1653901A (en)
CA (1) CA2386788A1 (en)
WO (12) WO2001026014A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110231229A1 (en) * 2010-03-22 2011-09-22 Computer Associates Think, Inc. Hybrid Software Component and Service Catalog

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2002256550A1 (en) * 2000-12-11 2002-06-24 Skill Development Associates Ltd Integrated business management system
US7937281B2 (en) 2001-12-07 2011-05-03 Accenture Global Services Limited Accelerated process improvement framework
US7035809B2 (en) * 2001-12-07 2006-04-25 Accenture Global Services Gmbh Accelerated process improvement framework
AU2003282996A1 (en) 2002-10-25 2004-05-25 Science Applications International Corporation Determining performance level capabilities using predetermined model criteria
DE10331207A1 (en) 2003-07-10 2005-01-27 Daimlerchrysler Ag Method and apparatus for predicting failure frequency
US8572003B2 (en) * 2003-07-18 2013-10-29 Sap Ag Standardized computer system total cost of ownership assessments and benchmarking
US8566147B2 (en) * 2005-10-25 2013-10-22 International Business Machines Corporation Determining the progress of adoption and alignment of information technology capabilities and on-demand capabilities by an organization
EP1808803A1 (en) * 2005-12-15 2007-07-18 International Business Machines Corporation System and method for automatically selecting one or more metrics for performing a CMMI evaluation
US8457297B2 (en) 2005-12-30 2013-06-04 Aspect Software, Inc. Distributing transactions among transaction processing systems
US8355938B2 (en) 2006-01-05 2013-01-15 Wells Fargo Bank, N.A. Capacity management index system and method
US7523082B2 (en) * 2006-05-08 2009-04-21 Aspect Software Inc Escalating online expert help
US20080208667A1 (en) * 2007-02-26 2008-08-28 Gregg Lymbery Method for multi-sourcing technology based services
EP2210227A2 (en) * 2007-10-25 2010-07-28 Markport Limited Modification of service delivery infrastructure in communication networks
US8326660B2 (en) 2008-01-07 2012-12-04 International Business Machines Corporation Automated derivation of response time service level objectives
US8320246B2 (en) * 2009-02-19 2012-11-27 Bridgewater Systems Corp. Adaptive window size for network fair usage controls
US8200188B2 (en) 2009-02-20 2012-06-12 Bridgewater Systems Corp. System and method for adaptive fair usage controls in wireless networks
US9203629B2 (en) 2009-05-04 2015-12-01 Bridgewater Systems Corp. System and methods for user-centric mobile device-based data communications cost monitoring and control
US8577329B2 (en) 2009-05-04 2013-11-05 Bridgewater Systems Corp. System and methods for carrier-centric mobile device data communications cost monitoring and control
US20110066476A1 (en) * 2009-09-15 2011-03-17 Joseph Fernard Lewis Business management assessment and consulting assistance system and associated method
WO2012057747A1 (en) 2010-10-27 2012-05-03 Hewlett-Packard Development Company, L.P. Systems and methods for scheduling changes
WO2015126409A1 (en) 2014-02-21 2015-08-27 Hewlett-Packard Development Company, L.P. Migrating cloud resources
US10148757B2 (en) 2014-02-21 2018-12-04 Hewlett Packard Enterprise Development Lp Migrating cloud resources
US20170032297A1 (en) * 2014-04-03 2017-02-02 Dale Chalfant Systems and Methods for Increasing Capability of Systems of Business Through Maturity Evolution
US10044786B2 (en) 2014-11-16 2018-08-07 International Business Machines Corporation Predicting performance by analytically solving a queueing network model
US9984044B2 (en) 2014-11-16 2018-05-29 International Business Machines Corporation Predicting performance regression of a computer system with a complex queuing network model
US10460272B2 (en) * 2016-02-25 2019-10-29 Accenture Global Solutions Limited Client services reporting
CN106682385B (en) * 2016-09-30 2020-02-11 广州英康唯尔互联网服务有限公司 Health information interaction system
JP7246407B2 (en) * 2018-04-16 2023-03-27 クラウドブルー エルエルシー Systems and methods for aligning revenue streams in a cloud service broker platform
US11481711B2 (en) 2018-06-01 2022-10-25 Walmart Apollo, Llc System and method for modifying capacity for new facilities
CA3101836A1 (en) 2018-06-01 2019-12-05 Walmart Apollo, Llc Automated slot adjustment tool
US11483350B2 (en) 2019-03-29 2022-10-25 Amazon Technologies, Inc. Intent-based governance service
CN110096423A (en) * 2019-05-14 2019-08-06 深圳供电局有限公司 A kind of server memory capacity analyzing and predicting method based on big data analysis
US11119877B2 (en) 2019-09-16 2021-09-14 Dell Products L.P. Component life cycle test categorization and optimization
WO2021096893A1 (en) * 2019-11-11 2021-05-20 Snapit Solutions Llc System for producing and delivering information technology products and services
US11288150B2 (en) 2019-11-18 2022-03-29 Sungard Availability Services, Lp Recovery maturity index (RMI)-based control of disaster recovery
US20210160143A1 (en) 2019-11-27 2021-05-27 Vmware, Inc. Information technology (it) toplogy solutions according to operational goals
US11501237B2 (en) 2020-08-04 2022-11-15 International Business Machines Corporation Optimized estimates for support characteristics for operational systems
US11329896B1 (en) 2021-02-11 2022-05-10 Kyndryl, Inc. Cognitive data protection and disaster recovery policy management

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793632A (en) * 1996-03-26 1998-08-11 Lockheed Martin Corporation Cost estimating system using parametric estimating and providing a split of labor and material costs

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4827423A (en) * 1987-01-20 1989-05-02 R. J. Reynolds Tobacco Company Computer integrated manufacturing system
JPH03111969A (en) * 1989-09-27 1991-05-13 Hitachi Ltd Method for supporting plan formation
US5233513A (en) * 1989-12-28 1993-08-03 Doyle William P Business modeling, software engineering and prototyping method and apparatus
WO1993012488A1 (en) * 1991-12-13 1993-06-24 White Leonard R Measurement analysis software system and method
US5701419A (en) * 1992-03-06 1997-12-23 Bell Atlantic Network Services, Inc. Telecommunications service creation apparatus and method
US5586021A (en) * 1992-03-24 1996-12-17 Texas Instruments Incorporated Method and system for production planning
US5646049A (en) * 1992-03-27 1997-07-08 Abbott Laboratories Scheduling operation of an automated analytical system
US5978811A (en) * 1992-07-29 1999-11-02 Texas Instruments Incorporated Information repository system and method for modeling data
US5630069A (en) * 1993-01-15 1997-05-13 Action Technologies, Inc. Method and apparatus for creating workflow maps of business processes
US5819270A (en) * 1993-02-25 1998-10-06 Massachusetts Institute Of Technology Computer system for displaying representations of processes
CA2118885C (en) * 1993-04-29 2005-05-24 Conrad K. Teran Process control system
AU7207194A (en) * 1993-06-16 1995-01-03 Electronic Data Systems Corporation Process management system
US5485574A (en) * 1993-11-04 1996-01-16 Microsoft Corporation Operating system based performance monitoring of programs
US5724262A (en) * 1994-05-31 1998-03-03 Paradyne Corporation Method for measuring the usability of a system and for task analysis and re-engineering
US5563951A (en) * 1994-07-25 1996-10-08 Interval Research Corporation Audio interface garment and communication system for use therewith
US5745880A (en) * 1994-10-03 1998-04-28 The Sabre Group, Inc. System to predict optimum computer platform
JP3315844B2 (en) * 1994-12-09 2002-08-19 株式会社東芝 Scheduling device and scheduling method
JPH08320855A (en) * 1995-05-24 1996-12-03 Hitachi Ltd Method and system for evaluating system introduction effect
EP0770967A3 (en) * 1995-10-26 1998-12-30 Koninklijke Philips Electronics N.V. Decision support system for the management of an agile supply chain
US5875431A (en) * 1996-03-15 1999-02-23 Heckman; Frank Legal strategic analysis planning and evaluation control system and method
US5960417A (en) * 1996-03-19 1999-09-28 Vanguard International Semiconductor Corporation IC manufacturing costing control system and process
US5960200A (en) * 1996-05-03 1999-09-28 I-Cube System to transition an enterprise to a distributed infrastructure
US5673382A (en) * 1996-05-30 1997-09-30 International Business Machines Corporation Automated management of off-site storage volumes for disaster recovery
US5864483A (en) * 1996-08-01 1999-01-26 Electronic Data Systems Corporation Monitoring of service delivery or product manufacturing
US5974395A (en) * 1996-08-21 1999-10-26 I2 Technologies, Inc. System and method for extended enterprise planning across a supply chain
US5930762A (en) * 1996-09-24 1999-07-27 Rco Software Limited Computer aided risk management in multiple-parameter physical systems
US6044354A (en) * 1996-12-19 2000-03-28 Sprint Communications Company, L.P. Computer-based product planning system
US5903478A (en) * 1997-03-10 1999-05-11 Ncr Corporation Method for displaying an IT (Information Technology) architecture visual model in a symbol-based decision rational table
US6028602A (en) * 1997-05-30 2000-02-22 Telefonaktiebolaget Lm Ericsson Method for managing contents of a hierarchical data model
US6106569A (en) * 1997-08-14 2000-08-22 International Business Machines Corporation Method of developing a software system using object oriented technology
US6092047A (en) * 1997-10-07 2000-07-18 Benefits Technologies, Inc. Apparatus and method of composing a plan of flexible benefits
US6131099A (en) * 1997-11-03 2000-10-10 Moore U.S.A. Inc. Print and mail business recovery configuration method and system
US6119097A (en) * 1997-11-26 2000-09-12 Executing The Numbers, Inc. System and method for quantification of human performance factors
US6157916A (en) * 1998-06-17 2000-12-05 The Hoffman Group Method and apparatus to control the operating speed of a papermaking facility

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793632A (en) * 1996-03-26 1998-08-11 Lockheed Martin Corporation Cost estimating system using parametric estimating and providing a split of labor and material costs

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BRIMSON J.A.: "Activity accounting: An activity-based costing approach", 1991, JOHN WILEY & SONS, INC., XP002937533 *
DAVIS W.S. ET AL.: "The information system consultant's hanbook: Systems, analysis and design", 1 December 1998, CRC PRESS, 002937537 *
KERZNER H. PHD: "Project management: A systems approach to planning, scheduling and controlling", 1995, XP002937534 *
OSTERLE H. ET AL.: "Total information system management: A European approach", 1993, JOHN WILEY & SONS, LTD., XP002937536 *
WARD J. ET AL.: "Strategic planning for information systems", 1996, JOHN WILEY & SONS, LTD., XP002937535 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110231229A1 (en) * 2010-03-22 2011-09-22 Computer Associates Think, Inc. Hybrid Software Component and Service Catalog

Also Published As

Publication number Publication date
WO2001025876A3 (en) 2001-08-30
AU8001700A (en) 2001-05-10
WO2001026011A1 (en) 2001-04-12
AU7756600A (en) 2001-05-10
EP1222510A2 (en) 2002-07-17
WO2001026008A1 (en) 2001-04-12
AU1193801A (en) 2001-05-10
WO2001026012A1 (en) 2001-04-12
AU7996100A (en) 2001-05-10
WO2001026010A1 (en) 2001-04-12
WO2001025970A1 (en) 2001-04-12
WO2001026005A1 (en) 2001-04-12
WO2001025876A2 (en) 2001-04-12
WO2001026028A1 (en) 2001-04-12
AU1653901A (en) 2001-05-10
EP1226523A1 (en) 2002-07-31
AU7866600A (en) 2001-05-10
WO2001025877A3 (en) 2001-09-07
WO2001026028A8 (en) 2001-07-26
EP1226523A4 (en) 2003-02-19
WO2001025970A8 (en) 2001-09-27
AU1431801A (en) 2001-05-10
EP1222510A4 (en) 2007-10-31
WO2001026014A1 (en) 2001-04-12
WO2001026007A1 (en) 2001-04-12
AU1431701A (en) 2001-05-10
AU1193601A (en) 2001-05-10
WO2001025877A2 (en) 2001-04-12
AU7996000A (en) 2001-05-10
CA2386788A1 (en) 2001-04-12
AU7861800A (en) 2001-05-10
AU8001800A (en) 2001-05-10

Similar Documents

Publication Publication Date Title
WO2001026013A1 (en) Method and estimator for providing service level management
US6738736B1 (en) Method and estimator for providing capacacity modeling and planning
US7810067B2 (en) Development processes representation and management
Guide Project management body of knowledge (pmbok® guide)
US7035809B2 (en) Accelerated process improvement framework
US8200527B1 (en) Method for prioritizing and presenting recommendations regarding organizaion's customer care capabilities
US20020059512A1 (en) Method and system for managing an information technology project
Stackpole A User's Manual to the PMBOK Guide
Roudias Mastering Principles and Practices in PMBOK, Prince 2, and Scrum: Using Essential Project Management Methods to Deliver Effective and Efficient Projects
Hedeman et al. Project Management Based on PRINCE2® 2009 edition
Singh downloaded from the King’s Research Portal at https://kclpure. kcl. ac. uk/portal
CISM Managing Software Deliverables: A Software Development Management Methodology
Çikot Project management with PMI standards and a critical approach to a real utilization in banking industry
Pilorget et al. IT Portfolio and Project Management
GONZALEZ PROJECT MANAGEMENT PLAN FOR THE HELP DESK IMPLEMENTATION IN SECRETARIAT OF EDUCATION DISTRICT OF BOGOTÁ, COLOMBIA
Olson Project Estimation
Ma Assessing capability maturity tools for process management improvement: A case study
OROZCO PROJECT MANAGEMENT PLAN FOR A NETWORK INFRASTRUCTURE UPGRADE IN A MANUFACTURING PLANT
Dilawer Practical Guide of Software Development Project Management in Practice
Melvin et al. VA IT MANAGEMENT: Organization Is Largely Centralized; Additional Actions Could Improve Human Capital Practices and Systems Development Processes
Baloyi The Effectiveness of the project management life cycle in Eskom Limpopo Operating Unit
Hand et al. Guide To The Business Capability Lifecycle For Department Of Defense ACAT III Programs
Clapp et al. A guide to conducting independent technical assessments
Kulmala NEW TECHNOLOGY SUPPLIER DEVELOPMENT IN REGULATED CONTRACT MANUFACTURING
Enterprise et al. Request for Proposal (RFP) Proc Main

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP