Typically, these processes and methods are left to the outsourcing company to ensure that such processes and methods can support the SLA agreement. However, it is recommended that the client and the outsourcing company collaborate during the negotiation of the SLA in order to eliminate misunderstandings about the process and method of support, as well as the management and reporting methods. The underlying advantage of cloud computing is that of shared resources that are supported by the underlying nature of a common infrastructure environment. Therefore, SLAs span the entire cloud and are offered by service providers as a service agreement and not a customer-based agreement. Measuring, monitoring, and reporting on cloud performance is based on the final UX or its ability to consume resources. The disadvantage of cloud computing compared to SLAs is the difficulty of determining the cause of service interruptions due to the complexity of the nature of the environment. The purpose of this SLA is to clarify the requirements of the SaaS service, as defined in this Regulation, as regards: ensuring that metrics reflect factors that are under the control of the service provider. To motivate good behavior, SLA metrics must reflect factors that lie in the control of the extern externator. A typical mistake is to sanction the service provider for delays caused by the customer`s lack of performance. For example, when the customer provides application code change specifications several weeks late, it is unfair and demotivating to keep the service provider on a predetermined delivery date. Designing the SLA from two sides by measuring the customer`s performance in interdependent actions is a good way to focus on the expected results. Service level agreements can contain many service performance metrics with appropriate service level objectives.
A common case in IT service management is a call center or services. Among the generally agreed metrics in these cases are: 8 Availability of the information system IS X Regression line Normal time Figure 5 Trend diagram of the availability of the information system IS X as a function of the COMPUTER audit time To determine the extent to which the services meet the set performance requirements, the corresponding performance indicators must be measured and evaluated. In principle, this should always be done by the client, otherwise it will result in a tangle of interests. However, the customer does not always have the means and know-how to carry out the measurements and evaluation. As a rule, the service provider has the necessary resources and know-how. The following design can be used to solve this problem: The supplier measures and evaluates the relevant ratios and the customer commissions a third party, the IT or IT auditor, to verify whether the supplier is using good methods of measuring and evaluating ratios and to what extent the supplier also complies with these procedures. The IT or IT auditor uses random checks, called IT or IT audits. Benchmarking is the comparison of values measured by the same performance indicators in similar circumstances, but within different situations or organizations. This makes it possible, for example, to determine the level of performance that different suppliers can achieve. However, there are a number of aspects that can make it more difficult to use benchmarking, namely that each organization is unique. This makes it difficult to create similar conditions within different organizations.
Performance operations show some statistical dispersion. For a good comparative result, it is therefore necessary to perform frequent performance comparisons. However, this involves considerable costs and burdens. Performance data may remain confidential. In such a case, a comparison is not possible, as the necessary data are not available.. . .