• Ei tuloksia

Merlin White paper - VTT project pages server

N/A
N/A
Info
Lataa
Protected

Academic year: 2023

Jaa "Merlin White paper - VTT project pages server"

Copied!
22
0
0

Kokoteksti

(1)

M M M e er e rr ll l ii i n n n W W W h hi h ii tte t e e p p p a ap a p pe e e rr r

A A A d d d a a a p p p t t t a a a b b b i i i l l l i i i t t t y y y e e e v v v a a a l l l u u u a a a t t t i i i o o o n n n o o o f f f s s s o o o f f f t t t w w w a a a r r r e e e a a a r r r c c c h h h i i i t t t e e e c c c t t t u u u r r r e e e s s s

The Merlin project addresses the increasing demand to find and discover new efficient ways to support collaborative embedded systems development. Embedded systems are increasingly developed globally in collaboration with partners, such as subcontractors, third party developers and also with in-house developers.

The Merlin project established working practices that build upon the benefits of collaboration but neutralise its negative effects. Central in this approach is the industrial application of technologies. No technology can be considered for industrial usage if there is no observed and proven experience of its benefit in practice. All Merlin results have been validated for applicability in specified industrial environments.

This white paper summarises one of the Merlin results. In case you are interested in more information on the project or you are interested in the full deliverable, please get in touch with us via the web-site: http://www.merlinproject.org/

Also we are interested in your opinion or feedback. Our contact details can be found on the Merlin web-site.

Author(s):

Pentti Tarvainen, VTT

Merlin

- Embedded Systems Engineering in Collaboration

The Merlin Consortium consists of:

Incode, Ericsson, LogicaCMG, Lund University, Nokia, Philips, Solid, Sony Ericsson, TU Delft, University of Oulu, VTT Technical Research Centre of Finland

© Copyright MERLIN Handbook

(2)

TABLE OF CONTENTS

TERMINOLOGY... 3

1. INTRODUCTION ... 5

2. RELATED WORK ... 6

3. DEFINITION OF ADAPTABILITY ... 7

4. DESCRIPTION OF ADAPTABILITY EVALUATION METHOD (AEM) ... 9

4.1 Phase 1: Defining Adaptability Goals ... 11

4.1.1 Identifying Stakeholders and Their Concerns... 11

4.1.2 Refining of Adaptability Requirements... 11

4.1.3 Mapping of Adaptability Requirements to Functionality... 11

4.1.4 Selecting of Architectural Styles and Patterns and Performing of a Trade-Off Analysis12 4.1.5 Defining of Criteria for Adaptability Evaluation ... 13

4.2 Phase 2: Representing Adaptability in Architectural Models ... 13

4.2.1 Mapping of Required Adaptability to Conceptual Architectural Elements... 13

4.2.2 Mapping from Conceptual Architecture to Concrete Architecture... 14

4.2.3 Mapping of Provided Adaptability to Concrete Architectural Elements... 14

4.3 Phase 3: Adaptability Evaluation ... 15

4.3.1 Quantitative Analysis ... 15

4.3.2 Qualitative Analysis ... 17

4.3.3 Decision Making Based on the Analysis... 17

5. VALIDATION OF AEM... 19

6. SUMMARY AND CONCLUSIONS... 20

REFERENCES... 21

(3)

Merlin white paper: Adaptability evaluation of software architectures

TERMINOLOGY

Adaptability scenario The description of the system behavior driven by the change requirement, including the usage for the system, the reaction to the change requirement and potential future changes, which are all related to adaptability. An adaptability scenario includes the action and events in system and the change requirement which triggers these actions.

Adaptability scenario profile A set of related Adaptability Scenarios.

Architectural Model Represents a system in terms of its principal run time parts (components) and their pathways of communication (connectors).

Architectural Pattern Idiom to describe how to build a Software System, express fundamental structural schemas for Software Systems.

Architectural Style Provides a specialised language for a specific class of systems that are related by shared structural and semantic properties.

Architectural View Architectural views form the basis of design methods. The architectural description is organised by views, which conform to the viewpoints selected. A viewpoint covers one or more concerns of one or more stakeholders who are interested in the software architecture.

External Adaptation System adaptation is handled outside of the application. Systems are monitored for various attributes, such as resource utilization, reliability, delivered quality of service.

Internal Adaptation System adaptation is handled inside of the application. Applications typically use generic mechanisms such as exception handling, or heartbeat mechanisms to trigger application-specific responses to an observed fault.

Middleware The software layer between the operating system – including the basic communication protocols – and the distributed applications that interact via the network.

Model-Driven Development Model-Driven Development (MDD) is about treating (handling) models as first class design entities. Modelling provides a view to a complex problem and its solutions, which is less risky, cheaper and easier to understand than implementation of the genuine (real) target.

Model-driven Software Architecture

Model-Driven Architecture (MDA) is a framework for standards that will enable model-driven development (MDD). Model-driven Software Architecture development focuses on models as primary software products.

Model-Driven Architecture is defined as “an OMG initiative that proposes to define a set of non-proprietary standards that will specify interoperable technologies with which to realize model-driven development with automated transformations”.

The concept of Model-Driven Architecture lies on three types of models on the different abstraction levels: computation independent model (CIM), platform independent model (PIM) and platform specific model (PSM). The computation independent model shows the system in the environment where it will operate. The platform independent model concentrates on the operation of the system while hiding the details of the underlying platform. PIM is computationally complete meaning that it is possible to execute the system defined by this model. The platform specific model is described as a realization of PIM with all the details of the chosen platform.

Product Family Architecture Product Family Architecture (PFA) is a software structure that is common for all products of a product family. The PFA is an architecture, which is derived of products of a product family.

As the product family members share the same architecture and for the fact that the architecture constrains quality attributes of a system, choosing the right architecture is essential. This occurs especially in the case of PFA, as the quality attributes reflect on the whole product family.

Product Line Architecture A software product line is a set of products sharing common features and architecture, but which also have product specific features. The system family concept is equivalent to a product line, signifying a family of software specific systems.

(4)

Quality-Aware Middleware

Middleware solution to adopt control architecture to monitor and improve the quality of service parameters of the middleware systems.

Reference Architecture Reference Architecture is an informal or formal architectural model that is accepted and used community-wide.

CASE tools and application generators are typical examples that utilize informal reference architecture without an exact specification of it.

Open systems are based on formal reference architectures with interface specifications that are fully defined, freely available, distributed in the form of standards, and controlled by a group of vendors and users. ODP (Open Distributed Processing) is a typical example of standard-based reference architecture used in heterogeneous distributed systems. The established OMA (Open Mobile Alliance) tries to harmonize and consolidate the standardization efforts of mobile services.

Service Architecture Service Architecture (SA) is a set of concepts and principles for the specification, design, implementation, and management of software services. The service architecture refers mostly to the software architecture of applications and middleware.

Service architecture defines concepts and principles to develop and maintain services to obtain the quality issues with minimum cost and faster time-to-market.

Social Pattern Idiom inspired by social and intentional characteristics used to design the details of system architecture.

Software Product Family

Software Product Family (SPF) is a family of products sharing a set of common properties and architecture - Product Family Architecture (PFA).

Software Architecture Software Architecture(SA) is the fundamental organization of a software system embodied in its components, their relationships to each other and to the environment. SA also includes the principles guiding its design and evolution and therefore it has a strong influence over the life cycle of a system.

The intention of software architecture is communicative. That is why software architecture has to be described in several ways, i.e., to present views of the architecture in a certain light to the defined stakeholders, e.g., customers, marketing and production staff, technical and administrative managers, in addition to the software and hardware developers. Different stakeholders need information at different levels of abstraction and aggregation. Therefore, one kind of architectural description is not enough and multiple views are needed.

The notion of software architecture has emerged as the appropriate level for dealing with software quality. Efforts towards the systematic use of architectural styles and design patterns contribute, in an informal way, to guaranteeing the quality of a design.

Software Family Engineering

The core idea in Software Family Engineering (SFE) is to use as much of the same software assets as possible, i.e. requirements, architecture, components, test cases, etc., in all family members.

The adoption of the SFE approach involves not only remarkable investments, but also changes in the work organisation and in the use of special development methods and techniques. Thus, a company intending to apply SFE in software production should investigate the benefits, preconditions and constraints entailed in the application of SFE, evaluate the maturity of existing architectures and estimate which kind of software family architecture best suits a company.

Structural Adaptability System feature where the system structure remains the same while the architectural elements of the system can be modified or replaced.

System Architecture System Architecture is the fundamental organization of a system embodied in its components, their relationships to each other and to the environment. System Architecture also includes the principles guiding its design and evolution and therefore it has a strong influence over the life cycle of a system.

(5)

Merlin white paper: Adaptability evaluation of software architectures

1. INTRODUCTION

Technology, environment, and user requirements have changed rapidly in the domain of software engineering. Due to these facts, adaptability has become one of the key feature and an increasingly important factor for survival and success of software systems in order to accommodate variable resources, system errors, and stakeholders’ objectives. Adaptive software systems should be able to adapt their functionality, even at runtime, to behavioral and structural changes that occur either internally or externally in their operating environment. Software architectures for such systems should be flexible enough to allow components to change their pattern depending on the environmental changes and goals of the system and stakeholders’

objectives, without changing the actual components themselves [1, 2]. Software architecture is defined as “the fundamental organization and behavior of a system in terms of components and connectors” [3].

The adaptability evaluation of today’s software systems is challenging, resulting from their complexity, large scale requirements and often the distribution. Due to the complicated nature of today’s software systems and the shortcomings of the existing evaluation methods, a new method is required to evaluate adaptability of the software systems from architectural models. The adaptability evaluation is not just about analyzing, it also requires that the entire system development approach must be refined, starting from the gathering of the requirements. All of the adaptability requirement sources should be identified and the requirements should be negotiated in a way that the best possible requirement set can be identified. The evaluation method should help to validate whether or not the adaptability requirements are met in the architecture. This evaluation should be performed for each candidate architectural solution and the candidate that meets the requirements best can subsequently be selected. The problem in adaptability of today’s software systems should be able to be analyzed before system implementation, i.e. when the corrections and modifications are easier and cheaper to perform and the design decisions can still be affected.

The contribution of this white paper is Adaptability Evaluation Method (AEM) [1, 2] to support architecture improving and decision making for choosing among candidate architectures. AEM defines (1) how the adaptability requirements should be negotiated and mapped to the architecture, (2) how adaptability requirements should be represented in the architectural models, and (3) how the architecture should be analyzed in order to validate whether or not the adaptability requirements are met. AEM fills the gap from requirements engineering to evaluation and provides a systematic framework and notation extensions, techniques and guidelines for adaptability evaluation at the architecture level. AEM is validated with a real world wireless environmental controlling system [2].

Rest of this white paper is organized as follows: Section 2 discusses briefly about the related work.

In Section 3, diversity of the adaptability definitions are discussed and a definition for system and architecture level adaptability is proposed. Section 4 provides a detailed description of AEM.

Section 5 discusses about the validation process of AEM and finally, Section 6 summarizes and concludes the white paper.

(6)

2. RELATED WORK

In literature, only few evaluation or analysis methods are published related to software architecture adaptability [1, 2]. Methodologies like ATAM [4], ALRRA [5] and ALMA [6] are focusing on the quality attributes of software systems like performance, modifiability, flexibility, maintainability, portability, variability and trade-offs between them, but none of them focuses directly on the adaptability characteristic of software architectures [1, 2]. In paper [7] a descriptive method for analyzing self-adaptive software is proposed. The method figures that self-adaptive software should include at least two components: (1) the deliberative component and (2) the reactive component. The method is based on feedback control and feed forward control theory. This method does not give the definition for adaptability and does not analyze the adaptability deeply.

The method is only a coarse qualitative analysis method. In paper [8] a process oriented metric for software architecture adaptability is presented. The method analyzes the degree of adaptability through the intuitive decomposition of goals and the intuitive scoring for the goal satisfying level of software architecture. The method can find some defaults in the architecture, but it depends too much on the intuition and the expert expertise, which leads to much uncertainty. Paper [9]

proposes a quantitative evaluation approach based on adaptability scenario profile, impact analysis on them and calculation of adaptability degree of them. However, the method does not consider evaluation of qualitative aspects of the software architectures [1, 2].

(7)

Merlin white paper: Adaptability evaluation of software architectures

3. DEFINITION OF ADAPTABILITY

In literature, adaptability, related to software engineering, is defined variously. For example, in [10]

adaptability is defined as “the system which can adjust its behavior according to changing of the environments” and in [11] as “the ability of software to adapt its functionality according to the current environment or user”. Although different notions, there is no explicit, extensive, and concrete definition for adaptability of software system or software architecture [1, 2].

Quality of software is one of the major issues in software systems and it is important to evaluate it as early as possible [3, 11]. ISO/IEC 9126-1 [12] defines a Software Quality Model including six main categories of quality attributes as follows: functionality, reliability, usability, efficiency, maintainability and portability. Each of these quality categories is divided into several sub characteristics. In ISO/IEC 9126-1 adaptability is categorized as a sub characteristic of portability and it is defined as “the capability of the software product to be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered”.

Quality evaluation of software can be done from the descriptions of software architecture with help of quality characteristics, i.e. quality attributes, shortly qualities [3, 11]. Furthermore, it is widely accepted that software requirements can be defined as consisting of functional and Non-Functional Requirements (NFRs), also referred to as qualities or system properties [13]. NFRs can be categorized in development and operational qualities [11, 14]. Development NFRs are qualities of the system that are relevant from a software engineering perspective, e.g. maintainability, reusability, and flexibility [11, 14]. Operational NFRs are qualities of the system in operation, e.g.

performance, reliability, robustness and fault-tolerance [11, 14]. Different from functional requirements, non-functional requirements can generally not be pinpointed to a particular part of the application but are a property of the application as a whole.

Maintainability is a development quality and it is defined as “the ease with which a software system or component can be modified or adapt to a changed environment” [11, 14]. Modifications may include extensions, porting to different computing systems or improvements. Maintainability is also affected by other development qualities. Modifiability is defined as “an ability of software system to make changes quickly and cost-effectively” [11, 14]. Modifiability includes adding, deleting and changing software structures and therefore, extensibility and portability can be considered as a special form of modifiability [11, 14]. In addition, flexibility, reusability, testability and integrability contribute to modifiability and therefore they can be defined as sub qualities of maintainability [11, 14].

As a conclusion, adaptability has many facets, including both functional and NFRs. The latter qualities (i.e. operational and development qualities) can be seen as architectural in nature [2].

Consequently, adaptability can be considered as a characterization of qualitative property of maintainability of software architecture [2]. Furthermore, adaptability includes runtime requirements of the software system as well as changes in requirements of stakeholders’ objectives [2]. We define software system or software architecture adaptability as follows [2]:

Adaptability of software system or software architecture is (1) a qualitative property of its maintainability and (2) an ability of its components to adapt their functionality, even at runtime, to behavioral and structural changes that occur either internally or externally in their operating environment and in requirements of stakeholders’ objectives.

In the definition above, adaptability is related to the objectives of stakeholders of the system and to the qualities of the software architecture. Typical stakeholders are customers, architects and architectural evolution strategists of organization. Different stakeholder have different viewpoints and demand different adaptable content. For example, from the point of view of end-user, adaptability may mean that new functions can be added and deployed to the system. On the other hand, for the architects, adaptability may mean that the system can adapt with different operating

(8)

systems. Adaptability of software architecture is meaningful in specified context, i.e. software architecture is adaptable to specified adaptability requirements. In adaptable software architecture the elements of the architecture, for example components and connectors, need to make reactions in order to satisfy external and internal adaptability requirements, which may include adding new components, modifying or deleting existing components.

(9)

Merlin white paper: Adaptability evaluation of software architectures

4. DESCRIPTION OF ADAPTABILITY EVALUATION METHOD (AEM)

AEM [1, 2] is an integral part of QADA®1 (Quality-driven Architecture Design and quality Analysis) methodology [11, 15-18], specializing its activities in adaptability related aspects. QADA® bases on the principles of (1) software product line engineering, (2) quality-driven architecture design, (3) quality evaluation based on architectural models, and (4) reuse of existing artifacts and knowledge.

QADA® (Figure 1) uses quality requirements as a driving force when selecting software structures and it describes the architecture on two abstraction levels: conceptual and concrete. The conceptual level means design decisions concerning, for example, functionality. Concrete level refines the conceptual designs in more detailed descriptions. The conceptual and concrete levels consist of four viewpoints: structural, behavioral, deployment and development. The structural viewpoint describes the compositional structure of the system, whereas the behavioral viewpoint concerns the behavioral aspects of the architecture. The deployment viewpoint allocates the components to various computing environments. Finally, the development viewpoint presents the components, their relationships to each other and the actors responsible for their development.

AEM means capturing and mapping the adaptability requirements to the software architecture and it defines (1) how the adaptability requirements can be negotiated to the architecture, (2) how adaptability can be represented in the architectural models, and (3) how the architecture can be analyzed in order to validate whether or not the requirements are met. Quality-driven architecture design is about mapping adaptability requirements to the architectural views and representing the adaptability properties in the architectural models. Quality evaluation consists of the adaptability analysis of the architecture. AEM also exploits the existing design knowledge, such as documentation patterns and architectural styles and patterns. The abstraction levels of QADA® enable the separation of the concepts of the required and provided adaptability. Required adaptability corresponds with the adaptability requirements, i.e. what the system has to support.

The required adaptability is described in the conceptual abstraction level, as mapping the adaptability requirements to the conceptual architecture. Provided adaptability, however, stands for the adaptability that the system implements or offers. This, in turn, is described in the concrete abstraction level when describing the adaptability that the concrete architectural elements provide.

AEM consists of three main phases and they can be applied separately to individual systems [1, 2].

Each phase includes several steps, which in turn consists of a set of activities. The phases and steps of AEM are depicted in Table 1.

1 QADA® - Registered trademark of VTT, Technical Research Centre of Finland

(10)

Figure 1. Abstraction levels of the QADA® Methodology.

Table 1. The phases and steps of AEM [1, 2].

Phase 1: Defining adaptability goals includes following five steps:

1. Identifying of stakeholders and their concerns, 2. Refining of adaptability requirements,

3. Mapping of adaptability requirements to functionality, i.e. mapping of common requirements to common functionality, and mapping of system- specific requirements to system-specific functionality,

4. Selecting of architectural styles and patterns and performing of a trade-off analysis, and

5. Defining of criteria for adaptability evaluation

Phase 2: Representing adaptability in architectural models includes following three steps:

1. Mapping of required adaptability to conceptual architectural elements, 2. Mapping from conceptual architecture to concrete architecture, and 3. Mapping of provided adaptability to concrete architectural elements

Phase 3: Adaptability evaluation includes following three main steps with different activities:

1. Quantitative analysis:

• Estimate the adaptability of the components,

• Estimate the adaptability of the software system,

• Estimate the adaptability of the system in its deployment environment 2. Qualitative analysis:

• Implement the (bidirectional) requirements tracking and analyze how the adaptability requirements are met in the architecture, and

• Identify potential problems caused by the unfulfilled requirements, and 3. Decision making based on the analysis

Stakeholders:

Project Manager, Product Architect, Component Acquirer, Component Designer, Testing Engineer, Integrator Maintainer, Asset Manager

Conceptual Level of Abstraction

Architecture Design

Style and Pattern Repository

Architecture of a High

Quality

Structural view

Behavior view

Deployment view

Development view

Functional and Quality Requirements

Qualities

Concrete Level of Abstraction

Quality Analysis

Analysis Scenarios Architect

?

(11)

Merlin white paper: Adaptability evaluation of software architectures

4.1 PHASE 1: DEFINING ADAPTABILITY GOALS

The purpose of the first phase of AEM [1, 2] is to define the adaptability goals. This means identifying and negotiating the adaptability requirements to find a satisfactory set of adaptability requirements that is subsequently brought further into the architecture design. The first phase of AEM extends requirements engineering activity to support adaptability concerns specifically and helps to identify and refine common and system-specific adaptability requirements, perform trade- off analysis, map the adaptability requirements to the functional requirements, select an architectural style and define criteria for adaptability evaluation.

4.1.1 Identifying Stakeholders and Their Concerns

Every system has several stakeholders and each of them has their own interests regarding the system. For example, stakeholders related to the creation and use of architectural descriptions includes the clients, users, architect(s), developers, evaluators, marketers, customers, and managers that have a direct vision of the core assets group and the product production group of the product.

In order to achieve the final adaptability requirements for the system, the adaptability requirements of all the stakeholders can be identified and negotiated by exploiting the i* (“distributed intentionality”) framework [19]. The i* framework helps to detect where the quality requirements originate and what kind of negotiations should take place, and thereby can be used to depict the relationships among different types of stakeholders. In the i* framework stakeholders are represented as (social) actors who depend on each other for goals to be achieved, tasks to be performed, and resources to be furnished. The i* framework includes the Strategic Dependency Model (SDM) for describing the network of relationships among the actors. The SDM is a graph involving the actors who have strategic dependencies among each other. A dependency describes an “agreement” between two actors. The type of the dependency describes the nature of the agreement as follows: (1) goal dependencies are used to represent delegation of responsibility for fulfilling a goal, (2) softgoal dependencies are similar to goal dependencies, but their fulfillment cannot be defined precisely, (3) task dependencies are used in situations where the actor is required to perform a given activity, and (4) resource dependencies require the actor to provide a resource to the other actor. Normally, in the i* framework actors are represented as circles, and goals, softgoals, tasks and resources, are respectively represented as ovals, clouds, hexagons and rectangles. However, in AEM, UML 2.0 [20] notation is exploited to depicts the i* SDMs by using the stereotypes << i* actor >> and << dependency >> and tagged values.

4.1.2 Refining of Adaptability Requirements

After the adaptability requirements are identified and negotiated, they must be refined to the final requirements of the system that are considered further in the architecture design. In AEM [1, 2], the specification of the final adaptability requirements is performed system-specifically. All of the requirements must be provided with the identification numbers. It is not always possible to implement all of the requirements, for example, due to time or money. Therefore, the importance of each adaptability requirement must be defined. In AEM, the importance is expressed by using three categories: high, medium and low.

4.1.3 Mapping of Adaptability Requirements to Functionality

According to QADA®, the architecture of the system is first described at the conceptual level. The main functionality, i.e. “what the system does”, can be considered as a main force of the conceptual design. The main functionality of the systems is divided into functional blocks. The entire system is first decomposed into domains which then are decomposed into subsystems and

(12)

leaf components, which are the smallest blocks that are used in conceptual architecture. UML 2.0 [20] notation can be exploited in graphical presentation. In AEM, the mapping of the system- specific adaptability requirements to functionality is performed case-specifically. One adaptability requirement may be mapped to several functional blocks. Additionally, the adaptability requirements themselves may result in certain functionality. The requirements mapping is the specific work of software architects, and requires extensive knowledge of the system. In this phase, the architect only has to decide which services are responsible for the implementation of each of the requirements; the means for achieving the requirements, i.e. the detailed design, do not need to be defined as of yet.

4.1.4 Selecting of Architectural Styles and Patterns and Performing of a Trade-Off Analysis

Software systems can be built from one or several architectural styles. For example, even if the main style is layered the blackboard style can still appear in one of the architectural layers. In the beginning of architecture modeling, the dominant architectural style must at least be selected.

When the dominant style is decided upon, the other architectural styles and patterns can be selected for the smaller parts of the architecture where they may be beneficial. According to the QADA®, the architectural modeling is begun from conceptual architecture. In the conceptual structural view, the functionality, i.e. services or utilities, are organized according to the selected architectural style. The style should be selected carefully by examining how each candidate style can assist in achieving the requirements.

The different adaptability requirements should be transformed to the design decisions or architectural styles and patterns in a predefined way. The different design alternatives can be searched for, for example, from a style base [15] that represents the mapping between the quality attributes and design decisions. In addition, architectural patterns [21-24] and social patterns [25]

can be transformed to the design decisions. The style base [15] provides guidance for architects to see what kind of design alternatives there are. It is the responsibility of an architect to choose the best suitable styles and patterns. The choice of these styles and patterns is not necessarily final, but rather iterative, as more exact designs are made later. Furthermore, the style base can provide detailed design patterns. These are not, however, used until the concrete architecture level. The adaptability levels of the systems define how important the achievements of the adaptability requirements are to that specific system. For example, in the high adaptability level, the adaptability of the system must be guaranteed by using the best possible design techniques. The cost and effort of the design is normally higher in the case of high adaptability level systems, whereas in the case of normal and low adaptability level systems the simpler and inexpensive design techniques can be used.

There is always a risk that the adaptability requirements will conflict with other quality requirements. This might even result in all of the important requirements not being met in the architecture. The purpose of the trade-off analysis is to guarantee the best requirements set considering all of the quality requirements. The NFR framework [26] is one method for the negotiation of various conflicting quality attributes and evaluating the criticality of quality requirements. The NFR framework is a process oriented approach that treats quality requirements as soft goals, i.e. the quality goals, to be achieved [26]. By using the NFR framework, the requirements with the affected stakeholders can be renegotiated and a solution can be found that makes acceptable tradeoffs for all of the stakeholders. It is the duty of software architect to specialize the correlation rules by using domain information. As a consequence of the trade-off analysis, the resulting problems of the analysis must be identified and solved.

(13)

Merlin white paper: Adaptability evaluation of software architectures

4.1.5 Defining of Criteria for Adaptability Evaluation

In AEM, the adaptability evaluation criteria are categorized into four evaluation levels as follows [1, 2]: Level 1; product line adaptability requirements, Level 2; system-specific adaptability requirements of high importance, Level 3; system-specific adaptability requirements of medium importance, and Level 4; system-specific adaptability requirements of low importance.

4.2 PHASE 2: REPRESENTING ADAPTABILITY IN ARCHITECTURAL MODELS

The second phase of the AEM [1, 2] provides guidelines for how to model adaptability in software architecture in a way that the adaptability analysis can be performed directly from the architecture.

In adaptability modeling, the abstraction levels of QADA® are used in two ways. First, the required adaptability of the system is described at the conceptual level, and second, the provided adaptability of the system is described at the concrete level. Adaptability appears in architectural models in two ways: (1) adaptability aspects are attached to architectural elements, and (2) adaptability requirements result in certain design decisions and functionality. When the adaptability requirements result in certain design decisions, such as structures or particular components, the design decision should be documented. Especially the qualitative analysis relies on documented design rationale.

4.2.1 Mapping of Required Adaptability to Conceptual Architectural Elements

After the architectural style is selected at the conceptual architectural level, the adaptability requirements are brought to the architectural models. In AEM [1, 2], the requirements are attached to architectural elements in the conceptual structural and deployment views of QADA®. This means that the requirements are transformed to the required responsibilities of the architectural elements, i.e. the components and connectors. In architecture, the required adaptability guides the design of concrete architecture and helps to make the design decisions. By mapping the requirements to the system behavior in conceptual behavioral view, the requirements have an influence on the dynamic aspects of the system. In AEM, the fourth view of QADA®, i.e. the conceptual development view, is used to organize the design work.

The conceptual structural view represents the static relationships of the architecture components.

The mapping of each adaptability requirement to functionality was performed when defining the quality goals. This enabled the tracing of requirements to architecture. Now, vice versa, all of the adaptability related requirements are defined for each architectural element. This enables bidirectional requirements tracing from architecture to requirements. By using UML 2.0 [20]

notation, the static structure of the system can be represented, for example, by using a component diagram or a composite structure diagram. Typically, the exact means and techniques to implement the requirements are not yet defined, but the definition helps one to define what is required from the system and its elements. The adaptability requirements and design rationale are written inside the architectural elements, i.e. components or services and connectors.

The conceptual behavioral view helps one to understand the dynamic aspects of the system. The view represents the dynamic relationships of architecture components. According to QADA®, the behavior of the system is described at the conceptual level as abstract descriptions of collaboration that describe the interactions between architecture components. The state diagrams or message sequence diagrams of UML 2.0 [20] notation can be used to describe the interactions between components.

The conceptual deployment view allocates units of deployment to physical computing units. In the deployment diagram of UML 2.0 [20], components are described as deployment nodes or units of deployment with types, and relationships as is allocated to relationships. The required adaptability is denoted by attaching requirements to nodes and relationships.

(14)

The conceptual development view does not itself assist in the adaptability representation.

However, the view helps one to detect which component and services have to be developed, which can be found in the asset repository and the ones that have to be bought.

4.2.2 Mapping from Conceptual Architecture to Concrete Architecture

When mapping the adaptability requirements to the conceptual architecture, the results of the adaptability requirements are reflected in the concrete architecture. The traceability of the adaptability requirements to the conceptual architecture and the concrete architecture must be ensured. Conceptual components, i.e. services, are more logical modeling elements than concrete implementation components. Thus, one conceptual service may result in several concrete components, or one concrete component may contribute to the implementation of one or more conceptual services. The mapping between conceptual and concrete architecture must be documented to trace the adaptability requirements to the concrete architectural level. The different design alternatives can be searched again from a style base [15] that represents the mapping between the qualities and design decisions. In addition, architectural patterns [21-24] and social patterns [25] can be transformed to the design decisions.

4.2.3 Mapping of Provided Adaptability to Concrete Architectural Elements

The provided adaptability means the adaptability that the system offers, and can therefore signify the means and techniques for implementing the adaptability requirements that the system elements provide. In AEM [1, 2], the concrete view of QADA® is used in the adaptability analysis.

The requirements are attached to architectural elements in the structural and deployment views of QADA® and are represented in the concrete architecture by using the concrete structural and deployment views. In the architecture, the provided adaptability guides the design of concrete components or represents the properties of the existing components, i.e. components in the asset repository or COTS components, which can be used. The behavioral view of QADA® assists in the modeling of the behavior of the components and the systems. The development view refines the allocation that is defined in the conceptual development view to concrete components. When more detailed description of the architecture components and their interfaces are needed, they can be developed and depicted as class diagrams.

The concrete structural view is used to describe the concrete components and interfaces needed for corresponding conceptual architecture. Therefore, the view decomposes the conceptual architecture into lower aggregation levels. The component diagram or composite structure diagram of UML 2.0 [20] notation can be used in order to describe the structure of the system. The provided adaptability is attached to the architectural elements by using the same criteria as in conceptual levels. The concrete structural view also reveals the interfaces of the components. Interfaces must be described in a way that enables the estimation of the interoperability of components.

Interoperability is the capability of the service to use the information exchanged with other services, and provide something new that has originated from it, and therefore the adaptability of the interfaces can be estimated by examining the component interoperability. An example of an architectural level interface description is given in [27].

In the concrete behavioral view, the state diagrams or message sequence diagrams of UML 2.0 [20] notation can be used to describe the interactions between components. For each new component, the state diagram must be defined to describe the internal states and state transition.

The message sequence diagram is used to derive input messages for adaptability analysis. Also, the activity diagram is required to derive a model for the adaptability analysis. An activity diagram typically represents the operational workflows of a system.

(15)

Merlin white paper: Adaptability evaluation of software architectures

components, the relationships between the hardware components, and the relationships between the software and hardware components. However, AEM concentrates on software systems;

therefore this portion is limited.

The concrete development view links the architectural views to the repository of common assets.

Thus, the components that already exist can be linked to the concrete components that they realize.

4.3 PHASE 3: ADAPTABILITY EVALUATION

The third phase of AEM [1, 2] is about analyzing the architecture to validate whether or not the adaptability requirements are met. The adaptability evaluation is performed by using the quantitative and qualitative analyses. The quantitative adaptability analysis evaluates the architecture adaptability based on adaptability scenario profile (ASP) and impact analysis (IA) of a system based on its structure in terms of composition, i.e. components and their interactions. This analysis requires that the structure of the system is known, both the static aspects represented by its components and the dynamic aspects represented by the execution frequency of each component and each interaction between components. The quantitative approach also assumes that the behavior of the components and component interactions are known. Qualitative adaptability analysis is complementary to the quantitative one and can be applied without knowing the behavior of components. The qualitative analysis consists of reasoning the design decisions, e.g., architectural styles and patterns and their support for the adaptability requirements.

4.3.1 Quantitative Analysis

The purpose of the quantitative analysis is to provide a systematic adaptability evaluation approach to support architecture improving and decision making for choosing among candidate architectures. In AEM [1, 2], quantitative analysis is driven by stakeholders’ adaptability goals including following four steps [9]: (1) developing ASP for each candidate architecture based on the adaptability goals of the system, (2) performing IA under ASP, (3) applying the metric and calculating the value of adaptability degree, and (4) analyzing the results of adaptability evaluation.

Scenario is one of the effective techniques in architecture analysis [9, 28]. The adaptability scenario is “a description of the system behavior, driven by the change requirement, including the usage for the system, the reaction to the change requirement and potential future changes, which are all related to adaptability” [9]. The ASP is a set of related adaptability scenarios.

Once the ASP is available, the IA for each of the adaptability scenarios can be performed by exploiting the Class Point (CP) method [29] and the class point calculation worksheet [29]. The Class Point method is based on the Function Point Analysis (FPA) [30] approach and it is conceived to estimate the size of object-oriented products, based on design documentation. The idea underlying the Class Point method is the quantification of classes in a program in analogy to the function counting performed by the FPA measure. This idea derives from the observation that in the procedural paradigm the basic programming units are functions or procedures; whereas, in the object-oriented paradigm, the logical building blocks are classes, which correspond to real- world objects and are related to each other [29].

In the CP size estimation, the design specifications are firstly analyzed in order to identify and classify the classes based on their types [29]: (1) the Problem Domain Type (PDT) component contains classes representing real-world entities in the application domain of the system, (2) the classes of the Human Interaction Type (HIT) are designed to satisfy the need for information visualization and human interaction, (3) the Data Management Type (DMT) component encompasses the classes that offer functionality for data storage and retrieval, and (4) the Task

(16)

Management Type (TMT) classes are designed for purposes of task management and communication between subsystems, and external systems. Such class typologies can be detected in any object-oriented (OO) system, independently of the application domain and of the design methodology adopted [29].

Secondly, the behavior of each class is taken into account in order to evaluate its complexity level (low, average, or high). The Number of External Methods (NEM) measures the size of the interface of a class and is determined by the number of locally defined public methods [29]. The Number of Services Requested (NSR) provides a measure of the interconnection of system components and is determined by the number of different services requested from other classes [29]. The Number of Attributes (NOA) is also taken into account in order to evaluate the complexity level of a class [29]. For example, if a class had more than nine NEM and the NSR value is not less than 2, and NOA is bigger than or equal to ten, it is assigned a high complexity level.

Once the complexity level of each identified class has been established, the Total Unadjusted Class Point value (TUCP) can be computed as the weighted total of the four components of the application [29]:

ij j

ij i

x w TUCP=

∑ ∑

×

=

= 3

1 4

1

where is the number of classes of component type i (PDT, HIT, DMT, TMT) with complexity level j (low, average, or high), and is the empirical weighting value for type i and complexity level j.

xij

wij

Next, the Technical Complexity Factor (TCF) is determined by assigning the degree of influence (ranging from 0 to 5) that 18 general system characteristics have on the application, from the designer’s point of view [29]. The estimates given for the degrees of influence are recorded in the Processing Complexity table. The sum of the influence degrees related to such general system characteristics forms the Total Degree of Influence (TDI ), which is used to determine the TCF according to the following formula [29]:

) 01 , 0 ( 55 ,

0 TDI

TCF = + ×

The final value of the Adjusted Class Point (CP) is obtained by multiplying the Total Unadjusted Class Point value by TCF [29]:

TCF TUCP CP= ×

For architecture adaptability two metrics can be used: (Impact On the Software Architecture) and (Adaptability Degree of the Software Architecture) to measure the impact on the architectures [9]. The is calculated by summing up the CP values in each adaptability scenario [9]:

IOSA ADSA

IOSA

( ( ) ( ))

1

sk sk

S

k

k CPC CPT

PS

IOSA=

× +

=

(1)

where S is the number of the adaptability scenario, is the estimated probability of adaptability scenario based on the adaptability evaluation criteria, and CP is the impact analysis result of

. and are the set of impacted components and connectors respectively.

PSk

Sk

Sk Csk Tsk Sk

The application of the is that the degree of the adaptability has inverse relation to the value of , so the ADSA is defined as [9]:

IOSA IOSA

(17)

Merlin white paper: Adaptability evaluation of software architectures

The is measured in terms of size of the impacted component and connectors [9]. From the equation 1, the range of IOSA is

[

, so the range of is

IOSA 0,

]

ADSA

[ ]

1,0 , 1 means that the architecture is totally adaptable in all dimensions, and 0 means that architecture is not adaptable in any dimensions. In order to make the value of spread in the range equally, the value of must be close to 1. After some experiments, we define

ADSA N

01 .

=1

N . Based on the value of the architect can decide which candidate architecture is more adaptable to stakeholders’ adaptability goals and in which dimensions the architecture is adaptable. In addition, the architect can identify the weaknesses of the architectures to support architecture improvement. However, the architectures must be designed for same system, or the value of is meaningless.

ADSA

ADSA

4.3.2 Qualitative Analysis

In AEM [1, 2], the qualitative analysis relies on documented design rationale that must be included or accompanied in the architectural models. If this is not the case, then the analysis relies heavily on the architects’ tacit knowledge. By analyzing and reasoning about one architectural solution, the qualitative analysis provides assurance to the architect that the requirements have been addressed in the design. By analyzing different architectural solutions for the same requirements, the analysis provides an evaluation of the degree to which they address the requirements and it also allows to compare different architecture candidates and recommend one for the solution.

The process of qualitative analysis can be partly automated, for example by automating the report generation. The main parts of the analysis still require a human analyzer. The qualitative analysis is about tracking the adaptability requirements. The bidirectional requirement tracking is about tracking the requirements to the architecture and the properties of the architecture to the requirements. The tracking is performed based on the requirement numbers that are associated to architectural elements by using the required adaptability and provided adaptability profiles. The required adaptability profile maps the requirements to the architecture at the conceptual level and the provided adaptability profile describes how these requirements are taken into account at the concrete level. Therefore, the qualitative analysis verifies that each requirement has been taken into account in the architecture design. When analyzing the architecture and its components, the tracking is performed vice versa; from concrete architecture to the conceptual and furthermore to requirements.

Design rationale can be associated with individual components, with individual connections, and a set of components and their connections. The analyzer compares the design decisions with the adaptability requirements and analyses how those requirements are met in the architecture. The analyzer also has to decide if the requirements are met sufficient enough, and to examine how to meet requirements better and how well all of these decisions work together. For comparing two different architectures, the qualitative analysis must be performed for each of the designs, and thereafter a numerical indicator for the coverage of requirements is used, but also human judgment regarding the proposed solutions has to be applied.

The objective of the qualitative analysis is to determine and to identify problems that may occur when certain adaptability requirements are not met in the architecture. Thus, architect must pay attention to the parts of the architecture that require an enhancement to meet the adaptability requirements in this particular architecture, without changing the architectural style.

4.3.3 Decision Making Based on the Analysis

If the result of the qualitative and/or quantitative adaptability analysis reveals that the particular architecture is not sufficient enough for the adaptability requirements, the architect has two choices: (1) keep the architecture and increase adaptability of components and their interactions.

This can be performed by choosing components with higher adaptability, by implementing higher

(18)

adaptable components by eliminating software defects in their implementation, and by deploying software on more adaptable hardware, and (2) change the architecture by using different architectural styles and patterns, and by introducing new adaptability mechanisms.

The AEM [1, 2] enables that the adaptability analysis can be performed systematic way and repeatedly for each architectural choice. The results of the analyses of different architectural choices must be evaluated against adaptability evaluation criteria and against each other. Human analysis is required to decide which architectural alternative meets the requirements best.

(19)

Merlin white paper: Adaptability evaluation of software architectures

5. VALIDATION OF AEM

AEM [1, 2] is validated in a practical case [2] with a wireless environmental controlling system, which is used to manage electrical control appliances of doors, windows, lights, security systems, elevators, etc. located in the user's close surroundings. The target system consists of the Client SW installed into user’s mobile phone, and unlimited number of receivers including their own software and hardware. The receivers are connected to the electrical appliances with cable and they receive control messages from mobile phone via Bluetooth interface. The validation was performed by following the phases of AEM exactly. The required adaptability for the target system was achieved. Two new usable and adaptable software architectures for the target system for multiple platforms were developed and analyzed. As a result of the case, the case partner received architectural design documents of the target system on the component level, and networking components even on the class level, both depicted with UML 2.0. The case partner discovered a lot of proposals to resolve architectural problems especially in network level. The validation results reveal that AEM can be used (1) in requirements engineering, (2) in designing, negotiating, and mapping of adaptability requirements to software architectures, and (3) in adaptability evaluation of software architectures [2].

(20)

6. SUMMARY AND CONCLUSIONS

Adaptability has become one of the key feature and an increasingly important factor for survival and success of software systems. Software architectures for such systems should be flexible enough to allow components to change their pattern depending on the environmental changes and goals of the system and stakeholders’ objectives, without changing the actual components themselves. Due to the shortcomings of the existing adaptability evaluation methods, a new method is required to evaluate adaptability of the software systems from architectural models.

This white paper described and discussed the Adaptability Evaluation Method (AEM). AEM defines how the adaptability requirements should be negotiated and mapped to the software architecture, how they should be represented in the architectural models, and how the architecture should be analyzed in order to validate whether or not the requirements are met. The method assists in requirement engineering, architecture modeling and adaptability evaluation from the architectural models. Furthermore, the method provides the capability to ensure, before system implementation, that the requirements are met. AEM fills the gap from requirements engineering to evaluation and provides the required tool and notation extensions, techniques and guidelines for adaptability evaluation at the architecture level.

AEM is validated with a wireless environmental controlling system. The validation results reveal that AEM can be used in depicted systematic way in definition, negotiation and mapping of adaptability requirements to software architecture. Furthermore, AEM can be used in adaptability evaluation of software architectures.

(21)

Merlin white paper: Adaptability evaluation of software architectures

REFERENCES

[1] P. Tarvainen, "Adaptability Evaluation of Software Architectures," Tech. Rep. D.2.5.3 of MERLIN Project (ITEA no. 03010), 2006, 83 p.

[2] P. Tarvainen, "Adaptability Evaluation of Software Architectures; A Case Study," in Proceedings of the First IEEE International Workshop on Software Engineering for Adaptive Software Systems (SEASS 2007) Held in Conjunction with the 31st Annual IEEE International Computer Software and Applications Conference (COMPSAC 2007), 2007, 6 p.

[3] L. Bass, P. Clements and R. Kazman, Software Architecture in Practice. Massachusetts: Addison- Wesley, 2003, 512 p.

[4] R. Kazman, M. Klein and P. Clements, "ATAMSM: Method for Architecture Evaluation," Carnegie Mellon University, Software Engineering Institute, Tech. Rep. CMU/SEI-2000-TR-004 ESC-TR-2000-004, 2000 83 p.

[5] S. M. Yacoub and H. H. Ammar, "A Methodology for Architecture-Level Reliability Risk Analysis,” IEEE Transactions on Software Engineering, vol. 28, pp. 529-547, 2002.

[6] P. Bengtsson, N. Lassing, J. Bosch and H. Van Vliet, "Architecture-Level Modifiability Analysis (ALMA),"

Journal of Systems and Software, vol. 69, pp. 129-147, 2004.

[7] A. C. Meng, "On Evaluating Self-adaptive Software," in Proceedings of the 1st International Workshop on Self-Adaptive Software (IWSAS 2000), 2000, pp. 65-74.

[8] L. Chung and N. Subramanian, "Process-Oriented Metrics for Software Architecture Adaptability," in Proceedings of the IEEE International Conference on Requirements Engineering, 2001, pp. 310-311.

[9] Xia Liu and Qing Wang, "Study on Application of a Quantitative Evaluation Approach for Software Architecture Adaptability," in Proceedings of the 5th International Conference on Quality Software (QSIC 2005), 2005, pp. 265-272.

[10] P. Oreizy, M. M. Gorlick, R. N. Taylor, D. Heimbigner, G. Johnson, N. Medvidovc, A. Quilici, D. S.

Rosenblum and A. L. Wolf, "An Architecture-Based Approach to Self-adaptive Software," IEEE Intelligent Systems and their Applications, vol. 14, pp. 54-62, 1999.

[11] M. Matinlassi, "Quality-Driven Software Architecture Model Transformation. Towards Automation," Ph.D Thesis, VTT Publications 608, VTT Technical Research Centre of Finland, pp. 101 + app. 95, 2006.

[12] ISO/IEC Std. 9126-1, "Software Engineering - Product Quality - Part 1: Quality Model," International Organization for Standardization / International Electrotechnical Commission, Tech. Rep. TR 9126- 1:2001(E), 2001, 32 p.

[13] J. Bosch and P. Molin, "Software Architecture Design: Evaluation and Transformation," in Proceedings of the IEEE Engineering of Computer Based Systems Symposium (ECBS'99), 1999, pp. 4-10.

[14] M. Matinlassi and E. Niemelä, "The Impact of Maintainability on Component-based Software Systems,"

in Proceedings of the 29th EUROMICRO Conference (EUROMICRO’03), "New Waves in System Architecture", 2003, pp. 25-32.

[15] E. Niemelä, J. Kalaoja and P. Lago, "Toward an Architectural Knowledge Base for Wireless Service Engineering," IEEE Transactions on Software Engineering, vol. 31, pp. 361-379, 2005.

[16] E. Niemelä and A. Immonen, "Capturing Quality Requirements of Product Family Architecture,"

Information and Software Technology, Doi:10.1016/j.Infsof.2006.11.003, 2006.

[17] E. Niemelä, M. Matinlassi and A. Immonen, "Quality-driven Development of Software Family Architectures, in Embedded Software Research and Development Activities 2004," VTT Technical Research Centre of Finland, Oulu, Finland, 2005, 7 p.

[18] M. Matinlassi, E. Niemelä and L. Dobrica, Quality-Driven Architecture Design and Quality Analysis Method, A Revolutionary Initiation Approach to a Product Line Architecture. Espoo: VTT Publication 456, VTT Technical Research Centre of Finland, 2002, 138 p.

[19] L. Chung, D. Gross and E. Yu, "Architectural Design to Meet Stakeholders Requirements," in Proceedings of the TC2 First Working IFIP Conference on Software Architecture (WICSA1), 1999, pp.

545-564.

[20] OMG, Unified Modeling Language: Superstructure Version 2.0. Needham, MA, U.S.A.: Object Management Group, 2005, 710 p.

[21] J. Bosch, Design and Use of Software Architectures: Adopting and Evolving a Product Line Approach.

Harlow: Addison-Wesley, 2000, 368 p.

[22] M. Shaw and D. Garlan, Software Architecture: Perspectives on an Emerging Discipline, 1st Ed. New Jersey: Prentice Hall, 1996, 242 p.

(22)

[23] B. P. Douglass, Doing Hard Time: Developing Real-Time Systems with UML, Objects, Frameworks, and Patterns. Boston, MA: Addison-Wesley Professional, 1999, 800 p.

[24] F. Buschmann, R. Meunier, H. Rohnert, P. Sommerland and M. Stal, Pattern Oriented Software Architecture. A System of Patterns, vol. 1, Chichester: John Wiley & Sons, 1996, 476 p.

[25] T. T. Do, M. Kolp and A. Pirotte, "Social Patterns for Designing Multi-Agent Systems," in Proceedings of the 15th International Conference on Software Engineering and Knowledge Engineering (SEKE'03), 2003, 8 p.

[26] L. Chung, B. Nixon, E. Yu and J. Mylopoulos, Non-Functional Requirements in Software Engineering.

Boston, Dordrecht: Kluwer Academic Publishers, 1999, 476 p.

[27] A. Immonen, J. Holappa, P. Kallio and J. Kalaoja, "Towards Interoperability of Wireless Services - A Description Model of Service Interfaces," in Proceedings of the International Conference on International Association for Development of the Information Society (IADIS), 2004, pp. 983-988.

[28] M. A. Babar and I. Gorton, "Comparison of Scenario-Based Software Architecture Evaluation Methods,"

in Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC), 2004, pp. 600-607.

[29] G. Costagliola, F. Ferrucci, G. Tortora and G. Vitiello, "Class Point: An Approach for the Size Estimation of Object-Oriented Systems,” IEEE Transactions on Software Engineering, vol. 31, pp. 52-74, 2005.

[30] J. B. Dreger, Function Point Analysis. Englewood Cliffs, New Jersey: Prentice Hall, 1989, 185 p.

Viittaukset

LIITTYVÄT TIEDOSTOT

The Organizational factor represents the fulfillment of company needs, how does system suit to the requirements, existing business processes and created outline.. It also