• Ei tuloksia

8. EMPIRICAL EVALUATION

8.2. Case Studies

8.2.3. Changes in the RE Practices

The Lightweight REAIMS Top Ten assessment was conducted in the meetings in the beginning of, in the middle of, and after the project (Table 35). Table 35 highlights the difference in RE practices between the phases within each company and, on the other hand, between the companies. A clear change in the practices is shown in the phase the BaRE method was adopted – for Company A in Phase 1 and for Company B in Phase 2. The change was most evident in the infrastructure part of the guidelines (cf. p. 118, the first five guidelines) since after adopting the method companies started reporting having infrastructure available for the named tasks. For example, after adopting the method, companies had a standard document structure defined, the BaRE RD template, which included a unique identifier for each requirement in the standard requirements description templates. In the working practices (the last five guidelines) the change was not so evident since actual implementation of these guidelines depends on the people doing the work and, therefore, it only makes sense to report these practices at most as applied at the discretion of the project manager (Table 8, p. 48). In Company B, two guidelines seemed unnecessary to apply in any way even after the BaRE method adoption. Company C refused to make statements about adopting the method and they consistently reported only the actual changes in their practices. Thus, of the Lightweight REAIMS Top Ten guidelines only the ones concerned with the RD template changed during the project in Company C.

Overall, the Lightweight REAIMS Top Ten assessment provided a quick estimate of the state-of-the-practice in the companies that reflects the actual situation fairly well in these three cases.

In Company C the total point gain did not change much (from 7 to 11) during the project, which is consistent with the fact that they had a requirements document template to start with and they focused only on developing a new and better one. However, the changes during the project indicate a move to a more systematic way of working. Companies A and B, on the other hand, had very little to start with and adopted the BaRE method as the method to follow in RE. Since both the companies did introduce many of the suggested guidelines, starting with the infrastructure guidelines, their total point gains experienced tangible increases. It is clear, though, that such a quick assessment is not very comprehensive or reliable, and therefore, a number of other more detailed investigations into the actual practices were performed as part of the present study.

Since the literature based evaluation brought up the difference between infrastructure and working practices as described earlier (p. 118), a more comprehensive list of basic infrastructure was developed and used to supplement the conducted Lightweight REAIMS Top Ten assessments. This list does not focus on RE only, but addresses other areas in software engineering that are inherently intertwined with RE – e.g. change management, reviews, and software engineering in general. The infrastructure list is shown in Table 36. Companies A and B addressed topics that are directly covered by the BaRE method documentation but not much more: neither one of them adopted software engineering methods or acquired training material

in the application domain during the project. Company A arranged some project management training in addition to RE training during the project. Consequently, they also got training material in software engineering from the course; no Company B representative participated in such training. In Company C changes were limited to requirements document and description templates and the adoption of the BaRE method as common training material. Notice that in Table 36, for example, Company C is reported not to have change request templates and requirements development processes even though they are included in the BaRE method documentation; this is caused by an unawareness or lack of interest in such artifacts.

Considering company level working practices, no such clear changes can be claimed even though important improvements were identified. In Company A, the system analyst stated after Phase 1 that they were quite familiar with the techniques suggested in the BaRE Guide.

However, after Phase 2 she elaborated the requirements development work from a wider perspective:

My approach to requirements development has changed to a more structured one. How do you actually go about writing requirements? Earlier it was just starting to write the requirements document, but now I really think how things should be and iterate through the requirements to complete the document.

Table 35. Lightweight REAIMS Top Ten assessment point gains in different phases. The symbols mean the following: ‘-’ Never applied, ‘ ’ Applied at the discretion of the project

manager, ‘ ’ Normal use, and ‘ ’ Standard use in company

A B C Case

Phase Guidelines

P0 P1 P2 P0 P1 P2 P0 P1 P2

Define practices for RM -

Define a standard document structure - -

Define unique identifier for each

requirement - - -

Define standard templates for

requirements description - - -

Define validation checklists - - - - - -

Organize informal requirements

reviews - -

Use language simply, consistently

and concisely - - -

Make the document easy to change - - - - -

Use checklists for requirements

analysis - - - - -

Plan for conflicts and conflict

resolution - - - - -

Total Point Gain 4 21 21 0 3 18 7 7 11

Company B used the requirements development process only in one project but this was the first software development project where a written requirements document was done. Company B also established new change management practices and processes within a couple of months from the beginning of the evaluation project; they had tried to do this before without success.

Company C did not indicate any big changes in their practices but having only one requirements document template in the company had made it easier to communicate about requirements. The project manager C3 elaborated this topic and the changes in the company more generally as follows:

Table 36. Infrastructure in different phases. The symbols mean the following: ‘-’ Not available, ‘ ’ Discrepant artifacts exist, ‘ ’ Common artifact available in company, and ‘ ’ Standard artifact

available in company

A B C P0 P1 P2 P0 P1 P2 P0 P1 P2 Case

Phase Element

Templates

Requirements description - -

Requirements document - -

Change requests - - - -

Processes

Requirements development - - - - - -

Change management -

Review - -

Techniques

Requirements development - -

Change management -

Checklists

Requirements engineering - - - - - -

Change management - - - - -

Training material

Requirements engineering - - -

Software engineering - - - - -

Application domain - - -

Methods

Requirements engineering - - - - - -

Software engineering - - -

Tools

Requirements management - - - - -

Change management - - - -

At least the co-operation has clearly gotten tighter. Namely, now we are thinking together whether things should be done this way or the other while earlier everyone was working alone with the document. Further, everyone was using the template he was accustomed to from previous projects and companies but now we have the advantage that you can ask specific questions and the others understand easier what you mean. … Of course we had communication also before with the previous templates but not so much and not at the same time as now. Usually the discussions took place after the document was already completed. This made making changes hard while now the discussions take place when the document is still under construction.

Further, a study of the Company C requirements documents in Phase 2 revealed that they both were supported by three appendices. In these projects, two common appendices were a glossary and a use case description document; the actual use cases defined the create, read, update and delete (CRUD) operations for each use case as a novelty.

Table 37 summarizes the properties of the RDs produced in companies B and C during the requirements development phase. Since the Company A project focused on a change of a database implementation, the project differed so much from a normal software development project that a further study of the developed RD is not justified here. Company C translated the RD template to Finnish to be able to use it with Finnish customers and, consequently, the author of this thesis developed a translation of the BaRE RD template to have a similar baseline for both Finnish and English language documents. The General Properties in Table 37 refer to the

Table 37. Properties of the produced requirements documents. The counts in parenthesis indicate respective counts in the BaRE RD template. The four last columns refer to the projects

that produced the summarized RDs; the first letter indicates the company (B and C) and the suffix thereafter the project number (P1 through P4)

Requirements Documents B-P1 C-P1 C-P2 C-P3 C-P4

Page count contents/total 12/17 10/14 16/20 23/28 14/19

Appendix count (5) 3 0 0 3 3

Appendix page count, contents/total 58/71 0 0 21/21 34/41

General Identical 3 0 0 1 3

Properties (9) Similar 4 0 1 3 1

Partly similar 2 1 0 1 2

Different 0 5 5 2 1

Omitted 0 3 3 2 2

Document Identical 48 19 12 15 14

Headings (51) Similar 0 15 24 22 21

Different 0 13 7 2 2

Use case name 0 0 6 4 0

Omitted 3 4 8 12 14

Count 48 47 49 43 37

Contents for Group header (8) 8 6 9 8 9

Headings Exists (43) 34 32 21 35 28

Missing 6 9 19 0 0

existence and contents of a cover page, document information table, change history, table of contents, document template (headers and footers), appendices, requirements tables, other tables (Stakeholders, Product Position Statement, Document Overview, Users, Important Domain Properties, Database, Documentation Requirements, and Likely Changes), and diagrams. The Document Headings refers to the headings of the RD template and the Contents for Headings refers to the actual text or requirements under each heading; the Group Header and Missing indicate the number of headings without any data. None of the produced documents had General Properties or Document Headings that were not suggested in the BaRE template or were not comparable with some other item in it.

Table 37 reports also the differences between the produced requirements documents and the BaRE requirements document template. The counts in parenthesis in the table indicate the respective item counts in the BaRE template; Identical means that no differences could be identified between the requirements document under study and the BaRE RD template while Similar refers to practically comparable meanings. Partly similar refers to an identifiable difference between the items and Different refers to a clear semantic difference between them.

Omitted is used to record the count of items removed from the BaRE template and the Use case name reports the count of use case names included in the table of contents since one project manager was accustomed to do so.

Looking at Table 37, it is striking that the requirements documents are fairly short and appendices tend to include a wealth of details. The three projects reporting use of appendices had all produced a Use Case Description document and two of them had also prepared a glossary; the third project reported a need and intent to make a glossary also but had not yet managed to do so. The third appendix was different in all the projects – one project had a Detailed Requirements appendix, another had a Report Examples appendix, and the third project had put a customer supplied interface specification to an adjacent system as an appendix to the RD. Another general observation from Table 37 is the fact that the Company C documents produced in Phase 2 (C-P3 and C-P4) resemble the BaRE RD template much more than the earlier projects produced in Phase 1 (C-P1 and C-P2). The fact that the C-P3 and C-P4 documents appear fairly similar to each other follows from the fact that the project managers for these projects made a conscious effort to develop a common RD template for the company during these projects; some of the differences, on the other hand, followed from the fact that the C-P3 project used Microsoft Word for the text processing while C-P4 used LaTeX for this purpose. Thus, even though a serious effort was made to establish a common RD template in the company, some infrastructure issues still remained to be solved.

The requirements documents summarized in Table 37 followed the BaRE RD template development ideas by and large. Company B adopted the English RD template and made two kinds of changes to it. First, some company specific adaptations to administrative data and its representation were made, and second, three sections were dropped from the contents:

document structure, users, and business process description.

Company C, on the other hand, made changes to all the three areas reported in Table 37 – General Properties, Document Headings and Contents for Headings. The general changes included dropping the use of tables for requirements and other information since plain text was seen as a better way to manage large amounts of data (i.e., Omitted in Table 37). However, the textual information had a standard structure and in most cases the text was preceded with a tag like “Description” or “Format” to identify the purpose of the data. The Different items included

a change to a company specific look for the cover page and a different kind of diagram use in the document while the Partly Similar items covered appendices and further deviations in the diagram use. The project C-P4 left a total of 14 of the BaRE template headings away from its own document; C-P3 left also 12 headings away but it had both one similar and one identical heading more than the project C-P4. The headings left away by both the projects included the ones describing product, document structure, logical user interface characteristics, business process description, performance, reliability, scalability, and documentation requirements. C-P3 had come up with the same translation for the Miscellaneous Requirements heading as in the BaRE template and used a similar term for the Design Constraints headings while C-P4 omitted them both totally. The different approaches for the Miscellaneous Requirements heading could well be connected with the decision to use the phrase “additional requirements” as the translation for Non-Functional Requirements, as this phrase is quite close to the phrase

“miscellaneous requirements.” The biggest differences in the headings, however, occurred with the translations of the headings Requirements Allocation and Availability as the Company C translations for these terms were phrases “implementation baseline” and “level of use,”

respectively. As the last point, the differences in the group header deserve two comments. First, the differences in the group header counts were caused by some descriptive text that was added in the project C-P3 document with little practical relevance for the document itself. Second, neither one of the Company C projects had headings without contents which made the documents look like completed ones.

As the last point in this section, the results of the SPICE process assessment are summarized.

This assessment was conducted independently from the present study during its final month.

The purpose of the software requirements analysis process is “to establish the requirements of the software components of the system” (ISO/IEC TR 15504-5 1998, p. 20) and the assessment results were the following:

Company A: “Process assessment was conducted on levels 1 and 2 and level 2 was achieved.

Level 1 assessment covered performance of 8 base practices, and as most of these practices were fully achieved, the overall rating was that the level 1 performance attribute was fully achieved. Level 2 assessment covered 8 management practices both in performance and work product management attributes, and the level 2 was fully achieved” (Saastamoinen and Tukiainen 2003a, p. 18).

Company B: “Process assessment was conducted on levels 1 and 2 and level 2 was achieved.

The company had clearly defined a requirements management process and the level 1 performance attribute was fully achieved. Also level 2 management practices were achieved largely both in performance and work product management attributes. The company used the BaRE method to elicit and specify requirements and managed them with the sfrm software. These tools made it possible to achieve an established and predictable way of working and maintain requirements documents” (Saastamoinen and Tukiainen 2003b, p. 16).

Company C: “A software requirements management process assessment was performed and level 1 was achieved. The assessment covered 8 base practices and most practices were fully or largely achieved. The most significant omission on level 1 base practices was lack of measurable quality requirements and lack of procedures to handle requirements change management and traceability; thus

these two practices were achieved only partly. Level 2 assessment covered 8 management practices both in performance and work product management attributes. The performance attributes were achieved in general largely, while the management practices appeared to be less well addressed and the management attributes were achieved only partly” (Saastamoinen and Tukiainen 2003c, p. 18).

As a summary of the SPICE assessment, the overall requirements management process capabilities in companies A and B were rated at level 2 while Company C was rated at level 1.

This result is comparable with the outcome of the Lightweight REAIMS Top Ten evaluation (p.

138) where companies A and B achieved 21 and 18 points, respectively, and Company C got only 11 points.