• Nenhum resultado encontrado

AIRDoc-An approach to improve the quality of requirements documents: dealing with use case models

N/A
N/A
Protected

Academic year: 2021

Share "AIRDoc-An approach to improve the quality of requirements documents: dealing with use case models"

Copied!
166
0
0

Texto

(1)Pós-Graduação em Ciência da Computação. AIRDoc – An Approach to Improve the Quality of Requirements Documents: Dealing with Use Case Models. Ricardo Argenton Ramos Doctoral Thesis. Universidade Federal de Pernambuco posgraduacao@cin.ufpe.br www.cin.ufpe.br/~posgraduacao. RECIFE, NOVEMBER 2009.

(2) UNIVERSIDADE FEDERAL DE PERNAMBUCO CENTRO DE INFORMÁTICA PÓS-GRADUAÇÃO EM CIÊNCIA DA COMPUTAÇÃO. Ricardo Argenton Ramos. AIRDoc – An Approach to Improve the Quality of Requirements Documents: Dealing with Use Case Models. THESIS PRESENTED TO THE GRADUATE PROGRAM IN COMPUTER SCIENCE OF THE UNIVERSIDADE FEDERAL DE PERNAMBUCO IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF SCIENCE.. Advisor: Jaelson Freire Brelaz de Castro Co-Advisor: João Baptista da Silva Araújo Júnior. RECIFE, NOVEMBER 2009.

(3) Ramos, Ricardo Argenton AIRDoc-An approach to improve the quality of requirements documents: dealing with use case models / Ricardo Argenton Ramos. - Recife: O Autor, 2009. xii, 149 folhas : il., fig., tab. Tese (doutorado) – Universidade Federal de Pernambuco. CIn. Ciência da Computação, 2009. Inclui bibliografia. 1. Engenharia de software. 2. Engenharia de requisitos. 3. Qualidade de documentos de requisitos. I. Título. 005.1. CDD (22. ed.). MEI2010 – 064.

(4)

(5) À minha mãe, por toda dedicação e amor..

(6) Agradecimentos Neste espaço eu gostaria de agradecer a todos que de alguma forma me ajudaram a desenvolver esta tese de doutorado, alguns diretamente e outros com apoio psicológico ou mesmo até com conversas em uma mesa de bar sobre assuntos relacionados à vida, o universo e tudo mais. Agradeço a Deus pelo caminho que ele me ajuda a trilhar. Agradeço ao meu orientador, Jaelson Freire Brelaz de Castro, por toda sua paciência e dedicação que ajudaram muito no meu crescimento como profissional e como ser humano. Aprendi muito com seus ensinamentos nestes anos de doutoramento. Obrigado por ter me aceitado como seu aluno de doutorado. Agradeço ao meu co-orientador João Araújo da Universidade Nova de Lisboa, que com muita dedicação me orientou no período de doutorado sanduíche em Lisboa e que ainda hoje me auxilia nas minhas pesquisas. Agradeço à minha esposa Paula pelo amor e incentivo em minhas pesquisas e na minha vida. Agradeço a todos da minha família, principalmente à minha mãe Izilda, meu pai Paulo, meu irmão Lucas e minha tia Vilma e tio Pérsio. Agradeço também à minha sogra Maria. Agradeço às minhas amigas de pesquisa que muito me ensinaram e sei que ainda vou aprender muito com elas, Rosângela Penteado, Fernanda Alencar e Ana Moreira. Agradeço aos amigos que dividi a moradia em Recife, pelos bons, engraçados e felizes momentos, Ricardo Afonso, Alexandre Alvaro, Eduardo Almeida, Vinicius Garcia, Joel Silva e Cinthya Roberta. Agradeço aos meus Amigos do Laboratório de Engenharia de Requisitos, Márcia, Maria, Emanuel, Carla, Laís, Clarissa, João, Rosa e outros que passaram por nosso grupo de pesquisa. Agradeço ao SERPRO Recife em especial à gerente Helena Bastos por disponibilizar os estudos de caso. Agradeço aos amigos da UNIVASF em Petrolina-PE e Juazeiro-BA, aos amigos de Ibitinga-SP, minha cidade natal. Agradeço aos meus amigos que sempre me acompanharam: Valter Camargo, Marcos (Quinhão), Maristela Favero, Joel Agostini, André Madaro, Jéssica Zorzatto, Pablo Dalbem e Eduardo Escovar.. i.

(7) Abstract. Requirements documents tend to be swamped by requirements that are no longer meaningful, descriptions that are unnecessarily long and convoluted, duplication of information, among other shortcomings. These syntactical problems hinder the overall understandability of requirements documents throughout the whole development process. Quality evaluation strategies are not very helpful if they only include lists of expected product qualities. They should also incorporate guidelines to help practitioners to effectively measure and improve the qualities of requirements documents. AIRDoc is a step forward to fill this gap proposing a process that supports the evaluation and improvement of requirements documents specified based on use cases. This process is founded on the elaboration of goals and definition of questions that will be answered by Assertions and metrics. The requirements documents quality is improved by the use of refactorings.. Key Words: Requirements Documents, Use Case Model, Goal Question Metrics, Potential Problems and Refactorings.. ii.

(8) Resumo Documentos de requisitos tendem a ser inundados por requisitos tais como: descrições sem semântica, desnecessariamente longas e confusas, informações duplicadas, entre outros problemas. Estes problemas sintáticos diminuem a facilidade de entendimento de todo o documento de requisitos e podem prejudicar outras fases do processo de desenvolvimento do software. Avaliações de qualidade não ajudam muito se somente são baseadas em listas com qualidades desejadas. Elas devem também incorporar diretrizes para auxiliar na avaliação e melhoria dos documentos de requisitos. O processo AIRDoc surge como uma solução para avaliar e melhorar documentos de requisitos que foram descritos usando casos de uso. O processo proposto é baseado na definição de objetivos e questões que devem ser avaliadas por métricas. A qualidade dos modelos de casos de uso é melhorada com a utilização de refatorações.. Palavras Chaves: Documento de Requisitos, Modelo de Caso de Uso, Goal Question Metrics, Problemas em casos de uso e Refatorações.. iii.

(9) Table of Contents Chapter 1 - Introduction. 1. 1.1. Context. 2. 1.2. The Problem. 2. 1.3. The Proposed Solution. 3. 1.4. Methodology. 4. 1.5. Contributions. 5. 1.6. Structure of the Work. 6. Chapter 2 - Background. 8. 2.1. Introduction. 9. 2.2. The Goal Question Metrics Approach. 9. 2.3. Quality Models. 13. 2.3.1. A Conceptual Software Quality Metrics Framework 2.4. Refactoring. 14 16. 2.4.1. Identifying where to Apply which Refactoring. 17. 2.4.2. Determine which Refactoring(s) should be Applied. 18. 2.4.3. Guaranteeing that the Refactoring Preserves Software Behavior. 18. 2.4.3. Assessing the Effect of Refactoring on Quality. 18. 2.4.4. Apply the Refactoring. 19. 2.4.5. Requirement Refactoring. 19. 2.5. Requirements Refactoring. 20. 2.6. The Business Process Modeling Notation. 22. 2.6.1. Elements. 22. 2.6.2. Flow Objects. 23. 2.6.3. Connecting Objects. 24. 2.6.4. Swimlane. 25. 2.6.5. Artifacts. 27. 2.7. The Practical Software Measurement. 29 iv.

(10) 2.7.1. The Requests of Information from Managers. 29. 2.7.2. The Measure Model. 29. 2.7.3. The Measuring Process. 30. 2.8. Summary Chapter 3 - The AIRDoc Process. 31 33. 3.1. Introduction. 34. Activity E.1 - Elaboration of Evaluation Plan.. 36. Activity E.2 - Definition of GQM Activities. 45. Activity E.3 - Collection of the Metrics Values.. 52. Activity E.4 - Interpretation of GQM Activities.. 55. Activity - I.1 - Elaboration of Improvement Plan.. 60. Activity - I.2 - Perform Improvement.. 63. 3.2. Summary.. 64. Chapter 4 - Potential Problems and Refactorings. 65. 4.1. Introduction. 66. 4.2. Potential Problems Catalog. 70. 4.2.1. Duplicated Requirement. 71. 4.2.2. Large Use Case. 72. 4.2.3. Complex Conditional Structures. 73. 4.2.4. Lazy Use Case. 74. 4.2.5. Naming Problems. 75. 4.2.6. Tangled Requirements. 76. 4.2.7. Scattered Requirements. 77. 4.2.8. Large Use Case Model. 78. 4.2.9. Ambiguous Activity. 79. 4.2.10. Inconsistent Requirement. 80. 4.2.11. Lack of Rank. 81. 4.3. The Catalog of Solutions for Improvement. 82. 4.3.1. Extract Use Case. 84. 4.3.2. Rename Use Case. 88. 4.3.3. Move Activity. 89. 4.3.4. Inline Use Case. 92. 4.3.5. Extract Alternative Flow. 94. v.

(11) 4.3.6. Extract Early Aspectual Use Case. 96. 4.3.7. Use Cases Package. 102. 4.3.8. Rank Use Cases. 107. 4.4. Summary Chapter 5 - An Experimental Validation of the AIRDoc Process. 108 109. 5.1. Introduction. 110. 5.2. An Outline of Three Quantitative Studies. 111. 5.3. A Qualitative Evaluation of the AIRDoc. 113. 5.3.1. The Case Study Definition. 114. 5.3.1.1. Define the Hypotheses. 114. 5.3.1.2. Select the Pilot Projects. 115. 5.3.1.3. Identify the Method of Comparison.. 115. 5.3.1.4. Plan the case study.. 119. 5.3.1.5. Analyze and Report the Results.. 122. 5.3.1.6. Considerations about the Case Study. 125. 5.4. A Comparative Study Case. 128. 5.4.1. Plan the case study.. 129. 5.4.2. The Case Study. 131. 5.4.3. Conclusions about the Case Study. 132. 5.5. Summary Chapter 6 - Conclusions and Future Works. 134 135. 6.1. Introduction. 136. 6.2. Contributions. 136. 6.3. Considerations. 137. 6.4. Future Works. 138. 6.5. Related Publications. 138. References. 142. vi.

(12) List of Figures Figure 2.1. A GQM model is a hierarchical structure.. 11. Figure 2.2. A GQM model generated by the AIRDoc. 11. Figure 2.3. Questions and metrics based on GQM. 12. Figure 2.4. A conceptual software quality metrics framework.. 15. Figure 2.5. Events.. 23. Figure 2.6. Activities.. 24. Figure 2.7. Gateways.. 24. Figure 2.8. Sequence Flow.. 25. Figure 2.9. Message Flow.. 25. Figure 2.10. Association.. 25. Figure 2.11. Pool.. 26. Figure 2.12. Lane.. 26. Figure 2.13. Data objects.. 28. Figure 2.14. Group.. 28. Figure 2.15. Annotation.. 28. Figure 2.16. The PSM model measurement. 30. Figure 3.1. AIRDoc main activities.. 35. Figure 3.2. Elaboration of evaluation plan.. 36. Figure 3.3. Definition of a quality team.. 38. Figure 3.4. Selection of tools and/or other resources.. 39. Figure 3.5. Definition of software quality requirements.. 39. Figure 3.6. Establish the quality evaluation scope.. 40. Figure 3.7. Generation of Project plan.. 43. Figure 3.8. Definition of GQM activities.. 45. Figure 3.9. Framework to generate a quality model.. 46. Figure 3.10. Elaboration of Assertions.. 49. Figure 3.11. Collection of metrics values.. 53 vii.

(13) Figure 3.12. Hold trial period.. 53. Figure 3.13. Interpretation of GQM activities.. 56. Figure 3.14. Preparation of feedback material.. 56. Figure 3.15. Conclusions about measurement results.. 59. Figure 3.16. Elaboration of improvement plan.. 60. Figure 3.17. Perform improvement.. 63. Figure 4.1. Complaint use case specification.. 86. Figure 4.2. Complaint use case specification (after the refactoring). 87. Figure 4.3. Register animal complaint use case.. 87. Figure 4.4. Login use case.. 91. Figure 4.5. Change logged employee use case.. 93. Figure 4.6. Fill and ship order use case.. 95. Figure 4.7. Fill and ship order use case alternative flow (after the refactoring).. 96. Figure 4.8. Register new employee.. 100. Figure 4.9. Register new product.. 100. Figure 4.10. Login early aspect.. 101. Figure 4.11. Composition rules.. 101. Figure 4.12 Partial use cases diagram for the Adjustment Taxes.. 104. Figure 4.13. Package “Display Spreadsheet Control”. 105. Figure 4.14. Use case model after the application of the refactoring.. 106. Figure 5.1. Illustration of the five studies realized.. 110. Figure 5.2. The AIRDoc perspectives evaluated.. 116. Figure 5.3. Part of evaluation form.. 117. Figure 5.4. Quality Model to evaluate the AIRDoc feasibility.. 118. Figure 5.5. Total time spent (in minutes) by all groups in each main AIRDoc 128 activity. Figure 6.1. Timeline about publications that helped the AIRDoc development.. 139. viii.

(14) List of Tables Table 2.1. Software quality models.. 13. Table 3.1. Quality team members.. 37. Table 3.2. Tools and/or Resources.. 38. Table 3.3 - Template to describe the requirement in focus.. 40. Table 3.4. Template to describe the source from the requirement in focus.. 41. Table 3.5. Template to describe the scope.. 42. Table 3.6. Template of the evaluation goal.. 42. Table 3.7. Template to schedule of tasks.. 43. Table 3.8. Template of decisions about training.. 44. Table 3.9. Document to define de questions.. 47. Table 3.10. Template to selected metrics.. 48. Table 3.11. Scales to transformation of numerical values.. 50. Table 3.12. Template to define the premises.. 51. Table 3.13. Template to define assertions.. 52. Table 3.14. Document with the tested values.. 54. Table 3.15. Data collection form.. 55. Table 3.16. Document with the questions answered by the assertions.. 57. Table 3.17. Document with the feedback material.. 58. Table 3.18. Document to indications of potential problems.. 60. Table 3.19. Document to show the conclusion about the problems.. 61. Table 3.20. Document to indicate potential problems.. 62. Table 4.1. Description of the duplicated requirements problem.. 71. Table 4.2. Description of the large use case problem.. 72. Table 4.3. Description of the complex conditional structures problem.. 73. Table 4.4. Description of the Lazy use case problem.. 74. Table 4.5. Description of the naming problems.. 75. Table 4.6. Description of the tangled requirements problem.. 76. Table 4.7. Description of the scattered requirements problem.. 77 ix.

(15) Table 4.8. Description of the large use case model problem.. 78. Table 4.9. Description of the ambiguous activity problem.. 79. Table 4.10. Description of the inconsistent requirement problem.. 80. Table 4.11. Description of the lack of rank problem.. 81. Table 5.1. Main differences between the second and third version of the 113 AIRDoc. Table 5.2. The subjects roles.. 120. Table 5.3. Training and placement conducted by the subjects.. 121. Table 5.4. Summary of findings.. 122. Table 5.5. Number of groups that answered “yes” by AIRDoc activity / step.. 123. Table 5.6. The percentage of “yes” answer by each question.. 125. Table 5.7. Analysis about the solution used by the groups.. 126. Table 5.8. Time spent (minutes) by each group in the AIRDoc application.. 127. Table 5.9. Roles of the case study groups.. 130. Table 5.10. List of training and placement realized by the groups.. 131. Table 5.11. Strategies used in the case study.. 132. Table 5.12. Total time spent by the groups in the evaluation and improvement 133 stage. Table 5.13. Analysis and conclusions about the study case final result.. 133. Table 6.1. When to use the AIRDoc.. 137. x.

(16) List of Acronyms AIRDoc. Approach to Improve Requirements Documents.. AIDS. Acquired Immune Deficiency Syndrome. B2B. Business to Business. BPD. Business Process Diagram. BPMI. Business Process Management Initiative. BPMN. Business Process Modeling Notation. CIDHA. Centro de Informações de DST, HIV e AIDS. CIN. Centro de Informática. DSOA. Desenvolvimento de Software Orientado a Aspectos. DST. Doença Sexualmente Transmitida. ESD. Electronic System Division. GE. General Electric. GQM. Goal Question Metrics. HIV. Human Immunodeficiency Virus. IDEAS. Workshop Iberoamericano de Engenharia de Requisitos e Ambientes de Software. IEEE. Institute of Electrical and Electronics Engineers. ISO. International Organization for Standardization. LER. Laboratório de Engenharia de Requisitos. OMG. Object Management Group. PRODESP. Companhia de Processamento de Dados de São Paulo. RADC. Rome Air Development Centre. SBES. Simpósio Brasileiro de Engenharia de Software. SBQS. Simpósio Brasileiro de Qualidade de Software. SERPRO. Serviço Federal de Processamento de Dados. SRS. Software Requirement Specification. UFPE. Universidade Federal de Pernambuco xi.

(17) UML. Unified Modeling Language. UNIVASF. Universidade Federal do Vale do São Francisco. WASP. Workshop de Desenvolvimento de Software Orientado a Aspectos. WER. Workshop de Engenharia de Requisitos. xii.

(18) Chapter 1 – Introduction. Chapter 1 - Introduction. This chapter presents the main motivations of the thesis. The subsequent sections point to the problem, the proposed process and the contributions, as well as the document structure.. 1.

(19) Chapter 1 - Introduction. 1.1 Context The definition of requirements is critical for the success of software system development [Nuseibeh and Easterbrook 2000]. However, the current practices fail to satisfy this, leading to inaccurate and faulty descriptions. Requirements documents may come in different styles, for example, some advocate the use of goal-oriented modeling languages such as i* [Yu 1995] and KAOS [Dardenne et al. 1993]. Others developing critical and safe applications may prefer a more formal approach. Nevertheless, the great majority of requirements documents are written either in Natural Language or in some semi-structured notation [Mich et al. 2002]. The Unified Modeling Language (UML) [UML, 2009] is a popular standard used in the international industry to describe the requirements documents [SERPRO, 2009], [PRODESP, 2009]. In this way, this work focuses on requirements documents specified with use case models. A use case should, at least, provide the main and secondary flow of actions, post conditions and business rules [Cockburn 2001]. In the use case flow of actions a set of problems might compromise the quality of the use case model.. 1.2. The Problem. Some attention has been dedicated to discover typical shortcomings that compromise the quality of a requirements document specified by use cases, such as use cases that have been abandoned and that are no longer meaningful, use case descriptions that are too long and difficult to read, and information that is duplicated among others [Lilly 1999]. These shortcomings hinder the overall understandability and reusability of requirements documents throughout the development process [Boehm and Sullivan 2000]. Fortunately they can be minimized by the identification of their symptoms and the removal of their causes. The removal of these symptoms in early stages of software development process reduces the costs associated with software changes. These cost. 2.

(20) Chapter 1 - Introduction. reductions could be three to six times more in later stages than during requirements activities [Pressman 2005]. Unfortunately, the early identification of the symptoms during the initial development stages is unusual. Works towards the identification of early problems are described in [Firesmith, 2007] and works about inspection [Fagan, 1986], such as reading techniques (ad-hoc, checklist, Perspective Based Reading), are proposed in [Travassos et al., 1999], [Basili et al.,1996], [Basili et al.,1997], [Laitenberger, 2000]. Unfortunately, these approaches do not provide well defined guidelines on how to identify the potential problems in requirements documents and models. These techniques are strongly dependent of the requirements engineer and do not give a quality warranty based on real results such as metric values. The adoption of metrics can be a good way to help to identify some of the problems that occur in requirements documents artifacts. But their adoption is a difficult endeavor. Collecting, interpreting and analyzing metrics have proved to be a major challenge [Boehm and Sullivan, 2000]. Moreover, there is a high cost of their adoption. Hence, in this thesis we address the lack of a method to support the metrics application in use case models and the lack of guidelines to conduct the software engineer to solve the potential problem found in use case models. We consider that a use case model is composed of use case descriptions and use case diagram.. 1.3. The Proposed Solution. Some of the symptoms, described before, may indicate potential problems with the software system and can be removed using appropriate refactoring transformations [Elssamadisy and Schalliol 2002]. For any software system the software engineer has to specify its external quality attributes (such as correctness, robustness, extensibility, reusability, compatibility, efficiency, ease of use, portability and functionality, among others) [Meyer 1997]. 3.

(21) Chapter 1 - Introduction. Refactorings can be classified according to which of these quality attributes they affect [Mens and Tourwé, 2004]. This allows the software engineer to improve the quality of a software system by applying the relevant refactorings at the right places. In this thesis, we adopt the Goal Question Metrics (GQM) [Basili et al.,1994] approach and tailor it to the context of requirements documents. Hence, we help the quality assurance team to elaborate goals, to define questions and to choose a set of metrics appropriate for requirement documents based on use cases.. 1.4. Methodology The methodology used to development this thesis has four phases that are described as follows: •. Phase 1 - Brainstorming (sources / technical) - Initially a literature survey has been carried out to discover existing metrics to document requirements and approaches that promote the improvement of the requirements documents. An initial work was drawn from this research [Ramos et al., 2005]. The GQM approach has been chosen to assist the implementation and interpretation of metrics on the requirements level. To adapt the GQM, several case studies in the requirements have been made.. •. Phase 2 - Selection of ideas (screening) – Several case studies have been conducted on real systems applying GQM together requirements document metrics [Ramos et al. 2007a], [Ramos et al. 2007c], [Ramos et al. 2008th], [Ramos et al., 2008b], [Ramos et al. 2008c], [Ramos et al., 2009]. These studies, qualitative and quantitative, helped to refine the AIRDoc. Chapter 5 presents the AIRDoc evolution according to these case studies.. •. Phase 3 - AIRDoc Development - To each case study it has been found new problems and refactorings have been developed. The work presented in [Alexander and Stevens 2002], [Cockburn 2001], [Fowler, 1999], [Fowler et al., 2000], [Lilly 1999] [Mens and Tourwé, 2004], [Opdyke, 1992] [Rui et al., 4.

(22) Chapter 1 - Introduction. 2003], [Russo et al., 1998] helped to guide the definition of refactorings and potential problems. The elaborations of refactorings and problem catalogs were motivated by the Martin Fowler work [Fowler et al., 2000]. In his work he defines a catalog of problems that occur in source code, which are called of "bad smells" and also he makes an association for each problem with its solutions described by refactorings. •. Phase four - AIRDoc validation - With quantitative experiments [Ramos et al. 2008c], [Ramos et al., 2009] and the qualitative experiment, presented in chapter 5, AIRDoc was validated and refined to achieve the final version presented in this thesis. Initially AIRDoc did not have characteristics of a process, however with the activities and well-defined steps it was necessary to choose some modeling technique in order to better express the AIRDoc in a process format, therefore, the choice was Business Process Modeling Notation - BPMN [White, 2009].. 1.5. Contributions. In this work we propose an original process named AIRDoc that is based on well know techniques and good practices of software engineering which assists the adoption of metrics to evaluate requirements documents described by use case models. The measures collected help the quality assurance team identify places that could be improved with the use of a refactorings catalog. AIRDoc is an original process because other approaches do not provide a full process to guide a quality team through the evaluation based on metrics collection and improvement based on refactorings application. In AIRDoc the following benefits are envisaged: •. The adequacy of Goal Question Metrics approach to be used in use case models context. Therefore, we create the following: - Templates to guide through the Goal definition;. 5.

(23) Chapter 1 - Introduction. - Templates to guide through the Questions elaboration. - Templates to facilitate the mapping between Metrics and Questions. - Templates to Premise definition. The Premises will give a support to answers of the questions and they were created for the AIRDoc context. These templates were created for the AIRDoc context and are not contained in the GQM original version. - Templates to Assertions definition. Assertions are the possible answers of the questions and were created for the AIRDoc context. These templates were created for the AIRDoc context and are not contained in the GQM original version. - Templates of Graphics to help the quality team in the conversion of quantitative values, obtained from the metric, for qualitative values. These templates were created for the AIRDoc context and are not contained in the GQM original version. •. A potential problems catalog. It will help to characterize some well-known problems.. •. A refactorings catalog to assist the software engineer solving the problems identified by measures or another means.. 1.6. Structure of the Work. Besides this introduction this work is organized as follows: Chapter 2 - Background: this chapter presents background information related with the AIRDoc proposal. Chapter 3 - AIRDoc - Process to Improve Requirements Document: our process and the detailed description of its steps are shown in this chapter. 6.

(24) Chapter 1 - Introduction. Chapter 4 - Potential Problems and Refactorings: In this chapter two catalogs are described: i) Potential Problems – that presents possible problems that can appear in use case models, and ii) Solutions for Improvement – defines some refactorings that describe the solutions to the problems presented in the first catalog. Chapter 5 - An Experimental Validation of the AIRDoc Process: In this chapter a set of experimental validation performed in the AIRDoc is presented. The experimentations were performed by different subjects and in different use case model domains. Chapter 6 - Conclusions and Future Work: Last but not least, some conclusions are drawn and directions for future works are pointed out. A summary of all published papers that had direct contribution to AIRDoc development is presented.. 7.

(25) Chapter 2 – Background. Chapter 2 – Background. This chapter presents background information that is related to the approach presented in this thesis. In general these works were brought to the use cases model context.. 8.

(26) Chapter 2 – Background. 2.1. Introduction. This chapter shows the background information related with the AIRDoc approach. It is important to note that some approaches presented here are not specific to requirements documents. Our approach is based on the definition of goals, questions and metrics. Hence, we rely on the GQM (Goal Question Metrics) work [Basili et al.,1994]. It will help the assurance quality team to define “what is the artifact that will be measured?” and “what are the metrics that will be used?” The use of GQM needs to be guide by the definition of the quality attributes. The qualities attributes and quality models are discussed in section 2.3. Once we have identified potential problem in the requirements documents we will need to fix them. In this thesis we rely on refactoring techniques for the improvement of use case model. This is the subject of section 2.4. Last but not least, our AIRDoc process also needs to be represented in some notation. In section 2.5 we review the language used in this thesis to describe the AIRDoc.. 2.2. The Goal Question Metrics Approach. The initial AIRDoc activities provide guidelines and templates to the Goal Question Metric (GQM) definition in the use case model context. The GQM approach is based on the assumption that for an organization to measure in a purposeful way it must first specify the goals for itself and its projects, then it must trace those goals to the data that are intended to define those goals operationally, and finally provide a framework for interpreting the data with respect to the stated goals [Basili et al.,1994]. The result of the GQM approach application is the specification of a measurement system targeting a particular set of issues and a set of rules for the measurement data interpretation. The resulting measurement model has three levels: 9.

(27) Chapter 2 – Background. Level 1. Conceptual level (GOAL): A goal is defined to an object, to a variety of reasons, with respect to various quality models, to various viewpoints and relative to a particular environment. Measurable objects are: •. Products: Artifacts, deliverables and documents that are produced during the system life cycle; e.g., requirements documents, specifications, designs, programs, test suites. The measurement objects used in the AIRDoc process are the use cases diagram and descriptions.. •. Processes: Software related activities usually associated with time; e.g., specifying, designing, testing and interviewing.. •. Resources: Items used by processes in order to produce their outputs; e.g., personnel, hardware, software and office space. Level 2. Operational level (QUESTION): A set of questions is used to. characterize the way the assessment/achievement of a specific goal is going to be performed based on some characterizing model. Questions try to characterize the measurable object (product, process, resource) with agreement to a selected quality issue and to determine its quality from the selected viewpoint. In this thesis the questions will be formulated based on templates that achieve the Goal defined in Level 1. Level 3. Quantitative level (METRIC): A set of data is associated with every question in order to answer them in a quantitative way. The data can be: •. Objective: If they depend only on the object that is being measured and not on the viewpoint from which the data are taken; e.g., number of document versions, staff hours spent on a task and use case size.. •. Subjective: If they depend on both the object that is being measured and the viewpoint from which they are taken; e.g., text readability, user satisfaction level.. In the AIRDoc, the selection of the metrics will depend on the data required by the questions. The AIRDoc gives a support to convert the metric numerical values in the values “good, medium and bad”. This conversion will help the software engineer in interpretation of the metrics values. 10.

(28) Chapter 2 – Background. A GQM is a hierarchical structure (Figure 2.1) starting with a goal (specifying purpose of measurement, object to be measured, issue to be measured, and viewpoint from which the measure is taken). The goal is refined into several questions, such as the one shown in Figure 2.1, that usually breaks down the issue into its smaller components. Each question is then refined into metrics that could be objective or subjective. The same metric can be used in order to answer different questions under the same goal. Several GQM models can also have questions and metrics in common, making sure that, when the measure is actually taken, the different viewpoints are taken into account correctly (i.e., the metric might have different values when taken from different viewpoints). Goal2. Goal1. Question. Metric. Question. Metric. Question. Metric. Question. Metric. Question. Metric. Metric. Figure 2.1. A GQM model is a hierarchical structure [Basili et al.,1994]. For example, in the AIRDoc we may define the GQM model presented in the Figure 2.2. This model will give the basis to define the goal, question and metric. D E F I N I T I O N. Flexibility. Understandability. Factors Internal Attributes. Metrics. Maintainability. Reusability. Qualities. Separation of Requirements. M1a. M1b. Coupling. Size. M2a. M2c M2b. A N A L Y Z E S. M3b M3a. Figure 2.2. A GQM model generated by the AIRDoc [Ramos et al.,2008b]. Figure 2.3 presents some question and metrics derived from the model presented in the Figure 2.2. The Goal to be achieved with the questions and metrics presented on Figure 2.2 is “Assess in the Adjustment Taxes requirements model the Display 11.

(29) Chapter 2 – Background. Requirement with a view to predict its maintainability”. This example shows an evaluation in a use case model named “Adjustment Taxes” from SERPRO [SERPRO, 2009], more detail about this evaluation may be seen in Chapters 4 and 5.. Has the display requirement a favorable maintainability?. Q1 - How good is the understandability from the display requirement? (Understandability). Q2 - How good is the flexibility from the display requirement? (flexibility). Q1.1 - How good is the size from the display requirement? (size) M2a – How many use cases are required to specify the display requirement? M2b – How many steps are required to specify the display requirement?. Q2.1 - How good is the separation of requirements from the display requirement? (Separation M1a - How manyof useRequirements) cases are included. .... in the use cases that contribute to specify the display requirement?. Q1.2 - How good is the separation of requirements from the display requirement? (Separation of Requirements) M1a - How many use cases are included in the use cases that contribute to specify the display requirement?. ... Q1.3 - How good is the coupling from the display requirement? (Coupling) M3a - How many use cases are extended in the use cases that contribute to specify the display requirement?. ... Q2.2 - How good is the coupling from the display requirement? (Coupling) M3a - How many use cases are extended in the use cases that contribute to specify the display requirement?. .... .... Figure 2.3. Questions and metrics based on GQM [Ramos et al.,2008b].. The GQM approach needs to have a quality model to achieve the goal, question and metrics definitions. Thus, requirements engineering needs to choose the best quality model for their evaluation.. 2.3. Quality Models. 12.

(30) Chapter 2 – Background. According to [Wallmüller, 1994] "one of the oldest and most frequently applied software quality model is that of [McCall and Walters, 1997]”. Other models such as that of [Murine and Carpenter, 1984] or that of [Azuma, 1987] are derived from it. McCall's model is used in the United States for very large projects in the military, space and public domains. It was developed in 1976 by the US Air force Electronic System Division (ESD), the Rome Air Development Centre (RADC) and General Electric (GE) with the aim of improving the quality of software products. One explicit aim was to make quality measurable. McCall started with a volume of 55 quality characteristics which have an important influence on quality, and called them "factors". For reasons of simplicity, McCall reduced the number of characteristics to eleven, see Table 2.1. A second set of quality factors was defined by [Boëhm, 1978]. A full list of both factors is depicted in Table 2.1. Table 2.1. Software quality models. [McCall and Walters, 1977] Efficiency Integrity Reliability Usability Accuracy Maintainability Testability Flexibility Interface facility Re-usability Transferability. [Boëhm 1978] Usability Clarity Efficiency Reliability Modifiability Re-usability Modularity Documentation Resilience Correctness Maintainability Portability Interoperability Understandability Integrity Validity Flexibility Generality Economy. Many authors have redefined some of the eleven factors while others have added even more to better reflect recent advances in the technology. In the next subsection we. 13.

(31) Chapter 2 – Background. present the quality model adopted in our thesis, which is based on the IEEE 1061 standard [IEEE 1061, 1998].. 2.3.1. A Conceptual Software Quality Metrics Framework Software quality is the degree to which a software possesses a desired combination of quality attributes. The use of software metrics reduces subjectivity in the assessment and control of software quality by providing a quantitative basis for making decisions about software quality [IEEE 1061, 1998]. However, the use of software metrics does not eliminate the need for human judgment in software assessments. The use of software metrics within an organization or project is expected to have a beneficial effect by making software quality more visible. According to IEEE 1061, the use of a standard for measuring quality enables an organization to: •. Assess achievement of quality goals;. •. Establish quality requirements for a system at its outset;. •. Establish acceptance criteria and standards;. •. Evaluate the level of quality achieved against the established requirements;. •. Detect anomalies or point to potential problems in the system;. •. Predict the level of quality that will be achieved in the future;. •. Monitor changes in quality when software is modified;. •. Assess the ease of change to the system during product evolution;. •. Validate a metrics set.. 14.

(32) Chapter 2 – Background. The conceptual software quality metrics framework shown in Figure 2.4 is designed to be flexible. It allows additions, deletions, and modifications of quality The Goal Quality Attribute of System X. Quality Factor. Quality Factor. Quality Factor. Direct Metric(s). Direct Metric(s). Direct Metric(s). Quality Sub Factor. Quality Sub Factor. Quality Sub Factor. Metric. Metric. Metric. factors, quality sub factors, and metrics. Each level may be expanded to several sublevels. The framework can thus be applied to all systems and can be adapted as appropriate without changing the basic concept. Therefore, the quality models mentioned in the previous subsection could be adapted in the format of this framework. Figure 2.4. A conceptual software quality metrics framework [IEEE 1061, 1998].. The first level of the software quality metrics framework hierarchy begins with the establishment of quality requirements by the assignment of various quality attributes, which are used to describe the quality of the entity system X. All attributes defining the quality requirements are agreed upon by the project team, and then the definitions are established. Quality factors that represent management and user-oriented views are then assigned to the attributes. If necessary, quality sub factors are then assigned to each quality factor. Associated with each quality factor is a direct metric that serves as a quantitative representation of a quality factor.. 15.

(33) Chapter 2 – Background. At the second level of the hierarchy there are quality sub-factors that represent software-oriented attributes that indicate quality. These can be obtained by decomposing each quality factor into measurable software attributes. Quality sub factors are independent attributes of software, and therefore may correspond to more than one quality factor. The quality sub factors are concrete attributes of software that are more meaningful than quality factors to technical personnel, such as analysts, designers, programmers, testers, and maintainers. The decomposition of quality factors into quality sub factors facilitates objective communication between the manager and the technical personnel regarding the quality objectives. At the third level of the hierarchy the quality sub factors are decomposed into metrics used to measure system products and processes during the development life cycle. Direct metric values, or quality factor values, are typically unavailable or expensive to collect early in the software life cycle. From bottom to top the framework enables the managerial and technical personnel to obtain feedback by: •. Evaluating the software products and processes at the metrics level;. •. Analyzing the metric values to estimate and assess the quality factors. An example of this framework application by the AIRDoc shown in the Figure. 2.3. Once, we have measured the appropriate quality factor, our AIRDoc process will be able to possible detect some potential problem. The solution to the problem could be defined in terms of some refactorings to be performed. 2.4. Refactoring. In the AIRDoc, a catalog is created to enclose some possible solutions. These solutions are in the refactorings format. Thus, when an AIRDoc applicator finds some potential problem it could be guided to achieve the solution by the use of the refactorings. As described in [Mens and Tourwé, 2004] the refactoring process consists of a number of distinct activities: 16.

(34) Chapter 2 – Background. 1. Identify where the software should be refactored (the AIRDoc evaluation stage is used in this activity); 2. Determine which refactoring(s) should be applied to the identified places (the potential problems catalog, available by the AIRDoc, assists in this activity); 3. Guarantee that the applied refactoring preserves behavior (in the requirements level this guarantee will be defined by the requirement engineer. Thus, the AIRDoc does not have a mechanism to guarantee the preservation of the behavior after the refactorings application. However, each refactoring has a step that remembers the requirement engineer to verify the preservation of the behavior.) 4. Apply the refactoring; 5. Assess the effect of the refactoring on quality characteristics of the software (e.g., complexity, understandability and maintainability) or the process (e.g., productivity, cost and effort); Each of these activities could be supported by different approaches and tools. In the following subsections some approaches and tools are related. 2.4.1. Identifying where to Apply the Refactorings A first decision that needs to be taken is to determine the appropriate level of abstraction to apply the refactoring. The refactorings could be applied to the program itself (i.e., the source code) or to more abstract software artifacts such as design models or requirements documents, for example. In this work we focus on the requirements document specified by use cases. Martin Fowler informally links “bad smells” to refactorings. The bad smells are “structures in the code that suggest (sometimes scream for) the possibility of refactoring” [Fowler, 1999]. Tourwé and Mens use a semi automated approach based on logic meta-programming to formally specify and detect these bad smells and to propose refactoring opportunities that remove these bad smells [Tourwé and Mens, 2003]. In this thesis, the AIRDoc detects the right point to apply the refactoring by means of metrics and relies on the use of the potential problems catalog like the links “bad smells” to refactorings elaborated by Fowler [Fowler, 1999]. Another way to identify 17.

(35) Chapter 2 – Background. where to apply the refactorings is by the indication of requirement engineer who developed the use case model. 2.4.2. Determine which Refactoring(s) should be Applied In Fowler’s works [Fowler, 1999], mentioned before, the “bad smells”, help the software engineer to determine which Refactoring should be applied to each kind of “bad smells” situation. In this thesis, we also use the indication of how the Refactoring should be applied to each situation of “refactoring opportunity”. However, refactoring opportunities should not be seen as exact rules allowing automatic application of refactorings. The requirements engineer needs to decide about the trade-offs in changing the system requirements and needs to choose which refactoring is more adequate for each opportunity. 2.4.3. Guaranteeing that the Refactoring Preserves Software Behavior By definition, a refactoring should not alter the behavior of the software [Opdyke, 1992], [Fowler, 1999]. The original definition of behavior preservation as suggested by Opdyke [Opdyke, 1992] establish that, for the same set of input values, the resulting set of output values should be the same before and after the refactoring. Opdyke suggests ensuring this particular notion of behavior preservation by specifying refactoring preconditions. In many application domains, requiring the preservation of input-output behavior is insufficient since many other aspects of the behavior may be relevant as well [Mens and Tourwé, 2004]. This implies that a wider range of definitions of behavior that may or may not be preserved by a refactoring is needed, depending on domainspecific or even user specific concerns. Mens and Tourwé (2004) list the following: •. For real-time software, an essential aspect of the behavior is the execution time of certain (sequences of) operations. In other words, refactorings should preserve all kinds of temporal constraints.. •. For embedded software, memory constraints and power consumption are also important aspects of the behavior that may need to be preserved by a refactoring. 18.

(36) Chapter 2 – Background. •. For safety-critical software, there are concrete notions of safety (e.g., liveness) that need to be preserved by a refactoring.. There are some pragmatic ways to deal with behavior preservation, for example: using a rigorous testing discipline. If there is an extensive set of test cases and all these tests still pass after the refactoring, there is good evidence that the refactoring preserves the program behavior [Pipka, 2002]. For the requirement level, the preservation of the behavior means that the requirement semantic was unchanged. In the AIRDoc the preservation of the behavior will depend on how much the software engineer is careful in the use of refactorings. In all guidelines that describe the AIRDoc refactorings there is a step with the indication to preserve the behavior. 2.4.4. Apply the Refactoring At the code level there are some tools that assist the implementation of refactorings, such as: [Kataoka et al., 2001], [Balazinska et al., 2000] and [Ducasse et al., 1999]. However, for the requirements document level, such as in AIRDoc, it is necessary that the requirement engineer follow the refactoring steps correctly for its precise application. 2.4.5. Assessing the Effect of Refactoring on Quality For any software artifact, it can be specified its external quality attributes (such as robustness,. extensibility,. reusability,. performance,. etc). [Sommerville,. 1997].. Refactorings can be classified according to which of these quality attributes they affect. This allows the software engineer to improve the quality of software by applying the relevant refactorings at the right places. To achieve this, each refactoring has to be analyzed according to its particular purpose and effect. Some refactorings remove redundancy, some raise the level of abstraction, some enhance the reusability, and so on [Fowler, 1999]. This effect can be estimated to a certain extent by expressing the refactorings in terms of the internal quality attributes they affect (such as size, complexity, coupling, and cohesion, etc). In order, to measure or estimate the impact of a refactoring on quality characteristics, many different techniques can be used. Examples include, but are not 19.

(37) Chapter 2 – Background. limited to, software metrics, empirical measurements, controlled experiments, and statistical techniques. Coupling metrics are proposed as an evaluation method to determine the effect of refactoring on the maintainability of the program [Kataoka et al., 2002]. Design decisions were encoded as softgoal graphs to guide the application of the transformation process [Tahvildari and Kontogiannis, 2002]. These softgoal graphs describe correlations between quality attributes. The association of refactorings with a possible effect on soft-goals addresses maintainability enhancements through primitive and composite refactorings. A catalogue of object-oriented metrics is used as an indicator to automatically detect where a particular refactoring can be applied to improve the software quality [Tahvildari and Kontogiannis, 2003]. This is achieved by analyzing the impact of each refactoring on these object-oriented metrics. As described in the previous section there is no tool that assists the refactoring application analysis. However, in the AIRDoc there are some activities (see the “I.2.2 - Evaluation of the New Use Case Model” in the next Chapter) that assess the use case model before and after applying the refactorings. 2.5. Requirements Refactoring Refactorings are more generally used to the source code, however refactorings can be applied to any type of software artifact. For example, it is possible and useful to apply refactorings in design models, database schemas, software architectures, and requirements documents [Mens and Tourwé, 2004]. Refactoring of these kinds of software artifacts rids the developer of many implementation-specific details and raises the expressive power of the changes that are made. On the other hand, applying refactorings to different types of software artifacts introduces the need to keep them all in synchronization [Mens and Tourwé, 2004]. Some related approaches which address refactoring or restructuring of requirements specification are discusses, as follows: •. Russo [Russo et al., 1998] suggest restructuring natural language requirements specifications by decomposing them into a structure of viewpoints. Each viewpoint encapsulates partial requirements of some system components, and interactions between these viewpoints are made explicit. According to the authors this restructuring approach increases requirements understanding, and facilitates detecting inconsistencies and managing requirements evolution. The 20.

(38) Chapter 2 – Background. works of Russo do not indicate the potential problems that will be solved with the restructuring. Thus the requirement engineer do no know where and when to apply the restructuring. The AIRDoc contain guidelines to evaluate and localize the potential problems and help the requirement engineer to apply the correct refactorings in the specific potential problem. •. Rui [Rui et al., 2003] describe a meta-model for use case modeling. The authors categorize several use cases that share a common behavior, or more precisely share a course of action and create a list of use case refactorings. In its metamodel it is described the recommendations of how to extract the common behavior to a new use case and use the include relation to relate it to the parents. The AIRDoc extends their refactoring with a detailed collection of a potential problems catalog and refactorings, including, for each refactoring: the context for the application for a given refactoring, a possible solution, the motivation to apply the transformations and an example of its practical use.. •. Yu [Yu et al., 2004] explains how refactoring can be applied in order to improve the organization of use case models. He focuses on the decomposition of a use case and the reorganization of relationships between use cases. He also describes ten refactorings that could be used to improve the overall organization of use case models, such as inclusion or extension mechanisms introduction, use case deletion or refactorings that manipulate the inheritance tree. While Yu et al. focus on refactoring the use case models, the AIRDoc focuses on refactoring use case descriptions. Thus, the AIRDoc refactorings are finer grained than theirs. The refactoring catalog provided by the AIRDoc describes in detail the mechanics of each refactoring and possible refactoring opportunities in the context of use case descriptions. Moreover, the strategy of Yu et al does not provide mechanisms to find where the refactoring should be applied.. •. El-Attar and Miller [El-Attar and Miller, 2006] proposed anti-patterns to detect potential defective areas in use cases models. The anti-patterns include some suggestions for improvement of running examples. Comparing with the AIRDoc, their work fails to provide guidelines to assist the practitioners to effectively measure, detect and improve those qualities. 21.

(39) Chapter 2 – Background. As discusses before, we also need some notation to present our process. In this thesis the Business Process Modeling Notation (BPMN) was the chosen notation. The BPMN comply the needs to model the activities and steps of AIRDoc, besides being simple and easy to understand. 2.6. The Business Process Modeling Notation The AIRDoc process is modeled using the standard Business Process Modeling Notation (BPMN). The BPMN is a standard used to model business process flows and web services. BPMN was developed by the Business Process Management Initiative (BPMI), and is now being maintained by the Object Management Group (OMG) since the two organizations merged in 2005. Its current adopted version is 1.0 and a new version (2.0) is under development [OMG, 2009]. The BPMN goal is to provide a notation that is readily understandable by all business users. This includes the business analysts that create the initial drafts of the processes to the technical developers responsible for implementing the technology that will perform those processes. [Owen and Raj, 2003]. In addition, BPMN defines a Business Process Diagram (BPD), which is based on a flowcharting technique tailored for creating graphical models of business process operations. A Business Process Model, then, is a network of graphical objects, which are activities (i.e., work) and the flow controls that define their order of performance [White, 2009]. BPMN has tried to find the best possible trade-off between an intuitive notation, using familiar constructs, and a complete set of business rules common to the business processes [Dubray, 2009]. 2.6.1. Elements. A BPD is made up of a set of graphical elements. These elements enable the easy development of simple diagrams that will look familiar to most business analysts (e.g., a flowchart diagram). The elements were chosen to be distinguishable from each other and to utilize shapes that are familiar to most modelers. For example, activities are rectangles and decisions are diamonds. It should be emphasized that one of the drivers for the development of BPMN is to create a simple mechanism for creating business process models, while at the same 22.

(40) Chapter 2 – Background. time being able to handle the complexity inherent to business processes [White, 2009]. The approach taken to handle these two conflicting requirements was to organize the graphical aspects of the notation into specific categories. This provides a small set of notation categories so that the reader of a BPD can easily recognize the basic types of elements and understand the diagram. Within the basic categories of elements, additional variation and information can be added to support the requirements for complexity without dramatically changing the basic look-and-feel of the diagram. The four basic categories of elements are: •. Flow Objects. •. Connecting Objects. •. Swimlanes. •. Artifacts These four categories of elements give the opportunity to make a simple diagram. (BPD). It is also allowed in BPD to make your own type of a Flow Object or an Artifact to make the diagram more understandable [Dubray, 2009]. 2.6.2. Flow Objects. Flow Objects consist of only three core elements. The three Flow Objects are: •. Events - An Event is represented by a circle and is something that “happens” during the course of a business process. The Events affect the flow of the process and usually have a cause (trigger) or an impact (result). There are three types of Events, based on when they affect the flow: Start, Intermediate and End (see Figure 2.5) [White, 2009].. Figure 2.5. Events. •. Activities - An Activity is a generic term for work being performed [Dubray, 2009]. An Activity is represented by a rounded-corner rectangle and is a generic term for work that a company performs. An Activity can be atomic or nonatomic (compound) [White, 2009]. Atomic activities can be of type: service, send, receive, user task (workflow), script task or manual task. A user task may 23.

(41) Chapter 2 – Background. have a performer. The types of Activities are: Task and Sub-Process. The SubProcess is distinguished by a small plus sign in the bottom center of the shape [White, 2009]. We are considering in this thesis, a process composed by activities and steps. An activity is performed by a set of steps to achieve a specific goal or other subactivities. Therefore, in the AIRDoc context, a Sub-Process is represented by an Activity and a Task by a Step. Thus, an Activity is composed by Steps or other Activities. Figure 2.4 shows the Activities.. Figure 2.6. Activities.. •. Gateways - A Gateway is represented by a diamond shape and is used to control the divergence and convergence of Sequence [White, 2009]. Thus, it will determine traditional decisions, as well as the forking, merging, and joining of paths (Figure 2.6). Internal Markers will indicate the type of behavior control [White, 2009]. A gateway can be thought of as a question that is asked at a point in the process flow. The question has a defined set of alternative answers, which are in effect gates. [Owen and Raj, 2003]. Figure 2.7. Gateways. 2.6.3. Connecting Objects. The Flow Objects are connected together in a diagram to create the basic skeletal structure of a business process. There are three Connecting Objects that provide this function [White, 2009]. These connectors are: 24.

(42) Chapter 2 – Background. •. Sequence Flow - A Sequence Flow is represented by a solid line with a solid arrowhead and is used to show the order (the sequence) that activities will be performed in a Process [White, 2009]. To show the order of execution of processes, you connect them with a Sequence Flow. A Sequence Flow is used to show the sequence of processes in an organization or department (Figure 2.8) [Owen and Raj, 2003] .. Figure 2.8. Sequence Flow.. •. Message Flow - A Message Flow is represented by a dashed line with an open arrowhead and is used to show the flow of messages between two separate Process Participants (business entities or business roles) that send and receive them, Figure 2.9 [White, 2009]. The Message Flow is available to model ordering of processes between organizations or departments (in other words, between pools) [Owen and Raj, 2003] .. Figure 2.9. Message Flow.. •. Association - An Association is represented by a dotted line with a line arrowhead and is used to associate data, text, and other Artifacts with flow objects, Figure 2.10. Associations are used to show the inputs and outputs of activities [ White, 2009] .. Figure 2.10. Association. 2.6.4. Swimlane. Many process modeling methodologies uses the concept of swimlanes as a mechanism to organize activities into separate visual categories in order to illustrate different. 25.

(43) Chapter 2 – Background. functional capabilities or responsibilities [White, 2009]. Swinlanes represent participants [Dubray, 2009]. BPMN supports swimlanes with two main constructs: •. Pool - A Pool represents a Participant in a Process [White, 2009]. It also acts as a graphical container which contains many Flow Objects, Connecting Objects and Artifacts [White, 2009] for partitioning a set of activities from other Pools, usually in the context of B2B (Business to Business) situations. A pool can represent other things besides an organization, such as a function (something that the organization performs, like Marketing or Sales or Training), an application (or computer software program), a location (a physical location in the company), a class (a software module in an object-oriented computer software program), or an entity (representing a logical table in a database). It can only represent one thing, but that thing comes from this ‘heterogeneous list’ of different types of things. [Owen and Raj 2003]. Figure 2.11. Pool. •. Lane - A Lane is a sub-partition within a Pool and will extend the entire length of the Pool, either vertically or horizontally. Lanes are used to organize and categorize activities [White, 2009]. It allows us to group activities which are logically related to each other (e.g. when they are performed by the same department) [Dubray, 2009]. The lanes organize the Flow Objects, Connecting Objects and Artifacts more precisely [White, 2009].. Figure 2.12. Lane. Typically, a pool represents an organization, and a lane represents a department within that organization. By taking processes and placing them in pools or lanes, it is 26.

(44) Chapter 2 – Background. specifying who does what, for events specify where they occur, and for gateways specify where decisions are made, or who makes them [Owen and Raj, 2003]. The analogy between this representation and swimming pools is a useful one. Imagine a process swimming down a lane, and changing lanes as a need to perform an activity, within a pool. The pool can be considered a ‘pool’ of resources. There are occasions when the process needs to jump to another pool, because it has different resources needed to complete the activity [Owen and Raj 2003]. The AIRDoc is formed by one pool and divided into two lanes that are called Evaluation Stage and Improvement Stage. 2.6.5. Artifacts. BPMN was designed to allow modelers and modeling tools some flexibility in extending the basic notation and in providing the ability for additional context appropriate to a specific modeling situation, such as for a vertical market (e.g., insurance or banking). Any number of Artifacts can be added to a diagram as appropriate for the context of the business processes being modeled [White, 2009]. Artifacts allow developers to bring more information into the model/diagram. In this way the model/diagram becomes more readable [White, 2009]. There are three predefined Artifacts: •. Data Objects – represent the input and output of activities [Dubray, 2009]. They are a mechanism to show how data is required or produced by activities. They are connected to activities through Associations [White, 2009]. Data Objects are artifacts that may represent many different types of electronic or physical items. Since they represent data, they are defined by a combination of one or more entities (corresponding to database tables) or classes (corresponding to objectoriented software modules that contain data) [Owen and Raj, 2003]. In the AIRDoc context, Data Objects represent the inputs and outputs of the activities (Figure 2.13). All AIRDoc inputs and outputs are numbered to help the visualization and approach organization.. 27.

(45) Chapter 2 – Background. Figure 2.13. Data Objects.. •. Group - A Group is represented by a rounded corner rectangle drawn with a dashed line (Figure 2.14). The grouping can be used for documentation or analysis purposes, but it does not affect the Sequence Flow [White, 2009].. Figure 2.14. Group.. •. Annotation - An Annotation is used to give the reader of the model/diagram an understandable impression [White, 2009]. BPMN has a textual annotation that can be affixed to any model element, in order to describe extra details about the element in good old-fashioned words [Owen and Raj, 2003].. Figure 2.15. Annotation.. The AIRDoc process is aligned with the Practical Software Measurement (PSM) [PSM, 2003]. PSM define some activities to perform measurements in software and systems. Thus, some of these activities are used and/or adapted in the AIRDoc context.. 28.

(46) Chapter 2 – Background. 2.7. The Practical Software Measurement. Following, an overview about PSM and which activities were basis to some of the AIRDoc activities are presented. The Practical Software and Systems Measurement: A Foundation for Objective Project Management [PSM, 2003], started with the USA department of defense and presents an approach to define an effective software and system measurement process. The PSM objective is giving quantitative information to support decisions that cause impact on project cost, scheduling and technical objectives of performance. There are three key points to define the basis of PSM [Card, 2003] 1- The requests of information from managers 2- The Measure model, and 3- The Process model.. 2.7.1. The Requests of Information from Managers. The requests are from the Manager effort to conduct the results of projects and process and are based in two points: 1 - The goals to be achieved by the Manager, 2 - The obstacles to achieving these goals.. 2.7.2. The Measure Model. The measure model defines the relationship between the requests of information from managers and the goals to be collected. The measure model define 3 measure levels: 1) basic measure, 2) derivate measures, and 3 ) indicators, see Figure 2.16.. 29.

(47) Chapter 2 – Background. Information. Indicator. Derivate Measures. Basic Measures. Attribute Figure 2.16. The PSM model measurement [Card, 2003] The measure model used in the AIRDoc process is based on GQM approach, presented in Figure 2.2 that is similar to the PSM shown in Figure 2.16.. 2.7.3. The Measuring Process. The measuring process describes a set of four measure activities: 1. Process Implementation: three tasks give support in this activity: i) obtain organizational support, ii) Define responsibilities and iii) Give resources. The AIRDoc approach implements the activity “ii Define responsibilities” inside the Activity (E.1) Elaboration of Evaluation Plan in the sub-activity (E.1.1) Definition of Quality Team. This activity defines what are the roles needed to perform all evaluation and who the assigned members to assume the roles are. 2. Customize measurements: in the PSM this activity defines the set of software metrics that gives the high knowledge about the project with low cost. In the AIRDoc process the customization of measurements is achieved by the use of GQM approach. The GQM approach guide the measurement in agreement the selected goals.. 30.

(48) Chapter 2 – Background. 3. Perform measurement: In PSM this activity is responsible to collect and analyze the metrics. This activity is inside activities (E.4) Collection of the Metrics Values and (E.4) Interpretation of GQM Activities from the AIRDoc process. 4. Evaluate measurement: this activity is used to create a base line about the measurement results. In AIRDoc process the step (I.2.2) Evaluation of the New Use Case Model implements this PSM activity.. 2.8. Summary. This chapter presented the background information of the GQM (Goal Question Metrics), Quality models, refactorings, the BPMN and the PSM. The GQM is an approach to guide the software engineer to apply software metrics. In the AIRDoc context, the GQM is used to help the requirement engineer in the selection, application and interpretation of metrics for the use case models. In order to indicate if there exist potential problems and what are their localization in the use case model. To evaluate a use case model using GQM, we need to know what are the qualities that we want to measure. Thus, the use of GQM is based on Quality Models that need to be defined by the quality assurance team. In this thesis, we use a conceptual software quality metrics framework proposed by [IEEE 1061, 1998]. This framework needs to be instantiated by the quality assurance team in each AIRDoc evaluation. After the AIRDoc evaluation it is possible that some potential problems are found. In this thesis, we address the use of refactorings to solve these potential problems. Refactorings can be classified according to which quality attributes they affect. Thus, this allows the software engineer to improve the quality of a use case model by applying the relevant refactorings at the right places. Chapter 4 presents the refactorings catalog to be used when a potential problems was found. The AIRDoc process is divided into two stages: Evaluation and Improvement. Each stage is composed by activities and each activity is composed by other activities or by steps that need to be performed to achieve the activity goal. In this thesis, we adopted. 31.

(49) Chapter 2 – Background. BPMN to model the AIRDoc process because it supplies resources to represent all AIRDoc stages and its activities, steps, input and outputs. The PSM is a general process that defines activities to perform measurements in software and systems. The AIRDoc instantiate some of these activities to the context of use case models measurement. In the next chapter we will present the AIRDoc process and the detailed description of their activities and steps.. 32.

(50) Chapter 3 – The AIRDoc process. Chapter 3 - The AIRDOC Process. In this chapter we present the activities and steps of the AIRDoc process. Each activity is expanded and its steps are detailed. Templates that assist the quality team to document the evaluation and the improvement are also presented.. 33.

(51) Chapter 3 – The AIRDoc process. 3.1. Introduction AIRDoc is an acronym for “Approach to Improve Requirements Documents1”. The AIRDoc process is based on GQM [Basili et al., 1994] and complies with the IEEE Standard for a Software Quality Methodology [IEEE 1061, 1998] and with the IEEE Recommended Practice for Software Requirements Specifications [IEEE 830, 1998] and is aligned with the PSM [PSM, 2003]. It is divided into two stages:. (1) Evaluation that consists of four Activities2: (E.1) Elaboration of Evaluation Plan; (E.2) Definition of GQM Activities, (E.3) Collection of the Metrics Values; (E.4) Interpretation of GQM Activities;. (2) Improvement that consists of two Activities: (I.1) Elaboration of Improvement Plan; and (I.2) Perform Improvement.. Figure 3.1 illustrates the process, showing the activities modeled with BPMN (Business Process Modeling notation) [White, 2009]. Each activity encloses a subprocess that helps the quality team to conduct the process.. In another context, AirDocs were used during the 1st World Wide War to assist pilots in combat aircraft. They consisted of a series of aircraft documentations. More details at http://www.airdoc.eu 2 It is considered, in this thesis, that an activity is composed by steps or other activities (sub-processes). 1. 34.

(52) Chapter 3 – The AIRDoc process. Figure 3.1. AIRDoc main activities.. The AIRDoc process is conducted by a set of persons that we named of “quality assurance team”, this team is formed by a manager that select other team members (more detail at the sub-activity “Definition of Quality Team”). The quality assurance team decides where the process should start. Hence, the first stage, the “evaluation phase”, is optional. This is the case, for example, when some potential requirements problems have already been identified by other means in the use case model3. However, if the existence of problems is already known, but their localization can not be determined, it is advisable to run the evaluation phase (which requires more careful planning and appropriate budget and time). In doing so, other problems that could have gone undetected may also be identified. In the sequel we describe the six activities of the two stages of the AIRDoc process. To illustrate the filling of the templates, we are using the evaluation performed in [Ramos et, al., 2008b]. This work shows the assessment in the use case model called. 3. Potential problems, such as, duplicated steps or large use cases, among others, can be identified empirically by their requirements engineers.. 35.

(53) Chapter 3 – The AIRDoc process "Adjustment Tax" provided by SERPRO4. The goal evaluation was to assess the maintainability of the display requirements. More details can be found in [Ramos et. al., 2008b] and in Chapter 5.. Activity E1 - Elaboration of Evaluation Plan. Figure 3.2. Elaboration of evaluation plan.. The first stage of our approach is the evaluation of the use case model, which consists of 4 activities (see Figure 3.1).It starts with the elaboration of an evaluation plan (see Figure 3.2). This activity complies with the IEEE Standard for a Software Quality Methodology [IEEE 1061, 1998] which recommends starting with the evaluation strategy definition. The inputs of this activity5, showed in Figure 3.2, are: (1) the use case model to be evaluated, and (2) the quality model. The outputs of this activity are: (3) schedule of tasks, (4) decisions about training, (5) selected quality team members, (6) tools and/or resources, (7) selected scope (this output indicates which part of the use case model will be under evaluation), (8) quality requirements (this defines, in agreement with the quality model selected, which quality attributes will be measured) and (9) the evaluation plan. The four atomic activities of the Elaboration of Evaluation Plan are expanded and their inputs/outputs as well as details are shown in the sequel.. 4. 5. SERPRO is a large company with development units broadly spread throughout 10 Brazilian cities. The company employs more than 2500 software engineers and has a history of successful and awarded solutions built over 40 years as a partner of the Brazilian Federal Revenue Service (More information at http://www.serpro.gov.br). Note that for the sake of clarity we have numbered all the date objects. Hence, data object (1) refers to use case models, which is an input of the activity E.1.3.. 36.

(54) Chapter 3 – The AIRDoc process. E.1.1 - Definition of Quality Team The first sub-activity of the Elaboration of Plan Activity consists of 3 steps. In the first step of this activity (E.1.1.1) the quality team will define what are the roles needed to perform all evaluation. The output (5.1)6 contains a list of all roles defined by the quality team. The next step (E.1.1.2) defines what are the responsibilities of each role (5.2), this will help the elimination of conflicts. It is possible that one team member assume one or more roles. The last step (E.1.1.3) assigns team members to specific role. A list with the team members (5.3) and theirs roles is defined. Table 3.1 shows a template which could be used to define the quality members and generate the output of that activity (5). Note that were used some examples to show the filling of this template. Figure 3.3 shows the three steps and the inputs/outputs of the activity which define the quality assurance team. Table 3.1. Quality team members. Project Team Roles (5.1) <Project team role>. Responsibilities (5.2). Assigned Members (5.3). <Specific quality-related. <Members to fulfilling the. responsibility>. role>. Coordinate the path of evaluation Quality Assurance Manager. and improvement according the. Ricardo Ramos. steps of the AIRDoc approach. Development and management of Requirements Engineer. the requirements document that is SERPRO manager of Recife unit7 being evaluated.. Reviser/Reader. Writer. Metrics Collector. Revise the metric applications and. Jaelson Castro / João Araújo /. other data generated.. Rosangela Penteado. Write the documentation of each AIRDoc’s step. Ricardo Ramos. Apply the metrics in the. Ricardo Ramos and Emanuel. requirements document. Santos8. 6. The reference of the data object (input/output) will be listed between parenthesis. With the intention to preserve the SERPRO employee integrity we omit their names. 8 Master Student and member of LER group (in Portuguese: “Laboratório de Engenharia de Requisitos”) UFPE-Cin, http://www.cin.ufpe.br/~ler 7. 37.

Referências

Documentos relacionados

As part of the current global endeavor to understand how host genetic determinants modulate the response to influenza virus infection, this study aimed to contribute to the popula-

O Psicólogo Hospitalar deve promover espaços dialógicos onde os profissionais de saúde possam refletir sobre suas próprias emoções e sentimentos proporcionando uma

Nesta revisão, foram relatados os principais aspectos do colágeno e seus derivados, as diferenças no processo de extração e características do produto final, assim como

Diante desse contexto, o presente trabalho apresenta a proposta de um modelo de reaproveitamento do óleo de cozinha, para a geração de biodiesel que

Total fatty acid methyl esters (FAME) concentration (mg/g, DW) of the halophytes studied in..

Thus, the paper provides an explanation for the inequality trends observed, considering the influence of both domestic and international forces on changes in the supply and demand

(2013) 9 , entretanto poucos estudos brasileiros sobre o tema relacionam a questão dos recém- formados com essa nova abordagem do mercado, com foco no sistema público

De igual modo não foram encontradas diferenças na taxa de sobrevivência dos implantes entre grupos onde se misturou osso autólogo com outros materiais de enxerto, onde só se