• Nenhum resultado encontrado

Pretende-se ampliar os resultados apresentados neste trabalho, através de uma análise mais abrangente sobre os métodos estatísticos de agregação da literatura e com replicações de estudos empíricos em Engenharia de Software. Observamos que os métodos de agregação paramétricos e não-paramétricos existentes na literatura possuem algumas deficiências e lim- itações. Uma das limitações que percebemos durante o desenvolvimento da nossa pesquisa foi que os métodos da literatura não possibitam a agregação de dados oriundos de estudos que utilizaram tipos de pesquisas diferentes, podemos citar como exemplos as famílias REP053, ORI032, ORI085 e ORI092 (ver Apêndice A) que foram encontradas no trabalho de (SILVA et al.,2014).

6.2. TRABALHOS FUTUROS 87 Listaremos a seguir sugestões de estudos mais aprofundados a respeito das observações feitas neste trabalho:

 Propor extensões dos métodos da literatura utilizados na pesquisa, explorando diver- sos contextos e situações buscando solucionar os problemas de dados oriundos de tipos de pesquisas diferentes, fornecendo uma medida de agregação para esse tipo de dados.

 Propor soluções para tratar dados atípicos.

 Realizar meta-análise em outras famílias de replicações levando em conta outros aspectos ou cenários.

 Propor versões dos métodos RRP e RRNP considerando a variabilidade entre os estudos.

88 88 88

Referências

BASILI, V. R. Quantitative evaluation of software engineering methodology. In: FIRST PAN PACIFIC COMPUTER CONFERENCE - AUSTRALIAN COMPUTER SOCIETY. Anais. . . [S.l.: s.n.], 1985. p.379–398.

BEZERRA, R. M. M. et al. Replication of Empirical Studies in Software Engineering: an update of a systematic mapping study. In: ACM/IEEE INTERNATIONAL SYMPOSIUM ON

EMPIRICAL SOFTWARE ENGINEERING AND MEASUREMENT, ESEM 2015, BEIJING, CHINA, OCTOBER 22-23, 2015, 2015. Anais. . . [S.l.: s.n.], 2015. p.132–135.

BORENSTEIN, M. et al. Introduction to Meta-analysis. [S.l.]: Jonh Wiley & Sons, 2009. BREI, V. A.; VIEIRA, V. A.; MATOS, C. A. D. Meta-análise em Marketing. Revista Brasileira de Marketing, [S.l.], n.2, p.84–97, 2014.

CLYDE, H. Replication, strict replications, and conceptual replications: are they important? Journal of social behavior and personality, [S.l.], n.4, p.41–49, 1991.

COLLINS, H. M. Changing Order: replication and induction in scientific practice. [S.l.]: Beverley Hills & London: Sage, 1985. n.2.

CRANDALL, B. W. A Comparative Study of Think-aloud and Critical Decision Knowledge Elicitation Methods. SIGART Bull., New York, NY, USA, n.108, p.144–146, Apr. 1989. DIESTE, O. et al. Hidden evidence behind useless replications. In: INTERNATIONAL WORKSHOP ON REPLICATION IN EMPIRICAL SOFTWARE ENGINEERING RESEARCH(RESER 2010), 1. Proceedings. . . [S.l.: s.n.], 2010.

DIESTE, O. et al. Comparative analysis of meta-analysis methods: when to use which? In: ANNUAL CONFERENCE ON EVALUATION & ASSESSMENT IN SOFTWARE ENGINEERING (EASE 2011), 15. Anais. . . [S.l.: s.n.], 2011. p.36–45.

DYBA, T. et al. Are Two Heads Better than One? On the Effectiveness of Pair Programming. IEEE Software, Los Alamitos, CA, USA, v.24, n.6, p.12–15, 2007.

EASTERBROOK, S. et al. In: SHULL, F.; SINGER, J.; SJØBERG, D. I. K. (Ed.). Guide to Advanced Empirical Software Engineering. London: Springer London, 2008. p.285–311. FAGARD, R.; STAESSEN, J.; THIJS, L. Advantages and disadvantages of the meta-analysis aproach. Journal of Hypertension, [S.l.], n.14, p.9–13, 1996.

GLASS, G. V. Primary, Secondary, and Meta-Analysis of Research. Educational Researcher, [S.l.], v.5, n.10, p.3–8, 1976.

GOMEZ, O. S.; NATALIA JURISTO, S. V. Understanding replication of experiments in software engineering: a classification. Information and Software Technology, [S.l.], v.56, n.8, p.1033–1048, 2014.

HEDGES, L.; GUREVITCH, J.; CURTIS, P. S. The Meta-Analysis of Response Ratio in Experimental Ecology. The Ecological Society of America., [S.l.], 1999.

REFERÊNCIAS 89 HEDGES, L.; OLKIN, I. Statistical methods for meta-analysis. [S.l.]: Academic Press, 1985. HEDGES, L. V.; PIGOTT, T. D. The power of statistical tests for moderators in meta-analysis. Psychological Methods, [S.l.], v.9, n.4, p.426–445, 2004.

HIGGINS, J. P. et al. Measuring inconsistency in meta-analyses. BMJ, [S.l.], v.327, n.7414, p.557–560, 2003.

J.NEYMAN; PEARSON, E.; S., P. On the Problem of the Most Efficient Tests of Statistical Hypotheses. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, [S.l.], v.231, n.694-706, p.289–337, Jan. 1933.

JURISTO, N.; GOMEZ, O. S. Empirical Software Engineering and Verification. In: MEYER, B.; NORDIO, M. (Ed.). . Berlin, Heidelberg: Springer-Verlag, 2012. p.60–88.

KITCHENHAM, B. Procedures for performing systematic reviews. [S.l.]: Keele University and NICTA, 2004.

LAJEUNESSE, M. J.; FORBES, M. R. Variable reporting and quantitative reviews: a

comparison of three meta-analytical techniques. Ecology Letters, [S.l.], v.6, n.5, p.448–454, 2003.

L.SCHIMIDT, F.; HUNTER, J. E. Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, [S.l.], 1977.

MIGUEZ, F. E.; BOLLERO, G. A. Review of corn yield response under winter cover cropping systems using meta-analytic methods. Crop Science, [S.l.], v.45, n.6, p.2318–2329, 2005. PICKARD, L. M.; KITCHENHAM, B. A.; JONES, P. W. Combining empirical results in software engineering. Information and Software Technology., [S.l.], 1998.

RESSING, M.; BLETTNER, M.; KLUG, S. J. Systematic Reviews and Meta-Analyses. International Dtsch Arztebl, [S.l.], p.456—-463, 2009.

RIED, K. Interpreting and understanding meta-analysis graphs - A practical guide. [S.l.]: Australian College of General Practitioners, 2008. v.35, n.8.

RODRIGUES, C. L.; ZIEGELMANN, P. K. METANÁLISE: um guia prÁtico. Revista HCPA, [S.l.], v.30, n.4, p.436–447, 2010.

ROSENTHAL, R. Meta-Analytic Procedures for Social Research. [S.l.]: SAGE Publications, Inc., 1991.

SCHMIDT, S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, [S.l.], v.13, n.2, p.90–100, 2009.

SILVA, F. Q. B. da et al. Replication of Empirical Studies in Software Engineering Research: a systematic mapping study. Empirical Softw. Engg., [S.l.], v.19, n.3, p.501–557, June 2014. SUTTON, A. J. et al. Methods for Meta-Analysis in Medical Research. [S.l.]: John Wiley, 2003. v.22.

WOHLIN, C. et al. Experimentation in Software Engineering: an introduction. [S.l.]: Kluver Academic Publishers, 2000.

REFERÊNCIAS 90 WOLF, F. M. Meta-Analytic Procedures for Social Research. [S.l.]: SAGE Publications, Inc., 1986.

91 91 91

Anexo

Conjunto de Replicações

Cód. Experimento Referência

[ORI002FE] CRUZ-LEMUS, J. A. et al. Assessing the influence of stereotypes on the comprehension of

UML sequence diagrams :A Controlled Experiment . p. 280–294, 2008.

[REP002FE] CRUZ-LEMUS, J. A. et al. Assessing the influence of stereotypes on the comprehension of

UML sequence diagrams : A family of experiments. v. 53, p. 1391–1403, 2011.

[ORI025] Jeffery R, RuheM,Wieczorek I (2001) Using Public Domain Metrics to Estimate Software

Development Effort. 7th International Software Metrics Symposium 16–27.

[REP025] Lokan, C.; Mendes, E. Cross-company and Single-company Effort Models Using the ISBSG

Database : a Further Replicated Study. ISESE ’06 Proceedings of the 2006 ACM/IEEE international symposium on Empirical software engineering, pp. 75-84, 2006.

[REP028] Mendes, E.; Lokan, C. Replicating studies on cross- vs single-company effort models using

the ISBSG Database. Empirical Software Engineering, 13(1), pp.3, 2008.

[ORI032] Prechelt, L.; Unger, B.; Tichy, W. F.; Brössler, P.; Votta, L. G. A controlled experiment

in maintenance: comparing design patterns to simpler solutions. IEEE Transactions on Software Engineering, 27(12), pp.1134-1144, 2001.

[REP037] Mendes E.; Lokan, C.; Harrison, R.; Triggs, C. A replicated comparison of cross-company

and within-company effort estimation models using the ISBSG database. Proceedings - International Software Metrics Symposium, pp. 331-340, 2005.

[REP053] Porter, A. A.; Votta, L. G.; Basili, V. R. Comparing Detection Methods for Software Require-

ments Inspections: A Replicated Experiment. IEEE Transactions on Software Engineering, 21(6), pp. 563, 1995.

[ORI085] Basili, V. R.; Selby, R. W. (1987). Comparing the Effectiveness of Software Testing Strategies.

IEEE Transactions on Software Engineering, SE-13(12), pp. 1278-1296, 1987.

[ORI092] Porter, A. A.; Votta, L. G.; Basili, V. R. Comparing Detection Methods for Software Require-

ments Inspections: A Replicated Experiment. IEEE Transactions on Software Engineering, 21(6), pp. 563, 1995.

[REP094] Ricca F, Penta MD, Torchiano M, Tonella P, Ceccato M (2010) How Developers’ Expe-

rience and Ability Influence Web Application Comprehension Tasks Supported by UML Stereotypes: A Series of Four Experiments. IEEE Transactions on Software Engineering 36(1):96–118.

[REP129] Cruz-Lemus JA, Genero M, Manso ME, Piattini M (2005) Evaluating the Effect of Composite

States on the Understandability of UML Statechart Diagrams. Lecture Notes in Computer Science 3713:113–125.

[REP130] CRUZ-LEMUS, J. A.; GENERO, M.; PIATTINI, M.; MORASCA, S. Improving the exper-

imentation for evaluating the effect of composite states on the understandability of UML statechart diagrams, in: Proceedings of 5th ACM-IEEE International Symposium on Empiri- cal Software engineering, Rio de Janeiro, Brazil, 2006, pp. 9-11.

[REP131] [REP131]CRUZ-LEMUS, J. A.; GENERO, M.; MANSO, M. E.; MORASCA, S.; PIAT-

TINI, M. Assessing the understandability of UML statechart diagrams with composite statesa family of empirical studies, Empirical Software engineering 14 (6) (2009) 685-719. doi:10.1007/s10664-009-9106-z.

Documentos relacionados