• Nenhum resultado encontrado

Part III: Robusta

8.6 Conclusion

Throughout this dissertation we have argued that managing dynamism is a difficult, cross- cutting task that is complex and error-prone. Improper handling of dynamism leads to unexpected and undesirable behavior, such as inconsistencies, corruption, memory leaks, among other problems. This chapter has provided an overview of our approach, from design to runtime, and back. We have described the types of analysis that can be performed at both the architectural level, as well as at the component implementation level, to assist architects and developers respectively in their quest to build dynamic applications.

Given the difficulty, invasiveness and cross-cuttingness that dynamism has on software, tooling and support for managing dynamism is required for both architects and developers.

Developing dynamic applications requires assistance, such as our efforts to provide analysis, and guarantees, such as our approach to decoupling and resilience that ensures the application remains consistent given both expected and unexpected dynamism.

Chapter 9

Implementation and Validation

Where you give software developers a choice of doing the simple thing or the more complicated thing, they go for the more complicated thing, because there's more reward for doing it.

Hasso Plattner, Chairman SAP, interview with the Wall Street Journal on May 15th 2007 You cannot control who you do not understand.

Mao

This chapter presents the implementation prototype for Robusta and its validation. We have focused our implementation on a proof-of-concept prototype that demonstrates the feasibility of using and implementing our approach in large and complex software used in industry. We have particularly focused on the runtime aspects needed for Robusta to be a feasible and useful approach. Design-time aspects and tooling to assist in the development of dynamic architectures have not been implemented, but we know, by experience, they can be implemented. It should be mentioned that we have made an effort to follow an Open-World assumption in our prototype in order to account for the fact that we cannot anticipate the dynamic changes an application will undergo or what components, classes or modules will be used or changed in the future. Following such an assumption, our prototype performs its analyses as-late-as-possible in order for them to assess the current state of the application at any given moment. We have verified that such an approach can be used in industrial software to assert large-scale systems.

Our prototype is primarily focused on the detection of component and module coupling, as described in Chapter 6 Dynamic-Decoupling, which is the basis to permitting unexpected dynamism in an application. Ensuring a component is properly decoupled and resilient to change is essential to assessing that the application will behave as the architect expects. In order to achieve decoupling, our analysis focus on analyzing classes, detecting coupling, calculating the Service Contract and determining the Contract Extensions.

9.1 Requirements for coupling detection

As described in Chapter 6, in order to calculate the full extent of coupling in an application we must have complete knowledge over all the relationships between classes and interfaces that

9.1 Requirements for coupling detection

in a directed graph called the Class Dependency Graph. This graph contains all loaded types and their relationships in order to properly calculate the extent of coupling among types. This graph is computed at runtime.

Coupling occurs at the class-level but deployment involves modules that contain sets of classes, therefore we must be able to detect the relationship between classes and modules in order to determine which classes belong to which module. Classes are contained in modules and modules are the unit of dynamism for adding and removing classes at runtime (and hence functionality). However, as will be seen later in this chapter, it is not always evident to obtain this information. Furthermore, in order to follow through with current development practices, such an approach should account for the loading of multiple versions of a class or class name clashes for classes contained in different modules. Indeed, it is common for large applications to contain multiple versions of a same library that are used independently in different areas of the application.

In order to detect coupling we must calculate the Service Contract. Given the Class Dependency Graph, it is straightforward to calculate the Service Contract using simple reachability heuristics. Furthermore, the Contract Extensions need to be determined also. Service Contract Extensions can be particularly problematic at runtime because a single class can come along and extend another, causing hidden coupling that can be difficult track and detect. Such hidden couplings, as described in section 6.1.2, can cause undesirable and unexpected behavior because of their contamination of the Service Contract. This is interesting because this shows that attention needs to be paid to what is added to the application, not only what is removed, if we should keep things well decoupled and minimize the impact of dynamism on the running application.

The prototype must calculate the impact on the application when performing a specific dynamic change. In particular, previous to removing a module, it is necessary to calculate which modules will be impacted by the removal and would also require being removed, in a domino effect. We should note that at the architectural level, the Service Contract is expected to be independent from the component implementations in such a way that component implementations may evolve independently. Of course, these are design decisions but the prototype must allow for their verification.

Finally, an important requirement that has influenced much of our prototype and validation is the Open-World assumption we have decided to follow in order to approach as closest as possible real-life concerns that exist for modern and complex long-running modular applications.

We use the open-world assumption to indicate that no single or central entity has the wisdom to foresee the dynamic changes that will occur in the future. This is essential to allowing programmers and architects the freedom to adapt their applications without having to predict each and every adaption in advance. Furthermore, in a more practical sense, it is unwise to expect that the runtime anticipate everything that is going to be loaded and thus perform all coupling calculations beforehand. Indeed, the open-world assumption follows the use of current industry technologies for building large and complex software, such as Java enterprise applications.

In summary, our prototype must achieve the following:

 Build a Class Dependency Graph containing all classes loaded and accessed by an application.

 Calculate the Service Contract and Extended Service Contract.

 Calculate change impact in order to, firstly, understand beforehand the interactions and impact of a change and, secondly, properly refresh all dependencies.

 Follow our Open-World assumption to allow for unanticipated and unexpected dynamism.

9.2 Solution Comparison and Tradeoffs

The assumptions we have made regarding how dynamic applications are developed and executed have influenced the technical decisions we have made both regarding our implementation and our validation. In this section we will analyze our choices regarding the implementation of our prototype.

9.2.1 Design-time versus r untime analysis

When to perform an analysis is important to determining the applicability of the solution.

Given our important open-world assumption, we can quickly see that it is necessary to provide the analyses and calculations at runtime because we cannot anticipate what classes will be run nor the dependencies that will exist at design-time. Performing such calculations at design-time reduces the scope of Robustas usability.

This is not to say that the analyses are not useful at design-time. Quite the contrary, they can be very useful in assisting developers to properly decouple their components at an early stage, avoiding the cost and energy spent in refactoring code late in the development process.

Nevertheless, only at runtime can we have a complete picture of the target application and all the necessary data about coupling among classes and modules as it exists at any given point in time.

Performing the calculations at runtime is widely applicable to different use-cases.

9.2.2 Bytecode versus source code analysis

The approach requires reading classes and calculating the relationships among them. There are two ways of doing this, either by reading human-readable source code or by performing compiled bytecode analysis. Following current programming techniques and the technologies used (e.g., Maven, Gradle, OSGi, Java EE, Grails), the execution framework rarely has access to all the source code used to compile or run a program. Indeed, the proliferation of libraries spread across an organization or obtained through third parties over the internet make it more and more common to simply recover existing compiled packages, often open source, and directly integrate them within an application.