The software development process andsoftware life cycle usually introduces students to the waterfall model, iterative models (e.g. spiral model), agile model, extreme model and rational united process. The waterfall model is well defined and the differences between this model and other models are clear. However, the differences between the other models are not that clear and could be confusing. For example, it is difficult to explain and highlight rigid differences between the spiral and agile models. Both are incremental and iterative. Both work in order of risk. The difference may be in the scope. While the spiral focuses on big design from the beginning and is recommended for large projects, the agile focuses on one increment at a time and may work for small projects. That difference is not really sharp to require two names for almost the same model. It was going to be easier if agile was considered a special case of the spiral model. Also, it is not clear what is meant with big and small projects, this is proportional. If the course delves into the discussion of the Extreme Programming (XP) model / technique then more confusion is added to the course as follows: The PC magazine says about XP that “it is based on a formal set of rules about how one develops functionality such as defining a test before writing the code and never designing more than is needed to support the code that is written” and “XP is designed to steer the project correctly rather than concentrating on meeting target dates, which are often unrealistic in this business” . But is not that what software developers need? Just to design what is needed for coding and to steer the project correctly? If so, then why the need for other models? Even more, TechTarget  claims that: “Kent Beck, author of Extreme Programming Explained: Embrace Change, developed the XP concept. According to Beck, code comes first in XP”. But this contradicts what we have been teaching students that softwareengineering is concerned with the careful analysis and design so that the coding phase goes smoothly. Now, we teach them that code comes first. Furthermore, according to Don Wells , XP “has already been proven to be very successful at many companies of all different sizes and industries worldwide”. Again, if XP is the perfect model for all different sizes and industries then why trying other models? On the other hand, the some suggest that XP is waning . While most literature suggests that XP is a special case of agile, Extreme Programming (XP) happens to be the most well- known of agile methodologies ; others suggest that agile itself is only an implementation of the spiral model . The point here is that there is no consensus on the relationship between the different models and there is no
According to recent research projects on education, the use of interactive visualizations help students deeply understand abstract and highly abstract engineering subjects in engineering courses. A computer tutoring framework was developed and implemented into a number of engineering courses at California State University, Long Beach. The proposed tutoring framework was incorporated with visualization learning objects including graphics, animation, video, and illustrative images/photos, which are found to be very effective in learning andteachingengineering courses. These learning objects consist of modules that will help students achieve deeper understanding (learn), apply learning to unfamiliar problems (practice), and optimize achievement of predefined learning outcomes through a diagnostic feedback loop (assess). Learning objects were designed to address basic, intermediate and advanced knowledge to provide spiraled learning. The visualizations provide dynamic representations of knowledge and improve accessibility of instructional materials because the learning objects provide an alternative to text. The interactive approach enables students with different learning styles to comprehend theoretical constructs and apply them in
Α number of educational optimization software packages also exist for the MCNFP. For students, teaching solution algorithms for the MCNFP sometimes seems difficult to be grasped because they need to generate a sequence of rooted trees. The scope of such tools  is not the solution of large scale instances, but rather a step-by- step visualization  of solution algorithms for the MCNFP in order to enable the OR instructors to explain each iteration of the algorithm visually and with minimal effort. Vanderbei developed a network Simplex pivot tool 5 that can be used for solving the MCNFP. Recently, Baloukas et al.  presented an animated demonstration 6 of the classic NPSA for the uncapacitated MCNFP. Andreou et al.  also developed visualization software 7 of the NEPSA. These educational optimization software packages implemented as Java applets are freely available and highly interactive, and can be accessed through the Web. Moreover, they have a number of helpful features, such as using colored eligible arcs, showing the solution process through textual information and depicting the relevant steps in pseudo code using multiple views.
Software documentation has been one of the important things in software development. It may be described as any artifact intended to communicate information on the software system . Documentation has been well-known on the list of recommended practices to improve development and help maintenance . It has been one of the oldest recommended practices. There have been many problems regarding documentation such as nonexistent or of poor quality [8, 9]; over abundant and without a definite objective  and tracking of process or flow between various documents that depends with one another.
This term is used for a new generation of tools that applies engineering principles to the development and analysis of software specifications. Simply, computers develop software for other computers in a fast way by using specific tools. When implanted in a concurrent engineering environment, this process is taking place while some parts of the developed software are running already. It's a sort of on-line softwareengineering. There are a lot of problems with these kinds of systems because they are very complex, not easily maintainable and fragile.
The softwareengineering is the systematic and scientific approach to develop, operate, maintain and to retire the software product. Although Softwareengineering is a very discipline and systematic approach but it has some limitation andproblems. Firstly, it is very difficult to simulate the human mind or behavior with the help of softwareengineering. Secondly, the computer consciousness is not possible in Softwareengineering. Thirdly, it is not possible to solve NP’s Complete Problem quickly i.e. in polynomial time. Fourthly, mostly process models in softwareengineering use the sequential approach and fixed phases so software product is not flexible in nature. Furthermore, the real time software is very difficult to engineer with the help of Softwareengineering. Lastly, since software is so cheap to build so formal engineering validation methods are not of much use in real world software development. Software development is still more a craft than an engineering discipline because of lack of rigor in critical processes of validating and improving a design.
As this research aimed to show, there are similar aspects in between the experiences of UBI and UO, such as the fact that the computer-assisted learning, using ICT has penetrated equally at both universities as a crucial feature of the teaching methodologies. In terms of maths issues at UBI, the students are very keen on using ICT, because they belong to the called technological generation. However, the experience has shown that it should happen at the same time of the use of traditional teaching methodologies. The most important change to do is to bring students to think about maths resolution strategies and discuss them during the classes. The ICT should be used after the explanation/understanding of the maths contents. The student’s main goal is to achieve resolution routines, given that the software generates the same kind of exercise, in a random way. In both universities, the scholars are stimulating students to use their visual memory and to ask questions. The Civil Engineering Department at the UO hosts mostly Romanian students but also few Erasmus students (especially from Turkey). Stimulating the auditory and visual memory of students enables them to more easily acquire the necessary know how for the graphical representation of structures. At UBI the situation is similar, however there is a wider range of backgrounds and students’ origins. As final remark, it can be said from the Civil Engineering Department experience at UBI that the strategy includes a student-driven and teacher-facilitated approach to learning (Project Based Learning), small classes, support material and classes taught in English-language, visual teaching tools and ICT use. This results for both host and international students in a friendly learning environment. At UBI, in transportation engineeringand GIS subjects, an oriented but autonomous, independent and critical thinking learning approach is applied as teaching methodology. Finally, in terms of language skills, Romanian students have generally a good grasp of English. The situation is similar at UBI, but some students, particularly the Portuguese-speakers coming from African countries or Brazil, are needing to improve their skills in English as a special requirement not only regarding particular units but also for a better performance along their academic experiences in general.
The problems with the Waterfall Model created a demand for a new method of developing systems which could provide faster results, require less up-front information, and offer greater flexibility. With Iterative Development, the project is divided into small parts. This allows the development team to demonstrate results earlier on in the process and obtain valuable feedback from system users. Often, each iteration is actually a mini-Waterfall process with the feedback from one phase providing vital information for the design of the next phase. In a variation of this model, the software products, which are produced at the end of each step (or series of steps), can go into production immediately as incremental releases.
Computer organization and architecture is a common course that is offered at universities throughout the world  . Traditionally, teaching such a course to computer engineeringand computer science students can be insufficient if the teaching focus is solely on textbook materials [2,3] . Students often have to rely on their imaginations to understand the underlying hardware-related concepts. In most universities, students learn computer design concepts by software implementing individual pieces of a computer. This approach has several limitations, while students can simulate their design using software, they don’t have the chance to realize or run their design in hardware  . Also, it is not feasible to build a laboratory that can provide various computer architectures for teaching computer organization and architecture. Hence, keeping computer education up-to-date requires keeping in touch with the rapid evolution of the computer technology and industry. Searching for an efficient way of teaching computer organization and architecture is an ongoing task  . An active tool will be considered in this paper for teaching computer organization and architecture by taking advantage of simulation and Field Programmable Gate Array (FPGA) technology
This contribution will focus on Computational Tools of Open-Source Software that are rather interesting in teaching Operations Research applied to engineering sciences, (Tavares et al 1996) specifically some educational experiences in the area of Forecasting; Simulation; Graphs and Networks; Decision Theory and Linear Programming.
lems. There are at least two issues that motivate the usage of toy problems: the re- sources available for the experiment and the risks concerned with the outcome of the experiment. The former results from the, often, very limited time subjects can devote to the experiment. The latter relates to the potential harm caused by the outcome of the experiment (e.g. while experimenting with different testing techniques on a real prob- lem, a less effective technique being tested could lead to a lower final product quality being delivered to a customer). The question, here, is whether the results obtained with a toy problem will scale up to real problems, or not. Toy problems are often used in early experiments, as their usage is less expensive. If the results of experiments con- ducted with toy examples are satisfactory, the risk of scaling up the problem to a real one may be mitigated to a certain extent, although it will not be completely eradicated. Experiments can also range from specific to general, in the sense that their results are applicable to a niche or to a wider population. For instance, when experimenting with the maintainability of object-oriented software, one can design experiments that are language-specific, or experiments that yield results applicable to object-oriented software in general.
On the other hand DSLs’ have flaws: the restriction of the domain leads to a limited applicability. This limited applicability is noticed by the lack of web communities for that DSL, leading to more difficulties on finding code examples for the DSL. With no DSL standard defined, this problem is even more serious since there can exist more than one DSL for the exact same domain and application on the market, producing unnecessary costs (several DSLs’ with the same scope are being developed) and restricting the evolu- tion of these DSLs’. The maintenance of a DSL should also be taken into consideration. Due to its restricted applicability, it is harder to find users that can maintain the tool and, for that reason, it can produce higher costs. [Mer+05].
Figure 2 shows the model in its basic forms [McCarthy, 1982] as it exists from perspective of a business entrepreneur. REA model is a pattern for an arm’s length collaboration (or an inside transformation) between the entrepreneur and a trading partner wherein he or she gives up control of some resource of value (the give part of the exchange above the dotted line) in exchange for another resource of perceived great value (the take part of the exchange below the dotted line). The entity types of figure 2 (the economic Resource, the economic Event and the economic Agents) are very important, but the structuring effects of the relationships are nearly as paramount. Stock flow relationships associate the flows in and out of a resource category while the duality links keep the economic rationale for the exchange boldly in the forefront.
In the present paper we are presenting a problem of distribution using the Network Modeling Module of the WinQSB software, were we have 5 athletes which we must assign the optimal sample, function of the ob- tained time, so as to obtain the maximum output of the athletes. Also we analyzed the case of an accident of 2 athletes, the coupling of 3 athletes with 5 various athletic events causing the maximum coupling, done using the Hungarian algorithm.
A high proportion of the study sample had mild symptoms of the problems of sensory before the operation and the symptoms decrease after the operation, and most of the muscle problems before the operation was severe, but after the operation, the symptoms decrease, and for the symptoms of the blood vessels before the operation was severe, but after the operation a few significantly, and respiratory problems was before operation severe and decrease after the operation. The researchers recommend Create a re-qualification in the ward to provide a training program to help the patient instructions apply to them before disposal to prevent the complications of dislocation which is very serious, and the reduction of fear, and thus reduce the psychosocial symptoms.
Abstract. Many domain-specific languages, that try to bring feasible alternatives for existing solutions while simplifying programming work, have come up in recent years. Although, these little languages seem to be easy to use, there is an open issue whether they bring advantages in comparison to the application libraries, which are the most commonly used implementation approach. In this work, we present an experiment, which was carried out to compare such a domain-specific language with a comparable application library. The experiment was conducted with 36 programmers, who have answered a questionnaire on both implementation approaches. The questionnaire is more than 100 pages long. For a domain-specific language and the application library, the same problem domain has been used – construction of graphical user interfaces. In terms of a domain-specific language, XAML has been used and C# Forms for the application library. A cognitive dimension framework has been used for a comparison between XAML and C# Forms.
CONTEXT: Despite the possible lack of validity when compared with other science areas, Simulation-Based Studies (SBS) in SoftwareEngineering (SE) have supported the achievement of some results in the field. However, as it happens with any other sort of experimental study, it is important to identify and deal with threats to validity aiming at increasing their strength and reinforcing results confidence. OBJECTIVE: To identify potential threats to SBS validity in SE and suggest ways to mitigate them. METHOD: To apply qualitative analysis in a dataset resulted from the aggregation of data from a quasi-systematic literature review combined with ad-hoc surveyed information regarding other science areas. RESULTS: The analysis of data extracted from 15 technical papers allowed the identification and classification of 28 different threats to validity concerned with SBS in SE according Cook and Campbell’s categories. Besides, 12 verification and validation procedures applicable to SBS were also analyzed and organized due to their ability to detect these threats to validity. These results were used to make available an improved set of guidelines regarding the planning and reporting of SBS in SE. CONCLUSIONS: Simulation based studies add different threats to validity when compared with traditional studies. They are not well observed and therefore, it is not easy to identify and mitigate all of them without explicit guidance, as the one depicted in this paper.
A Use Case describes the interaction between a system and its environment. A Use Case defines a goal-oriented set of interactions between external actors and the system under consideration. The term actor is used to describe the person or system that has a goal against the system under discussion. A primary actor triggers the system behavior in order to achieve a certain goal. A secondary actor interacts with the system but does not trigger the Use Case. A Use Case is completed successfully when that goal is satisfied. Use Case descriptions also include possible extensions to this sequence, e.g., alternative sequences that may also satisfy the goal, as well as sequences that may lead to failure in completing the service in case of exceptional behavior, error handling, etc. A complete set of Use Cases specifies all the different ways to use the system, and therefore defines the whole required behavior of the system. Generally, Use Case steps are written in an easy-to understand, structured narrative using the vocabulary of the domain. A scenario is an instance of a Use Case and represents a single path through the Use Case. Thus, there exists a scenario for the main flow through the Use Case, and as many other scenarios as the possible variations of flow through the Use Case. Scenarios may also be depicted in a graphical form using UML Sequence Diagrams.
Class cohesion or degree of the relations of class members is considered as one of the crucial quality criteria. A class with a high cohesion improves understandability, maintainability and reusability. The class cohesion metrics can be measured quantita- tively and therefore can be used as a base for assessing the quality of design. The main objective of this paper is to identify impor- tant research directions in the area of class cohesion metrics that require further attention in order to develop more effective and efficient class cohesion metrics for softwareengineering. In this paper, we discuss the class cohesion assessing metrics (thirty-two metrics) that have received the most attention in the research community and compare them from different aspects. We also present desirable properties of cohesion metrics to validate class cohesion metrics.