Over time, due to increased product functionalities, softwareprojects have become more and more complex and along withincreasing work completion pressures, the software projects arerequired to be accomplished in lesser amount of time but withfewer people.Designing and building high quality industry basedsoftware is a difficult task. Indeed, it has been verified that the industrial development projects are amongst the most complex construction tasks undertaken by humans. To handle this level of criticality, a wide range of software engineering paradigms have been devised (e.g., procedural programming, structured programming, declarative programming, object-oriented programming, design patterns, application frameworks and component- ware).Each successive development either claims to make the software development process easier or to extend the complexity of applications that can feasibly be built.But recently, with the high rate of increase in complexity of projects associated with software engineering, agent concepts have been considered as a new paradigm for handling complex systems.Agile methods are built on the assumption that the world isunpredictable and therefore aims at being adaptive, flexible and responsive, while
Computing becomes more diffused and distributed. Decentralization and cooperation between software modules are needed to improve the quality of services of a system. In addition, with the growing size and complexity of new applications, the centralized vision seems has reached its limits. We are thus naturally led to seek a new way to give more autonomy and initiative in the various software entities. The concept of multi-agentsystems (MAS) offers a response to these challenges. Currently multi-agentsystems have addressed several areas, namely the field of education. They can contribute greatly to the improvement of the learning process.
The work by Bidarra, et al. in  describes a course on game projects, taught in the second year, at Delfts University of Technology with a focus on game development in large teams using students from different disciplines. The proposed course's learning outcomes include demonstrating proficiency in applying media and programming techniques within the context of computer games, striving for a balance between the effectiveness of a programming technique and the desired quality of a game effect; describing the main modules of a game engine and purposefully using their functionality, deepening object-oriented programming skills while building a complex and large software system; an. developing and contrasting teamwork skills within the context of a realistic interdisciplinary team. The survey results reported in the paper indicate that the students were highly motivated upon completing the course and were largely happy with the projects.
This Chapter starts with a brief clarification of concepts recurrently used throughout the doc- ument in Section 2.1. Being the main focus of this thesis, complexsystems are reviewed in Sec- tion 2.2. Throughout this document, the Cyber-Physical Systems are used as an example of a complex system. As such, a proper overview of them is done in Section 2.3. Moving to strategies of how this thesis hypothesis may be approached, multi-agent-based computing usefulness both in the task of engineering the software tool and in modeling the entities in the simulation envi- ronment is discussed on Section 2.4. Going deeper in what regards the simulation environment, a review of Behavioral Cloning in Section 2.5 provides inspiration on how to tackle the problem. Moving forward, Section 2.6 introduces the Big Data era, and how latest developments boost CPS potential. This sets the tone for a careful analysis over real-time data mining in Section 2.7, as well as some key algorithms. The chapter ends with some considerations on maintenance and usability questions in Section 2.8 that guided the development of this project.
During the last decade, researchers have produced a considerable number of interesting works in which Emotional-based concepts were applied in Agent Architectures. We will make a revision of some of these works in Chapter 3. Still, it is worth mentioning some authors whose work has greatly influenced our ideas, either by their functional and practical perspective or, on the other hand, by their high-level and integrating approach. For example, Velásquez [Vel98] has applied Emotional-based mechanisms to control behavior selection of a physical robot named Yuppi. Cañamero [Cañ97][Cañ00] has also developed some very interesting work in modeling and applying Emotion- based concepts to action-selection in software Agents. Gadanho & Hallam[Gad98] and Gadanho[Gad02] built a simulation where they have applied Emotional Mechanisms to perform reinforcement learning. From a more theoretical point of view, Aaron Sloman and his students in the Cognition and Affect Group, University of Birmingham [Wri97][Bea94][Slo00] followed a design- basedapproach to develop Agent Architectures. These Architectures were based on the study of the requirements needed for buildingsystems with Human-like intelligence. In such Architectures, Emotion-like manifestations would arise in the system as a natural result of interactions established between architectural components.
Abstract. The demand for large-scale systems running in complex and even chaotic environments requires the consideration of new paradigms and tech- nologies that provide flexibility, robustness, agility and responsiveness. Multi- agents systems is pointed out as a suitable approach to address this challenge by offering an alternative way to design control systems, based on the decentraliza- tion of control functions over distributed autonomous and cooperative entities. However, in spite of their enormous potential, they usually lack some aspects related to interoperability, optimization in decentralized structures and truly self-adaptation. This paper discusses a new perspective to engineer adaptive complexsystems considering a 3-layer framework integrating several comple- mentary paradigms and technologies. In a first step, it suggests the integration of multi-agentsystems with service-oriented architectures to overcome the limi- tations of interoperability and smooth migration, followed by the use of tech- nology enablers, such as cloud computing and wireless sensor networks, to provide a ubiquitous and reconfigurable environment. Finally, the resulted ser- vice-oriented multi-agent system should be enhanced with biologically inspired techniques, namely self-organization, to reach a truly robust, agile and adaptive system.
Fig. 1 shows our vision for the problem. The sensors installed on the campus provide data to a central cloud server, where information is manipulated towards the desirable goal. A service-basedapproach is used to provide flexibility and allow the reuse of algorithms towards knowledge extraction. The proposed architecture is composed of four layers: 1) Data layer, which comprises data collection from installed sensors; 2) Information layer, where data is manipulated towards achieving desirable infor- mation, based on data mining algorithms (out of the scope of this paper); 3) Knowledge layer, where this information can be used for campus management, and specific func- tional roles act on infrastructure and systems to optimise operating conditions; and 4) Services layer, which feeds main applications in a service-basedapproach, where information can be incorporated in the related service. For example, the info about the number of empty spaces at the parking facilities can be used to increase the number of persons using them.
With the intention of implementing the task of information security risk management, URMIS needs to collect data about the status of information asset, recognize kinds of risk, and perform risk management task based on a good defined risk management process. That means the working environment of URMIS consists of knowledge, data, process and strategies. However, knowledge, data, process and strategies are resources in different formalization, and it is a complex work to design interface for each resource. This work is based on the multi agentsystemsapproach, because of its benefits. It encompasses cooperation, resolution of complex problems, modularity, efficiency, reliability and reusability. All these advantages provided by MAS fit these needs.
Agent technology is one of the rapidly growing fields of information technology and possesses huge scope for research both in industry as well as in academic level. Software agents can be simply defined as an abstraction to describe computer programs that acts on behalf of another program or user either directly or indirectly . Softwareagent is endowed with intelligence in such a way that it adapts and learns in order to solve complex problems and to achieve their goals. Software agents are widely employed to greater extent for the realization of various complex application systems such as Electronic commerce, Information retrieval and Virtual corporations. For example in an online shopping system the softwareagent help the internet users to find services that are related to the one they just used. Though agent oriented systems has progressive growth, there is a lack in its uptake as there is no proper testing mechanism for testing anagentbased system .
The general purpose of conventional generating units maintenance scheduling is to determine the time and sequence of generating unit outages that represent the best assignment of these equipment to maintenance periods in a given time horizon, in such way that the reliability of the system is maximized . Regardless of the scenario, deregulated or not, the main objective of generating units maintenance scheduling remains the same, and perhaps this problem has been one of the most complex ones in power systems due to the astronomical number of possible schedules, as well as to the complexity of the constraints and objectives involved . During the last years, different approaches have been used to solve this combinatorial problem. They have shown that, in general, the most attractive and promising optimization approach applied to this problem was integer programming, even though the heuristic procedures could achieve good results with accomplished CPU time. One major research ,  has been proposed in order to ensure reliability. The ISO could consider several heuristic procedures, as well as some reliability indices such as LOLP (Loss of Load Probability) or EENS (Expected Energy Not Supplied) to assess each schedule. This work also sought to provide an answer to the question of how a certain chronological sequence of schedule could cause an impact on the system reliability, considering economic implications. In this sense, it is interesting to search the best assignment of units. This requires a visit plenty of unit combinations and an assessment on the reliability impact of the system on each of the unit combinations. The great contribution of this work was to build a suitable composite reliability tool considering several heuristic procedures in order to maximize reliability and revenue, covering the lack of researches in this area.
In this paper, the PPHPC model is completely specified, and an exhaustive analysis of the respective simulation outputs is performed. Regarding the latter, after determining the mean and variance of the several FMs, we opted to study their distributional properties instead of proceeding with the classical analysis suggested by simulation output analysis literature (i.e., the establishment of CIs.). This approach has a number of practical uses. For example, if we were to estimate CIs for FMs drawn from the steady-state mean, we could use t-distribution CIs with some confidence, as these FMs display an approximately normal distribution. If we did the same for FMs drawn from the steady-state sample standard deviation, the Willink (2005) CI would be preferable, as it accounts for the skewness displayed by these FMs. Estimating CIs without a good understanding of the underlying distribution can be misleading, especially if the distribution is multimodal. The approach taken here is also useful for comparing different PPHPC implementations. If we were to compare max or min-based FMs, which seem to follow approximately normal distributions, parametric tests such as the t-test would most likely produce valid conclusions. On the other hand, if we compare arg max or arg min-based FMs, non-parametric tests, such as the Mann-Whitney U test (Gibbons & Chakraborti, 2011), would be more adequate, as these FMs do not usually follow a normal distribution.
As indicated by Saraiva Nogueira and Armando Valladares (2012), “... our knowledge about the seas and oceans is rather limited. Future challenges, like climate change, cannot be assessed, nor can reasonable mitigation measures be taken without oceanographic data. Ocean services are expected to deliver on the promise of continuous forecasts of the state of the ocean environment. The knowledge needed for such fore- casting capability cannot be obtained from research activities which are restricted in duration and project oriented. Scien- tific research has to be based on, and supplemented by, op- erational long-term ocean observations to provide analyses, predictions and other information products”. From this per- spective the RADMED monitoring programme is being a key tool for confronting these challenges. Right now, it is a pro- gramme not as exhaustive as the CalCOFI programme, but it provides regular and structured information that needs to be complemented by other programmes running in the Mediter- ranean Sea (including, but not only, MOOSE, Med-SVP, and ARGO floats). The purpose of the RADMED team is to co- ordinate with those programmes and make use of the new monitoring technologies to implement a monitoring service that is able to meet the MSFD requirements.
With all the details presented in the map, it is possible to evaluate the categories and attributions made by the community of Wikipedia as a connected whole. This allows many different entry points into the main theme and builds a new layer of appreciation. For example, by coloring the different philosophical traditions, one can discuss the exchanges of influence between them, and even elaborate on the adequacy of such philosophical divisions. On the other hand, it is also possible to reassess the links of influence of Wikipedia themselves, that were built mostly from a top-down perspective, that is, based on critical, disciplinary appreciation, and now can be evaluated from a different point of view. For example, Adam Hogan (2015), in his blog Design and Analytics (2012), brings up an interesting discussion about this network: judging by the centrality and the size of the node of Hegel, he would probably be the most influential philosopher in history, which seems curious, considering the fundamental place ancient greek philosophers like Plato have in western philosophy. So, is he really king? Moreover, did the editors of Wikipedia themselves have any previous consideration of how this aggregated result would look like when they went, point by point, defining the influences of each philosopher? Probably not. It was only visible… well, by visualization, by assembling a visual context to these scattered bits of information.
The global software development industry has now become more matured and complex. The industry is making use of newer tools and approaches of software development. The challenge then lies in accurately modeling and predicting the software development effort, and then create project development schedule. This work employs a neural network (NN) approach and a multiple regression modeling approach to model and predict the software development effort based on an available real life dataset which is prepared by Lopez-Martin et al. [1, 2]. A comparison between results obtained by both the approaches is presented. It is concluded that NN is able to successfully model the complex, non-linear relationship between a large number of effort drivers and the software maintenance effort, with results closely matching the effort estimated by experts.
Functionalized tri-calcium phosphate, is a smart calcium phosphate system that controls the delivery of calcium and phosphate ions to the teeth, works synergistically with fluoride to improve performance. Since the structure of TCP is similar to HA, once the functionalized calcium ions are released, they readily interact with the tooth surface and subsurface. While other calcium phosphate additives may require an acidic pH, TCP can offer optimal benefits when delivered in a neutral pH environment. This TCP ingredient can enhance mineralization and help build a high quality, acid- resistant mineral without the need for high levels of calcium.
A core algorithm was created for people’s actions, accepting multiple models for each life compo- nents. For now, the fertility part is not based on the decisions of a person but is more of a random effect with some control. It has three different models to choose from, ranging from a random model to a fully controlled model. The mortality part is also not modeled as a personal decision, but neither should it fully be, as the time of death of a person is partly decision-based (how healthy a person lives, for example) but is also random (a healthy person may have a car accident and die, for exam- ple). This component also has three different models to choose from, ranging from a random model to a fully controlled model. The migration model, however, is based on a person’s decision capacity. There are many approaches on how to model the decision of a person; here, a gain maximization al- gorithm was used. This algorithm uses the person’s preferences (for health, safety, ...), its life history (employment status and remuneration) and exogenous factors (age, economic scenario, ...).
The building consist of 5 floor, each floor are installed with 3 a-Si (triple junction amorphous PV) modules on each array. When energy rating (kWh/kWp) were measures, it was revealed that an annual difference by 16% and a monthly difference between by 10% – 24% occur. Annual energy rating of first array is 1072 kWh/ kWp, Annual energy rating of second array 885 kWh/kWp. Minimum electricity output was measured in November due to radiation is least compared with a whole year. If declination angle decreases, energy rating reduces. Shading effect has a significant effect as ambient temperature, the direction of the building and the PV’s tilt angle. In , analyses of an open channel BIPV system were carried out which system specifications 1.5 m in height, 0.7 in depth, 0.1 in length. 3D calculation and natural convection analysis were pro- vided between two wall for 3 different configurations (uniform, staggered, non-uniform) given in Fig. 7. Results show that alternative input should be opened between hot and cold zone. In this way, heat transfer with convection and chimney effect with mass flow rate increases.
In the current organisational environments, where downsizing, reengineering, restructuring and high rates of organisational turnover are common, enterprises are beginning to find that it is easy to loose a vital element of their intellectual property: shared corporate knowledge. Knowledge gained during the normal execution of daily tasks is often lost in the dynamic business environment. Those who stay in the company are often unaware of key resources that are ‘hidden’ in the heterogeneous knowledge repositories [Dzbor et al. 2000], but that depend on a particular expertise to be found and used; technical knowledge is often an example of how dependent information systems are on the expertise of the people using them.
Ramchurn et al. in  introduce the smart grid key components: demand-side manage- ment, electric vehicles, virtual power plants, the emergence of prosumers, and self-healing networks. Demand-side management is directly related to this dissertation research, in Section 5 we describe a mechanism to manage the power consumption by the demand-side. In the paper Putting the ’smarts’ into the smart grid: a grand challenge for artificial intelligence the authors argue about the smart grid new challenges in artiﬁcial intelligence and the smart grid technologies that require algo- rithms and mechanisms to solve problems involving a large number of heterogeneous actors. There is a tendency in the developed world to decrease the use of fossil fuels and move to a low-carbon economy to guarantee energy security and mitigate the impact of energy use on the environment. This transition requires a fundamental re-thinking and re-engineering of the smart grid which must be able to make eﬃcient use of renewable energy sources and support the additional electricity required by new actors like electrical vehicles. Many of the issues within the smart grid can be found in other domains such as water distribution, transportation, and telecommunication networks. So, there is potential to transfer technologies across these domains and also address smart grid issues that aﬀect the sustainability of such systems.
Abstract - This work addresses an architecture that not only allows mining data sources of multiple formats, but also allows extensibility for future formats. It also presents a methodology for unification of association rules prior to incorporation in a BKB. It will use an existing agent development tool to establish a multi-agentbased framework and define the communications between those agents. The framework will be designed to accept a request from PESKI for one of three possible data mining operations. These operations are discussed in more detail in Chapter 4. Once the system accepts the request, it will determine which data sources can fulfill the request and tasks the agents responsible for those sources to begin data mining. Once results have been obtained, they will be unified to eliminate redundant or conflicting results and returned to PESKI. Two new data source formats will be introduced into the PESKI schema and they will be mined for association rules, the results unified into a unique list of results, and then passed back to PESKI for incorporation into the BKB. The process will be automated and will use existing message passing formats to communicate with PESKI.