The most used biological processes can be divided into aerobic and anaerobic. The most common anaerobic system is upflow anaerobic sludge blanket (UASB). Among aerobic processes, the most used are activated sludge, aerated lagoons, sequencing batch reactors and rotating biological contactors. According to Tchobanoglous et al. (1993), high COD concentrations favor anaerobic treatment, because aerobic becomes expensive, whereas high sulfate concentrations may limit it, due to production of odors from the reduction of sulfate to sulfide. One great advantage of anaerobic over aerobic treatment is the energy surplus associated with methane production, lack of aeration equipment and limited sludge production. Advantages of aerobic biologicalsystems over anaerobic systems are low cost of construction, flexibility in use, ability to change rapidly to varying components within the leachate, quick start up times, lack of maintenance and ease in automation (Mehmood et al., 2009).
Part of this process is largely facilitated by the program STRUCTURE HARVESTER (Earl & vonHoldt, 2012), which automates the parsing of STRUCTURE runs and uses that information to perform an “Evanno test” on the data, which uses some heuristics to predict which value of ‘K’ makes the most biological sense regarding the analysed data. Although this is a very convenient automation, it still relies on manual user intervention to input the data from STRUCTURE, does not provide assistance with the plotting of the “meanQ” values and only works for the software STRUCTURE. Other programs, such as FASTSTRUCTURE include the necessary software to perform these tests, and even to plot the “meanQ” values, but still require manual intervention between these steps. MavericK goes further and presents the full posterior distribution for ‘K’ using the “Thermodynamic Integration” test as an automatic last step of the analysis and even recommends some scripts for drawing “meanQ” plots, but this last step also requires human intervention.
Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well–known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automationof this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis–lysogeny switch in the bacteriophage λ .
The evolution of manufacturing systems and the emergence of decentralised control require flexibility at various levels of their lifecycle. New emerging methods, such as multi-agent and service-oriented systems are major research topics in the sense of revitalizing the traditional production procedures. This paper takes an overview of the service- oriented approach in terms of platform and engineering tools, from the perspective ofautomation and production systems. From the basic foundation to the more complex interactions, service-oriented architectures and its implementation in form of web services provide diverse and quality proved features that are welcome to different states of the production systems’ life-cycle. Key elements are the concepts of modelling and collaboration, which enhance the automatic binding and synchronisation of individual low-value services to more complex and meaningful structures. Such interactions can be specified by Petri nets, a mathematically well founded tool with features that enhance towards the modelling ofsystems. The right application of different methodologies together should motivate the development of service-oriented manufacturing systems that embrace the vision of collaborative automation.
Lipolytic activity was tested using culture medium proposed by Kouker and Jaeger (1987), with modifications (0.5% peptone, 0.1% yeast extract, 0.4% NaCl, 1.5% agar and 0.1% rhodamine B solution). After sterilization, 2.5% sterile olive oil was added as a carbon source. All isolates were inoculated by the spot method, and plates incubated at 25 °C and 30 °C for up to 72 h. The growth of the isolates was checked daily and the lipid hydrolysis confirmed by the orange fluorescence on the colonies when exposed to UV light at 350 nm. Isolates with faster growth were reinoculated in the culture medium deprived of yeast extract, to induce olive oil uptake and thus to select the best enzyme-producing strains. Incubation took place in the same conditions as described before. The same assay was performed with canola oil and grape seed oil as a carbon source.
In the last years, requirements and overall complexity in the areas of utilization of CPS has increased dramatically. The later is correlated to the pursuit for ﬂexibility, customization, interaction and provision of new functionalities in industrial settings. Currently, there is a technology push into complexity, with everything getting smart, e.g., phones, houses, cars, aircrafts, factories, cities etc. As an example, the functionalities and consequently complexity associat- ed can be seen by a system comparison e.g. of the early past century plane controls in Charles Lindbergh’s ‘‘Spirit of St. Louis’’ and a modern Airbus A380 aircraft. Although both have the common goal of ﬂying, this could be realized by monitoring a couple of sensors in the ‘‘Spirit of St. Louis’’, which nowadays translates to thousands of sensors in A380, which is impossible to assess for humans. However, with the automation and creation of high-level key performance indicators from the sensor data, this can still stay manageable at high level, although not all interworking are directly seen nor understood by its operators. Although complexity may have its advantages, hiding it from the end-users and managing it, results to grand challenges. As an example, in our cars, the complexity is hidden from the driver, as she/he just needs to handle a limited number of controls to operate the system, without being exposed to its complex networks of sensors and actuators distributed all through the mechanic infrastructure.
Membrane computing provides a hierarchical structure for molecular computation in which embraces play an essential role for objects to pass in a regulated fashion within and across the membranes. Since its inception in 1998, much research on theoretical aspects has been done to establish membrane computing computational power (Paun, 1998; 2000). In recent time, the research interest is more concentrated in applying the membrane computing formalism to solve real world problems. One of the attempts is using membrane computing capabilities in modeling biologicalsystems. Studies ofbiologicalsystems such as cells in silico have greatly reduced the need for expensive and prolonged lab experiments. Construction of a biological system into the membrane computing model provides a better understanding of the dynamics and functionality of the system. The biological description of membrane computing formalism has been utilized to characterize and preserve the elements in biologicalsystems. The research in this line shows that biological system can be modeled better using
drum sequencers, cam timers, and closed-loop controllers. The process for updating such facilities for the yearly model change-over was very expensive and time consuming as electricians have to individually rewire each and every relay. Digital computers, being general-purpose programmable devices, were applied for the control of industrial processes. Early computers required specialist programmers and essential operating environmental control for temperature, cleanliness, and power quality. The general-purpose computer used for process control required protecting the computer from the plant floor conditions. An industrial control computer possess several attributes: it would tolerate the shop-floor environment, it would not require years of training to use, and it would permit its operation to be monitored, it would support discrete (bit-form) input and output in an easily extensible manner. The response time of any computer system must be fast enough to be useful for control; the required speed varying according to the nature of the process .
In the case of DPWS, besides the typical request and response mechanism, it includes a specification concerning eventing, namely WS-Eventing . It can be used to make subscriptions by the supervisory system to different services in the system and get events e.g. about data values whenever these are produced. Filtering of events is also possible, since the Petri net formalism applied in this work considers the association of events to the enabling-firing conditions of transitions. Moreover, dynamic discovery is used whenever devices are connected to the service bus and announce themselves with “Hello” events and “Bye” events in case they are removed. However, infrastructure support can also be present for dynamic discovery and messaging exchange support.
culture was able to achieve a high phosphorus removal efficiency (>99%), storing poly- p while consuming amino acids anaerobically. MAR-FISH confirmed that Tetrasphaera were responsible for amino acid consumption while Ca. Accumulibacter likely survived on fermentation products. Tetrasphaera performed the majority of the P removal in this culture, and batch tests showed that the metabolism of some carbon sources could actually lead to anaerobic P uptake through energy generated by fermentation of glucose and amino acids. This anaerobic P uptake may lead to lower net P release to C uptake ratios and reduce the P needed to be removed aerobically in WWTPs. Intracellular metabolites such as amino acids, sugars, VFAs and small amines were observed as storage products, which may serve as energy sources in the aerobic phase. The culture showed a preference towards the uptake of certain amino acids, while the intracellular amino acids that were accumulated during the anaerobic phase accounted for 20% of the total amino acids consumed. Evidence of the urea cycle was found, which could be involved in reducing the intracellular nitrogen content. This study improves our understanding of how phosphorus is removed in EBPR systems and can enable novel process optimisation strategies.
Chitooligosaccharides (COS) are derivatives of chitosan, typi- cally differing from native chitosan by having a molecular weight of 10 kDa or less. They have recently attracted attention as poten- tial therapeutic agents, owing to a variety of reported positive bio- logical activities (Eaton, Fernandes, Pereira, Pintado, & Malcata, 2008; Kim & Rajapakse, 2005), and have shown potential as scav- enging agents, due to their ability to abstract hydrogen atoms from free radicals (Huang, Mendis, & Kim, 2005; Vårum, Ottøy, & Smidsrød, 1994). This ability has been reported as directly corre- lated with their structural properties – namely that the amino and hydroxyl groups can react with unstable free radicals to form stable macromolecule radicals (Je, Park, & Kim, 2004; Kim & Rajap- akse, 2005). Furthermore, their ready uptake by cells and the intes- tine, in addition with their claimed low toxicity, make chitooligosaccharides very promising compounds for use as natu- ral antioxidants (Chae, Jang, & Nah, 2005; Fernandes et al., 2008). Although several studies on chitosans and COS already reported antioxidant activity, so far most reports are based on methods which measure the capacity of a molecule to reduce a stable artiﬁ- cial free radical: scavenging of DPPH or ABTS radicals, carbon cen- Abbreviations: ROS, reactive oxygen species; Hb, hemoglobin; AAPH, 2,2 0 -azobis
STUDYING THE INTERACTIONS BETWEEN NANOPARTICLES AND BIOLOGICALSYSTEMS. Although in recent years there has been an increasing amount of literature on nanotechnology and their clinical applications, it is still scarce a deep understanding of the interactions at the molecular levels between nanoparticles and cells. Studies demonstrating the underlying mechanisms of nanoparticles endocytosis, intracellular trafficking, and cellular processing are imperative to understand better how cells interact with those materials and their possible undesired effects, e.g. nanotoxicity. The rising awareness concerning nanoparticles applications and its interactions with the cellular environment is part of the new research field called Nanotoxicology. The cumulative knowledge in nanotoxicology will allow us to foresee toxic effects, establish regulations and limits for nanoparticles applications. In this work, we discuss the theoretical concepts about studying endocytosis and intracellular trafficking of nanoparticles. The nanoparticles-cell interactions are a multi-step process, which can be divided into nanoparticles’ internalization, intracellular processing and triggering effects of nanomaterials on eukaryotic cells. Finally, we discuss the main techniques used to study this process: flow cytometry, use of endocytosis inhibitors and confocal microscopy.
The research done by IDF estimated the prevalence of diabetes mellitus for each coun- try for the years 2007 and 2025, and it used data provided from 215 countries and terri- tories, which were gathered on in dierent geographical basis, in a total of the seven IDF regions: Africa (AFR), Eastern Mediterranean and Middle East (EMME), Europe (EUR), North America (NA), South and Central America (SACA), South-East Asia (SEA), and the Western Pacic (WP). Figure 1.2 shows one of the results of the survey: the WP Region will have the highest number of people with diabetes, approximately 100 million, representing approximately an increase of 50% when compared with 2007. This demon- strates that diabetes is not a problem just for developing countries, but a health problem for all humanity. On the other hand, AFR has the smallest number of people with dia- betes, but the expectation is growth of slightly more than 70%. Figure 1.3 shows a best estimate for each country in the world in 2025. (IDF, 2009).
biological properties; properties such as total organic carbon, microbial carbon, and microbial quotient may gradually increase, which is important for a sustainable system, while the metabolic quotient property may decrease with pasture age (Muniz et al., 2011). In reforested areas, biological properties have proven to be more sensitive in detecting the impact of different forest plantations on chemical and physical properties (Silva et al., 2009). Increased C and organic matter contents are beneficial and a consequence of plant residue accumulation on the soil surface, and this may contribute to soil aggregation (Salton et al., 2005). These iCLFs are designed to have a different microbial structure from that of degraded pastures due to the greater diversity of their microbial community. However, the structure of this community is rarely investigated in integrated systems such as the iCLF, especially in the tropics (Lisboa et al., 2014). Thus, little information is available on the effects of increased species diversity on soil biological quality.
Basal respiration is a widely-used parameter in quantifying microbial activity and indirectly, the quality of the soil. It represents the oxidation of organic matter into CO 2 (ANDERSON; DOMSCH, 1978), having a close relationship with the abiotic conditions of the soil, including humidity, temperature and aeration, which are influenced by the rate of decomposition of the organic matter (SEVERINO et al., 2004). Organic matter, besides contributing to reducing the negative impacts that may arise from the intensive and successive management of cultivated areas (CARDOSO et al., 2009 CUNHA et al., 2011), also increases the biodiversity of the soil (FREITAS et al., 2011).
After the application source code is completed, an application's iteration has been con- cluded; what is left is a potentially deployable Product which is ready and awaiting testing. The Quality Assurance (QA) Department of Primavera's Software Factory has the objective of ensuring that all Primavera products comply with the standards of Primavera's quality patterns and that they meet the customers' expectations. To ensure the functional quality of the software, two different but complementary types of testing are conducted: firstly, man- ual testing that is representative of the large part of the time and effort invested on testing, secondly, a set of very limited automatic tests of about 1.000.000(one Million) tests in each month.
ABSTRACT - The aim of this work was to evaluate biological indicators of soils used under the systemsof organic farming, agroforestry and pasture in the south western part of the Amazon region of Brazil. The experiment was carried out at the Seridó Ecological Site, located in Rio Branco, in the state of Acre, Brazil. The experimental design was completely randomised, with five treatments (land-use systems) and six replications, with each replication consisting of four single samples. The systemsof land use evaluated were: 1) native forest (control); 2) agroforestry (AFS); 3) pasture; 4) intercropped passion fruit, maize, cassava, pineapple and forage peanut; and 5) intercropped passion fruit, maize, cassava, pineapple and tropical kudzu. It was found that organic farming systems intercropped with kudzu resulted in smaller losses of C-CO 2 through edaphic respiration, and a greater accumulation of microbial biomass carbon. The intercropped organic farming system which included the forage peanut resulted in a greater loss than retention of carbon in the soil at a depth of 5-10 cm. Soil under the agroforestry system was equivalent to the soil of the control (native forest) in relation to the release and retention of carbon through biological activity. At a depth of 5-10 cm, soils under pasture presented similar microbial biomass to those under organic cultivation intercropped with tropical kudzu. However, at that depth, soils under pasture presented greater microbial biomass than those under natural forest, agroforestry or organic cultivation intercropped with forage peanut. Key words: Organic production. Carbon dynamics. Ground cover.
This dissertation concludes a "work in progress" that lasts for the past three years - my master degree; finally! I began my studies in the field of Computer Science in 2000, and I was for- tunate enough to get a very nice job that introduced me and allowed me to start studying the field of home automation. Home automation is a relatively new field of study and, as such, it as several approaches to address the same problems. Two of the major problems found in the home automation world are the diversity of protocols (it is difficult to develop software applications to support multiple protocols), and the lack of user-friendly control points. In this research work, we address these problems by abstracting the home automation network into concepts which allows the scalability of the system and are well known to the final user, but we went a little bit further. The models defined on this research work were designed to be used as a framework which allows others to use it as a basis to develop new concepts and applications. The concept of context-awareness is introduced to give a little bit more intelli- gence to the system. Basically, we present the kernel for a future framework that, hopefully, will change the way people develop home automation software and, consequently, the way people interact with there own houses and the way houses interact with the human being.
The Robots Revolution is on the rise. After the revolution that Customer Relationship Management (CRM) and Enterprise Resourcing Planning (EPR) created, a new term is going to revolutionize the workplace: Robotic Process Automation (RPA) (Anagnoste, 2017). This type ofautomation aims to automate business processes with the goal of improving efficiency while cutting costs (Cewe, Koch, & Mertens, 2017), by reducing the time humans spend dealing with Information Systems (IS), doing repetitive tasks such as typing, extracting, coping and moving huge amounts of data from one system to another system, meaning that these structured and manual tasks can be done by a robot, so that the workers can dedicate their time and effort to tasks that add more value (Aguirre & Rodriguez, 2017). Robots execute repetitive tasks by using Graphical User Interface (GUI) automation adaptors instead of Application Programming Interfaces (APIs) (as used in traditional automation) (Cewe et al., 2017), without changing the Information Technology (IT) infrastructure (Mindfields, 2015). This means that the robot does repetitive tasks that used to be done by humans faster and cost efficient.
Electromechanical relays were designed to use mechanical forces produced as result of the electromagnetic interaction between currents and fluxes such as in a motor. These early relays were, until recently, the primary source of protection due to its replacement cost and complexity, but in new installations or at major upgrades they have been replaced by solid- state relays. The development of the solid-state relays, with a higher level of performance and more sophisticated characteristics, was only possible with the introduction of semiconductors. These relays are more power-efficient but have lower tolerance to adverse conditions than electromechanical relays. Their characteristics can be easily tuned and their settings are more repeatable and precise than previous type relays. Computer relays, based on rugged high-performance microprocessors, started to replace existing electromechanical and solid-state relays. These relays generally referred to as intelligent electronic devices, present great advantages over previous designs since they are provided with communication capabilities, and have both the ability to self-diagnose, and to self-adapt in real-time to variable system conditions.