Multiple theoretical approaches like perceptual control theory , theory ofevent coding , and the free energy principle  have emphasized and discussed the role ofcontrol in explaining perception, cognition, and action. Perceptual control theory argues that the notion ofcontrol is central in explaining how we interact with our environment [15,17]. They suggest that the aspect of an action controlled by the agent is its perceived outcome rather than the action itself. This control is achieved by matching the actual perceptual outcome with the expected perceptual consequence, which can be evaluated at different levels. The theory ofevent coding  argues that control is achieved in two stages; firstly, at the action selection stage and secondly, when the action outcome is evaluated. Both the theory ofevent coding and perceptual control theory suggest that exhibited control influences the nature of perceptual input and action planning for any particular event and the interplay between perceptual events and action events is mediated by the exhibited control. The match between the sensory prediction and the actual outcome also forms the basis of a family of models explaining the mechanism behind senseofagency . Such comparator models differ in terms of the type of comparisons made (e.g. the two step comparator model ), or in terms of the nature ofcontrol mechanism (feed-back, e.g., , versus feed-forward, e.g., ). Several studies have indicated that the manipulation ofcontrol influences participant’s subjective experience and executive processes [15,24–26]. Moore et al  manipulated the statistical contingency between a key press and occurrence of a subsequent tone and found a subjective expansion in perceived elapsed time between the key press and the tone with increase in control (higher contingency). Studies on the perception of time between an action and its consequent effect indicate a close link between the experienced control and the SoA . Disrupting the activity in pre-motor cortex using TMS decreased the experience ofcontrol and resulted in depletion of SoA . Desantis et al.  have
No matter what method is used, researches in this area paid more attention to the relationship between senseofagency and senseof ownership rather than how senseofagency and senseof ownership can affect our higher cogni- tion, such as emotional experiences (Guterstam, Abdulk- arim, & Ehrsson, 2015; Christensen, Yoshie, Di, & Haggard, 2016). Though there have been already some studies using this paradigm of virtual hand illusion to investigate the re- lationship between senseof ownership and the affective resonance in facing with different kinds of emotional events. Yuan and Steed designed an experiment to measure skin conductance responses (SCR) to what they considered threats to a virtual hand and found similar elevations as with rubber hands. Participants were asked to play games in virtual environment by operating the hand of an avatar. During the game, a virtual lamp would fall on the virtual hand operated by the participants at some point, which in- duced a reliable increase in SCR. They placed the hand with an arrow as the control condition which produced sig- nificantly less increase in SCR. Taken together, they sug- gested that people emotionally “care” about what they perceive as being a part of their body but not, or not so much, about what they perceive as belonging to the body of someone else (Yuan & Steed, 2010). However, Ma and Hommel thought that two aspects of Yuan and Steed’s study might help explaining this seeming discrepancy. For one, they did not use the standard synchronization tech- nique to induce different degrees of body ownership. For another, the threatening event merely consisted of a virtual lamp falling on the virtual hand. Even though the contact between the lamp and the hand was clearly visible to the participant, it is difficult to judge from the visual display how much pain. Ma and Hommel adopted the standard synchronization technique to induce the illusion of owner- ship and replaced the falling of a virtual with a knife. Their findings suggested that ownership was stronger if the vir- tual hand moved synchronously with the participant’s own hand, but this effect was independent from whether the hand was impacted or threatened. In other words, in the face of threats, affective resonance was independent of syn- chronicity (Ma & Hommel, 2013).
However, recent studies have demonstrated cases in which consistency between internal motor signals and actual sensory feedback is unnecessary for the development of a senseofagency. For example, Wegner and colleagues developed an interesting paradigm that induced a false senseofagency . In their experiment, the participants watched themselves in a mirror without moving, while a paired participant, the “helper,” stood immediately behind the partici- pant, with his or her arms extended outward, and performed a series of movements. In the con- dition in which participants heard instructions previewing each movement, they reported a (false) senseofcontrol over the helpers’ hands. This phenomenon of vicarious agency empha- sized the role of external cues in the senseofagency. To account for external factors in the senseofagency, Synofzik and colleagues proposed a two-step model of the senseofagency, which included a perceptual level, involving the feeling ofagency, and an explicit conceptual level, involving the judgment ofagency . Synofzik et al. suggested that the senseofagency is a combination of these two types of process. In ambiguous situations in particular, external cues could play a more important role in the judgment ofagency . In a recent electrophysi- ology study, sensory attenuation of an early potential (N1) was observed for a learned action- feedback association, whereas attenuation of a later potential (P3a) was observed for agency judgment . The authors suggested that the detection of unpredicted information was re- flected in early sensory attenuation processes, but the judgment ofagency was drawn from more cognitive mechanisms . A similar study found that associations between different components ofevent-related potentials and participants’ agency judgments differed according to the reliability of the association between action and feedback . These findings are also consistent with the cue integration theory ofagency, which suggests that the reliability of both internal and external cues determines the extent to which they contribute to the senseofagency . In the present study, we investigated the dominance of different cues involved in the judg- ment ofagency in conditions that differed with respect to the reliability of the action-feedback association. We hypothesized that if the congruence between predicted and actual sensory in- formation was less reliable, experience of a senseofagency would be greatly influenced by task performance, which is an external cue.
This article examines the effect of prolonged time of holding at the temperature of 620 0 C on the processes of secondary phase precipitation and mechanical properties of low-alloy cast steel with an addition of vanadium subjected to two variants of heat treatment, i.e. U:1150 0 C+H:950 0 C+O:620 0 C and H:950 0 C+O:620 0 C. To determine an impact of the applied heat treatment operations, testing of mechanical properties and microstructural examinations of the cast steel with 0,21 and 0,27%C were carried out.
This paper discusses the case when an organizational field is challenged by the institutionalization of practices by illegal competitors such as pirates. It aims to assess the effectiveness of public policies meant to counter these unfair practices, based on an enhanced Institutional model. The model is conceived based on two parts, both meant to assault the pirate challenge, where one is turned to deinstitutionalize its operational practices, while the other aims at its legitimization support. We have conducted a case study with interviews with executives of different governmental agencies that have established their perception of the situation and the actions undertaken in response. The analysis was performed based on a model prepared to respond to exogenous challenges dropping the isomorphic-passive institutional perspective. This analysis shows that activities have repeated much of the usual procedures the state had already undertaken previously, such as repression and judicial suits, which by themselves are incapable of performing the task of effacing or dismantling piracy. This in a way confirms organizations’ tendency to repeat confirmed practices in most situations. The paper also shows that society is prepared to absorb paradoxical practices living side by side.
According to Cai and Sun (2004) proportional reasoning is an important road for the development of algebraic reasoning and for the students’ understanding of the meaning of functions. However, Proportional Reasoning cannot be adopted as a synonym for proportionality but as a necessary condition for the comprehension of contexts and mathematics applications that involve proportion/proportionality (LESH; POST; BEHR, 1988). The mobilization of Proportional Reasoning may happen during the development of strategies and freer procedures, without being attached to formulas or algebraic resources such as the rule of three when identifying multiplicative relations among numerical quantities in a problem and when selecting, organizing and making explicit quantities that covariate among themselves as well as those that remain constant, among others. The work with proportionality should not be restricted to the use of algebraic resources in a mechanical fashion, depending on the memorization of rules and formulas.
This work addresses human communicative agency. The competence to instantiate a set of communicative procedures is taken as a component of human rationality that meets a key role of regulating our cognitive environment (a set of mental states, centrally assumptions and emotions), in order to maximize practical goals and sociability. The linguistic-inferential approach offered here for such scope of rationality covers two levels: cognitive and practical, treated hierarchically, according to the assumed regulations. We consider that the cognitive apparatus (the inferential, representational and metarepresentational basis), along with the linguistic apparatus (computation plus interpretable expressions), allows us to operate from the most basic levels of linguistic processing to higher levels (where agents consider assumptions about other minds). In the practical domain, we consider that the linguistic and communicative behavior is used by agents to affect e tal states a d othe s courses of action, thus being in the basis of our social cognition. In this scenario, we not only interact with agents, but we also create a social agency via language. We, therefore, consider a communicative agency framework in which acts are performed within a dialogical structure. The general thesis is that communication requires the use of skills that incorporate practical rationality parameters. This regulation would be dependent on a cognitive and practical structure ofagency in which human cognition represents three types of agents: individuals, group members and groups (collectives or representatives). Each of these levels presents characteristic features of communicative agency. In all of them, however, there is the possibility of disagreement among agents, cognitive or practical, in dialogue situations. We illustrate this aspect with a scenario of conflict between agents that are supposed to reach a peace agreement. The illustrative analysis focuses on real negotiation dialogues between group members and representatives of the State of Israel and of Palestine. We observe how practical goals of agents of these types regulate their cognitive and dialogical goals. As a result, we present an alternative proposal to the standard scenario of negotiation, or conflict mediation. As a theoretical benefit, ad hoc pragmatic issues (relevance to the individual qua agent, conflicts between agents) are given prominence and effective treatment. As a practical benefit, the model can be applied to the area of conflict mediation, given the downsizing of a biosocial disposition: our cognitive states are particularly affected by stimuli from a class of agents (artists), with potential effect on individual and collective agencies.
The fact that the 1969 elections took place, once more, under conditions that were far short of the promises made by the leaders of the regime, and were followed by a crackdown directed against opposition figures who had been vocal in them, reinforced the idea that the extinction of the Estado Novo could only occur through violent means. The idea was not alien to the political culture of the Portuguese opposition. Since its Sixth Congress in 1965 the PCP had forged a strategy (‘Road to Victory’) for the overthrow of the dictatorship around the concept of ‘popular national uprising’, which involved a combination of legal and illegal methods to ‘radicalise the masses’, to cause the breakdown of the regime”s security apparatus and make room for a general uprising, based on a coalition of workers, peasants, intellectuals and the more ‘enlightened’ sectors of the petty bourgeoisie and middle classes. In many ways, this approach did not differ from previous party strategies (with the exception of the period known in its official history as the ‘rightist deviation’, situated between the Congress in 1957 and the recapture of the leadership by Cunhal in 1960-61). The new factor was that the PCP now faced a potentially embarrassing challenge from sectors to its left, as outlined in the context of the difficulties experienced by the regime in 1961-62 and various international developments (revolution in Cuba, Sino- Soviet split, anti-imperialist liberation struggles) which had exerted a strong impression on the younger fringes of the opposition 43 . One of the characteristic
Considering the hypothesis of cardiac tamponade, an echocardiogram was requested and revealed pericardial effusion with signs of tamponade. Cardiac puncture under echocardiographic visualization extracted 25 mL of yellowish fluid, similar to parenteral solution. Immediate clinical improvement was noted. The tip of the catheter was repositioned in the superior vena cava and confirmed by radiography. The catheter was maintained in use for another 8 days.
The work was aimed to determine the influence of aluminium in the amount from about 0.6% to about 2.8% on the structure of cast iron treated with cerium mischmetal and subjected to graphitizing modification with 75% ferrosilicon. Four experimental melts were held during the investigation. The charge was composed of the specially prepared grey iron, containing the basic elements within the presumed limits. While determining the desirable quantity of carbon in the charge cast iron, two contradicting conditions were taken into account, i.e. that the purpose is to achieve the nodular cast iron (which means that the relatively large carbon amount would be demanded) and that introducing aluminium to the melt results in the decreased solubility of carbon in cast steel. Taking this into account, it was stated that the quantity of carbon in the charge cast iron should be maintained within the range of 3.2÷3.4%. It has been assumed that the silicon content in the charge material should fall within 0.7÷1.0%, as it was during the former investigations. Manganese content was restricted to 0.1% maximally in order to achieve the desired structure with ferrite fraction as high as possible. It has been also assumed that the content of both sulphur and phosphor should be at the possible lowest level.
between the development of increasingly better materi- als and their clinical employment has greatly increased. Currently, it is no longer enough to demonstrate proof of efficacy and safety, or consequently obtain the re- spected seals of approval, such as CE Mark, to allow access to these new materials.
Secretariat: Afghanistan, South Africa, Germany, Saudi Arabia, Algiers, Antigua and Barbuda, Australia, Austria, Azerbaijan, Bahamas, Bahrein, Belgium, Belize, Benin, Bhutan, BosniaHerzegovina, Brazil, Burkina Faso, Burundi, Cameroon, Canada, Chile, China, Colombia, Congo, Cook Islands, Costa Rica, Ivory Coast, Croatia, Cyprus, Denmark, Djibouti, Domenica, Ecuador, Egypt, El Salvador, United Arab Emirates, Slovenia, Spain, Estonia, Philippines, Finland, France, Gabon, Gambia, Georgia, Ghana, Greece, Granada, Guatemala, Guinea, Guiana, Honduras, Hungary, Iceland, India, Iran, Iraq, Ireland, Italy, Jamaica, Japan, Jordan, Kuwait, Latvia, Lebanon, Libya, Lithuania, Luxembourg, Macedonia, Madagascar, Malaysia, Maldives, Mali, Malta, Mauritania, Mauritius, Mexico, Micronesia, Moldova, Montenegro, Myanmar, New Zealand, Nicaragua, Niger, Nigeria, Norway, Oman, Netherlands, Pakistan, Palau, Panama, Papua New Guinea, Paraguay, Poland, Portugal, Kenya, Kiribati, Kyrgyzstan, United Kingdom of Great Britain and Northern Ireland, Republic of Korea, Check Republic, Democratic Republic of Congo, Russia, St. Lucie, Samoa, San Marino, Senegal, Serbia, Seychelles, Sierra Leoa, Singapore, Sri Lanka, Saint Christopher and Nevis, Syria, Surinam, Swaziland, Sweden, Tanzania, Thailand, Togo, Tonga, Trinidad & Tobago, Tunisia, Turkey, Turkmenistan, Uganda, Ukraine, European Union, Vanuatu, Vietnam, Yemen and Zimbabwe. b Includes the 23 Member States in the Americas that sent reports on the progress of
practice of the home visit, however it has been shown that he has done in greater proportions the assistance activities within the Basic Health Units. In this way, he has abandoned the use of the environment of the visit Household to make the bonds between the families and leaving them without a quality assistance. However, in this environment, the nurse practitioner works with the objective of attending to all the difficulties that may arise related to the health of his patients, taking into account home infrastructure, family relationships, problems that may affect the community, diagnosis of pathologies, Among other problems, always with an approach in the assistance of its clients.
In this paper we introduce a new approach to solve approximately nonlinear non-smooth programming problems which don’t have any limitation upon con- vexity and smoothness of the nonlinear functions. In this approach any given nonlinear function is approximated by a piecewise linear function with controlled error. In this manner, the difference between global solution of the approximated problem and the main problem is less than or equal a desirable upper bound which is shown by ε > 0. Also we represent an efficient algorithm to find global solu- tion of approximated problem. One of the main advantages of our approach is that it can be extended to problems with non-smooth functions by introducing a novel definition of Global Weak Differentiation in the senseof L 1 -norm. The
This paper proposes a discretization procedure, based on an extension of the Taylor series expansion of an arbitrary degree ℓ , which converts a continuous-time LPV model with piecewise constant parameters and a time-varying network-induced de- lay into an equivalent discrete-time LPV system. The accuracy of the discrete-time representation is strongly related to the in- crease of degree ℓ . An event-based sampling of the output asso- ciated to the changes of the time-varying parameters is assumed. Thus, as discussed in , by considering the hypothesis of a time-varying sampling interval that depends on the system parameter measurements, it is possible to treat a broad class of problems, such as engines, manufacturing systems and telerobotic systems. For instance, one can cite an internal combustion engine whose sampling interval is variable and depends on the engine speed . Differently from the discretization procedure proposed in  for uncertain time-invariant systems, the new method pre- sented in this paper considers that the network-induced delay in the continuous-time LPV system can be time-varying. The ob- tained discretized model, with bounds on the rate of variation of the parameters, is described by homogeneous polynomial matri- ces of degree ℓ on the time-varying parameters, which belong to the Cartesian product of simplexes (called a multi-simplex ), plus a norm-bounded term related to the approximation error. The norm-bounded term depends on the degree of Taylor series expan- sion, the sampling time, the network-induced delay, and the origi- nal continuous-time uncertainty domain. Estimates for the bounds of the discretization residual error terms are computed through a grid in the uncertainty domain. To establish a valid discrete- time LPV representation, the time-varying parameters considered in the continuous-time model are supposed piecewise constant and, therefore, the parameters do not change between two consec- utive samples. Considering that the parameters are continuously monitored and have known bounds on their rate of variation, a new transmission is triggered to sample the output and the sched- uled parameters whenever a significant change occurs. Otherwise, a new sample is acquired when a prescribed upper bound on the transmission interval is reached. In this scheme, the assump- tion that, during the sampling interval, the parameter variations are insignificant and can be neglected is valid, as considered in
Hierarchical constellations and MIMO (spatial multi- plexing [12, 13]) are methods to oﬀer multiresolution. The authors of this paper have previously analyzed and evaluated these two forms of multiresolution considering the WCDMA technology in [14–16]. In OFDMA-based networks, the transmission of di ﬀerent fractions of the total set of subcarriers (chunks) depending on the position of the mobiles is another way to o ﬀer multiresolution. Any of these methods is able to provide unequal bit error protection. In any case there are two or more classes of bits with di ﬀerent error protection, to which diﬀerent streams of information can be mapped. Regardless of the channel conditions, a given user always attempts to demodulate both the more protected bits and the other bits that carry the additional resolution. Depending on its position inside the cell more or less blocks with additional resolution will be correctly received by the mobile user. However, the basic quality will be always correctly received independently of the position of any user, within the 95% coverage target.
Since a menu game is a game with incomplete information, we focus on sequential equilibria. As we allow for a Polish space of types and a com- pact metric space of contracts, its extensive form will, typically, be infinite. Despite this technical difficulty, it is easy to define the notion of sequential equilibrium of a menu game. In fact, each principal has a unique informa- tion set and all of the agent’s information sets are singletons. Therefore, an assessment, i.e. a pair of beliefs and strategies, is consistent if beliefs are de- fined using the strategy and Bayes’ rule. Moreover, a consistent assessment is sequentially rational if: (1) the agent optimizes at every possible type and vector of menus and (2) each principal optimizes given the strategy of the other principals and the strategy of the agent. 2
This paper has presented a systematic technique called hierarchical, tree-structured approach for creating dynamic VEs and managing VOs in VE. Through this approach, several crucial VEs for the amino acids group have been created. Users are able to visualize, interact with VOs and walkthrough in the VE. To develop dynamic VEs in the VR system, four modules have been explained in detailed. Quantitative and qualitative data have been collected respectively to examine the use of capacity for each projected VE and to evaluate the usability of VR system using the proposed approach. The findings have shown that the VR system utilizing the Hierarchical, Tree-Structured approach is useful for teaching, easy and fun to use. The respondents have also indicated that they could understand the subject better using the system. Therefore, this approach may breed better understanding among students in their learning. We believe that many applications from various fields can be produced using this study including gaming, manufacture, architecture, medicine, entertainment etc.
Researchers in both communities have used Dynamic Networks of Hybrid Automata (DNHA) to model systems with evolving structure (Deshpande, et al., 1997). Informally, DNHA allow for interacting automata to create and destroy links among themselves and for the creation and destruction of automata. At the level of software implementation, this model has to incorporate the mechanisms by which software modules interact, which are called models of computation, or semantic frameworks. The choice of a model of computation for a specific implementation depends on the properties of the underlying problem domain (Edwards, et al., 1997). The problem of modelling systems with evolving structure is also discussed in (Milner, 1996). He argues that a rich conceptual development that gives a distinct character to the principles and concepts underlying computing, is in progress. In his claim, the distinct and unifying theme encompassing the new developments is what he calls "Information flow - not only the volume and quantity of flow, but the structure of the items which flow and the structure and the controlof the flow itself". This is why Milner developed the Pi-calculus, an idealized modelling programming language and a mathematical model of processes whose interactions change with time (Milner, 1999). The Pi-calculus has been used to model service networks and other systems with evolving structure.
The improvement of productivity in the field of manufacturing requires the automation of tasks and integration of techniques such as CAD and CAM. However, events such as start-up, maintenance, and faults cannot be treated completely automatically. In general, supervision by human operators is necessary, because the knowledge, experience, and skills for working with unexpected situations are very difficult to structure or reproduce. In addition, several reports confirm that totally automated machines are very expensive, and that appropriate combination of automated machines and human supervision is more effective for the operation of manufacturing systems when considering features such as fault- tolerance (Riascos and Miyagi, 2001; Miyagi and Riascos, 2002). Thus, in the automated manufacturing systems context, the balanced automation approach considering an appropriate level of automation is more effective than either totally automated machines or anthropocentric systems (Camarinha-Matos, 1996).