Sintetizar em hardware uma dessas soluções ainda é uma tarefa extremamente custosa (DOPPA; ROSCA; BOGDAN, 2019). Seguindo essa linha, sintetizar todas essas possibilidades e atender ao time-to-market é uma tarefa inviável. Dentre as técnicas usadas para possibilitar a redução do tempo de exploração do espaço de projeto (DSE – do inglês Design **Space** **Exploration**) para sistemas computacionais está a predição do custo em área, potência e desempenho das possíveis soluções no espaço de projeto. Modelos de predição incluem técnicas de aprendizado de máquina e reduzem a quantidade de sínteses e simulações necessária, pois possibilitam a estimativa ou classificação de novos valores a partir de um conjunto de dados de treinamento (O’NEAL; BRISK, 2018). Essa técnica consolidada é encontrada em diversos trabalhos na literatura (ÏPEK et al., 2006; OZISIKYILMAZ; MEMIK; CHOUDHARY, 2008) e possibilita descobrir rapidamente soluções próximas do ótimo, satisfazendo aos múltiplos objetivos do sistema (KIM; DOPPA; PANDE, 2018).

Mostrar mais
20 Ler mais

Current design flow. At present moment, design of many RTLS is based solely on prototyping. Developers typically select a communication technology and several algorithms for location estimation. By performing series of real-world tests (e.g., by trying different movement patterns, or changing transmit power) they intend to calibrate the prototype and come up with an optimal configuration. The choice of technology and localization method, therefore, is done empirically at the very beginning of development and afterwards designers stick to that decision. If the latter is incorrect, it might entail the development of a new prototype. Designers do not have sufficient tools to try out different alternatives without violating time-to-market or running out of budget because prototyping might be both expensive and time-consuming. At the same time, real- world experiments are of crucial importance, because it is impossible to consider and predict all possible factors and events in a model. What we are trying to emphasize is that an intermediate stage between specification and prototyping is required, which would highly increase the design **space** **exploration** quality and designer's confidence on their decisions. Such stage could include building models of different system configurations, simulating them and analyzing results using some verification techniques. The latter could be used in a feedback loop and affect the design in the sense of adjusting several model parameters for further simulations.

Mostrar mais
17 Ler mais

NoC-based MPSoCs can provide massive computing power on a single chip, achieving hundreds of billions of operations per second by employing dozens of processing cores that communicate over a packet-switched network at a rate that exceeds 100 Tbps. Such devices can support the convergence of several appliances (e.g. HDTV, multiple wireless communication standards, media players, gaming) due to their comparatively high performance, flexibility and power efficiency. Due to the vast design **space** alternatives, evaluating the NoC-based MPSoCs at lower abstraction levels does not provide the required support to find out the most efficient NoC architecture considering the performance constraints (e.g. latency, power) of a given application at early design process stages. Thus, NoC-based MPSoCs design requires simple and accurate high level models in order to achieve precise performance results, of each design alternative, in an acceptable design time. In this context, the present Thesis has two main contributions: (i) development of abstract NoC models, providing accurate performance evaluation; and (ii) integration of the proposed models into a model-based design flow, allowing the design **space** **exploration** of NoC-based MPSoCs at early stages of the design flow.

Mostrar mais
99 Ler mais

These results, provided by Realpaver in less than one second on a Core 2 Duo 3 GHz with 8 Gb of RAM, can be considered as deductions made from the initial data. Indeed, the difficulty in finding a valid parameter set is that steady state values must be within their intervals. These two approaches facilitate the manual search of an initial parameter **space**. The analytical solution of the equations indicates, for each parameter, the variables impacted by a change in the parameter value. We use this information to define an order for checking variables, such that there are parameters allowing us to adjust a variable value but not changing the value of the already correct variables. Consequently, to find an initial valid parameter set, one has only to tune some parameters to fix the value of the first variable, then tune some other parameters to fix the value of the second variable, and so on. If the value of one variable cannot be set in its interval, one has to step back and change the value of parameters used to fix the value of the previous variable. The variable order we use is: IRP, TfR1 (through the adjustment of k IRP→TfR1 ,

Mostrar mais
16 Ler mais

However, the specific FFM requirements vary over the ap- plication range: high Galois field dimensions are used in cryptography, whereas for FFMs in channel coding the fo- cus lies on hi[r]

7 Ler mais

The performance and costs caused by these NoCs strongly depend on NoC-specific parameters like topology (Neuen- hahn et al., 2006), data word length, routing-algorithm and many more (Bjerregaard and Mahadevan, 2006). These NoC-parameters span a huge design-**space** for NoCs. As each application or application-class has different commu- nication requirements the appropriate NoC-parameter com- bination that fulfills these requirements at minimal costs has to be found (Ahonen et al., 2005; Benini, 2006).

6 Ler mais

increases, new commercial opportunities in **space** arise, which may influence the military use of **space**, and as **space** power gains added value. In fact, history tells us that explored and occupied frontiers by human beings have (and are), at some point been subject to conflict. The question here is whether **space** will escape this trend. The research behind the present article started from the presupposition that **space** is a challenging field for Europe as it aims to grow from a political and economic viewpoint. However, like Director General António Rodotá of ESA stated in November 2001, “We are still at the dawn of the **space** age”. **Space** **exploration** – although it was part of collective imagination in the past – is effectively just over 50 years old, and is a field that is still unknown in many ways.

Mostrar mais
17 Ler mais

Works in [14-20] discuss the estimation of state **space** parameters using different sampling techniques, but none of them alert about this inadequacy. Although any state **space** **exploration** algorithm, saying depth-first, breath-first, random walks, etc., might seem inadequate because they are dependent on the system ‟s progression, in fact the algorithms provide a systematic sampling, a type of probability sampling, however no details on the sampling procedure are given.

6 Ler mais

This payload was flown on the CRS-3 launch of the **Space** **Exploration** Technologies (SpaceX) Dragon spacecraft, on a Falcon 9 v1.1 rocket which successfully launched April 18, 2014. After six days, the **space** plates were removed from the MELFI (Minus Eighty Lab Freezer For ISS) and partially thawed. However, technical problems arose and the **space** plates were placed back into the MELFI until December 8, 2014. At that time, all three plates were thawed and the OD600 of each well (3×3 grid) was measured at time 0 (60 min after removal from the freezer) and then every 24 h for four days. Measurements were performed in a Molecular Devices SpectraMax M5e plate reader which was modified for integration onto the ISS. On these same days, equivalent measurements of the ground plates were taken in a Molecular Devices SpectraMax M5e plate reader at UC Davis. The exception to this was the initial partial thawing, which was not replicated with the ground plates since the amount of thaw was not reported by the astronauts. After the experiment, the ground plates were placed back at −80 ◦ C and the **space** plates were placed back into

Mostrar mais
11 Ler mais

When trying to make sense of this logic and reinterpret it as a resour- ce for the ‘being-**space**’ relationship, the connection between Nishida’s arguments and an ontological account of **space** can be smoothly accompli- shed. For that, I will divide his discourse into two arguments: the dialectical and the grammatical argument. In the dialectical argument what is at stake is the idea that things exist in relation to one another, and something that is exists against what is not: we are humans only by opposition to what is not-human; or, I am myself only in opposition to others that are not myself. But, in Nishida’s core logic, there must be a basho where this opposition is reflected and that sets it up. So, I am human against what is not-human, and this relation is sustained and made possible through a basho that is spa- ce: I can only be through the medium of **space**. Once again, the core-logic tells us that ‘everything that exists, exists in something else’: meaning that being that exists, exist in **space**. Here, we are already forming a judgment.

Mostrar mais
17 Ler mais

Keywords: prime fields, p -adic number fields, adele ring, p -adic Banach spaces, adelic Banach space, p -adic operators, adelic operators.. Mathematics Subject Classification: 05E15, 11[r]

37 Ler mais

According to Proposition 6.1, we can compute the dimension of the tangent **space** to the character variety Hom (Γ, G) //G at an equivalence class of good representations by computing the dimension of the corresponding first cohomology group. In order to obtain the latter one, it will be important to remember concepts and examples related with group cohomology that were given in section 1.3. In particular the example 1.27, where we analysed the first cohomology groups of a finitely generated group, Γ, with coefficients in the Γ-module g Ad ρ , H

137 Ler mais

dimensionalidade da exploração vocacional, permitindo um aprofundamento teórico e empírico da problemática da natureza da exploração vocacional (Taveira, 2000). A análise da consistência interna de cada um dos factores neste estudo apresentou valores satisfatórios que variam entre 0,63 a 0,82 nos alunos de 9º ano e de 0,57 a 0,89, nos alunos de 12º ano. O valor médio global do coeficiente alfa é de 0,71 para o 9º ano e de 0,88 para o 12º ano (Taveira, 2000, p. 262). Ainda tendo em consideração a população escolar, Rowold e Staufenbiel (2010) realizaram dois estudos para testar a dimensionalidade da versão alemã do Carrer **Exploration** Survey (CES-G; Rowold & Staufenbiel, 2010) com alunos de escolas secundárias e de universidades alemãs. Num primeiro estudo utilizaram uma versão traduzida do questionário e para o segundo estudo acrescentaram 7 itens que se adequavam mais à população em causa. A análise factorial confirmatória dos dados suportou a hipótese do modelo de 16 factores, sugerindo os autores que este instrumento é uma boa medida da exploração vocacional (Rowold & Staufenbiel, 2010). A consistência interna das escalas do primeiro estudo varia entre 0,55 e 0,88 e a do segundo estudo varia entre 0,72 e 0,84, corroborando a adequabilidade da segunda versão do questionário para a população alemã (CES-G; Rowold & Staufenbiel, 2010). Apesar de, pela primeira vez, se assistir ao desenvolvimento de investigação específica e sistemática sobre o processo de exploração vocacional, com o aparecimento do modelo de exploração vocacional de Stumpf e col. (1983), são poucos os estudos que têm por objectivo o desenvolvimento da própria escala.

Mostrar mais
49 Ler mais

Furthermore, our results provide a non-trivial and unified approach to the exis- tence problem of large games. In fact, they are designed to meet the following two criteria. First, the conditions that we show to be equivalent are stated in such a way that they can be falsified. This goal is obtained by requiring the action **space** to be merely a separable metric **space**, rather than compact. Second, by particularizing the action **space** to be compact, we obtain as a corollary to our results the classical existence theorems of Schmeidler (1973), Mas-Colell (1984), Khan and Sun (1995b) and Khan and Sun (1999).

Mostrar mais
34 Ler mais

Em jeito de conclusão é interessante perceber que as empresas da ITV em termos da exploitation, dão a mesma importância ao seu produto, como aos mercados onde estão inseridos, e aplicam o conhecimento adquirido na manutenção dos mesmos. No que diz respeito à **exploration**, as empresas da ITV dão mais importância à procura de novos mercados, assim como novos clientes e recolhem informação sobre esses mesmos pontos, ao invés não apostam tanto no fortalecimento das capacidades de inovação em áreas que não têm experiência, assim como têm relutância em implementar novos processos produtivos, ou adquirido competências e processos de desenvolvimento de produtos novos. Isto pressupõe que sejam premissas que estão altamente ligadas com custos, e se analisarmos que podemos obter informações informações de novos mercados e potenciais clientes, junto de organismos públicos como o AICEP, os custos neste âmbito ficam mais reduzidos, assim como, no caso de a empresa querer apostar com a sua presença em feiras internacionais, existem programas QREN que financiam a sua presença até 45%. Quando falamos na questão de inovação de produto, já são circunstâncias diferentes, porque pressupõem disponibilidade financeira imediata, tempo de investigação para a inovação, perceber o que se pretende fazer e depois de ter o produto novo idealizado, é necessário implementa-lo nos mercados, e apostar na sua divulgação, esperando um retorno, que será sempre a médio longo prazo.

Mostrar mais
92 Ler mais

Euler introduced the constant breadth curves in 1778 [7]. He considered these special curves in the plane. Later, many geometers have shown increased interest in the properties of plane convex curves. Struik published a brief review of the most important publications on this subject [20]. Also, Ball [1], Barbier [2], Blaschke [3, 4] and Mellish [14] investigated the properties of plane curves of constant breadth. A **space** curve of constant breadth was obtained by Fujiwara by taking a closed curve whose normal plane at a point P has only one more point Q in common with the curve, and for which the distance d(P, Q) is constant [8].

Mostrar mais
9 Ler mais

As I mentioned before, I put the most stress at the entrance from the public **space** as it will be in my opinion the mostly used entrance. After analyzing the composition of the square I appreciated the high wall surrounding it. The wall, besides the tectonics of the leisure **space**, has a very big impact on sensing the **space**. The wall is the base for triangular form of coffee house and restaurant, it’s a support of the steel stairs going up from the square, and it’s the element to pass through while going into the triangular Soda Coffee House. The composition of surrounding walls is even emphasized by constructing and urban wall on the Art Gallery, the wall which closes the composition to make a finished masterpiece.

Mostrar mais
55 Ler mais

Ranking method has the advantage that its results can be exploited in analyses involving correlations between variables by nonparametric methods. The method has the disadvantage of losing some information on the two leveling of different characteristics: once the ranks are assigned for each feature and the ranks are assigned for total scores. By leveling, the distances between the two levels of characteristics, in different **space** units, are replaced by an arithmetic progression with ratio one.

12 Ler mais

Tridimensional **space** can be represented in two dimensions in many ways in. One, perspective, has been accepted as the best technique for creating realistic images since the Renaissance. This lead to the hegemony of the model of vision on which it is based: the camera obscura. Crary (1992) discusses the differences between this type of repre- sentation and others, that were more popular before the 1500s. He relates the hegemony of perspective with that of a specific subjectivity. On the one hand, the camera obscura performs an act of “individuation” in which the observer, isolated and autonomous, is detached from the world he sees. “At the same time, another related and equally decisive function of the camera obscura was to sunder the act of seeing from the physical body of the observer, to decorporealise vision”. (Crary, 1992: 245) Thus, the Cartesian subjectivity that is at the core of the idea of immersion is strong in images in perspective and in other technical images produced such as photography, cinema and television.

Mostrar mais
17 Ler mais

In recent years Arvind’s Group at MIT has shown the usefulness of term rewrit- ing theory for the specification of processor architectures. In their approach processors specified by term rewriting systems are translated into a standard hardware description language for simulation purposes. In this work we present our current investigation on the use of Rewriting-Logic, which is a more powerful theoretical framework than pure rewriting, for specification, **exploration** and verification of processor architectures at a higher abstraction level. We adopt the rewriting-logic environment ELAN to specify, explore and verify architectures without the need to resort to the details of hardware description languages for simulation purposes. Our investigation shows that simulation at rewriting-logic level may provide useful insights to guide the architectural design.

Mostrar mais
15 Ler mais