Interface - User

Top PDF Interface - User:

No More Keyword Search or FAQ: Innovative Ontology and Agent Based Dynamic User Interface

No More Keyword Search or FAQ: Innovative Ontology and Agent Based Dynamic User Interface

Figure 1 shows the basic architecture of the dynamic user interface. There are four basic components within the architecture: user’s browser, interface agent, ontology and knowledge base which stores solutions for users’ problems. Here, a series of interrelated vocabularies which allows user to identify and describe their problems on the user interface, is stored and organized in a structural hierarchy within the ontology. Modern web technology is used as a means to deliver the interface through the Internet and can appear on the browser to facilitate the interaction with the user and deliver user request for resolution. On the other hand, software agent technology is used to facilitate user communication. Software agent is a computer program that behaves like human and is capable of autonomous actions in pursuit of specific goal [8,11]. To deliver the dynamic user interface, user simply clicks on the target Uniform Resource Locator (URL). Subsequently, the interface agent that possesses communication capability will deliver a dynamic user interface to the browser, based on the information stored in the ontology. The dynamic and interactive communication capabilities of the interface agent help users to identify and present their problems. Firstly the interface agent interacts with the user by asking the user to select a problem type on the user interface. Based on the input, the interface agent will generate the next category of possible problem scenarios from the ontology. This type of interaction will continue until the agent has gathered sufficient information. When the problem is completely described on the interface, the knowledge base will then deliver a related solution to the user. Since each set of problem descriptions is linked to a particular solution in the knowledge base, it guarantees the return of the most appropriate solution.
Mostrar mais

6 Ler mais

Graphical User Interface for educational content programming with social robots activities and how teachers may perceive it

Graphical User Interface for educational content programming with social robots activities and how teachers may perceive it

This paper is an extension of the published work entitled "Graphical User Interface for Adaptive Human-Robot Interaction Design in Educational Activities Creation", in the XXIX Simpósio Brasileiro de Informática na Educação (SBIE). The first version presented the Graphical User Interface (GUI) components and information that would be more helpful to the teachers, such as dialogue, content, and student and evaluation databases. This extended version presents also some system’s implementation decisions and interfaces functionalities that would be more relevant to programmers, such as the vision subsystem and their methods, the adaptation algorithm and the interaction interface. It also deliveries an unprecedented study with regular teachers from the elementary school. They participate in a presentation about the proposed system and they were inquired about their opinion of how much this type of solution can be helpful to them in after- classes exercises.
Mostrar mais

17 Ler mais

Avatar modeling: a telepresence study with natural user interface

Avatar modeling: a telepresence study with natural user interface

Nowadays, the virtual environments are emerging in multimedia applications and video games. In these kinds of systems involving virtual environments, there is usually a virtual representation of the user, which is his/her avatar. The avatar is the link between the user and the virtual environment. Nevertheless, this bond created between the user and the avatar lacks on empirical knowledge. There are some characteristics about the avatar that might enhance that relationship, such as the interface between the user and the virtual environment, or even the visual aspect, morphology and the dynamic of the avatar. All of these characteristics may have an important role in making the user feel more comfortable when he/she is represented by the avatar in the virtual world. In other words, if these characteristics were to be exhaustively studied in order to have an almost “perfect” avatar, it would be easier for the user to feel the avatar as him/herself. This would lead to a different specific avatar for everyone, but would have identical characteristics for all the avatars, such as being scaled-to-user, visually resembled, dynamic and with real-time response. When the users can be completely immersed in the virtual environment, the experience in virtual environments would seem as real as the real world.
Mostrar mais

69 Ler mais

Object Modeling for User-Centered Development and User Interface Design: The Wisdom Approach

Object Modeling for User-Centered Development and User Interface Design: The Wisdom Approach

As consistently mentioned throughout the previous chapter, there is common agreement over the clear advantage to separate the user interface from the functional core in conceptual and implementation architectural models for interactive system. In section III.2.3 we discussed that such conceptual (not necessarily physical) separation leverages on user-interface knowledge in providing design assistance (evaluation, exploration) and run-time services (user interface management, help systems, etc.). This approach should support implementation of internal functionality independent of the user interface fostering reuse, maintainability and multi-user interface development, an increasingly important requirement with the advent of multiple information appliances [Norman, 1998]. The Wisdom interaction model is an external view of the system from the user interface perspective. This model structures the user interface identifying the different elements that compose the dialogue and presentation structure of the system (look and feel), and how they relate to the domain specific information in the functional core. The different elements that compose the Wisdom user-interface architecture are described in detail in section IV.3.2. To specify the interaction model in Wisdom we use the corresponding UML notational extensions described in section IV.4.3. The information in the interaction model corresponds to a technical activity carried out by the user-interface development team. The interaction model is mainly built during the analysis workflow of the elaboration phase. The interaction model influences the presentation and dialogue models devised in the construction workflows.
Mostrar mais

285 Ler mais

A graphical user interface for policy composition in CIM-SPL

A graphical user interface for policy composition in CIM-SPL

CIM-SPL is a declarative policy specification language proposed inside DMTF. SPL policies allow the specification of rules to govern the behavior of a system using a PBM approach. However, SPL requires thorough knowledge of the language syntax as well as full understanding of the management scenario and its available management features. This paper describes a graphical CIM-SPL editor application and the supporting policy edition metaphor. A graphical composition process of SPL policies is proposed, based on the use of drag and drop operations of the policy component items in a graphical interface. The editor includes policy creation wizards that guide the user in the policy specification process, in order to alleviate network administrators from the difficulties associated with the intricacies of SPL language. Additionally, a text-based SPL edition tool is provided as a complement for experienced SPL language operators.
Mostrar mais

7 Ler mais

Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

The aim of programming code generation in the PVSio-web framework is producing a module that implements the user interface of a device, which can be compiled and linked into the device software without any particular assumptions on its architecture. In this way, the user interface module can be used without forcing design choices on the rest of the software. In our approach, the generated module contains a set of C functions. The main ones are, for each Emucharts trigger: (i) a permission function, to check if the trigger event is permitted, i.e., whether it is associated with any transition from the current state, and (ii) a transition function that, according to the current state, updates it, provided that the guard condition of an outgoing transition holds. The code includes logically redundant tests (assert macros) to improve robustness.
Mostrar mais

14 Ler mais

Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements

Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements

Abstract: We developed a framework for Physiologically Attentive User Interfaces, to reduce the interaction gap between humans and machines in life critical robot teleoperations. Our system utilizes emotional state awareness capabilities of psychophysiology and classifies three emotional states (Resting, Stress, and Workload) by analysing physiological data along with facial expression and eye movement analysis. This emotional state estimation is then used to create a dynamic interface that updates in real time with respect to user’s emotional state. The results of a preliminary evaluation of the developed emotional state classifier for robot teleoperation are presented, along with its future possibilities are discussed.
Mostrar mais

10 Ler mais

Extending User Interface Design Patterns with accessibility recommendations to guide mobile developers

Extending User Interface Design Patterns with accessibility recommendations to guide mobile developers

With the dissemination of mobile devices and the migration of activities that were once executed only on desktop computers to smartphones and tablets, the concerns related to accessibility in this environments have increased. Accessibility barriers can directly affect the access of information: a user with difficulties on accessing important information can become frustrated, absorbing the content with difficulty or even not being able to assimilate any information. The difficulties encountered by a great variety of users on mobile devices adds new challenges to the task of creating accessible applications for everyone. However, the accessibility impact of mobile interface design patterns in the life of disabled people has not been widely addressed in academic works. At the same time, the community of mobile designers and developers has made significant advances in identifying accessibility issues with design patterns on mobile interfaces, reporting these findings in virtual spaces of discussions as forums and blogs. Against this scenario, this project proposes recommendations that will help to mitigate or eliminate the accessibility barriers created by Interface Design Patterns on mobile applications. These recommendations were created based on two main studies. The first study was an accessibility evaluation based on interaction design patterns in an e-learning application containing 21 participants without disabilities that collected the emotional response to seven design patterns and video analysis using communicability metrics. While the second study aimed to explore the experiences and knowledge of professionals through an ethnographic study in 18 virtual communities of mobile design and development with the goal of identifying issues on the accessibility of Android mobile interface design patterns. This work presents two main contributions. It presents an approach to support the employment of virtual ethnography studies in software engineering as means to observe software development practice based on the information available in online communities. It also proposes 22 recommendations of 11 Interface Design Patterns and 11 Cross-section elements of mobile applications with the goal of improving the overall accessibility of mobile devices.
Mostrar mais

142 Ler mais

Graphical user interface (GUI) for design of passenger car suspension system using random road profile

Graphical user interface (GUI) for design of passenger car suspension system using random road profile

Study and design of car suspension systems are an important part of automotive engineering. Passive suspension systems have wide use in passenger vehicles but they cannot effectively sustain vehicle comfort under different road profiles without the help of a computer design tool for the analysis of these systems. The systems with more than two degrees of freedom required more analytical and computational efforts. In order to help engineers to likely solve these kinds of systems, numerically, they have to master a suitable programming environment. An essential design and analysis of suspension systems can be handled using software like Matlab. MATLAB (Matrix Laboratory) is a commercially suitable software package developed in the seventies by Cleve Moler, [1], for numerical computations, especially matrix manipulations. Now days, it is a high-level technical computing language and interactive environment, for algorithm buildings, results visualization, results analysis, and numerical analyses, widely used in academic and in sectors [2]. The analysis capabilities, flexibilities, reliability, and powerful graphics make Matlab the most used software package by engineers and scientists [3]. Matlab represent an interactive tool with a lot of good and accurate built-in mathematical functions. The functions provide good solutions to many mathematical and engineering problems such as matrix manipulation, static and dynamic systems, etc. In this work, the use of Matlab as a programming tool to design and build a graphical user interface (GUI) for the analysis of car suspension system can be carried out using MATLAB/GUIDE environment [4].
Mostrar mais

4 Ler mais

Evaluating the Representation of User Interface Elements in Feature Models: an Empirical Study

Evaluating the Representation of User Interface Elements in Feature Models: an Empirical Study

This extension provide a feature element to model features that represent aspects of user interface [11]. In UI-Odyssey-Fex, all elements need to represent features in the original notation were implemented and, in addition to this, a new feature category was included, called User Interface. An example of the FM designed in accordance with the extension is presented in Figure 2. The figure shows the same SPL represented in Section II-A, but including the User Interface features category. In this case, the SPL engineer could represent the UI elements that are shared among the Financial products derived from the SPL. For instance, the Transfer functional feature has a communication link with the User Interface Transfer System feature.
Mostrar mais

6 Ler mais

Redesign of the user interface of Us'Em mobile aplication

Redesign of the user interface of Us'Em mobile aplication

Us’em devices (bracelets: one orange and other blue) measure the movements of the arm-hand of the user (the patient). This information is, then, sent to a central server where it is stored and processed. Then, it can be sent to Us’em mobile app, to a therapist’s host (for instance, an online management platform to use in the clinic) and, according to mobile app’s settings, to a c ar er’s host (that may be a computer or mobile app). The system requires, thus, an internet connection for data transfer. This connectivity will enable a more detailed reviewing of patient’s progress, accessing and monitor patient’s data. On this definition of Us’em system , an example of a scenario could be as described next. A post stroke patient with upper limbs disorders is discharged to home. His/her rehabilitation plan involves training his/her impaired arm-hand when eating. So, at home, she/he uses Us’em devices and track, through the mobile app, the ratio of his/her moves. This data is, then, provided to his/her rehabilitation therapist through another interface, who analyses this information. He/she becomes more aware of patient’s activity and progress and may adjust rehabilitation plan or activities. In the next clinical meeting or therapy session, they can discuss about it and the patient may be provided by feedback on h is/her performance tracked by Us’em.
Mostrar mais

264 Ler mais

LIANA Model Integration System - architecture, user interface design and application in MOIRA DSS

LIANA Model Integration System - architecture, user interface design and application in MOIRA DSS

Developers of each model affect the appearance of the GUI of EDSS when decided how to split the model data in different files. For example the input or output data for the particular model can be presented in GUI as one long table (containing all of the parameters) or as number of the short tables each related to one of the parameters. Developers of the model will make the decision about it. Such participa- tion helps to construct the user interface quickly and utilise the broad experience of the developers in a scientific subject and their experience with the communication with users of the particular model.
Mostrar mais

8 Ler mais

User-Centered Design of the Interface Prototype of a Business Intelligence Mobile Application

User-Centered Design of the Interface Prototype of a Business Intelligence Mobile Application

Acknowledging the primary similarities and differences between both operating systems was a very important step towards bypassing the differences. Regarding screen sizes, the choice fell on using iPhone’s 320x480 and iPad’s 1024x768. About the “Back navigation”, this wasn’t really a problem because both iOS “Back” and Android “Up” buttons have the same function. Concerning tab navigation, the choice feel on iOS tab bar mainly because by looking back at the user’s needs that were going to be subject of interface design, one could group them into four different sections (KPIs, Notifications, Item file and Shortcuts for other applications) so the iOS approach seemed better. Regarding the changing views functionality, both versions were used. iOS segmented controls were used in some sections because it gave a bigger visibility to the options available. On the other hand, Android drop-down menu was used in other sections where the number of available options exceeded three and they didn’t need much visibility. Figure 3.6 illustrates this difference.
Mostrar mais

138 Ler mais

Dynamic graphical user interface generation for web-based public display applications

Dynamic graphical user interface generation for web-based public display applications

In iCrafter (Ponnekanti et al., 2001), services register in an Interface Manager and send it a service description in an XML-based language called Service Description Language (SDL) – which lists the operations supported by the service, in a similar way to ISL. Clients obtain a list of available services from the Interface Manager and can ask for the user interface of a specific service (or a combination of services). When asked for a user interface, the Interface Manager will search for a suitable interface generator: it first searches for a generator for that specific service, then for a generator for that service interface, and finally for the service-independent generator. This allows the creation of custom user interfaces for a service, if the developer chooses to, but guarantees that a suitable user interface can always be presented to the user. The interface generator uses a template to generate code in a user interface language supported by the controller device (iCrafter supports HTML, VoiceXML, SUIML and MoDAL), so controller devices are assumed to be capable of running a user interface interpreter that can then render the received user interface code.
Mostrar mais

9 Ler mais

HFBUIT: Design Aid Tools For A Human Factor Based User Interface Design

HFBUIT: Design Aid Tools For A Human Factor Based User Interface Design

Exploring user interface design and development problems is the core of current HCI research. Although there have been considerable advancements in computer technology, human factors considerations are still lacking. This results in the user frequently becoming confused or frustrated when trying to interact with the software. Designers should utilize the knowledge about the user to configure the interface for different users, i.e. each user may have different skills, level of experience, or cognitive and physical abilities. This paper offers a five phase framework of a tool that might help designers to design a Human Factors based user interface.
Mostrar mais

4 Ler mais

Hacker design and graphic user interface customizations

Hacker design and graphic user interface customizations

3 Citação original: “In previous versions of the Macintosh OS, the look of user interface items was built into the definition functions for the in- dividual items. These were stored in resources, and the standard ones were in the System file. But programmers could write their own to cus- tomize the user interface. Window appearances were defined by WDEF resources, menus by MDEFs and controls by CDEFs. Giving the desktop a new look required rewriting all of those separate pieces of code”. 4 Citação original: “I have no first-hand information about why Apple backed away from its original plans in this area, but numerous sources say Steve Jobs axed the alternative themes after Mac OS 8.5 had already reached "final candidate." Some people contend that he was afraid Apple would be blamed for compatibility and stability problems that may occur when certain applications are run under the non-Platinum appearances. The most common interpretation, however, is that he simply didn't like the High-Tech and Gizmo looks”.
Mostrar mais

26 Ler mais

GC4S: A bioinformatics-oriented Java software library of reusable graphical user interface components

GC4S: A bioinformatics-oriented Java software library of reusable graphical user interface components

The development of AIBench applications always relies on the straightforward implementa- tion of three different but complementary types of programmable components: operations, data types and views. However, our long-term experience in the field has proved that GUI (a key feature in any deployment) should be improved further. As previously stated, AIBench automatically constructs the user interface by generating ( i) application menus based on declared operations and ( ii) input dialogs for obtaining required operation parameters. How- ever, the view components, which must be developed in Java Swing, are the responsibility of the programmer. In this regard, we noticed that these kinds of components were being copied and pasted between code bases, including non-AIBench based applications. In doing so, devel- opers were missing an opportunity for reusing GUI elements by sharing them between code bases and saving time in future projects.
Mostrar mais

19 Ler mais

Formalizing markup languages for user interface

Formalizing markup languages for user interface

EAI 4 [Lin00]is a scientific group concerned with application integration which alerts for the importance of user interfaces in the overall integration process. However, due to the non existence of a standard to describe and manipulate interaction data, a great opportunity is being lost. It is well-known that user-interface “engineering” was deferred to second level in previous computing models [PE02b]. So, there is a great opportunity for better interface design support, both at operation and evaluation levels. In dealing with standard methods and technologies with some concerns for sharing and reusing, caution, rules and concepts are required. If we wish to share information (of any type) across different information systems it is important to have these assumptions [BCMS02] in mind and, in particular, a standard ontology to rely on.
Mostrar mais

316 Ler mais

Pattern based user interface generation

Pattern based user interface generation

Most of today’s software interfaces are designed with the assumption that they are going to be used by an able-bodied person, who is using a typical set of input and output devices, who has typical perceptual and cognitive abilities, and who is sitting in a stable, warm environment. Any deviation from this pattern requires a new design. In [7] is argued that automatic personalized generation of interfaces is a feasible and scalable solution for this challenge. Supple can automatically generate interfaces adapted to a persons devices, tasks, preferences and abilities at run-time. It is not intended to replace user interface designers, instead it offers an alternative user interface for those people whose devices, tasks, preferences and abilities are not sufficiently addressed by the original designs. Support for users with special needs is often forgotten by interface designers. When this problem is addressed there are three popular patterns that are usually followed, manual redesign of the interface, limited customization support, or by supplying an external as- sistive technology. The first approach is clearly not scalable, new devices constantly enter the market, and people’s abilities or preferences both differ greatly and often cannot be anticipated in advance. Second, today’s customization approaches typically only support changes to the organization of tool bars and menus and cosmetic changes to other parts of the interface. Furthermore, even when given the opportunity, most people do not customize their applications. Finally, assistive technologies, while they often enable com- puter access for people who would otherwise not have it, also have limitations. They’re impractical for users with temporal impairments, they do not adapt to people whose abil- ities change over time and they’re often abandoned by people who need them because of factors like cost, complexity, configuration, and the need for ongoing maintenance. In contrast with this approach, Supple generates personalized interfaces to suit the par- ticular contexts of individual users. In order to be able to generate these personalized interfaces, Supple makes three important contributions:
Mostrar mais

112 Ler mais

INTELLIGENT USER INTERFACE IN FUZZY ENVIRONMENT

INTELLIGENT USER INTERFACE IN FUZZY ENVIRONMENT

Human-Computer Interaction with the traditional User Interface is done using a specified in advance script dialog “menu”, mainly based on human intellect and unproductive use of navigation. This approach doesn’t lead to making qualitative decision in control systems, where the situations and processes cannot be structured in advance. Any dynamic changes in the controlled business process (as example, in organizational unit of the information fuzzy control system) make it necessary to modify the script dialogue in User Interface. This circumstance leads to a redesign of the components of the User Interface and of the entire control system. In the Intelligent User Interface, where the dialog situations are unknown in advance, fuzzy structured and artificial intelligence is crucial, the redesign described above is impossible. To solve this and other problems, we propose the data, information and knowledge based technology of Smart/ Intelligent User Interface (IUI) design, which interacts with users and systems in natural and other languages, utilizing the principles of Situational Control and Fuzzy Logic theories, Artificial Intelligence, Linguistics, Knowledge Base technologies and others. The proposed technology of IUI design is defined by multi-agents of a) Situational Control and of data, information and knowledge, b) modelling of Fuzzy Logic Inference, c) Generalization, Representation and Explanation of knowledge, c) Planning and Decision- making, d) Dialog Control, e) Reasoning and Systems Thinking, f) Fuzzy Control of organizational unit in real-time, fuzzy conditions, heterogeneous domains, and g) multi-lingual communication under uncertainty and in Fuzzy Environment.
Mostrar mais

16 Ler mais

Show all 3549 documents...