Companies could not provide any justification on the quality of their products to the users and users are left with uncertainties on the standard and quality of the software .There are other issues that relate toquality attributes, among them are priorities and different views of quality among users, developers and managers. It is recognized that views of users, developers and managers are different. A manager is more interested in the overall quality rather than a specific quality characteristic thus requires assigning weight to reflect the business requirements .
Supply chain management (SCM) is an area received great attention in business community. To implement SCM, the coordination and integration of the activities within organization and across the supply chain is necessary. Many firms identify and qualify adequate suppliers to provide the materials and service needed by them . Effectively selecting and evaluating these qualified suppliers and managing their involvement in critical supply chain activities enable manufacturers to achieve the four dimensions of customer satisfaction: competitive pricing, productquality, product variety and delivery service [2,3]. This paper coordinates supplier selection, pricing and inventory decisions and proposes a cooperative game theory approachto evaluate the suppliers for an integrated multi-level supply chain.
Our experience with users from various organisations and sectors indicates that software certification approach which is a higher level of quality assessment is beneficial toensurequality of software. The certification approach can be applied at any time during the operational of the software, thus the continuous quality monitoring will be guaranteed . In addition, results from certification will provide a valuable recognition on the quality of the software organization which can support the buoyancy and trustworthiness of the organization. Fig. 3 demonstrates the processes involve in software certification that relates to the industry. Currently, the syllabus of software engineering does not integrate with these requirements. Therefore, it is important to deliver this skill and knowledge to the students toensure that our students have enough knowledge in producing and managing good qualitysoftware in the real industry.
Over the past two decades, this approach has been largely applied with great success by worldwide companies (e.g. Philips, Boeing or Nokia) to the embedded applications development. In the embedded field, the figures are encouraging, encompassing 10x productivity and quality increases and 60% cuts on costs (Pohl, Böckle, & Linden, 2005). The following Product Line definition is adopted in the paper: A softwareproduct line is a set of software-intensive systems that share a common, managed set of features satisfying the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way (Clements & Northrop, 2001).
16 and producing higher success rates (Cooper, 2006); as “spiral development”, a series of build, test, obtain feedback and revise iterations or loops; as “a holistic approach”, where the number one key to reduce cycle time and promptly getting to market focuses on the core team, an effective cross functional group that remain involved from start to finish; as “metrics, accountability, and continuous improvement”, because as Cooper (2006) argued: “you can’t manage what you don’t measure”. The point is that continuous learning and improvement becomes an integral, routine facet of the development process: every project is executed better than the one before. Top performing companies measure how well individual projects perform by building post-launch and gate reviews into their idea-to-launch processes, and hold teams accountable for delivering promised results against these metrics. “Focus and effective portfolio management”, and “a lean, scalable, and adaptable process”, are also part of the principles of next generation of idea-to- launch process (NexGen Stage-Gate). As Cooper claimed (2006), by moving toward NexGen processes, companies can make Stage-Gate even more effective. The process must be lean, scalable, and adaptable, ensuring that each principle becomes ingrained in the process’ language and method of operation. Success in product innovation requires many behavioral changes, such as discipline; deliberate, fact-based, and transparent decision making; responsible, accountable, effective, and true cross functional teams; continuous improvement and learning from mistakes; and risk taking and risk awareness. The structure and content of Stage-Gate is a vehicle for change: altering how people think, acts, decide, and work together (Cooper, 2006).
Representing a core dimension in organizational learning, learning orientation is conceptualized as the degree to which the firm stresses the value of learning for the long-termbenefits of the firm (Hult and Ferrell, 1997). It reflects the attitude toward learning in an organization—a predilection toward the ability, commitment, necessity, and value of learning (Sujan et al., 1994). Parkhe (1991) argues that interorganizational learning mitigates the impact of diversity in partnering, thus increasing the possibility for cooperation among members. This phenomenon is more apparent in ISAs where learning occurs among partners who are not competitors (Tseng, 1999), as is the case in distribution channels. Conceivably, learning can foster cooperation among partners. Specifically, cooperation was observed to be partially a function of a partner's learning orientation. This linkage was observed in all four samples, but was especially strong in China. As such, to the extent that ISA partners in the U.S., Finland, Poland, and P.R.C. possess a genuine ability and commitment to learn, perceive the necessity of learning, and value learning, the greater will be their efforts at engaging in reciprocity and comparable or complementary coordinated activities in order to accomplish mutual results. Through joint learning, manufacturer/ distribution partners are likely to share their respective companies' expertise with each other (a manifestation of a learning orientation). Such collaborative endeavors seemingly conduce to cooperation among the partners. Perhaps the nature of a particular nation's economic ideology (Ralston et al., 1997) contributed to the finding that the learning orientation/cooperation association was especially strong in China. A possible explanation for this finding could well be attributed to the official policy of the P.R.C. government. It emphasizes learning ‘from theWest’ in order to assist the Chinese economy to progress slowly towards more of a capitalistic approach. Chinese firms may well have adopted this approach of learning ‘from the West,’ as was reported in a study of Sino–Singaporean joint ventures by Tseng (1999). All Chinese partners expressed their intention of learning from their joint venture partnering experience. Thus, Chinese manufacturers, relative to their other three national counterparts, are especially keen to learn about business approaches with which they are not particularly conversant.
models for interpreting the physic of phenomena of damage. On the other side, Failure Reporting, Analysis, Corrective Action System (FRACAS) and Failure Mode and Effect Analysis (FMEA) are irreplaceable to highlight on which criticalities to focus the experimental research, Fault Tree Analysis (FTA) and Reliability Analysis of in-service failure Data (RDA) to foresee the theoretical and authentic behaviour of reliability. Only strictly joining the Design for Experiments techniques (DOE, ANOVA, etc.) to the Design for Quality methodologies (FRACAS, FTA, FMEA, RDA) is possible to manage an integrated approach of Total Quality (TQ) and the reliability moves besides the current limits. Reliability improvements were focus on an largely spread family of air intake manifold, already installed or next to be installed in millions of specimens on different car brands. This system has to be progressively improved in reliability. Only joining theoretical knowledge, simulation analysis and experimental results, it is possible to obtain complete, fundamental information for further researches and manufacturing. TQM tools, if properly utilized, can be extremely useful to interpret reliability of the product, to allow shorter testing procedures and a fast way for removing mistakes that are noticed in basic experimental diagnostic. Acknowledgment: The research presented in this paper was supported by the European Union inside the Tempus Program by the Joint European Project named “DIAUSS - Development and improvement of automotive and urban engineering studies in Serbia ” [JP 516729-2011]
• Does the hardware remain unchanged between implementations? • Can the hardware specifics be removed to another component? • Is the design optimized enough for the next implementation? • Can we parameterize a non-reusable component so that it becomes reusable?
In developing countries it is a common practice to focus on those projects which are strong in business prospective and compromise on the local projects. Some of the highly ranked software houses SQA teams are working on foreign projects while some are working on local projects. SQA teams which are working on foreign projects are getting more benefits than the others and their salaries are usually higher than the other colleagues. The reason behind the difference is the profit margins. That adds more pressure on team leads working on local projects and this results in compromise on quality because they want to spend less time on projects. So that they can work on more projects to show efficiency, in order to make good quality products team leads should make sure to give proper time to every project regardless of the profit margin. It is the responsibility of the higher management to make sure that quality is maintained for every project and gives equal benefits to all teams regardless of the foreign project factor. A very highly ranked software company in Pakistan has decided not to work on local projects recently just because of this issue, which is a better option if you can’t handle the quality issue created because of profit issues.
In this paper we propose a PC middleware that shares the features of that class but goes a step further presenting a few novel features that are particularly adapted to the coordina- tion of teams of mobile robots. Particularly, it implements the distributed shared memory model giving each node seamless access to remote variables as if they were local, abstracting away both distribution and communication, it includes a specific communications protocol based on a reconfigurable and adaptive TDMA approach that minimizes the collisions among team members and further contributes to the network stability in a shared medium with other sources of traffic. It also includes a task manager that provides enhanced synchronization services to tasks executing on a general purpose operating system within each node. Overall, the proposed middleware is affordable since it is based on COTS hardware technologies and open source software, it is dependable in the sense that it is robust to transmission errors and to spurious transmissions and it meets most of the objectives referred in  namely, simplification of the development process, reusability, integration and QoS.
quality of alternatives with regard to price within a category (Jin & Suh, 2005). Organic vegetable products have advantages and technologies related of environmental friendly. Perceived quality is not the actual quality of the brands or products. Rather, it is the consumers’ judgment about an entity’s or a service’s overall excellence or superiority (Aaker, 1991). Sometimes is directly related to the reputation of the firm that manufactures the product (Davis et al. 2003), and viewed as the degree and direction of discrepancy between consumers’ perceptions and expectations (Chen & Chang, 2005). Perceived quality and perception of quality had closer theoretical, perception defined is the mental process that persons go through in selecting, organizing and interpreting information into meaningful patterns (Truong & Yap, 2010:532). It can be interpreted that perception of quality is overall judgment of superior quality of organic products as result from selecting, organizing and interpreting form the alternative product. Measurement of customer perception of quality on organic products is divide on several things, included guarantee (origin, brand, label, variety), organoleptic characteristic (firmness, color, flavor, aroma), and external factors (damage, size, price) (Carrasco et al., 2012:1422). In other side on organic product it measured with environmental concern, environmental consideration, environmental performance, environmental image, and environmental reputation (Chen & Chang, 2013:71).
A number of IT professionals started to work individually on new approaches to develop software. The results of their researches were a set of new development methodologies that have many common features. When they met in 2001 in conference in Utah , they created the so called: Agile Manifesto. These approaches were developed based on the same rule that the best way to verify a system is to deliver working versions to the customer, then update it according to their notes. Agile authors built their methodologies on four principles. First, the main objective is to develop software that satisfies the customers, through continuous delivering of working software, and getting feedback from customers about it. The second principle is accepting changes in requirements at any development stage, so that customers would feel more comfortable with the development process. The third principle is the cooperation between the developers and the customers (business people) on a daily basis throughout the project development. The last principle is developing on a test-driven basis; that is to write test prior to writing code. A test suite is run on the application after any code change .
Data quality and its representativity are the ﬁrst and fore- most points to guarantee the successful building of fore- casting models. The data preprocessing step often impacts the generalization ability of a machine learning algorithm . Data preprocessing usually encompasses missing data imputation, removing or modifying outlier observations, data transformation (often normalization and standardi- zation), and feature engineering. While the ﬁrst two steps are useful to have more accurate and complete sets of data, the third one is typically used to have more uniformly dis- tributed data and to minimize data variability. Finally, the fourth step is used to obtain a new, typically smaller, and more informative dataset. This last step is typically com- posed of feature extraction and feature selection. In the continuation of this section, we describe how these steps were accomplished in this work.
Toensure flexibility and modularity, instead of modifying the RTOS scheduler to extend it to the partitioning concept, the approach followed in the AIR architecture uses one instance of the native RTOS scheduler (as provided by the RTEMS kernel, in the example illustrated in Figure 1) for process priority-based preemptive scheduling inside each partition. This is in conformity with the ARINC 653 specification. No fundamental modification is needed to the functionality of the RTOS process scheduler for its integration in the AIR system. In fact, this two-level hierarchical scheduler approach secures partition and process scheduler decoupling, thus allowing the use of different operating systems in different partitions (e.g. RTEMS, eCos,...).
In the studies we have conducted so far using the developed software, we mainly applied an ap- proach that could be termed ‘parametric’ (e.g. Schrader and Hammerschmidt 1997, Naguib et al. 2001). It consists of measuring the acoustic proper- ties of contours in the time and frequency domains. The specific parameters implemented depend on the particular shape of the contours under study and partly also on the question investigated. Using con- tours that more or less resemble an inverted ‘U’ as an example (Figs. 1 and 2), some of the parameters we implemented will be described below.
SoftwareProduct Line (SPL) is becoming widely adopted in industry due to its capabil- ity of minimizing costs and improving quality of software systems through systematic reuse of software artifacts. An SPL is a set of software systems sharing a common, managed set of features that satisﬁes the speciﬁc needs of a particular market segment. There are several tools to support variability management by modeling features in SPL. However, it is hard for a developer to choose the most appropriate tool due to several options available. In order to support this research, we developed the ViSPLatform. It is a visual platform developed using Data Driven Documents (D3) to present and to favor the understanding of empirical data about SPL tools. We used ViSPLatform in two research studies. First, we present and discuss the ﬁndings from a System- atic Literature Review (SLR) of SPL management tools. Based on the results of the SLR, we later designed and executed an empirical study. This empirical study com- pares and analyzes three SPL management tools, namely SPLOT, FeatureIDE, and pure::variants, based on data from 124 participants that used the analyzed tools. In this study, we performed a four-dimension quantitative and qualitative analysis with respect to common functionalities provided by SPL tools: (i) Feature Model Edition, (ii) Automated Feature Model Analysis, (iii) Product Conﬁguration, and (iv) Feature Model Import/Export. Our aim with the ViSPLatform is to explore diﬀerent data types of our results and to provide visualization support to empirical data of SPL tools.
The reality of university students is in transition. New rules and regulations govern their education. Expectations from industry and society and their own self-image change while emerging digital tools uproot time-tested methods of studying. In Europe, the Bologna process fostering comparability in educational standards and ensuring quality of qualifications is only one visible cornerstone of substantial changes driven by trends such as globalization, mobile digitalization, and the knowledge economy. All stakeholders are being affected: Far from their old image of ivory towers, universities struggle to cope with the mass inrush of students. Still holding on to the Humboldtian model of unity between research and teaching, teachers are torn between their own scientific curiosity within an overwhelming body of knowledge and the demand to deliver innovative approaches to teaching and learning. Students are often overstrained by requirements resembling those of corporate managers but without having the resources and tools that professionals use. The scope and multitude of these transformations explain why educational technologies have struggled to keep up with providing the best potential support to students and professors. All this demonstrates the need for innovative tools and services outlining a new field for innovation in the higher education domain. But how can we support learners in dealing with the transformation in the educational systems and media landscapes? How can we grasp and specify opportunities for innovation in such a transitory field?