61 research outputs found

    Modelling manufacturing systems flexibility.

    Get PDF
    The flexl.bility to change product and processes quickly and economically represents a significant competitive advantage to manufacturing organisations. The rapid rise in global sourcing, has resulted in manufacturers having to offer greater levels of customisation, thus a wider product range is essential to an organisation's competitiveness. The rate at which new products are introduced to the market has also increased, with greatly reduced development times being essential to a new product's market success. Hence there is a strong need to have a flexible manufacturing system such that new products may be introduced rapidly. These drivers have made the need for flexibility within manufacturing systems of great importance. However, there are many types of flexibility and to ensure that organisations correctly target these types of flexibility there is a need to measure fleXlbility, because, measuring fleXlDility allows manufacturers to identify systems which will improve their performance. This research, therefore, has focused on the development measures for two types of flexibility ie. mix fleXlDility and product flexibility. These represent the ability to change between the manufacture of current products i. e. mix flexibility and the ability to introduce new products i.e. product fleXlDility. In order to develop effective measures for these types of fleXlbility a conceptual model has been developed, which represents the current and potential future product range of manufacturing systems. The methodology developed for measuring mix and product flexibility has been successfully applied in two companies. These companies represent diverse manufacturing environments. One operates in high volume chemical manufacture and the other in low to medium volume furniture manufacture. Through applying this methodology in these two companies it has been demonstrated that the methodology is generic and can be used in a wide range of companIes

    The role of information in therapeutic decision-making for adults living with Multiple Sclerosis (MS)

    Get PDF
    Background: People with MS (pwMS) are confronted with 16 therapies. These come with risks that have led to drug withdrawals and changes to prescribing regulations. Patient autonomy is seen as desirable and has challenged the role of the health care professional (HCP). Greater scrutiny of the decisional process is necessary to determine if complex decision-making can be influenced. Methods: i) Attendees to an MS conference (n=105) and a cohort of patients on treatment (n=76) were contacted about their current treatment status and if they had decisional conflict (DC). ii) Prospective study (n=73) of pwMS offered treatment, used instruments to map pwMS through their decision post-consultation. iii) Results informed a film aimed at pwMS (n=1001) and a comparator group without MS (n=148). Participants reviewed the film with the primary aim of measuring understanding of the concepts portrayed. Results: i) Data from the cohorts in methods i-ii (n=254) were compared. The treatment status ‘not satisfied’ was present in 113/254 (44%) and 135/254 (53%) had DC. ii) DC was significantly increased in a treatment naïve subgroup 75% (27/36), p=0.013. iii) In the ‘offered treatment’ study, making a treatment decision took a mean of 29 days (range 0-308). Multivariate regression analysis found those with less confidence in their healthcare decision-making were more likely to have DC (n=72, SURE scale; adjusted R2 0.11, p=0.02; SURE-subscale adjusted R2 0.04 p=0.04; DCG adjusted R2 0.04 p=0.04). iv) The neurologist perceived significantly more consensus during the consultation (39.24±6.54) than pwMS (31.22±10.64; p<0.001). A multivariate regression analysis found that shared decision making (SDM) was associated with lower DC alongside patient engagement (n=67, adjusted R2 0.382; p<0.001). v) There was a high level of film understanding in the total population (85%). vi) A multivariate regression analysis found that ‘education’ was associated with film ‘understanding’ (n=892, adjusted R2 0.023, p=0.000). This meant having less education was associated with increased understanding. A one point increase in education was associated with a .170 reduction in understanding. Conclusions: i) PwMS have high levels of DC when making treatment decisions. ii) Low engagement is associated with increased DC but an HCP consultation with good SDM is associated with lower DC. iii) A film produced a high level of understanding in both MS and non-MS populations. Those less educated had the highest understanding overall.Open Acces

    Evidence of System: A Network Model Case-Study of Seventh Grade Science Assessment Practices from Classrooms to the State Test.

    Full text link
    With science education in the United States entering a period of greater accountability, this study investigated how student learning in science was assessed by educators within one state, asking what systemic assessment approaches existed and how the information from them was used. Conducted during the 2006-2007 school year, this research developed and piloted a network-model case study design that included teachers, principals, administrators, and the state test development process, as well as several state-level professional associations. The data analyzed included observations, interviews, surveys, and both public and private documents. Some data were secondary. This design produced an empirical depiction of practice with a web of related cases. The network model expands on the hierarchical (nested) models often assumed in the growing literature on how information is used in educational contexts by showing multiple ways in which individuals are related through organizational structures. Seven case study teachers, each employing assessment methods largely unique and invisible to others in their schools, illustrate one set of assessment practices. The only alternative to classroom assessments that could be documented was the annual state accountability test. These two assessment species were neither tightly coupled nor distinct. Some teachers were partners in developing state test instruments, and in some cases the annual test could be seen as a school management resource. Boundary practices -- activities where these two systems connected -- were opportunities to identify challenges to policy implementation in science education. The challenges include standards, cognition, vocabulary, and classroom equipment. The boundary practices, along with the web of connections, provide the outlines of potential (and often unrealized) synergistic relationships. This model shows diverse indigenous practices and adaptations by actors responding to pressures of change and persistent historical tensions of diversity and control. It provided evidence of a broadening instructional agenda and rapid deployment of information infrastructures for collection, dissemination, and analysis of student information. The model became a lens to view these changes and paths that policy for science education may take for implementation. It also became a lens to evaluate accountability policies to see how models embedded within policies may fit with current practice.Ph.D.EducationUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/60826/1/ppiety_1.pd

    Modeling and simulation of magnetic components in electric circuits

    No full text
    This thesis demonstrates how by using a variety of model constructions and parameter extraction techniques, a range of magnetic component models can be developed for a wide range of application areas, with different levels of accuracy appropriate for the simulation required. Novel parameter extraction and model optimization methods are developed, including the innovative use of Genetic Algorithms and Metrics, to ensure the accuracy of the material models used. Multiple domain modeling, including the magnetic, thermal and magnetic aspects are applied in integrated simulations to ensure correct and complete dynamic behaviour under a range of environmental conditions. Improvements to the original Jiles-Atherton theory to more accurately model loop closure and dynamic thermal behaviour are proposed, developed and tested against measured results. Magnetic Component modeling techniques are reviewed and applied in practical examples to evaluate the effectiveness of lumped models, 1D and 2D Finite Element Analysis models and coupling Finite Element Analysis with Circuit Simulation. An original approach, linking SPICE with a Finite Element Analysis solver is presented and evaluated. Practical test cases illustrate the effectiveness of the models used in a variety of contexts. A Passive Fault Current Limiter (FCL) was investigated using a saturable inductor with a magnet offset, and the comparison between measured and simulated results allows accurate prediction of the behaviour of the device. A series of broadband hybrid transformers for ADSL were built, tested, modeled and simulated. Results show clearly how the Total Harmonic Distortion (THD), Inter Modulation Distortion (IMD) and Insertion Loss (IL) can be accurately predicted using simulation.A new implementation of ADSL transformers using a planar magnetic structure is presented, with results presented that compare favourably with current wire wound techniques. The inclusion of transformer models in complete ADSL hybrid simulations demonstrate the effectiveness of the models in the context of a complete electrical system in predicting the overall circuit performance

    iOS Technologies & Frameworks

    Get PDF
    Apple’s mobile platform — iOS — currently generates the largest amount of revenue out of all mobile app stores. The majority of iDevices run the latest major iOS version (iOS 10) due to Apple users’ tendency to update their devices. Consequently, iOS developers are pressured into keeping their apps up to date. Advantages to updating apps consist of new features and adapting apps to the platform’s hardware and software evolution. However, this does not always happen. There are apps, some popular (with many users), which either receive slow updates, or not at all. The main consequence of developers not updating to the latest tendencies (i.e. user interface or API changes) is the degradation of their apps’ user experience. This subpar user experience leads to a decrease in the number of installs (and sales) and a search for alternatives that have been updated to support the latest firmware iteration fully. We identified a common pattern amongst ten apps which have subpar reviews on the App Store: excessive battery consumption and lack of user onboarding were just a few of the ssues. Above all, almost all those apps belong to the top 1% of apps (which generate 94% of the App Store’s revenue), so the lack of focus on the user experience is unfortunate considering their massive user bases. We listed the available resources for those wanting to develop or improve iOS apps. Given these requisites, we studied the possibility of developing a mobile app that adopted good engineering practices and, above all, focused on delivering an excellent user experience in a given timeframe of six months. The app’s idea consisted of a wish list management app called Snapwish that allows the user to take photos of objects they want, create wish lists, and share them with family and friends. The app allows for offline usage, with data syncing automatically (in real-time) without user intervention when the app’s Internet connection is present. We tested Snapwish thoroughly to measure the quality of its implementation. Profiling helped assert that core metrics like CPU and memory usage, network data requests and energy consumption were within acceptable values while unit and user interface tests served to validate our code functionally. Furthermore, our team of five beta testers provided valuable feedback and suggestions. Ultimately, the six-month timeframe proved to be insufficient in regards to a release on the App Store, as Snapwish remains in the latter beta stages at the time of writing. This delay is mostly attributed to a lengthy testing process. Thus, we plan on releasing it in the first trimester of 2017.Hoje em dia, a plataforma móvel da Apple — iOS — é a que tem maior revenue em aplicações móveis. A maior parte dos dispositivos móveis iOS corre a versão mais atual (iOS 10), devido à tendência dos seus utilizadores em atualizar o sistema operativo com frequência. Consequentemente, os desenvolvedores da plataforma são pressionados para manterem as suas apps atualizadas. Algumas das vantagens das atualizações consiste em adicionar novas funcionalidades e adaptar as apps à evolução do hardware e do software da plataforma. Contudo, isto nem sempre e verifica. Existem muitas apps, algumas “populares” (com muitas instalações) cuja atualização demora ou não acontece. A principal consequência da não atualização das apps às tendências atuais, quer em termos de interação, quer em termos de mecanismos de proteção de dados, consumo de bateria e outros, é a degradação da experiência de quem as utiliza, consequentemente, a diminuição do número de instalações (e vendas) e a crescente procura de alternativas que tenham estes princípios em conta. Foi identificado um padrão comum em dez aplicações cujas classificações na App Store são medíocres: um consumo exagerado de bateria e falta de user onboarding foram apenas alguns dos problemas. Acima de tudo, quase todas pertencem ao 1% de aplicações que geram 94% das receitas da App Store. A falta de foco na experiência do utilizador é infeliz considerando as enormes bases de utilizadores dessas aplicações. Foram listados os recursos disponíveis para quem pretende desenvolver ou melhorar uma aplicação iOS. Dadas essas premissas, foi estudada a possibilidade de desenvolver uma aplicação móvel que adote boas práticas de engenharia e, acima de tudo, foque na experiência do utilizador, num período de seis meses. A ideia para a aplicação consistiu num gestor de listas de desejos designada Snapwish que permite tirar fotos de objetos que o utilizador deseja, criar listas, e partilhá-las com amigos e familiares. Além disso, a app permite o uso offline e os dados são sincronizados em tempo real sem intervenção do utilizador quando a app dispõe de uma conexão à Internet. A nossa aplicação foi testada extensivamente para medir o nível de qualidade da sua implementação. O profiling ajudou em constatar que métricas fundamentais como o consumo de CPU e memória, pedidos de dados de rede e de consumo de energia (bateria) estavam dentro dos parâmetros aceitáveis. Além disso, uma equipa de cinco beta-testers contribuiu com comentários e sugestões de grande valor. Em última análise, o prazo de seis meses revelou-se insuficiente em relação ao lançamento da app na App Store. O Snapwish permanece numa fase beta avançada (no momento da escrita desta tese). Este atraso é principalmente atribuído a um extenso processo de testes. Assim, pretendemos lançar a aplicação no primeiro trimestre de 2017

    Enhancement of Power System Dynamic Performance by Coordinated Design of PSS and FACTS Damping Controllers

    Get PDF
    Due to environmental and economical constraints, it is difficult to build new power lines and to reinforce the existing ones. The continued growth in demand for electric power must therefore to a great extent be met by increased loading of available lines. A consequence of this is reduction of power system damping, leading to a risk of poorly damped power oscillations between generators. To suppress these oscillations and maintain power system dynamic performance, one of the conventional, economical and effective solutions is to install a power system stabilizer (PSS). However, in some cases PSS may not provide sufficient damping for the inter-area oscillations in a multi-machine power system. In this context, other possible solutions are needed to be exposed. With the evolution of power electronics, flexible AC transmission systems (FACTS) controllers turn out to be possible solution to alleviate such critical situations by controlling the power flow over the AC transmission line and improving power oscillations damping. However, coordination of conventional PSS with FACTS controllers in aiding of power system oscillations damping is still an open problem. Therefore, it is essential to study the coordinated design of PSS with FACTS controllers in a multi-machine power system. This thesis gives an overview of the modelling and operation of power system with conventional PSS. It gives the introduction to emerging FACTS controllers with emphasis on the TCSC, SVC and STATCOM controllers. The basic modelling and operating principles of the controllers are explained in this thesis, along with the power oscillations damping (POD) stabilizers. The coordination design of PSS and FACTS damping controllers over a wide range of operating conditions is formulated as an optimization problem. The objective function of this optimization problem is framed using system eigen values and it is solved using AAPSO and IWO algorithms. The optimal control parameters of coordinated controllers are obtained at the end of these optimization algorithms. A comprehensive approach to the hybrid coordinated design of PSS with series and shunt FACTS damping controllers is proposed to enhance the overall system dynamic performance. The robustness and effectiveness of proposed hybrid coordinated designs are demonstrated through the eigen value analysis and time-domain simulations. The proposed hybrid designs provide robust dynamic performance under wide range in load condition and providing significant improvement in damping power system oscillations under severe disturbance. The developed hybrid coordinated designs are tested in different multimachine power systems using AAPSO and IWO algorithms. The IWO based hybrid designs and AAPSO based hybrid designs are more effective than other control designs. In addition to this, the proposed designs are implemented and validated in real-time using Opal-RT hardware simulator. The real-time simulations of different test power systems with different proposed designs are carried out for a severe fault disturbance. Finally, the proposed controller simulation results are validated with real-time results

    BENCHMARKING CLASSIFIERS - HOW WELL DOES A GOWA-VARIANT OF THE SIMILARITY CLASSIFIER DO IN COMPARISON WITH SELECTED CLASSIFIERS?

    Get PDF
    Digital data is ubiquitous in nearly all modern businesses. Organizations have more data available, in various formats, than ever before. Machine learning algorithms and predictive analytics utilize the knowledge contained in that data, in order to help the business related decision-making. This study explores predictive analytics by comparing different classification methods – the main interest being in the Generalize Ordered Weighted Average (GOWA)-variant of the similarity classifier. The target for this research is to find out how what is the GOWA-variant of the similarity classifier and how well it performs compared to other selected classifiers. This study also tries to investigate whether the GOWA-variant of the similarity classifier is a sufficient method to be used in the busi-ness related decision-making. Four different classical classifiers were selected as reference classifiers on the basis of their common usage in machine learning research, and on their availability in the Sta-tistics and Machine Learning Toolbox in MATLAB. Three different data sets from UCI Machine Learning repository were used for benchmarking the classifiers. The benchmarking process uses fitness function instead of pure classification accuracy to determine the performance of the classifiers. Fitness function combines several measurement criteria into a one common value. With one data set, the GOWA-variant of the similarity classifier per-formed the best. One of the data sets contains credit card client data. It was more complex than the other two data sets and contains clearly business related data. The GOWA-variant performed also well with this data set. Therefore it can be claimed that the GOWA-variant of the similarity classifi-er is a viable option to be used also for solving business related problems

    Emerging Informatics

    Get PDF
    The book on emerging informatics brings together the new concepts and applications that will help define and outline problem solving methods and features in designing business and human systems. It covers international aspects of information systems design in which many relevant technologies are introduced for the welfare of human and business systems. This initiative can be viewed as an emergent area of informatics that helps better conceptualise and design new world-class solutions. The book provides four flexible sections that accommodate total of fourteen chapters. The section specifies learning contexts in emerging fields. Each chapter presents a clear basis through the problem conception and its applicable technological solutions. I hope this will help further exploration of knowledge in the informatics discipline

    A contribution to supply chain design under uncertainty

    Get PDF
    Dans le contexte actuel des chaînes logistiques, des processus d'affaires complexes et des partenaires étendus, plusieurs facteurs peuvent augmenter les chances de perturbations dans les chaînes logistiques, telles que les pertes de clients en raison de l'intensification de la concurrence, la pénurie de l'offre en raison de l'incertitude des approvisionnements, la gestion d'un grand nombre de partenaires, les défaillances et les pannes imprévisibles, etc. Prévoir et répondre aux changements qui touchent les chaînes logistiques exigent parfois de composer avec des incertitudes et des informations incomplètes. Chaque entité de la chaîne doit être choisie de façon efficace afin de réduire autant que possible les facteurs de perturbations. Configurer des chaînes logistiques efficientes peut garantir la continuité des activités de la chaîne en dépit de la présence d'événements perturbateurs. L'objectif principal de cette thèse est la conception de chaînes logistiques qui résistent aux perturbations par le biais de modèles de sélection d'acteurs fiables. Les modèles proposés permettent de réduire la vulnérabilité aux perturbations qui peuvent aV, oir un impact sur la continuité des opérations des entités de la chaîne, soient les fournisseurs, les sites de production et les sites de distribution. Le manuscrit de cette thèse s'articule autour de trois principaux chapitres: 1 - Construction d'un modèle multi-objectifs de sélection d'acteurs fiables pour la conception de chaînes logistiques en mesure de résister aux perturbations. 2 - Examen des différents concepts et des types de risques liés aux chaînes logistiques ainsi qu'une présentation d'une approche pour quantifier le risque. 3 - Développement d'un modèle d'optimisation de la fiabilité afin de réduire la vulnérabilité aux perturbations des chaînes logistiques sous l'incertitude de la sollicitation et de l'offre
    corecore