873,268 research outputs found

    A Conceptual Architecture for a Quantum-HPC Middleware

    Full text link
    Quantum computing promises potential for science and industry by solving certain computationally complex problems faster than classical computers. Quantum computing systems evolved from monolithic systems towards modular architectures comprising multiple quantum processing units (QPUs) coupled to classical computing nodes (HPC). With the increasing scale, middleware systems that facilitate the efficient coupling of quantum-classical computing are becoming critical. Through an in-depth analysis of quantum applications, integration patterns and systems, we identified a gap in understanding Quantum-HPC middleware systems. We present a conceptual middleware to facilitate reasoning about quantum-classical integration and serve as the basis for a future middleware system. An essential contribution of this paper lies in leveraging well-established high-performance computing abstractions for managing workloads, tasks, and resources to integrate quantum computing into HPC systems seamlessly.Comment: 12 pages, 3 figure

    Cognitive consequences of clumsy automation on high workload, high consequence human performance

    Get PDF
    The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems

    The Role of Modeling in Monarch Butterfly Research and Conservation

    Get PDF
    Models are an integral part of the scientific endeavor, whether they be conceptual, mathematical, statistical, or simulation models. Models of appropriate complexity facilitate comprehension and improve understanding of the variables driving system processes. In the context of conservation planning decision-making or research efforts, a useful model can aid interpretation and avoid overfitting by including only essential elements. Models can serve two related, but different purposes: understanding and prediction of future system behavior. Predictive models can require several iterations of refinement and empirical data gathering to be useful for conservation planning. Models with less predictive ability can be used to enhance understanding of system function and generate hypotheses for empirical evaluation. Modeling monarch butterfly systems, whether it be landscape-scale movement in breeding habitats, migratory behavior, or population dynamics at monthly or yearly timeframes, is challenging because the systems encompass complex spatial and temporal interactions across nested scales that are difficult, if not impossible, to empirically observe or comprehend without simplification. We review mathematical, statistical, and simulation models that have provided insights into monarch butterfly systems. Mathematical models have provided understanding of underlying processes that may be driving monarch systems. Statistical models have provided understanding of patterns in empirical data, which may represent underlying mechanisms. Simulations models have provided understanding of mechanisms driving systems and provide the potential to link mechanisms with data to build more predictive models. As an example, recently published agent-based models of non-migratory eastern North American monarch butterfly movement and egg-laying may provide the means to explore how different spatial patterns of habitat, habitat quality, and the interaction of stressors can influence future adult recruitment. The migratory process, however, has not been addressed with agent-based modeling. Using western monarch migration as an example, we describe how modeling could be used to provide insights into migratory dynamics. Future integration of migratory models with non-migratory and population dynamics models may provide better understanding and ultimately prediction of monarch butterfly movement and population dynamics at a continental scale

    An Adaptive Integration Architecture for Software Reuse

    Get PDF
    The problem of building large, reliable software systems in a controlled, cost-effective way, the so-called software crisis problem, is one of computer science\u27s great challenges. From the very outset of computing as science, software reuse has been touted as a means to overcome the software crisis issue. Over three decades later, the software community is still grappling with the problem of building large reliable software systems in a controlled, cost effective way; the software crisis problem is alive and well. Today, many computer scientists still regard software reuse as a very powerful vehicle to improve the practice of software engineering. The advantage of amortizing software development cost through reuse continues to be a major objective in the art of building software, even though the tools, methods, languages, and overall understanding of software engineering have changed significantly over the years. Our work is primarily focused on the development of an Adaptive Application Integration Architecture Framework. Without good integration tools and techniques, reuse is difficult and will probably not happen to any significant degree. In the development of the adaptive integration architecture framework, the primary enabling concept is object-oriented design supported by the unified modeling language. The concepts of software architecture, design patterns, and abstract data views are used in a structured and disciplined manner to established a generic framework. This framework is applied to solve the Enterprise Application Integration (EM) problem in the telecommunications operations support system (OSS) enterprise marketplace. The proposed adaptive application integration architecture framework facilitates application reusability and flexible business process re-engineering. The architecture addresses the need for modern businesses to continuously redefine themselves to address changing market conditions in an increasingly competitive environment. We have developed a number of Enterprise Application Integration design patterns to enable the implementation of an EAI framework in a definite and repeatable manner. The design patterns allow for integration of commercial off-the-shelf applications into a unified enterprise framework facilitating true application portfolio interoperability. The notion of treating application services as infrastructure services and using business processes to combine them arbitrarily provides a natural way of thinking about adaptable and reusable software systems. We present a mathematical formalism for the specification of design patterns. This specification constitutes an extension of the basic concepts from many-sorted algebra. In particular, the notion of signature is extended to that of a vector, consisting of a set of linearly independent signatures. The approach can be used to reason about various properties including efforts for component reuse and to facilitate complex largescale software development by providing the developer with design alternatives and support for automatic program verification

    Analysing the characteristics of VoIP traffic

    Get PDF
    In this study, the characteristics of VoIP traffic in a deployed Cisco VoIP phone system and a SIP based soft phone system are analysed. Traffic was captured in a soft phone system, through which elementary understanding about a VoIP system was obtained and experimental setup was validated. An advanced experiment was performed in a deployed Cisco VoIP system in the department of Computer Science at the University of Saskatchewan. Three months of traffic trace was collected beginning October 2006, recording address and protocol information for every packet sent and received on the Cisco VoIP network. The trace was analysed to find out the features of Cisco VoIP system and the findings were presented.This work appears to be one of the first real deployment studies of VoIP that does not rely on artificial traffic. The experimental data provided in this study is useful for design and modeling of such systems, from which more useful predictive models can be generated. The analysis method used in this research can be used for developing synthetic workload models. A clear understanding of usage patterns in a real VoIP network is important for network deployment and potential network activities such as integration, optimizations or expansion. The major factors affecting VoIP quality such as delay, jitter and loss were also measured and simulated in this study, which will be helpful in an advanced VoIP quality study. A traffic generator was developed to generate various simulated VoIP traffic. The data used to provide the traffic model parameters was chosen from peak traffic periods in the captured data from University of Saskatchewan deployment. By utilizing the Traffic Trace function in ns2, the simulated VoIP traffic was fed into ns2, and delay, jitter and packet loss were calculated for different scenarios. Two simulation experiments were performed. The first experiment simulated the traffic of multiple calls running on a backbone link. The second experiment simulated a real network environment with different traffic load patterns. It is significant for network expansion and integration

    Movement in workplace environments – configurational or programmed?

    Get PDF
    In countless case studies space syntax research has found that the configuration of a spatial system offers a powerful explanation to movement flows. However, this relationship is restricted for complex buildings where movement cannot be assumed as random since there may also be a programme that requires specific actions and interactions. A distinction has to be made here according to the nature of the organisation occupying a building: a strong programme building where the interaction and co-presence of people is highly controlled may not allow movement flows to follow configuration. In contrast, a weak programme building with an all-play-all interface might be expected to experience more randomised movement patterns increasing the significance of configuration as determining factor. Though being useful, these assumptions lack the power to fully explain real life movement flows in workplace environments for two reasons: firstly, most workplace environments follow neither purely strong nor simply weak programmes, they lie in-between the two poles and comprise aspects of both systems. Secondly, configuration considered as the crucial cause of movement in an office may even be limited for weak programmes due to the effects exerted by everyday attractors such as the coffee machine, the watercooler or the photocopier, toilets or the building entrances. This paper explores different strategies for explaining observed movement patterns, among them axial and segment analysis. It aims at an in-depth analysis of strong and weak programme aspects in order to find ways of understanding office movement patterns. The data used stems from two case studies representing those ‘in-between’ settings: a university school and a research organisation hosting theoretical physicists. The results suggest that movement in these workplaces may be reflected best by a metric analysis, as opposed to urban movement that follows angularity patterns. Distances seem to matter most in small and well known spaces. Moreover, it can be shown that flows of people can only be explained through configuration whenever it is possible to exclude attractor driven movement. On this basis a new approach is suggested that combines configuration based integration measures with attractor based ones in order to predict actual movement flows in offices

    A Conceptual Architecture for a Quantum-HPC Middleware

    Get PDF
    Quantum computing promises potential for science and industry by solving certain computationally complex problems faster than classical computers. Quantum computing systems evolved from monolithic systems towards modular architectures comprising multiple quantum processing units (QPUs) coupled to classical computing nodes (HPC). With the increasing scale, middleware systems that facilitate the efficient coupling of quantum-classical computing are becoming critical. Through an in-depth analysis of quantum applications, integration patterns and systems, we identified a gap in understanding Quantum-HPC middleware systems. We present a conceptual middleware to facilitate reasoning about quantum-classical integration and serve as the basis for a future middleware system. An essential contribution of this paper lies in leveraging well-established high-performance computing abstractions for managing workloads, tasks, and resources to integrate quantum computing into HPC systems seamlessly

    Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    Get PDF
    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states

    HEALTH SYSTEMS INTEGRATION AND TRANSFORMATION THROUGH CROSS-SECTORAL COLLABORATION

    Get PDF
    Statement of the problem Health system integration has been a challenge world-wide. There is no one best model to ensure successful integration. The aim of this research is to better understand "how to" build cross-sectoral collaboration for health systems integration and transformation. This study sheds light on understanding the patterns of communication and collaboration among the participants of six newly established teams or "Tables" in one of the Local Health Integration Networks in Ontario (Canada). This naturalistic inquiry study uses a combination of Complex Adaptive Systems and Relational Coordination theories as a theoretical lens to interpret the findings. Methods A mixed-methods approach has been used with Methodological Triangulation, which includes quantitative surveys (at baseline and follow-up), qualitative interviews and member checking. Results The survey response rate was 62% at Baseline (n=45) and 25% at Follow-up (n=22). Relational Coordination Index Scores was "moderate" with no significant differences between Baseline and Follow-up and no differences between the stakeholders or "Tables." From the twelve interviews, it was revealed that context matters at the local levels. "Rural Tables" with "moderate" Relational Coordination reported "inter-dependency" and the "Suburban Tables" with "weak" Relational Coordination reported "inter-organizational challenges." Discussion There is no one-size-fits-all model for health systems integration, and there is no formula for determining whether policy directives should be "bottom-up," "top-down" or "both." Based on this conundrum, it is recommended that leaders view health care as a Complex Adaptive System in order to allow the system to transform, change and to develop inter-dependencies, inter-organizational relationships and self-organizing capacities. Policymakers should take this into consideration in policy development and evaluation. New strategies are proposed and further research is needed to inform health systems change. Conclusion The findings characterized the process of intentional cross-sectoral collaboration using Complex Adaptive Systems and Relational Coordination theories to understand the patterns of communication and collaboration among the stakeholders and "Tables". A policy framework on "how to" build cross-sectoral collaboration for health systems integration and transformation has been developed, which adds a much-needed understanding on cross-sectoral collaboration

    Voices of Hope: Substance Use Peer Support in a System of Care

    Get PDF
    Peer support in substance use recovery assists individuals who seek long-term recovery by establishing supportive and reciprocal relationships that support the initiation and maintenance of recovery. Prior research has found that peer support workers provide essential services to individuals in recovery, while the experience of the peer and their integration into a system of care has yet to be fully explored. This qualitative study explored the peer worker’s experience as a provider of recovery support services in a system of care. Semi-structured interviews were conducted with 10 peer support workers. The interviews were transcribed and analyzed using qualitative data analysis software. Thematic analysis was used to identify themes and patterns inductively from the data. Peer support worker experiences included challenges establishing credibility, frustrations in managing systemic barriers, a lack of understanding as to what the role of peer worker entails by stakeholders, and skepticism from other providers about the value of the position. Positive experiences included a decrease in the perception of stigma about substance use and feeling valued. Supervision played a key role in the success of the peer worker role, with concerns related to supervisors who are not in recovery. This study highlighted improvements in the integration of peer support workers in systems of care and regard for the role by professionals. A widespread understanding of the role and scope of practice is lacking and a need for better support for the role through avenues such as training, and supervision exists
    • …
    corecore