567 research outputs found

    Multidisciplinary digital competencies of pre-service vocational teachers

    Get PDF
    Developments of Industry 4.0 require a set of multidisciplinary digital competencies for future vocational teachers, consisting of specific knowledge, motivational aspects, cognitive abilities and skills to fulfill the demands of digitally interconnected work situations. The competence model that is adapted from future work scenarios of vocational apprentices in Industry 4.0 includes attitudes towards digitization and handling of digital devices, information literacy, application of digital security standards, virtual collaboration, digital problem solving as well as a demonstration of reflective judgment of one’s actions in an interconnected and digital environment. Structural equation modeling was used to assess N = 205 pre-service vocational teachers between 18 and 35 years of age. The findings indicate the relationship of the proposed dimensions, measured through external- and self-assessments validate the proposed structure of the multidisciplinary digital competencies. However, attitude towards digitization can predict the self-efficacy of the relevant Multidisciplinary Digital Competencies but not the actual achievement in an external assessed scenario. Nevertheless, this study confirms that self-assessed multidisciplinary digital competencies can predict achievement in an external and qualitative-assessed competence test. Fit indices show an acceptable model conception, the reliability and construct validity of the model were confirmed. Findings suggest that the attitude towards digitization and the application of digital security standards are important, whereas the ability to solve digital problems seems to have a weak relation to the general multidisciplinary digital competencies of pre-service vocational teachers

    Healthcare leaders’ and elected politicians’ approach to support-systems and requirements for complying with quality and safety regulation in nursing homes – a case study

    Get PDF
    Background Healthcare leaders play an important and complex role in managing and handling the dual responsibility of both Health, Safety and Environment (HSE) for workers and quality and patient safety (QPS). There is a need for better understanding of how healthcare leaders and decision makers organize and create support structures to handle these combined responsibilities in practice. The aim of this study was to explore how healthcare leaders and elected politicians organize, control, and follow up the work of HSE and QPS in a Norwegian nursing home context. Moreover, we explore how they interpret, negotiate, and manage the dual responsibility and possible tensions between employee health and safety, and patient safety and quality of service delivery. Methods The study was conducted in 2022 as a case study exploring the experience of healthcare leaders and elected politicians in five municipalities responsible for providing nursing homes services in Norway. Elected politicians (18) and healthcare leaders (11) participated in focus group interviews (5) and individual interviews (11). Data were analyzed using inductive thematic analysis. Results The analysis identified five main themes explaining how the healthcare leaders and elected politicians organize, control, and follow up the work of HSE and QPS: 1. Establish frameworks and room for maneuver in the work with HSE and QPS. 2. Create good routines and channels for communication and collaboration. 3. Build a culture for a health-promoting work environment and patient safety. 4. Create systems to handle the possible tensions in the dual responsibility between caring for employees and quality and safety in service delivery. 5. Define clear boundaries in responsibility between politics and administration. Conclusions The study showed that healthcare leaders and elected politicians who are responsible for ensuring sound systems for quality and safety for both patients and staff, do experience tensions in handling this dual responsibility. They acknowledge the need to create systems and awareness for the responsibility and argue that there is a need to better separate the roles and boundaries between elected politicians and the healthcare administration in the execution of HSE and QPS.publishedVersio

    Lines-of-inquiry and sources of evidence in work-based research

    Get PDF
    There is synergy between the investigative practices of police detectives and social scientists, including work-based researchers. They both develop lines-of-inquiry and draw on multiple sources of evidence in order to make inferences about people, trends and phenomena. However, the principles associated with lines-of-inquiry and sources of evidence have not so far been examined in relation to work-based research methods, which are often unexplored or ill-defined in the published literature. We explore this gap by examining the various direct and indirect lines-of-inquiry and the main sources of primary and secondary evidence used in work-based research, which is especially relevant because some work-based researchers are also police detectives. Clearer understanding of these intersections will be useful in emerging professional contexts where the work-based researcher, the detective, and the social scientist cohere in the one person and their research project. The case we examined was a Professional Studies programme at a university in Australia, which has many police detectives doing work-based research, and from their experience we conclude there is synergy between work-based research and lines of enquiry. Specifically, in the context of research methods, we identify seven sources of evidence: 1) creative, unstructured, and semi-structured interviews; 2) structured interviews; 3) consensus group methods; 4) surveys; 5) documentation and archives; 6) direct observations and participant observations; and 7) physical or cultural artefacts, and show their methodological features related to data and method type, reliability, validity, and types of analysis, along with their respective advantages and disadvantages. This study thereby unpacks and isolates those characteristics of work-based research which are relevant to a growing body of literature related to the messy, co-produced and wicked problems of private companies, government agencies, and non-government organisations and the research methods used to investigate them

    Modelling and solution methods for portfolio optimisation

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University, 16/01/2004.In this thesis modelling and solution methods for portfolio optimisation are presented. The investigations reported in this thesis extend the Markowitz mean-variance model to the domain of quadratic mixed integer programming (QMIP) models which are 'NP-hard' discrete optimisation problems. In addition to the modelling extensions a number of challenging aspects of solution algorithms are considered. The relative performances of sparse simplex (SSX) as well as the interior point method (IPM) are studied in detail. In particular, the roles of 'warmstart' and dual simplex are highlighted as applied to the construction of the efficient frontier which requires processing a family of problems; that is, the portfolio planning model stated in a parametric form. The method of solving QMIP models using the branch and bound algorithm is first developed; this is followed up by heuristics which improve the performance of the (discrete) solution algorithm. Some properties of the efficient frontier with discrete constraints are considered and a method of computing the discrete efficient frontier (DEF) efficiently is proposed. The computational investigation considers the efficiency and effectiveness in respect of the scale up properties of the proposed algorithm. The extensions of the real world models and the proposed solution algorithms make contribution as new knowledge

    Application of quasiparticle forces in quantum technologies

    Get PDF
    The performance of many superconducting devices is diminished by long-lived Bogoliubov quasiparticle excitations present in the superconducting part. In normal-metal–insulator– superconductor structured micro-refrigerators, for example, the tunneling of quasiparticles into the normal metal and the accompanying backflow of heat just extracted from it, reduces the cooling efficiency. In superconducting qubits incoherent quasiparticle tunneling through Josephson junctions leads to qubit decoherence and relaxation. While the associated rates are small compared to those of currently more serious noise sources, quasiparticle tunneling is expected to be relevant for fulfilling the high requirements given by current quantum com-putation tasks based on fault tolerant quantum computing. Normal-metal quasiparticle traps among other established techniques are commonly used in order to redistribute the quasi-particles inside the superconducting part and reduce their density in regions which are more important for the device performance. In this thesis we quantitatively investigate on the trapping performance of such normal-metal quasiparticle traps and particularly the role taken by the superconducting proximity effect therein. The quasiclassical Green’s function approach based on the non-equilibrium Keldysh technique serves as theoretical tool. As central physical quantities in the stationary non-equilibrium state the superconducting order parameter, local density of quasiparticle states and the quasiparticle density are put into context with the proximity effect. Two competing characteristics opposingly affecting the trapping performance are revealed, which points out the existence of an ideal trap position with optimal trapping performance. Furthermore, the conversion between dissipative normal current and supercurrent mediated by Andreev reflection and the resulting reduction of the quasiparticle density is studied. A further part of this thesis is about the emulation of quantum field theory in curved spacetime involving spontaneous particle creation due to the conversion of virtual particles into real, detectable ones. We propose an experimental setup, where the dynamics of surface acoustic waves and phonons, respectively, on a piezoelectric semiconductor mimics the propagation of a massless scalar quantum field on a curved spacetime with an effective metric resembling that of a black hole and an expanding universe to some extent, including an acoustic event horizon for surface acoustic waves. An appropriate detection scheme indicating particle creation in form of phonons employs electron loaded dynamic quantum dots and a Stern-Gerlach gate for their readout. A non-thermal steady state for the electrons is predicted, which is ascribed to particle creation.Die Performance vieler auf Supraleitung basierender Quantentechnologien wird durch die Präsenz von langlebigen Bogoliubov-Quasiteilchen im supraleitenden Bauteil beeinträchtigt. Die Kühlleistung von elektronischen Mikro-Kühlern beispielsweise, die aus normal- und supraleitenden Metallen bestehende Tunnelkontakte verwenden, wird durch in das Normal-metall tunnelnde Quasiteilchen und dem damit einhergehendem Wärmefluss, welche dem Normalmetall zuvor entzogen wurden, reduziert. In supraleitenden Qubits führt inkohärentes Tunneln von Quasiteilchen durch Josephson-Kontakte zu Dekohärenz und Relaxation des Qubits. Zwar sind die damit verbundenen Raten im Vergleich zu denen anderer Fehlerquellen laut aktuellem Stand kleiner, man geht jedoch davon aus, dass das Tunneln von Quasiteilchen relevant ist, um die hohen Anforderungen, die aktuelle Anwendungen von fehlertoleranten Quantencomputern erfordern, zu erfüllen. Der Einsatz von Normalmetall-Quasiteilchenfallen und anderer experimenteller Techniken hat sich etabliert, um Quasiteilchen im supraleitenden Bauteil umzuverteilen und deren Dichte in für die Performance ausschlaggebenden Regionen zu reduzieren. In dieser Dissertation untersuchen wir quantitativ die Einfangperformance solcher Normalmetall-Quasiteilchenfallen und speziell die Rolle, die der supraleitende Proximity-Effekt dabei spielt. Als theoretisches Werkzeug dienen dabei quasiklassische Greensfunktionen, die auf der Keldysh-Technik für Nichtgleichgewichtssituationen basieren. Als zentrale physikalische Größen des stationären Nichtgleichgewichtszustandes werden der supraleitende Ordnungsparameter, die lokale Zustandsdichte der Quasiteilchen und deren Dichte mit dem Proximity-Effekt in Verbindung gebracht. Dabei werden zwei sich auf die Fallenwirkung gegensätzlich auswirkende Eigenschaften deutlich, welche die Existenz einer idealen Fallenpositionierung mit optimaler Einfangperformance aufzeigen. Zusätzlich wird die durch Andreev-Reflektion veraursachte Umwandlung zwischen dissipativem Normalstrom und Suprastrom und die damit verbundene Reduktion der Quasiteilchendichte studiert. Ein weiterer Teil dieser Dissertation behandelt die Emulation von Quantenfeldtheorie in gekrümmter Raumzeit mit spontaner Teilchenproduktion aufgrund der Umwandlung von virtuellen in reale, detektierbare Teilchen. Wir schlagen ein Experiment vor, in dem die Ausbreitung von akustischen Oberflächenwellen beziehungsweise Phononen auf einem piezoelektrischen Halbleiter die Propagation eines masselosen skalaren Quantenfeldes in einer gekrümmten Raumzeit mit einer effektiven Metrik imitiert. Die effektive Metrik weist dabei Ähnlichkeiten zu der eines Schwarzen Loches und eines sich ausdehnendem Universums auf, einschließlich eines akustischen Horizonts für akustische Oberflächenwellen. Ein geeignetes Detektionsverfahren, welches auf Teilchenproduktion in Form von Phononen hindeutet, verwendet in dynamischen Quantenpunkten gefangene Elektronen und ein Stern-Gerlach-Gatter für deren Messung. Wir sagen einen nicht-thermischen Gleichgewichtszustand der Elektronen vorher, der der Teilchenproduktion zugeschrieben wird
    • …
    corecore