25,072 research outputs found

    Practice, Skill Mix, and Education: The Evolving Role of Pharmacy Technicians in Great Britain

    Get PDF
    Pharmacy technicians’ roles are rapidly evolving in Great Britain (GB) as they undertake more extended activities with increased autonomy across the different pharmacy sectors. This paper compares the GB pharmacy regulator initial education and training standards recently introduced (2017) with the qualifications currently used in practice and discusses whether future qualifications will be ‘fit for purpose’. In this context, knowledge, skills, and competence are reviewed to assess whether they will meet the expectations and underpin the evolving pharmacy technician role as integral to healthcare provision. Based on drivers, policy change, and the changing GB healthcare landscape, effectiveness of skill mix is analysed to establish whether this is being optimised to support person-centred pharmacy in response to the challenges and pressures faced within the NHS. On this basis and given there is a limited evidence base, this review has highlighted a need for larger scale research to reassure the pharmacy and wider healthcare professions, and the public, that the evolving pharmacy technician role presents no increased risk to patient safety and contributes significantly to releasing pharmacists time for person-centred clinical activities

    Investigating the role of knowledge management in driving the development of an effective business process architecture

    Get PDF
    Business Process Architecture (BPA) modelling methods are not dynamic and flexible enough to effectively respond to changes. This may create a barrier that contributes to a lack of knowledge and learning capabilities which can affect the BPA regarding its support for a sustainable competitive advantage in an organisation. New business challenges are driving business enterprises to adopt Knowledge Management (KM) as one means of making a positive difference to their performance and competitiveness. However, shortcomings still remain in utilising knowledge management in business processes where efforts were mostly directed towards the integration of knowledge management with business process management but not including BPAs. The idea of applying KM as a memory to be timely retrieved and updated as needed is no longer sufficient. The resource-based view suggests a number of key factors to be investigated and taken into consideration during the development of knowledge management systems. These key factors are known as Knowledge Management Enablers (KMEs). KMEs are crucial for representing KM and understanding how knowledge is created, shared and disseminated. They are also essential to identify available assets and resources, and to clarify how organisational capabilities are created and utilised.This research is aimed at investigating the role of the knowledge management enablers in the development of an effective process architecture. An effective process architecture needs to be dynamic and supports a sustainable competitive advantage in an organisation. Identifying the KMEs, selecting an appropriate BPA method, aligning these KMEs with this method as well as undertaking a critical evaluation of this alignment are the main objectives set for this research. In order to accomplish the research aim and objectives, a resource-based and semantic-enriched framework, namely the KMEOntoBPA has been designed using KMEs to drive the process of BPA development. Organisational structure, culture, information technology, leadership, knowledge context and business repository have been selected as representatives of the KMEs. The object-based BPA modelling, specifically the semantically enriched Riva BPA (srBPA) method, has been adopted in order to embrace the knowledge resources generated by KMEs and utilise them in the derivation and re-configuration of its constitutional elements. These knowledge resources are employed as business objects. They are considered as Candidate Essential Business Entities (CEBEs) in the Riva method, that characterise or represent a form of business of an organisation. The Design Science Research Methodology (DSRM) is used to guide the research phases with an emphasis on the design and development, demonstration and evaluation of the research framework. The KMEOntoBPA has been demonstrated using sufficient and representative core banking case studies of the Treasury, Deposits and Financing. These case studies have been applied to the DSRM iterations beginning with the Treasury as the 1st case study, followed by the Deposits and the Financing case studies.The results have revealed that KMEs utilisation provides an agile generation of representative CEBEs and their corresponding Riva BPA elements, which reflect the real business in each of the core banking business studies. This research also demonstrated the semantic Riva BPA method as an appropriate object-based method that is well aligned with KMEs in exploiting knowledge resources for the development of a dynamic BPA with reference to robustness and learning capabilities. In addition to these results, the research framework, i.e, the KMEOntoBPA has shown an understanding of the flow of knowledge in the bank and has provided several possible advantages such as the accuracy of service delivery and the improvement of the financial control. It also supports the sources of sustainable competitive advantage (SCA): technical capabilities, core competences and social capital.Finally, a number of significant contributions and artefacts have been attained. For example, there is the aKMEOnt which is the abstract ontology that utilises six KMEs in this research to investigate the effectiveness of using such KMEs in driving the development of the BPA. These contributions along with the research results provide a guide to future research directions such as using the aKMEOnt in the development of different business process modelling and deriving the Enterprise Information Architecture (EIA) and Service Oriented Architecture (SOA)

    The Role of Safe Practices on Hospitals’ Total Factor Productivity.

    Get PDF
    The dual aims of improving safety and productivity are a major part of the health care reform movement hospital leaders must manage. Studies exploring the two phenomena conjointly and over time are critical to understanding how change in one dimension influences the other over time. A Malmquist approach is used to assess hospitals’ relative productivity levels over time. Analysis of variance (ANOVA) algorithms were executed to assess whether or not the Malmquist Indices (MIs) correlate with the safe practices measure. The American Hospital Association’s annual survey and the Centers for Medicare and Medicaid Services’ Case Mix Index for fiscal years 2002–2006, along with Leapfrog Group’s annual survey for 2006 were used for this study. Leapfrog Group respondents have significantly higher technological change (TC) and total factor productivity (TFP) than nonrespondents without sacrificing technical efficiency changes. Of the three MIs, TC (P , 0.10) and TFP (P , 0.05) had significant relationships with the National Quality Forum’s Safe Practices score. The ANOVA also indicates that the mean differences of TFP measures progressed in a monotonic fashion up the Safe Practices scale. Adherence to the National Quality Forum’s Safe Practices recommendations had a major impact on hospitals’ operating processes and productivity. Specifically, there is evidence that hospitals reporting higher Safe Practices scores had above average levels of TC and TFP gains over the period assessed. Leaders should strive for increased transparency to promote both quality improvement and increased productivity

    Market fields structure & dynamics in industrial automation

    Get PDF
    There is a research tradition in the economics of standards which addresses standards wars, antitrust concerns or positive externalities from standards. Recent research has also dealt with the process characteristics of standardisation, de facto standard-setting consortia and intellectual property concerns in the technology specification or implementation phase. Nonetheless, there are no studies which analyse capabilities, comparative industry dynamics or incentive structures sufficiently in the context of standard-setting. In my study, I address the characteristics of collaborative research and standard-setting as a new mode of deploying assets beyond motivations well-known from R&D consortia or market alliances. On the basis of a case study of a leading user organisation in the market for industrial automation technology, but also a descriptive network analysis of cross-community affiliations, I demonstrate that there must be a paradoxical relationship between cooperation and competition. More precisely, I explain how there can be a dual relationship between value creation and value capture respecting exploration and exploitation. My case study emphasises the dynamics between knowledge stocks (knowledge alignment, narrowing and deepening) produced by collaborative standard setting and innovation; it also sheds light on an evolutional relationship between the exploration of assets and use cases and each firm's exploitation activities in the market. I derive standard-setting capabilities from an empirical analysis of membership structures, policies and incumbent firm characteristics in selected, but leading, user organisations. The results are as follows: the market for industrial automation technology is characterised by collaboration on standards, high technology influences of other industries and network effects on standards. Further, system integrators play a decisive role in value creation in the customer-specific business case. Standard-setting activities appear to be loosely coupled to the products offered on the market. Core leaders in world standards in industrial automation own a variety of assets and they are affiliated to many standard-setting communities rather than exclusively committed to a few standards. Furthermore, their R&D ratios outperform those of peripheral members and experience in standard-setting processes can be assumed. Standard-setting communities specify common core concepts as the basis for the development of each member's proprietary products, complementary technologies and industrial services. From a knowledge-based perspective, the targeted disclosure of certain knowledge can be used to achieve high innovation returns through systemic products which add proprietary features to open standards. Finally, the interplay between exploitation and exploration respecting the deployment of standard-setting capabilities linked to cooperative, pre-competitive processes leads to an evolution in common technology owned and exploited by the standard-setting community as a particular kind of innovation ecosystem. --standard-setting,innovation,industry dynamics and context,industrial automation

    THE ROLE OF INTERNET OF THINGS(IOT) IN SMART HEALTHCARE SYSTEM

    Get PDF
    Healthcare has evolved thanks to the Internet of Things (IoT) revolution. Health management has replaced treatment as the primary focus of healthcare. As a result, as healthcare develops toward patient-oriented and analytical applications, more data are being collected and pooled than ever before. In addition to discussing health data and patient-centred health management, this article also covers several facets of smart healthcare. One of the most important industries that the Internet of Things (IoT) has modernized is healthcare. The ability to collect data and evaluate this enormous data is made possible by the shrinking of sensors. IoT sensors can be used to connect medical equipment and resources in order to gather data and process it. This paper provides a overview of certain IoT's effects on the healthcare industry. Healthcare cannot be outside of this paradigm given the rise of IoT technologies. The purpose of this article is to provide guidelines for achieving worldwide connectivity between medical environments and the Internet of Things (IoT

    Smart Roads and Autonomous Driving vs. Data Protection: the Problem of the Lawfulness of the Processing

    Get PDF
    The paper highlights how smart mobility - and in particular its most advanced expression to date, which is autonomous driving - is a fundamental component of the smart city. However, technological development in this direction raises the issue of balancing interests with the need to protect personal data, of which driverless cars collect a huge amount. In this perspective, the main issue is now recognized in the lack of an adequate legal basis; the solution this essay proposes is that of the provision, by the legislator, of a task of public interest through the implementation of an ad hoc legislation, of which it offers a possible model. pdf icon downloa

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Reasonable AI and Other Creatures. What Role for AI Standards in Liability Litigation?

    Get PDF
    Standards play a vital role in supporting policies and legislation of the European Union. The regulation of artificial intelligence (AI) makes no exception as made clear by the AI Act proposal. Particularly, Articles 40 and 41 defer to harmonised standards and common specifications the concrete definition of safety and trustworthiness requirements, including risk management, data quality, transparency, human oversight, accuracy, robustness, and cybersecurity. Besides, other types of standards and professional norms are also relevant to the governance of AI. These include European non-harmonised standards, international and national standards, professional codes and guidelines, and uncodified best practices. This contribution casts light on the relationship between standards and private law in the context of liability litigation for damage caused by AI systems. Despite literature’s commitment to the issue of liability for AI, the role of standardisation in this regard has been largely overlooked hitherto. Furthermore, while much research has been undertaken on the regulation of AI, comparatively little has dealt with its standardisation. This paper aims to fill this gap. Building on previous scholarship, the contribution demonstrates that standards and professional norms are substantially normative in spite of their private and voluntary nature. In fact, they shape private relationships due to normative and economic reasons. Indeed, these private norms enter the courtrooms by explicit or implicit incorporation into contracts as well as by informing general clauses such as reasonableness and duty of care. Therefore, they represent the yardstick against which professionals’ performance and conduct are evaluated. Hence, a link between standards, safety, and liability can be established. Against this backdrop, the role of AI standards in private law is assessed. To set the scene, the article provides a bird’s-eye view of AI standardisation. The European AI standardisation initiative is analysed along with other institutional and non-institutional instruments. Finally, it is argued that AI standards contribute to defining the duty of care expected from developers and professional operators of AI systems. Hence, they might represent a valuable instrument for tackling the challenges posed by AI technology to extracontractual and contractual liability

    The economic viability of an in-home monitoring system in the context of an aged care setting

    Get PDF
    The aged care sector in Australia faces significant challenges. Although many of these issues have been clearly identified, their urgency has been further highlighted during the COVID-19 pandemic. Technology such as in-home monitoring is one way to address some of these challenges. However, the efficacy of technology must be considered together with its implementation and running costs to ensure that there is a return on investment, and it is economically viable as a solution. A pilot programme was run using the HalleyAssistÂź in-home monitoring system to test the efficacy of this system. This article focuses on an economic analysis to better understand the financial viability of such systems. Using a secondary analysis approach, the findings identified that revenue could be generated by providing carers with additional services such as real-time monitoring of the client, which can foster deeper relationships with the customer, along with savings of healthcare costs to carers, service providers and Government. Savings are related to the earlier intervention of critical events that are identified by the system, as delays in treatment of some critical events can create much more severe and costly health outcomes. Further health costs savings can be made via trend analysis, which can show more nuanced health deterioration that is often missed. The implementation of preventative measures via this identification can reduce the chances of critical events occurring that have much higher costs. Overall, monitoring systems lead to a transition from a reactive to a preventative services offering, delivering more targeted and personalised care
    • 

    corecore