526,112 research outputs found

    Managing performance in quality management: A two level study of employee-perceptions and workplace-performance

    Get PDF
    Purpose: This paper addresses potential effects of the control element in Quality Management. First, behavioural theories on how elements of performance management can affect organisational performance are examined. Secondly, theoretical models on how perceptions of work conditions may impact wellbeing and performance are considered. Direct and indirect pathways from performance management to productivity/quality are inferred. Methodology: Matched employee-workplace data from an economy-wide survey in Britain and two-level structural equation models are used to test the hypothesised associations. Findings: The use of practices in workplaces is inconsistent with a unified performance management approach. Distinct outcomes are expected from separate components in performance management and some may be contingent on workplace size. For example, within Quality-planning, strategy dissemination is positively associated with workplace-productivity; targets are negatively associated with perceptions of job demands and positively correlated with job satisfaction, which in turn can increase workplace-productivity. With respect to Information & Analysis: keeping and analysing records, or monitoring employee-performance via appraisals that assess training needs, are positively associated with workplace-productivity and quality. Originality: This paper illustrates how control in Quality Management can be effective. Although the merits of performance management are subject to ongoing debate, arguments in the literature have tended to focus on performance appraisal. Analyses of economy-wide data linking performance management practices, within Quality Management, to employee perceptions of work conditions, wellbeing and aggregate performance are rare

    Conserving socio-ecological landscapes: An analysis of traditional and responsive management practices for floodplain meadows in England

    Get PDF
    Contemporary practice in the conservation of socio-ecological landscapes draws on both a model of responsive management, and also on ideas about historic management. This study considered what evidence might exist for the exercise of these approaches to management in the conservation of floodplain meadows in England, in order to inform understanding and knowledge of conservation management and assessment practice. Evidence for a model of responsive management was limited, with managing stakeholders often alternating between this model and an alternative approach, called here the ‘traditional management approach’, based on ideas, narratives and prescriptions of long-established land management practices. Limited monitoring and assessment appeared to undermine the former model, whilst uncertainty over past long-standing management practices undermined the latter. As a result of the relative power of conservation actors over farmers delivering site management, and their framings of meadows as ‘natural’ spaces, management tended to oscillate between aspects of these two approaches in a sometimes inconsistent manner. Conservation managers should consider the past motivating drivers and management practices that created the landscapes they wish to conserve, and bear in mind that these are necessarily implicated in aspects of the contemporary landscape value that they wish to maintain. They should ensure that assessment activity captures a broad range of indicators of site value and condition, not only biological composition, and also record data on site management operations in order to ensure management effectiveness

    Living with inconsistencies in a multidatabase system

    Get PDF
    Integration of autonomous sources of information is one of the most important problems in implementation of the global information systems. This paper considers multidatabase systems as one of the typical architectures of global information services and addresses a problem of storing and processing inconsistent information in such systems. A new data model proposed in the paper separates sure from inconsistent information and introduces a system of elementary operations on the containers with sure and inconsistent information. A review of the implementation aspects in an environment of a typical relational database management system concludes the paper

    Dispersion Measurement of Ultra-High Numerical Aperture Fibers covering Thulium, Holmium, and Erbium Emission Wavelengths

    Full text link
    We present broadband group velocity dispersion (GVD) measurements of commercially available ultra-high numerical aperture fibers (UHNA1, UHNA3, UHNA4, UHNA7 and PM2000D from Coherent-Nufern). Although these fibers are attractive for dispersion management in ultrafast fiber laser systems in the 2 {\mu}m wavelength region, experimental dispersion data in literature is scarce and inconsistent. Here we demonstrate the measurements using the spectral interferometry technique covering the typically used erbium, thulium and holmium emission bands. The results are characterized in terms of the standard-deviation uncertainty and compared with previous literature reports. Fitting parameters are provided for each fiber allowing for the straightforward replication of the measured dispersion profiles. This work is intended to facilitate the design of ultrafast fiber laser sources and the investigations of nonlinear optical phenomena

    Creating Fair Models of Atherosclerotic Cardiovascular Disease Risk

    Get PDF
    Guidelines for the management of atherosclerotic cardiovascular disease (ASCVD) recommend the use of risk stratification models to identify patients most likely to benefit from cholesterol-lowering and other therapies. These models have differential performance across race and gender groups with inconsistent behavior across studies, potentially resulting in an inequitable distribution of beneficial therapy. In this work, we leverage adversarial learning and a large observational cohort extracted from electronic health records (EHRs) to develop a "fair" ASCVD risk prediction model with reduced variability in error rates across groups. We empirically demonstrate that our approach is capable of aligning the distribution of risk predictions conditioned on the outcome across several groups simultaneously for models built from high-dimensional EHR data. We also discuss the relevance of these results in the context of the empirical trade-off between fairness and model performance

    Personalizable Knowledge Integration

    Get PDF
    Large repositories of data are used daily as knowledge bases (KBs) feeding computer systems that support decision making processes, such as in medical or financial applications. Unfortunately, the larger a KB is, the harder it is to ensure its consistency and completeness. The problem of handling KBs of this kind has been studied in the AI and databases communities, but most approaches focus on computing answers locally to the KB, assuming there is some single, epistemically correct solution. It is important to recognize that for some applications, as part of the decision making process, users consider far more knowledge than that which is contained in the knowledge base, and that sometimes inconsistent data may help in directing reasoning; for instance, inconsistency in taxpayer records can serve as evidence of a possible fraud. Thus, the handling of this type of data needs to be context-sensitive, creating a synergy with the user in order to build useful, flexible data management systems. Inconsistent and incomplete information is ubiquitous and presents a substantial problem when trying to reason about the data: how can we derive an adequate model of the world, from the point of view of a given user, from a KB that may be inconsistent or incomplete? In this thesis we argue that in many cases users need to bring their application-specific knowledge to bear in order to inform the data management process. Therefore, we provide different approaches to handle, in a personalized fashion, some of the most common issues that arise in knowledge management. Specifically, we focus on (1) inconsistency management in relational databases, general knowledge bases, and a special kind of knowledge base designed for news reports; (2) management of incomplete information in the form of different types of null values; and (3) answering queries in the presence of uncertain schema matchings. We allow users to define policies to manage both inconsistent and incomplete information in their application in a way that takes both the user's knowledge of his problem, and his attitude to error/risk, into account. Using the frameworks and tools proposed here, users can specify when and how they want to manage/solve the issues that arise due to inconsistency and incompleteness in their data, in the way that best suits their needs

    Quantitative analysis of crisis: crisis identification and causality

    Get PDF
    Studies use different conceptual and operational definitions of crises. The different crisis identifications can lead to inconsistent conclusions and policy formulation even if the same analytical framework is applied. Also, most studies focus on only a few types of crises. This narrow focus on crises may not capture the multidimensionality of crises. Seven crisis types are analyzed, namely (1) liquidity type banking crises, (2) solvency type banking crises, (3) balance of payments crises, (4) currency crises, (5) debt crises, (6) growth rate crises, and (7) financial crises. Crisis data were collected from 15 emerging economies in 1980-2002 on a quarterly basis. The crisis identification exercise finds that multidimensionality in which different crisis types occur in short periods is one of the most important characteristics of recent crises. Further, the Granger causality tests in five Asian economies (Indonesia, the Republic of Korea, Malaysia, the Philippines, and Thailand) find that currency crises tend to trigger other types of crises, and therefore exchange rate management is essential.Governance Indicators,Economic Theory&Research,Macroeconomic Management,Banks&Banking Reform,Financial Crisis Management&Restructuring

    Developing a service for patients with very severe chronic obstructive pulmonary disease (COPD) within resources

    Get PDF
    Chronic obstructive pulmonary disease (COPD) is a common life-limiting illness with significant burden for patient and carer. Despite this, access to supportive and specialist palliative care is inconsistent and implementation of published good practice recommendations may be challenging within current resources. The aim of this service development was to improve local service provision in Barnsley, within the currently available resources, for patients with very severe COPD, to improve patient identification and symptom management, increase advance care planning and the numbers of patients dying in their preferred place, and increase patient and carer support and satisfaction. To do this a working group was formed, the service problems identified and baseline data collected to identify the needs of people with very severe COPD. A multidisciplinary team meeting was piloted and assessed by community matron feedback, patient case studies and an after death analysis. These indicated a high level of satisfaction, with improvements in advance care planning, co-ordination of management and support for patients' preferred place of care at the end of life. In conclusion this is the first reported very severe COPD service development established in this way and within current resources. Preliminary data indicates the development of the multidisciplinary team meeting has been positive. The appointment of a coordinator will aid this development. Further evaluations particularly seeking patient views and estimations of cost savings will be performed
    • 

    corecore