523 research outputs found

    Fairness of performance evaluation procedures and job satisfaction: the role of outcome-based and non-outcome based effects

    Get PDF
    Prior management accounting studies on fairness perceptions have overlooked two important issues. First, no prior management accounting studies have investigated how procedural fairness, by itself, affects managers' job satisfaction. Second, management accounting researchers have not demonstrated how conflicting theories on procedural fairness can be integrated and explained in a coherent manner. Our model proposes that fairness of procedures for performance evaluation affects job satisfaction through two distinct processes. The first is out-come-based through fairness of outcomes (distributive fairness). The second is non-outcome-based through trust in superior and organisational commitment. Based on a sample of 110 managers, the results indicate that while procedural fairness perceptions affect job satisfaction through both processes, the non-outcome-based process is much stronger than the outcome-based process. These results may be used to develop a unified theory on procedural fairness effects

    The interactive effects of different accounting controls on subordinates\u27 behaviour and performance

    Get PDF
    Prior research suggests that goal setting and an emphasis en meeting tight budget targets may influence the extent of subordinates\u27 performance and slack creation. This study hypothesizes that other accounting controls may moderate these relationships. Specifically, it hypothesizes that: (i) budgetary performance is increased and (ii) budgetary slack creation is decreased when an emphasis on setting and meeting tight budget targets is complemented with a high extent of cost control. The results support a significant two-way interaction between Emphasis on setting and meeting tight budget targets and Cost control affecting budgetary performance. A significant two-way interaction between Emphasis en setting and meeting tight budget targets and Cost control affecting the propensity to create slack was also found for production managers. Marketing managers\u27 propensity to create slack was found to be associated only with Emphasis en setting and meeting tight budget targets

    Investigation of systematic errors for the hybrid and panoramic scanners

    Get PDF
    The existence of terrestrial laser scanners (TLSs) with capability to provide dense three-dimensional (3D) data in short period of time has made it widely used for the many purposes such as documentation, management and analysis. However, similar to other sensors, data obtained from TLSs also can be impaired by errors coming from different sources. Then, calibration routine is crucial for the TLSs to ensure the quality of the data. Through self-calibration, this study has performed system calibration for hybrid (Leica ScanStation C10) and panoramic (Faro Photon 120) scanner at the laboratory with dimensions 15.5m x 9m x 3m and more than hundred planar targets that were fairly distributed. Four most significant parameters are derived from well-known error sources of geodetic instruments as constant (a0), collimation axis (b0), trunnion axis (b1) and vertical circle index (c0) errors. Data obtained from seven scan-stations were processed, and statistical analysis (e.g. t-test) has shown significant errors for the calibrated scanners

    Terrestrial laser scanners pre-processing: registration and georeferencing

    Get PDF
    Terrestrial laser scanner (TLS) is a non-contact sensor, optics-based instrument that collects three dimensional (3D) data of a defined region of an object surface automatically and in a systematic pattern with a high data collecting rate. This capability has made TLS widely applied for numerous 3D applications. With the ability to provide dense 3D data, TLS has improved the processing phase in constructing complete 3D model, which is much simpler and faster. Pre-processing is one of the phases involved, which consisted of registration and georeferencing procedures. Due to the many error sources occur in TLS measurement, thus, pre-processing can be considered as very crucial phase to identify any existence of errors and outliers. Any presence of errors in this phase can decrease the quality of TLS final product. With intention to discuss about this issue, this study has performed two experiments, which involved with data collection for land slide monitoring and 3D topography. By implementing both direct and indirect pre-processing method, the outcomes have indicated that TLS is suitable for applications which require centimetre level of accuracy

    A study about terrestrial laser scanning for reconstruction of precast concrete to support QCLASSIC assessment

    Get PDF
    Nowadays, terrestrial laser scanning shows the potential to improve construction productivity by measuring the objects changes using real-time applications. This paper presents the process of implementation of an efficient framework for precast concrete using terrestrial laser scanning that enables contractors to acquire accurate data and support Quality Assessment System in Construction (QLASSIC). Leica Scanstation C10, black/white target, Autodesk Revit and Cyclone software were used in this study. The results were compared with the dimensional of based model precast concrete given by the company as a reference with the AutoDesk Revit model from the terrestrial laser scanning data and conventional method (measuring tape). To support QLASSIC, the tolerance dimensions of cast in-situ & precast elements is +10 mm /-5 mm. The results showed that the root mean square error for a Revit model is 2.972 mm while using measuring tape is 13.687 mm. The accuracy showed that terrestrial laser scanning has an advantage in construction jobs to support QLASSIC

    Starting a movement: An epidemiological audit into the distribution and determinants of Clostridium Difficele infection at an Australian tertiary hospital site

    Get PDF
    BackgroundThe emergence of hypervirulent strains of Clostridioides (Clostridium) difficile over the past few decades has cemented C. difficile infection (CDI) as the most common cause of nosocomial infectious diarrhoea within Australia. This report was initiated to better understand the burden of disease at the Bankstown-Lidcombe Hospital through analysis of CDI incidence, risk factors, and treatment.AimsThe specific objectives of this study were two-fold; 1) to determine the prevalence of hospitalised patients affected with CDI and 2) to identify risk factors for CDI in hospitalised patients.Methods A retrospective review of all consecutive CDI cases at the Bankstown-Lidcombe Hospital between 1 July 2014 and 31 December 2018 was performed. CDI incidence was calculated based on the number of CDI cases observed per 10,000 patient days. Annual incidence and predisposing antibiotics to CDI were compared via univariate analysis and Student t-tests. Treatment for CDI was compared using contingency analysis via Pearson’s chi-squared analysis. Results The CDI diagnoses ranged from 3.2–4.6 (as a proportion of 10,000 occupied bed days) throughout 2014 and 2018. There was a significant decrease in CDI associated with Macrolides between 2017 and 2018 (p=0.03). There was a significant rise in CDI associated with Beta lactamase inhibitors and Penicillins (e.g., Tazobactam/Piperacillin). The majority of CDI patients were treated with single therapy metronidazole during their hospital stays.ConclusionCDI risk minimisation presents a significant challenge to all hospital departments. This audit highlights the importance of antibiotic usage influencing in-patient CDI cases and the vital role of multidisciplinary teams (microbiologists, pathologists, physicians, surgeons and pharmacists) in managing and monitoring these patients

    Investigation of datum constraints effect in terrestrial laser scanner self-calibration

    Get PDF
    The ability to provide rapid and dense three-dimensional (3D) data have made many 3D applications easier. However, similar to other optical and electronic instruments, data from TLS can also be impaired with errors. Self-calibration is a method available to investigate those errors in TLS observations which has been adopted from photogrammetry technique. Though, the network configuration applied by both TLS and photogrammetry techniques are quite different. Thus, further investigation is required to verify whether the photogrammetry principal regarding datum constraints selection is applicable to TLS self-calibration. To ensure that the assessment is thoroughly done, the datum constraints analyses were carried out using three variant network configurations: 1) minimum number of scan stations, 2) minimum number of surfaces for targets distribution, and 3) minimum number of point targets. Via graphical and statistical, the analyses of datum constraints selection have indicated that the parameter correlations obtained are significantly similar

    A Development Framework for Rapid Metaheuristics Hybridization

    Get PDF
    While meta-heuristics are effective for solving large-scale combinatorial optimization problems, they result from time-consuming trial-and-error algorithm design tailored to specific problems. For this reason, a software tool for rapid prototyping of algorithms would save considerable resources. This paper presents a generic software framework that reduces development time through abstract classes and software reuse, and more importantly, aids design with support of user-defined strategies and hybridization of meta-heuristics. Most interestingly, we propose a novel way of redefining hybridization with the use of the “request and response ” metaphor, which form an abstract concept for hybridization. Different hybridization schemes can now be formed with minimal coding, which gives our proposed Metaheuristics Development Framework its uniqueness. To illustrate the concept, we restrict to two popular metaheuristics Ants Colony Optimization and Tabu Search, and demonstrate MDF through the implementation of various hybridized models to solve the Traveling Salesman Problem. 1

    COVID-19 Related Mobility Reduction: Heterogenous Effects on Sleep and Physical Activity Rhythms

    Full text link
    Mobility restrictions imposed to suppress coronavirus transmission can alter physical activity (PA) and sleep patterns. Characterization of response heterogeneity and their underlying reasons may assist in tailoring customized interventions. We obtained wearable data covering baseline, incremental movement restriction and lockdown periods from 1824 city-dwelling, working adults aged 21 to 40 years, incorporating 206,381 nights of sleep and 334,038 days of PA. Four distinct rest activity rhythms (RARs) were identified using k-means clustering of participants' temporally distributed step counts. Hierarchical clustering of the proportion of time spent in each of these RAR revealed 4 groups who expressed different mixtures of RAR profiles before and during the lockdown. Substantial but asymmetric delays in bedtime and waketime resulted in a 24 min increase in weekday sleep duration with no loss in sleep efficiency. Resting heart rate declined 2 bpm. PA dropped an average of 38%. 4 groups with different compositions of RAR profiles were found. Three were better able to maintain PA and weekday/weekend differentiation during lockdown. The least active group comprising 51 percent of the sample, were younger and predominantly singles. Habitually less active already, this group showed the greatest reduction in PA during lockdown with little weekday/weekend differences. Among different mobility restrictions, removal of habitual social cues by lockdown had the largest effect on PA and sleep. Sleep and resting heart rate unexpectedly improved. RAR evaluation uncovered heterogeneity of responses to lockdown and can identify characteristics of persons at risk of decline in health and wellbeing.Comment: 30 pages, 3 main figures, 3 tables, 4 supplementary figure
    corecore