6,823 research outputs found

    Reliability of vocational assessment: an evaluation of level 3 electro-technical qualifications

    Get PDF

    Two ways to Grid: the contribution of Open Grid Services Architecture (OGSA) mechanisms to service-centric and resource-centric lifecycles

    Get PDF
    Service Oriented Architectures (SOAs) support service lifecycle tasks, including Development, Deployment, Discovery and Use. We observe that there are two disparate ways to use Grid SOAs such as the Open Grid Services Architecture (OGSA) as exemplified in the Globus Toolkit (GT3/4). One is a traditional enterprise SOA use where end-user services are developed, deployed and resourced behind firewalls, for use by external consumers: a service-centric (or ‘first-order’) approach. The other supports end-user development, deployment, and resourcing of applications across organizations via the use of execution and resource management services: A Resource-centric (or ‘second-order’) approach. We analyze and compare the two approaches using a combination of empirical experiments and an architectural evaluation methodology (scenario, mechanism, and quality attributes) to reveal common and distinct strengths and weaknesses. The impact of potential improvements (which are likely to be manifested by GT4) is estimated, and opportunities for alternative architectures and technologies explored. We conclude by investigating if the two approaches can be converged or combined, and if they are compatible on shared resources

    Modelling of heat emitters embedded within third order lumped parameter building envelope model

    Get PDF
    A dynamic modelling approach for heat emitters embedded within an existing third order lumped parameter building envelope model is reported in this work. The model has been found to provide more accurate results with negligible expense of computational time compared to a conventional quasi-dynamic model. The dynamic model also is preferred over the quasi-dynamic model as it allows for modelling emitters with high thermal capacity such as under-floor heating. Recommendation for this approach is justified through a series of analyses and comparative tests for various circuit options, timesteps and control volumes

    HEDONIC PRICE ESTIMATION FOR KANSAS WHEAT CHARACTERISTICS

    Get PDF
    A hedonic price model is applied to a cross-sectional time-series data set of Kansas wheat characteristics. Results indicate that prices received by wheat producers reflect the presence of conventional quality characteristics of wheat and also milling and dough characteristics. Furthermore, the results indicate that the alternative sets of characteristics exhibit quality information that is, to some degree, independent of one another. Important conclusions regarding the efficiency of current grading and pricing practices for wheat are drawn from this analysis.Crop Production/Industries,

    Doctor of Philosophy

    Get PDF
    dissertationPublic health surveillance systems are crucial for the timely detection and response to public health threats. Since the terrorist attacks of September 11, 2001, and the release of anthrax in the following month, there has been a heightened interest in public health surveillance. The years immediately following these attacks were met with increased awareness and funding from the federal government which has significantly strengthened the United States surveillance capabilities; however, despite these improvements, there are substantial challenges faced by today's public health surveillance systems. Problems with the current surveillance systems include: a) lack of leveraging unstructured public health data for surveillance purposes; and b) lack of information integration and the ability to leverage resources, applications or other surveillance efforts due to systems being built on a centralized model. This research addresses these problems by focusing on the development and evaluation of new informatics methods to improve the public health surveillance. To address the problems above, we first identified a current public surveillance workflow which is affected by the problems described and has the opportunity for enhancement through current informatics techniques. The 122 Mortality Surveillance for Pneumonia and Influenza was chosen as the primary use case for this dissertation work. The second step involved demonstrating the feasibility of using unstructured public health data, in this case death certificates. For this we created and evaluated a pipeline iv composed of a detection rule and natural language processor, for the coding of death certificates and the identification of pneumonia and influenza cases. The second problem was addressed by presenting the rationale of creating a federated model by leveraging grid technology concepts and tools for the sharing and epidemiological analyses of public health data. As a case study of this approach, a secured virtual organization was created where users are able to access two grid data services, using death certificates from the Utah Department of Health, and two analytical grid services, MetaMap and R. A scientific workflow was created using the published services to replicate the mortality surveillance workflow. To validate these approaches, and provide proofs-of-concepts, a series of real-world scenarios were conducted

    Modeling causes of death: an integrated approach using CODEm

    Get PDF
    Background: Data on causes of death by age and sex are a critical input into health decision-making. Priority setting in public health should be informed not only by the current magnitude of health problems but by trends in them. However, cause of death data are often not available or are subject to substantial problems of comparability. We propose five general principles for cause of death model development, validation, and reporting.Methods: We detail a specific implementation of these principles that is embodied in an analytical tool - the Cause of Death Ensemble model (CODEm) - which explores a large variety of possible models to estimate trends in causes of death. Possible models are identified using a covariate selection algorithm that yields many plausible combinations of covariates, which are then run through four model classes. The model classes include mixed effects linear models and spatial-temporal Gaussian Process Regression models for cause fractions and death rates. All models for each cause of death are then assessed using out-of-sample predictive validity and combined into an ensemble with optimal out-of-sample predictive performance.Results: Ensemble models for cause of death estimation outperform any single component model in tests of root mean square error, frequency of predicting correct temporal trends, and achieving 95% coverage of the prediction interval. We present detailed results for CODEm applied to maternal mortality and summary results for several other causes of death, including cardiovascular disease and several cancers.Conclusions: CODEm produces better estimates of cause of death trends than previous methods and is less susceptible to bias in model specification. We demonstrate the utility of CODEm for the estimation of several major causes of death

    Quantum walk speedup of backtracking algorithms

    Full text link
    We describe a general method to obtain quantum speedups of classical algorithms which are based on the technique of backtracking, a standard approach for solving constraint satisfaction problems (CSPs). Backtracking algorithms explore a tree whose vertices are partial solutions to a CSP in an attempt to find a complete solution. Assume there is a classical backtracking algorithm which finds a solution to a CSP on n variables, or outputs that none exists, and whose corresponding tree contains T vertices, each vertex corresponding to a test of a partial solution. Then we show that there is a bounded-error quantum algorithm which completes the same task using O(sqrt(T) n^(3/2) log n) tests. In particular, this quantum algorithm can be used to speed up the DPLL algorithm, which is the basis of many of the most efficient SAT solvers used in practice. The quantum algorithm is based on the use of a quantum walk algorithm of Belovs to search in the backtracking tree. We also discuss how, for certain distributions on the inputs, the algorithm can lead to an exponential reduction in expected runtime.Comment: 23 pages; v2: minor changes to presentatio

    COMPARATIVE STUDIES ON KEY INDICATORS USED IN PERFORMANCE MEASUREMENT SYSTEM OF POLYTECHNICS’ ACADEMIC STAFF

    Get PDF
    Polytechnic Transformation Plan is launched to reinforce the role of polytechnics and technical education in Malaysia. The third thrust of the Plan puts forth the need to equip polytechnics’ teaching personnel and support staff with high skills and competency (MoHE, 2009). As a result, performance of teaching personnel needs to be evaluated to ensure the efficiency and effectiveness of teaching personnel in polytechnics and thus, it is crucial to assert the key indicators used. Based on the literature review, the tentative key indicators identified include, teaching and supervision, research and innovation, administrative tasks, professional activities and services to community. These key indicators are tested in polytechnic context on comparative basis between Northern and Central Region in Malaysia. Researchers employed hybrid/mixed method as the research approach for this study because the method elaborate or develop analysis by providing richer details, and initiate new line of thinking through attention to surprise and provide fresh sight. Amongst the six strategies introduced by Creswell (2003), concurrent embedded strategy is implemented to empirically test the research objective. The purpose of this strategy is to use quantitative data and results to assist in the interpretation of qualitative findings through triangulation. Researchers interviewed the Directors and/or Deputy Directors/Heads of Department of the polytechnic on face-to-face semi-structured basis. In addition, questionnaires developed are distributed to academic staff of the polytechnics to gather their perspective on the key indicators of academic Performance Measurement System. The data collected via interviews are transcribed and translated into English for data analysis process using thematic coding. Besides that, quantitative data are described and analysed using Statistical Package for the Social Science (SPSS) as a toolAcademic Staff, Performance Measurement System, Polytechnic, Key Indicators
    corecore