27 research outputs found

    Eleventh International Conference on the Bearing Capacity of Roads, Railways and Airfields

    Get PDF
    Innovations in Road, Railway and Airfield Bearing Capacity – Volume 3 comprises the third part of contributions to the 11th International Conference on Bearing Capacity of Roads, Railways and Airfields (2022). In anticipation of the event, it unveils state-of-the-art information and research on the latest policies, traffic loading measurements, in-situ measurements and condition surveys, functional testing, deflection measurement evaluation, structural performance prediction for pavements and tracks, new construction and rehabilitation design systems, frost affected areas, drainage and environmental effects, reinforcement, traditional and recycled materials, full scale testing and on case histories of road, railways and airfields. This edited work is intended for a global audience of road, railway and airfield engineers, researchers and consultants, as well as building and maintenance companies looking to further upgrade their practices in the field

    A Semantic-Based Approach to Attain Reproducibility of Computational Environments in Scientific Workflows: A Case Study

    Get PDF
    Abstract. Reproducible research in scientific workflows is often addressed by tracking the provenance of the produced results. While this approach allows inspecting intermediate and final results, improves understanding, and permits replaying a workflow execution, it does not ensure that the computational environment is available for subsequent executions to reproduce the experiment. In this work, we propose describing the resources involved in the execution of an experiment using a set of semantic vocabularies, so as to conserve the computational environment. We define a process for documenting the workflow application, management system, and their dependencies based on 4 domain ontologies. We then conduct an experimental evaluation using a real workflow application on an academic and a public Cloud platform. Results show that our approach can reproduce an equivalent execution environment of a predefined virtual machine image on both computing platforms

    A Semantic-Based Approach to Attain Reproducibility of Computational Environments in Scientific Workflows: A Case Study

    Get PDF
    Abstract. Reproducible research in scientific workflows is often addressed by tracking the provenance of the produced results. While this approach allows inspecting intermediate and final results, improves understanding, and permits replaying a workflow execution, it does not ensure that the computational environment is available for subsequent executions to reproduce the experiment. In this work, we propose describing the resources involved in the execution of an experiment using a set of semantic vocabularies, so as to conserve the computational environment. We define a process for documenting the workflow application, management system, and their dependencies based on 4 domain ontologies. We then conduct an experimental evaluation using a real workflow application on an academic and a public Cloud platform. Results show that our approach can reproduce an equivalent execution environment of a predefined virtual machine image on both computing platforms

    State of the art of cyber-physical systems security: An automatic control perspective

    Get PDF
    Cyber-physical systems are integrations of computation, networking, and physical processes. Due to the tight cyber-physical coupling and to the potentially disrupting consequences of failures, security here is one of the primary concerns. Our systematic mapping study sheds light on how security is actually addressed when dealing with cyber-physical systems from an automatic control perspective. The provided map of 138 selected studies is defined empirically and is based on, for instance, application fields, various system components, related algorithms and models, attacks characteristics and defense strategies. It presents a powerful comparison framework for existing and future research on this hot topic, important for both industry and academia

    Recent advances in low-cost particulate matter sensor: calibration and application

    Get PDF
    Particulate matter (PM) has been monitored routinely due to its negative effects on human health and atmospheric visibility. Standard gravimetric measurements and current commercial instruments for field measurements are still expensive and laborious. The high cost of conventional instruments typically limits the number of monitoring sites, which in turn undermines the accuracy of real-time mapping of sources and hotspots of air pollutants with insufficient spatial resolution. The new trends of PM concentration measurement are personalized portable devices for individual customers and networking of large quantity sensors to meet the demand of Big Data. Therefore, low-cost PM sensors have been studied extensively due to their price advantage and compact size. These sensors have been considered as a good supplement of current monitoring sites for high spatial-temporal PM mapping. However, a large concern is the accuracy of these low-cost PM sensors. Multiple types of low-cost PM sensors and monitors were calibrated against reference instruments. All these units demonstrated high linearity against reference instruments with high R2 values for different types of aerosols over a wide range of concentration levels. The question of whether low-cost PM monitors can be considered as a substituent of conventional instruments was discussed, together with how to qualitatively describe the improvement of data quality due to calibrations. A limitation of these sensors and monitors is that their outputs depended highly on particle composition and size, resulting in as high as 10 times difference in the sensor outputs. Optical characterization of low-cost PM sensors (ensemble measurement) was conducted by combining experimental results with Mie scattering theory. The reasons for their dependence on the PM composition and size distribution were studied. To improve accuracy in estimation of mass concentration, an expression for K as a function of the geometric mean diameter, geometric standard deviation, and refractive index is proposed. To get rid of the influence of the refractive index, we propose a new design of a multi-wavelength sensor with a robust data inversion routine to estimate the PM size distribution and refractive index simultaneously. The utility of the networked system with improved sensitivity was demonstrated by deploying it in a woodworking shop. Data collected by the networked system was utilized to construct spatiotemporal PM concentration distributions using an ordinary Kriging method and an Artificial Neural Network model to elucidate particle generation and ventilation processes. Furthermore, for the outdoor environment, data reported by low-cost sensors were compared against satellite data. The remote sensing data could provide a daily calibration of these low-cost sensors. On the other hand, low-cost PM sensors could provide better accuracy to demonstrate the microenvironment

    Automatic deployment and reproducibility of workflow on the Cloud using container virtualization

    Get PDF
    PhD ThesisCloud computing is a service-oriented approach to distributed computing that has many attractive features, including on-demand access to large compute resources. One type of cloud applications are scientific work ows, which are playing an increasingly important role in building applications from heterogeneous components. Work ows are increasingly used in science as a means to capture, share, and publish computational analysis. Clouds can offer a number of benefits to work ow systems, including the dynamic provisioning of the resources needed for computation and storage, which has the potential to dramatically increase the ability to quickly extract new results from the huge amounts of data now being collected. However, there are increasing number of Cloud computing platforms, each with different functionality and interfaces. It therefore becomes increasingly challenging to de ne work ows in a portable way so that they can be run reliably on different clouds. As a consequence, work ow developers face the problem of deciding which Cloud to select and - more importantly for the long-term - how to avoid vendor lock-in. A further issue that has arisen with work ows is that it is common for them to stop being executable a relatively short time after they were created. This can be due to the external resources required to execute a work ow - such as data and services - becoming unavailable. It can also be caused by changes in the execution environment on which the work ow depends, such as changes to a library causing an error when a work ow service is executed. This "work ow decay" issue is recognised as an impediment to the reuse of work ows and the reproducibility of their results. It is becoming a major problem, as the reproducibility of science is increasingly dependent on the reproducibility of scientific work ows. In this thesis we presented new solutions to address these challenges. We propose a new approach to work ow modelling that offers a portable and re-usable description of the work ow using the TOSCA specification language. Our approach addresses portability by allowing work ow components to be systematically specifed and automatically - v - deployed on a range of clouds, or in local computing environments, using container virtualisation techniques. To address the issues of reproducibility and work ow decay, our modelling and deployment approach has also been integrated with source control and container management techniques to create a new framework that e ciently supports dynamic work ow deployment, (re-)execution and reproducibility. To improve deployment performance, we extend the framework with number of new optimisation techniques, and evaluate their effect on a range of real and synthetic work ows.Ministry of Higher Education and Scientific Research in Iraq and Mosul Universit

    A data governance maturity evaluation model to enhance data management in Eastern Cape government departments

    Get PDF
    The governance of data assets has become a topical issue in the public sector. Government departments are faced with increasingly complex data and information arising from multiple projects, different departments, divisions and several stakeholders seeking data for divergent end uses. However, an exploratory study of the literature regarding data governance in government departments of the Eastern Cape province of South Africa suggest that there are no clear data governance processes in place within the departments. The research question “How can a data governance maturity evaluation model enhance data governance processes in the Eastern Cape government departments” was derived as a result of a perceived need for government departments of the province to manage their critical data assets in a manner which promotes accurate, verifiable and relevant fiscal and strategic planning. Following the review of current literature in the data governance domain, a conceptual data governance evaluation maturity model was developed and produced. The conceptual model was influenced by the IBM data governance maturity model (2007) and it was aimed at addressing the gaps in the reference model to suit the context of the Eastern Cape government departments and the governance of their data assets. A qualitative phase of empirical data collection was conducted to test the components of the conceptual model. A quantitative instrument, derived from the findings of the qualitative study, as well as the components of the refined model was administered to 50 participants in the same departments where qualitative data was collected, with additional participants being drawn from three other departments. Pragmatism was the guiding philosophy for the research. The Contingency and Institutional theories form the theoretical grounding for the study. Design Science guidelines by Hevner et al (2004), Peffers et al’s (2008) Six Steps in Design Science and Drechsler & Hevner’s (2016) Fourth Cycle of Design Science were employed to construct, improve, validate and evaluate the final artefact. Findings confirmed the literature that data governance is lacking in government departments. It is asserted that the implementation of this model will improve the way data assets are recorded, used, archived and disposed in government departments of the Eastern Cape. The outcome of this research was the development and production of a data governance maturity evaluation model as well as a process document which gives a roadmap of how to move from one maturity level to another

    A framework to manage sensitive information during its migration between software platforms

    Get PDF
    Software migrations are mostly performed by organisations using migration teams. Such migration teams need to be aware of how sensitive information ought to be handled and protected during the implementation of the migration projects. There is a need to ensure that sensitive information is identified, classified and protected during the migration process. This thesis suggests how sensitive information in organisations can be handled and protected during migrations by using the migration from proprietary software to open source software to develop a management framework that can be used to manage such a migration process.A rudimentary management framework on information sensitivity during software migrations and a model on the security challenges during open source migrations are utilised to propose a preliminary management framework using a sequential explanatory mixed methods case study. The preliminary management framework resulting from the quantitative data analysis is enhanced and validated to conceptualise the final management framework on information sensitivity during software migrations at the end of the qualitative data analysis. The final management framework is validated and found to be significant, valid and reliable by using statistical techniques like Exploratory Factor Analysis, reliability analysis and multivariate analysis as well as a qualitative coding process.Information ScienceD. Litt. et Phil. (Information Systems
    corecore