7,207 research outputs found

    Characterisation of particulate matter originating from automotive occupant restraints

    Get PDF
    In 2012, in the UK, it was estimated that 6.5 million airbags required deployment at end of life. This process poses hazards such as that of occupational respiratory exposure to solid particulate matter (PM) effluents produced during the production of inflation gases for airbag deployment. To date methods for assessing effluent exposure has focused on vehicle occupants and not occupational exposures and has mainly centred on direct and static measurement of particle mass. This research programme evaluated existing methods of assessment and defined novel methods for more comprehensive characterisation. The methods were employed to characterise sub-micron PM effluents from driver airbags using non-azide, solid propellant and hybrid inflators. Testing was undertaken using a differential mobility spectrometer (DMS), gravimetric filtration, high speed photography and electron microscopy. A comparison of an effluent test tank and a vehicle of a comparable volume showed that the tank was able to replicate a vehicle environment and provide measurements with acceptable levels of inter-test variability with test duration of >900s. Characterisation of particle geometric mean diameter (GMD) and number concentration for airbag effluents showed that dominant particles were below 150nm in size, with smaller particles being emitted by hybrid airbags. Particle concentrations were also lower for hybrid airbags. By assessing transient behaviour it was identified that as time elapsed, concentration reduced whilst particle mean size increased. This data allowed identification of propellants used in airbags and a mathematical model was defined to describe effluent characteristics for each propellant employed. The particle sizes measured by DMS compared well with those obtained from TEM images which identified generally spherical particles, commonly accumulated as agglomerates. TEM also identified large concentrations of particles below the lower measurement range of the DMS, 1μm. This research has provided verification of existing test methodologies and allowed a more comprehensive assessment of airbag effluents than previously presented in the literature

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    Gateway Electromagnetic Environmental Effects (E3) Requirements

    Get PDF
    Electromagnetic Compatibility (EMC) is essential to the success of any vehicle design that incorporates a complex assortment of electronic, electrical, and electromechanical systems and sub-systems that is expected to meet operational and performance requirements while exposed to a changing set of electromagnetic environments composed of both man-made and naturally occurring threats. The combined aspects of these environments are known as Electromagnetic Environmental Effects (E3). The attainment of EMC is accomplished through the application of sound engineering principles and practices that enable a complex vehicle or vehicles to operate successfully when exposed to the effects of its expected and/or specified electromagnetic environments

    Optimizing Human Performance to Enhance Safety: A Case Study in an Automotive Plant

    Get PDF
    Human factors play a relevant role in the dynamic work environments of the manufacturing sector in terms of production efficiency, safety, and sustainable performance. This is particularly relevant in assembly lines where humans are widely employed alongside automated and robotic agents. In this situation, operators’ ability to adapt to different levels of task complexity and variability in each workstation has a strong impact on the safety, reliability, and efficiency of the overall production process. This paper presents an application of a theoretical and empirical method used to assess the matching of different workers to various workstations based on a quantified comparison between the workload associated with the tasks and the human capability of the workers that can rotate among them. The approach allowed for the development of an algorithm designed to operationalise indicators for workload and task complexity requirements, considering the skills and capabilities of individual operators. This led to the creation of human performance (HP) indices. The HP indices were utilized to ensure a good match between requirements and capabilities, aiming to minimise the probability of human error and injuries. The developed and customised model demonstrated encouraging results in the specific case studies where it was applied but also offers a generalizable approach that can extend to other contexts and situations where job rotations can benefit from effectively matching operators to suitable task requirements

    License to Supervise:Influence of Driving Automation on Driver Licensing

    Get PDF
    To use highly automated vehicles while a driver remains responsible for safe driving, places new – yet demanding, requirements on the human operator. This is because the automation creates a gap between drivers’ responsibility and the human capabilities to take responsibility, especially for unexpected or time-critical transitions of control. This gap is not being addressed by current practises of driver licensing. Based on literature review, this research collects drivers’ requirements to enable safe transitions in control attuned to human capabilities. This knowledge is intended to help system developers and authorities to identify the requirements on human operators to (re)take responsibility for safe driving after automation

    New Directions in Compensation Research: Synergies, Risk, and Survival

    Get PDF
    We describe and use two theoretical frameworks, the resource-based view of the firm and institutional theory, as lenses for examining three promising areas of compensation research. First, we examine the nature of the relationship between pay and effectiveness. Does pay typically have a main effect or, instead, does the relationship depend on other human resource activities and organization characteristics? If the latter is true, then there are synergies between pay and these other factors and thus, conclusions drawn from main effects models may be misleading. Second, we discuss a relatively neglected issue in pay research, the concept of risk as it applies to investments in pay programs. Although firms and researchers tend to focus on expected returns from compensation interventions, analysis of the risk, or variability, associated with these returns may be essential for effective decision-making. Finally ,pay program survival, which has been virtually ignored in systematic pay research, is investigated. Survival appears to have important consequences for estimating pay plan risk and returns, and is also integral to the discussion of pay synergies. Based upon our two theoretical frameworks, we suggest specific research directions for pay program synergies, risk, and survival

    Assessing the Human Factor in Truck Driving

    Get PDF
    Human factors assessment techniques are commonly applied to a variety of workplaces to examine the nature of operations and how key functions are controlled operationally; however, these tools appear to overlook key aspects of truck driving, particularly the driver’s relationship to the driving experience. The fundamental issue is with the ability to completely decompose truck driving and accurately document the truck drivers working environment will be problematic. Therefore, to demonstrate how a truck driver moves between each series of sub-tasks will require a purpose-built assessment tool that that is both practical and relevant to truck driving

    Equity in Microscale Urban Design and Walkability: A Photographic Survey of Six Pittsburgh Streetscapes

    Get PDF
    This paper explores inequity in neighborhood walkability at the micro-scale level by qualitatively examining six streetscapes in Pittsburgh, Pennsylvania. A photographic survey is used to highlight differences in the quality and design of the built environment among pairs of streetscapes with high or low social vulnerability but approximately equal quantitative Walk Scores®. The survey revealed discernible differences in the quality and maintenance of the built environment among those in more and less disadvantaged neighborhoods. This was true of several characteristics expected to affect walkability, including enclosure, transparency, complexity, and tidiness. Streetscapes in neighborhoods with high social vulnerability exhibited less contiguous street walls, fewer windows and less transparent storefronts, less well maintained infrastructure, fewer street cafés, and overall less complexity than those in neighborhoods with low social vulnerability. Implications for planning and policy are discussed

    Development and certification of mixed-criticality embedded systems based on probabilistic timing analysis

    Get PDF
    An increasing variety of emerging systems relentlessly replaces or augments the functionality of mechanical subsystems with embedded electronics. For quantity, complexity, and use, the safety of such subsystems is an increasingly important matter. Accordingly, those systems are subject to safety certification to demonstrate system's safety by rigorous development processes and hardware/software constraints. The massive augment in embedded processors' complexity renders the arduous certification task significantly harder to achieve. The focus of this thesis is to address the certification challenges in multicore architectures: despite their potential to integrate several applications on a single platform, their inherent complexity imperils their timing predictability and certification. Recently, the Measurement-Based Probabilistic Timing Analysis (MBPTA) technique emerged as an alternative to deal with hardware/software complexity. The innovation that MBPTA brings about is, however, a major step from current certification procedures and standards. The particular contributions of this Thesis include: (i) the definition of certification arguments for mixed-criticality integration upon multicore processors. In particular we propose a set of safety mechanisms and procedures as required to comply with functional safety standards. For timing predictability, (ii) we present a quantitative approach to assess the likelihood of execution-time exceedance events with respect to the risk reduction requirements on safety standards. To this end, we build upon the MBPTA approach and we present the design of a safety-related source of randomization (SoR), that plays a key role in the platform-level randomization needed by MBPTA. And (iii) we evaluate current certification guidance with respect to emerging high performance design trends like caches. Overall, this Thesis pushes the certification limits in the use of multicore and MBPTA technology in Critical Real-Time Embedded Systems (CRTES) and paves the way towards their adoption in industry.Una creciente variedad de sistemas emergentes reemplazan o aumentan la funcionalidad de subsistemas mecánicos con componentes electrónicos embebidos. El aumento en la cantidad y complejidad de dichos subsistemas electrónicos así como su cometido, hacen de su seguridad una cuestión de creciente importancia. Tanto es así que la comercialización de estos sistemas críticos está sujeta a rigurosos procesos de certificación donde se garantiza la seguridad del sistema mediante estrictas restricciones en el proceso de desarrollo y diseño de su hardware y software. Esta tesis trata de abordar los nuevos retos y dificultades dadas por la introducción de procesadores multi-núcleo en dichos sistemas críticos: aunque su mayor rendimiento despierta el interés de la industria para integrar múltiples aplicaciones en una sola plataforma, suponen una mayor complejidad. Su arquitectura desafía su análisis temporal mediante los métodos tradicionales y, asimismo, su certificación es cada vez más compleja y costosa. Con el fin de lidiar con estas limitaciones, recientemente se ha desarrollado una novedosa técnica de análisis temporal probabilístico basado en medidas (MBPTA). La innovación de esta técnica, sin embargo, supone un gran cambio cultural respecto a los estándares y procedimientos tradicionales de certificación. En esta línea, las contribuciones de esta tesis están agrupadas en tres ejes principales: (i) definición de argumentos de seguridad para la certificación de aplicaciones de criticidad-mixta sobre plataformas multi-núcleo. Se definen, en particular, mecanismos de seguridad, técnicas de diagnóstico y reacción de faltas acorde con el estándar IEC 61508 sobre una arquitectura multi-núcleo de referencia. Respecto al análisis temporal, (ii) presentamos la cuantificación de la probabilidad de exceder un límite temporal y su relación con los requisitos de reducción de riesgos derivados de los estándares de seguridad funcional. Con este fin, nos basamos en la técnica MBPTA y presentamos el diseño de una fuente de números aleatorios segura; un componente clave para conseguir las propiedades aleatorias requeridas por MBPTA a nivel de plataforma. Por último, (iii) extrapolamos las guías actuales para la certificación de arquitecturas multi-núcleo a una solución comercial de 8 núcleos y las evaluamos con respecto a las tendencias emergentes de diseño de alto rendimiento (caches). Con estas contribuciones, esta tesis trata de abordar los retos que el uso de procesadores multi-núcleo y MBPTA implican en el proceso de certificación de sistemas críticos de tiempo real y facilita, de esta forma, su adopción por la industria.Postprint (published version
    corecore