9,570 research outputs found

    Large-Eddy Simulation of Microvortex Generators in a Turbulent Boundary Layer

    Get PDF
    The present study investigates the flow physics on MicroVortex Generators (MVGs) in order to improve their performance in turbulent boundary layers (TBLs). TBLs can be a challenging environment for MVGs because of the streamwise length of the shedded vortex and its increased parasitic drag. Large-Eddy Simulation (LES) is used to properly resolve the turbulent boundary layer of a flat-plate with a zero-pressure gradient and MVG vane. Three different vane-types are investigated (e423-Mod, triangular, and rectangular vanes) in a single vane configuration. Important flow features such as a separation bubble on the leading edge of the rectangular vanes which introduced unsteadiness into the vortex formation and degraded the MVG's efficiency was observed. The e423-Mod and triangular vanes was observed to be more aerodynamically efficient. The triangular vane was found to be the most efficient when evaluated immediately downstream of the vane. However, the vortex from the triangular vane decayed very rapidly due to it being formed very close to the wall which degraded its efficiency further downstream. The e423-Mod vane avoided this problem but its drag was very high relative to the strength of the generated vortex and its vortex experienced a brief period of rapid decay immediately downstream decreasing its efficiency. Further downstream, the vortex of the rectangular vane at 16° became the most efficient through a combination of low vane drag and low vortex decay in the TBL, demonstrating the need to consider a range of issues when designing an MVG

    Automated hippocampal segmentation in patients with epilepsy: Available free online

    Get PDF
    Hippocampal sclerosis, a common cause of refractory focal epilepsy, requires hippocampal volumetry for accurate diagnosis and surgical planning. Manual segmentation is time-consuming and subject to interrater/intrarater variability. Automated algorithms perform poorly in patients with temporal lobe epilepsy. We validate and make freely available online a novel automated method

    Natural disaster preparation and response: a guide for state housing authorities

    Get PDF
    A natural disaster is a rapid onset event that threatens or causes death, injury or damage to property or the environment, requiring a coordinated multi-agency and community response. The most costly and significant impacts of natural disasters and other environmental emergencies are on buildings. Damage or total loss of residential dwellings and social infrastructure especially accentuate hardship, homelessness, displacement and psychological trauma. For this reason, State Housing Authorities (SHAs) are among the key stakeholders with significant roles in disaster management. The overall aim of the project is to provide guidance for SHAs and to assist them prepare for and respond to natural disasters and other environmental emergencies

    A balance of trust in the use of government administrative data

    Get PDF
    Government departments and agencies around the world routinely collect administrative data produced by citizen interaction with the state. The UK government increasingly frames data as an ‘asset’. The potential in administrative data can be exploited by sharing and linking across datasets, but when the rhetoric of the benefits of data sharing is bound up in commercial exploitation, trustworthy motivations for sharing data come into question. Such questions are framed around two apparently conflicting public goods. The public good in re-using data to increase government efficiency and to enhance research is set against the public good in protecting privacy. Privacy is a collective as well as an individual benefit, enabling the public to participate confidently in citizen-state interactions. Balancing these public goods is challenging given rapidly evolving technology and data science. The analysis presented here draws on research undertaken by the authors as part of the Administrative Data Research Centre in England. Between 2014 and 2017, four case studies were conducted on government administrative data across education, transport, energy and health. The purpose of the research was to examine stakeholder perspectives in relation to administrative data sharing and re-use. The themes of trust, risk and consent were chosen to articulate the research questions and analysis: this article focuses on the findings related to trust. It explores the notion of trust in the collection, analysis, linkage and re-use of routinely collected government administrative data in England. It seeks to demonstrate that securing public trust in data initiatives is dependent on a broader balance of trust between a network of actors involved in data sharing and use

    A Computational Comparison of Optimization Methods for the Golomb Ruler Problem

    Full text link
    The Golomb ruler problem is defined as follows: Given a positive integer n, locate n marks on a ruler such that the distance between any two distinct pair of marks are different from each other and the total length of the ruler is minimized. The Golomb ruler problem has applications in information theory, astronomy and communications, and it can be seen as a challenge for combinatorial optimization algorithms. Although constructing high quality rulers is well-studied, proving optimality is a far more challenging task. In this paper, we provide a computational comparison of different optimization paradigms, each using a different model (linear integer, constraint programming and quadratic integer) to certify that a given Golomb ruler is optimal. We propose several enhancements to improve the computational performance of each method by exploring bound tightening, valid inequalities, cutting planes and branching strategies. We conclude that a certain quadratic integer programming model solved through a Benders decomposition and strengthened by two types of valid inequalities performs the best in terms of solution time for small-sized Golomb ruler problem instances. On the other hand, a constraint programming model improved by range reduction and a particular branching strategy could have more potential to solve larger size instances due to its promising parallelization features

    A novel walkability index for London predicts walking time in adults

    Get PDF
    Objective: To develop a novel walkability index for London and test it through measurement of associations between neighbourhood walkability and walking among adults using data from the Whitehall II Study. Background: Physical activity is essential for health; walking is the easiest way to incorporate it into everyday life. Many studies have reported positive associations between neighbourhood walkability and walking but the majority have focused on cities in North America and Australasia. Urban form with respect to street connectivity, residential density and land use mix – common components of walkability indices – is likely to differ in European cities. Methods: A walkability index for the 633 spatially contiguous census area statistics wards of London was constructed, comprising three core dimensions associated with walking behaviours: residential dwelling density, street connectivity and land use mix. Walkability was expressed as quartile scores, with wards scoring 1 being in the bottom 25% in terms of walkability, and those scoring 4 in the top 25%. A neighbourhood walkability score was assigned to each London-dwelling Whitehall II Study participant (2003-04, N=3020, mean +/-SD age=61.0y +/-6.0) as the walkability score of the ward in which their residential postcode fell. Associations between neighbourhood walkability and weekly walking time were measured using multiple logistic regression. Results: After adjustment for individual level factors and area deprivation, people in the most walkable neighbourhoods were significantly more likely to spend ≥6hr/wk (Odds Ratio 1.4; 95%Confidence Interval 1.1-1.9), than those in the least walkable. Conclusions: The walkability index constructed can predict walking time in adults: living in a more walkable neighbourhood is associated with longer weekly walking time. The index may help urban planners identify and design neighbourhoods in London with characteristics that are potentially more supportive of walking and, thereby, promote public health

    Where do we go from here? An assessment of navigation performance using a compass versus a GPS unit

    Get PDF
    The Global Positioning System (GPS) looks set to replace the traditional map and compass for navigation tasks in military and civil domains. However, we may ask whether GPS has a real performance advantage over traditional methods. We present an exploratory study using a waypoint plotting task to compare the standard magnetic compass against a military GPS unit, for both expert and non-expert navigators. Whilst performance times were generally longer in setting up the GPS unit, once navigation was underway the GPS was more efficient than the compass. For mediumto long-term missions, this means that GPS could offer significant performance benefits, although the compass remains superior for shorter missions. Notwithstanding the performance times, significantly more errors, and more serious errors, occurred when using the compass. Overall, then, the GPS offers some clear advantages, especially for non-expert users. Nonetheless, concerns over the development of cognitive maps remain when using GPS technologies

    Development of a novel walkability index for London, United Kingdom: cross-sectional application to the Whitehall II study

    Get PDF
    BACKGROUND: Physical activity is essential for health; walking is the easiest way to incorporate activity into everyday life. Previous studies report positive associations between neighbourhood walkability and walking but most focused on cities in North America and Australasia. Urban form with respect to street connectivity, residential density and land use mix-common components of walkability indices-differs in European cities. The objective of this study was to develop a walkability index for London and test the index using walking data from the Whitehall II Study.  METHODS: A neighbourhood walkability index for London was constructed, comprising factors associated with walking behaviours: residential dwelling density, street connectivity and land use mix. Three models were produced that differed in the land uses included. Neighbourhoods were operationalised at three levels of administrative geography: (i) 21,140 output areas, (ii) 633 wards and (iii) 33 local authorities. A neighbourhood walkability score was assigned to each London-dwelling Whitehall II Study participant (2003-04, N = 3020, mean ± SD age = 61.0 years ± 6.0) based on residential postcode. The effect of changing the model specification and the units of enumeration on spatial variation in walkability was examined. RESULTS: There was a radial decay in walkability from the centre to the periphery of London. There was high inter-model correlation in walkability scores for any given neighbourhood operationalisation (0.92-0.98), and moderate-high correlation between neighbourhood operationalisations for any given model (0.39-0.70). After adjustment for individual level factors and area deprivation, individuals in the most walkable neighbourhoods operationalised as wards were more likely to walk >6 h/week (OR = 1.4; 95 % CI: 1.1-1.9) than those in the least walkable. CONCLUSIONS: Walkability was associated with walking time in adults. This walkability index could help urban planners identify and design neighbourhoods in London with characteristics more supportive of walking, thereby promoting public health

    Determination of urban volatile organic compound emission ratios and comparison with an emissions database

    Get PDF
    During the NEAQS-ITCT2k4 campaign in New England, anthropogenic VOCs and CO were measured downwind from New York City and Boston. The emission ratios of VOCs relative to CO and acetylene were calculated using a method in which the ratio of a VOC with acetylene is plotted versus the photochemical age. The intercept at the photochemical age of zero gives the emission ratio. The so determined emission ratios were compared to other measurement sets, including data from the same location in 2002, canister samples collected inside New York City and Boston, aircraft measurements from Los Angeles in 2002, and the average urban composition of 39 U.S. cities. All the measurements generally agree within a factor of two. The measured emission ratios also agree for most compounds within a factor of two with vehicle exhaust data indicating that a major source of VOCs in urban areas is automobiles. A comparison with an anthropogenic emission database shows less agreement. Especially large discrepancies were found for the C2-C4 alkanes and most oxygenated species. As an example, the database overestimated toluene by almost a factor of three, which caused an air quality forecast model (WRF-CHEM) using this database to overpredict the toluene mixing ratio by a factor of 2.5 as well. On the other hand, the overall reactivity of the measured species and the reactivity of the same compounds in the emission database were found to agree within 30%. Copyright 2007 by the American Geophysical Union

    The effects of two weeks high-intensity interval training on fasting glucose, glucose tolerance and insulin resistance in adolescent boys: a pilot study

    Get PDF
    This is the final version. Available on open access from BMC via the DOI in this recordAvailability of data and materials: The datasets generated and analysed during the current study are not publicly available due to ethical restrictions but are available from the corresponding author upon reasonable request.Background Current evidence of metabolic health benefits of high-intensity interval training (HIIT) are limited to longer training periods or conducted in overweight youth. This study assessed 1) fasting and postprandial insulin and glucose before and after 2 weeks of HIIT in healthy adolescent boys, and 2) the relationship between pre intervention health outcomes and the effects of the HIIT intervention. Methods Seven healthy boys (age:14.3 ± 0.3 y, BMI: 21.6 ± 2.6, 3 participants classified as overweight) completed 6 sessions of HIIT over 2 weeks. Insulin resistance (IR) and blood glucose and insulin responses to a Mixed Meal Tolerance Test (MMTT) were assessed before (PRE), 20 h and 70 h after (POST) the final HIIT session. Results Two weeks of HIIT had no effect on fasting plasma glucose, insulin or IR at 20 h and 70 h POST HIIT, nor insulin and glucose response to MMTT (all P > 0.05). There was a strong negative correlation between PRE training IR and change in IR after HIIT (r = − 0.96, P < 0.05). Conclusion Two weeks of HIIT did not elicit improvements to fasting or postprandial glucose or insulin health outcomes in a group of adolescent boys. However the negative correlation between PRE IR and improvements after HIIT suggest that interventions of this type may be effective in adolescents with raised baseline IR.National Institute for Health Research (NIHR)Northcott Devon Medical Foundatio
    • …
    corecore