1,864 research outputs found
Examining trends in nonĂą fatal strangulation among sexual assault survivors seeking Sexual Assault Nurse Examiner care from 2002 to 2017
Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154499/1/ijgo13058_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154499/2/ijgo13058.pd
Radio telemetry devices to monitor breathing in non-sedated animals
Radio telemetry equipment has significantly improved over the last 10-15 years and is increasingly being used in research for monitoring a variety of physiological parameters in non-sedated animals. The aim of this review is to provide an update on the current state of development of radio telemetry for recording respiration. Our literature review found only rare reports of respiratory studies via radio telemetry. Much of this article will hence report our experience with our custom-built radio telemetry devices designed for recording respiratory signals, together with numerous other physiological signals in lambs. Our current radio telemetry system allows to record 24 simultaneous signals 24h/day for several days. To our knowledge, this is the highest number of physiological signals, which can be recorded wirelessly. Our devices have been invaluable for studying respiration in our ovine models of preterm birth, reflux laryngitis, postnatal exposure to cigarette smoke, respiratory syncytial virus infection and nasal ventilation, all of which are relevant to neonatal respiratory problems
Validation of nonlinear PCA
Linear principal component analysis (PCA) can be extended to a nonlinear PCA
by using artificial neural networks. But the benefit of curved components
requires a careful control of the model complexity. Moreover, standard
techniques for model selection, including cross-validation and more generally
the use of an independent test set, fail when applied to nonlinear PCA because
of its inherent unsupervised characteristics. This paper presents a new
approach for validating the complexity of nonlinear PCA models by using the
error in missing data estimation as a criterion for model selection. It is
motivated by the idea that only the model of optimal complexity is able to
predict missing values with the highest accuracy. While standard test set
validation usually favours over-fitted nonlinear PCA models, the proposed model
validation approach correctly selects the optimal model complexity.Comment: 12 pages, 5 figure
Learning curves for pediatric laparoscopy: how many operations are enough? The Amsterdam experience with laparoscopic pyloromyotomy
Few studies on the surgical outcomes of open (OP) versus laparoscopic pyloromyotomy (LP) in the treatment of hypertrophic pyloric stenosis have been published. The question arises as to how many laparoscopic procedures are required for a surgeon to pass the learning curve and which technique is best in terms of postoperative complications. This study aimed to evaluate and quantify the learning curve for the laparoscopic technique at the authors' center. A second goal of this study was to evaluate the pre- and postoperative data of OP versus LP for infantile hypertrophic pyloric stenosis. A retrospective analysis was performed for 229 patients with infantile hypertrophic pyloric stenosis. Between January 2002 and September 2008, 158 infants underwent OP and 71 infants had LP. The median operating time between the OP (33 min) and LP (40 min) groups was significantly different. The median hospital stay after surgery was 3 days for the OP patients and 2 days for the LP patients (p = 0.002). The postoperative complication rates were not significantly different between the OP (21.5%) and LP (21.1%) groups (p = 0.947). Complications were experienced by 31.5% of the first 35 LP patients. This rate decreased to 11.4% during the next 35 LP procedures (p = 0.041). Two perforations and three conversions occurred in the first LP group, compared with one perforation in the second LP group. The number of complications decreased significantly between the first and second groups of the LP patients, with the second group of LP patients quantifying the learning curve. Not only was the complication rate lower in the second LP group, but severe complications also were decreased. This indicates that the learning curve for LP in the current series involved 35 procedure
Farm systems assessment of bioenergy feedstock production: Integrating bio-economic models and life cycle analysis approaches
Climate change and energy security concerns have driven the development of policies that encourage bioenergy production. Meeting EU targets for the consumption of transport fuels from bioenergy by 2020 will require a large increase in the production of bioenergy feedstock. Initially an increase in âfirst generationâ biofuels was observed, however âfood competitionâ concerns have generated interest in second generation biofuels (SGBs). These SGBs can be produced from co-products (e.g. cereal straw) or energy crops (e.g. miscanthus), with the former largely negating food competition concerns. In order to assess the sustainability of feedstock supply for SGBs, the financial, environmental and energy costs and benefits of the farm system must be quantified. Previous research has captured financial costs and benefits through linear programming (LP) approaches, whilst environmental and energy metrics have been largely been undertaken within life cycle analysis (LCA) frameworks. Assessing aspects of the financial, environmental and energy sustainability of supplying co-product second generation biofuel (CPSGB) feedstocks at the farm level requires a framework that permits the trade-offs between these objectives to be quantified and understood. The development of a modelling framework for Managing Energy and Emissions Trade-Offs in Agriculture (MEETA Model) that combines bio-economic process modelling and LCA is presented together with input data parameters obtained from literature and industry sources. The MEETA model quantifies arable farm inputs and outputs in terms of financial, energy and emissions results. The model explicitly captures fertiliser: crop-yield relationships, plus the incorporation of straw or removal for sale, with associated nutrient impacts of incorporation/removal on the following crop in the rotation. Key results of crop-mix, machinery use, greenhouse gas (GHG) emissions per kg of crop product and energy use per hectare are in line with previous research and industry survey findings. Results show that the gross margin â energy trade-off is ÂŁ36 GJâ1, representing the gross margin forgone by maximising net farm energy cf. maximising farm gross margin. The gross marginâGHG emission trade-off is ÂŁ0.15 kgâ1 CO2 eq, representing the gross margin forgone per kg of CO2 eq reduced when GHG emissions are minimised cf. maximising farm gross margin. The energyâGHG emission trade-off is 0.03 GJ kgâ1 CO2 eq quantifying the reduction in net energy from the farm system per kg of CO2 eq reduced when minimising GHG emissions cf. maximising net farm energy. When both farm gross margin and net farm energy are maximised all the cereal straw is baled for sale. Sensitivity analysis of the model in relation to different prices of cereal straw shows that it becomes financially optimal to incorporate wheat straw at price of ÂŁ11 tâ1 for this co-product. Local market conditions for straw and farmer attitudes towards incorporation or sale of straw will impact on the straw price at which farmers will supply this potential bioenergy feedstock and represent important areas for future research
Scoping review on vector-borne diseases in urban areas : transmission dynamics, vectorial capacity and co-infection
BACKGROUND: Transmission dynamics, vectorial capacity, and co-infections have substantial impacts on vector-borne diseases (VBDs) affecting urban and suburban populations. Reviewing key factors can provide insight into priority research areas and offer suggestions for potential interventions. MAIN BODY: Through a scoping review, we identify knowledge gaps on transmission dynamics, vectorial capacity, and co-infections regarding VBDs in urban areas. Peer-reviewed and grey literature published between 2000 and 2016 was searched. We screened abstracts and full texts to select studies. Using an extraction grid, we retrieved general data, results, lessons learned and recommendations, future research avenues, and practice implications. We classified studies by VBD and country/continent and identified relevant knowledge gaps. Of 773 articles selected for full-text screening, 50 were included in the review: 23 based on research in the Americas, 15 in Asia, 10 in Africa, and one each in Europe and Australia. The largest body of evidence concerning VBD epidemiology in urban areas concerned dengue and malaria. Other arboviruses covered included chikungunya and West Nile virus, other parasitic diseases such as leishmaniasis and trypanosomiasis, and bacterial rickettsiosis and plague. Most articles retrieved in our review combined transmission dynamics and vectorial capacity; only two combined transmission dynamics and co-infection. The review identified significant knowledge gaps on the role of asymptomatic individuals, the effects of co-infection and other host factors, and the impacts of climatic, environmental, and socioeconomic factors on VBD transmission in urban areas. Limitations included the trade-off from narrowing the search strategy (missing out on classical modelling studies), a lack of studies on co-infections, most studies being only descriptive, and few offering concrete public health recommendations. More research is needed on transmission risk in homes and workplaces, given increasingly dynamic and mobile populations. The lack of studies on co-infection hampers monitoring of infections transmitted by the same vector. CONCLUSIONS: Strengthening VBD surveillance and control, particularly in asymptomatic cases and mobile populations, as well as using early warning tools to predict increasing transmission, were key strategies identified for public health policy and practice
How do you say âhelloâ? Personality impressions from brief novel voices
On hearing a novel voice, listeners readily form personality impressions of that speaker. Accurate or not, these impressions are known to affect subsequent interactions; yet the underlying psychological and acoustical bases remain poorly understood. Furthermore, hitherto studies have focussed on extended speech as opposed to analysing the instantaneous impressions we obtain from first experience. In this paper, through a mass online rating experiment, 320 participants rated 64 sub-second vocal utterances of the word âhelloâ on one of 10 personality traits. We show that: (1) personality judgements of brief utterances from unfamiliar speakers are consistent across listeners; (2) a two-dimensional âsocial voice spaceâ with axes mapping Valence (Trust, Likeability) and Dominance, each driven by differing combinations of vocal acoustics, adequately summarises ratings in both male and female voices; and (3) a positive combination of Valence and Dominance results in increased perceived male vocal Attractiveness, whereas perceived female vocal Attractiveness is largely controlled by increasing Valence. Results are discussed in relation to the rapid evaluation of personality and, in turn, the intent of others, as being driven by survival mechanisms via approach or avoidance behaviours. These findings provide empirical bases for predicting personality impressions from acoustical analyses of short utterances and for generating desired personality impressions in artificial voices
A continuous mapping of sleep states through association of EEG with a mesoscale cortical model
Here we show that a mathematical model of the human sleep cycle can be used to obtain a detailed description of electroencephalogram (EEG) sleep stages, and we discuss how this analysis may aid in the prediction and prevention of seizures during sleep. The association between EEG data and the cortical model is found via locally linear embedding (LLE), a method of dimensionality reduction. We first show that LLE can distinguish between traditional sleep stages when applied to EEG data. It reliably separates REM and non-REM sleep and maps the EEG data to a low-dimensional output space where the sleep state changes smoothly over time. We also incorporate the concept of strongly connected components and use this as a method of automatic outlier rejection for EEG data. Then, by using LLE on a hybrid data set containing both sleep EEG and signals generated from the mesoscale cortical model, we quantify the relationship between the data and the mathematical model. This enables us to take any sample of sleep EEG data and associate it with a position among the continuous range of sleep states provided by the model; we can thus infer a trajectory of states as the subject sleeps. Lastly, we show that this method gives consistent results for various subjects over a full night of sleep and can be done in real time
- âŠ