2,620 research outputs found

    Coupling CAD and CFD codes within a virtual integration platform

    Get PDF
    The Virtual Integration Platform (VIP) is an essential component of the VIRTUE project. It provides a system for combining disparate numerical analysis methods into a simulation environment. The platform allows for defining process chains, allocating of which tools to be used, and assigning users to perform the individual tasks. The platform also manages the data that are imported into or generated within a process, so that a version history of input and output can be evaluated. Within the VIP, a re-usable template for a given process chain can be created. A process chain is composed of one or more smaller tasks. For each of these tasks, a selection of available tools can be allocated. The advanced scripting methods in the VIP use wrappers for managing the individual tools. A wrapper allows communication between the platform and the tool, and passes input and output data as necessary, in most cases without modifying the tool in any way. In this way, third-party tools may also be used without the need for access to source code or special modifications. The included case study demonstrates several advantages of using the integration platform. A parametric propeller design process couples CAD and CFD codes to adapt the propeller to given operating constraints. The VIP template helped eliminate common user errors, and captured enough expert knowledge so that the casual user could perform the given tasks with minimal guidance. Areas of improvements to in-house codes and to the overall process were identified while using the integration platform. Additionally, the process chain was designed to facilitate formal optimisation methods

    An economic model of Tennessee\u27s lumber and wood products industry

    Get PDF
    The purpose of this study was to provide a better understanding of the contribution of the Tennessee lumber and wood products industry to the state economy. The approach taken was to develop equations that quantified historical relationships relevant to the lumber and wood products industry. This methodology provided a separate sector, consistent with the state econometric model for the lumber and wood products industry. The Tennessee Econometric Model (TEM II) provided the basic framework within which the lumber and wood products equations were formulated. The framework for the manufacturing sector equations in TEM II consists of separate equations for forecasting output, employment, and wages. This basic structure was used in formulating the Tennessee lumber and wood products equations. The constraints associated with working within the TEM II framework were considered significant in this study. In order to better identify structural relationships in the industry, an alternative set of output equations was developed for structural/simulation analysis. These equations were formulated using a different set of statistical and economic criteria than the forecasting equations. The equations resulting from the study provide valuable information about the performance of the lumber and wood products industry in the state economy. The forecasting equations, in their current form, provide forecasts for the industry in terms of output, employment, and wages for the 1979-1986 time period. The final form of the simulation equation provides a statistically valid method of impact analysis. Specifically, the impact of changes in the furniture and housing markets on the state lumber industry can be tentatively identified. Though the equations may not be incorporated into the state econometric model in their current form, the research accomplished in their formulation is valuable as a basis for further study of the industry. Additional research is needed to determine if the hardwood industry can be analyzed adequately as a separate sector of the state econometric model and/or by developing a more detailed satellite model

    Balancing Safety and Availability: A Historical Perspective on the Pace of Drug Approval, 1950s-2009.

    Get PDF
    BALANCING SAFETY AND AVAILABILITY: A HISTORICAL PERSPECTIVE ON THE PACE OF DRUG APPROVAL, 1950s-2009. Fabienne C. Meier-Abt and Bruno J. Strasser. Department of History of Medicine, Yale University, School of Medicine, New Haven, CT. Over the course of the past 50 years, drug approval processes have ranged from 42 days to more than 10 years. What are the consequences of slow or rapid drug approvals on drug safety and drug availability? How slow is too slow? How fast is too fast? These questions have engaged the public, the government, physicians and the pharmaceutical industry for decades. This essay adopts a historical approach to examine the search for the right balance between drug safety and drug availability in the changing political climates of the past 60 years. Before 1962, the discovery of life-saving antibiotics fostered an emphasis on drug availability and the rapid marketing of drugs. On the background of the thalidomide crisis in the early 1960s, however, the drug approval process was reframed. The 1962 Kefauver-Harris Amendments ensured a new focus on drug safety rather than drug availability. Efficacy standards were introduced and safety standards raised, and as a result drug approval and drug marketing times increased. During the 1970s, the term drug lag was coined and rapidly endorsed by pharmaceutical companies, physicians and by conservative parties. The term referred to the unnecessary suffering of American patients as a result of the delayed market introduction of life-saving drugs in the United States. On the background of general consumer movements and as illustrated by the case of sodium valproate, patients, too, used the notion of drug lag as a political weapon to fight government regulations on the pharmaceutical industry. In the context of the Reagan Administrations emphasis on economic deregulation and of the public health crisis caused by the emergence of AIDS, the political pressure on the Food and Drug Administration rose, and the drug review process was revised to emphasize drug availability rather than drug safety. In the late 1980s and throughout the 1990s, several measures were introduced, intended to reduce drug approval and drug marketing times, especially for drugs targeting life-threatening diseases. Finding the right balance between drug safety and drug availability has been a controversial task. As illustrated by the case of gefitinib, the current system depends very heavily on postmarketing studies and on trust in the pharmaceutical industry\u27s ethical behavior. So far, however, the drug industry has not proven to deserve such trust, as exemplified by cases like rofecoxib. Hence, in 2009, the drug approval process awaits to be reframed again. A renewed focus on drug safety with more careful pre-approval studies and more thorough drug reviews seems warranted

    Can conventional forces really explain the anomalous acceleration of Pioneer 10/11 ?

    Full text link
    A conventional explanation of the correlation between the Pioneer 10/11 anomalous acceleration and spin-rate change is given. First, the rotational Doppler shift analysis is improved. Finally, a relation between the radio beam reaction force and the spin-rate change is established. Computations are found in good agreement with observational data. The relevance of our result to the main Pioneer 10/11 anomalous acceleration is emphasized. Our analysis leads us to conclude that the latter may not be merely artificial.Comment: 9 pages, no figur

    Individualisation of time-motion analysis : a method comparison and case report series

    Get PDF
    © Georg Thieme Verlag KG. This study compared the intensity distribution of time-motion analysis data, when speed zones were categorized by different methods. 12 U18 players undertook a routine battery of laboratory- and field-based assessments to determine their running speed corresponding to the respiratory compensation threshold (RCT), maximal aerobic speed (MAS), maximal oxygen consumption (vVO 2max ) and maximal sprint speed (MSS). Players match-demands were tracked using 5 Hz GPS units in 22 fixtures (50 eligible match observations). The percentage of total distance covered running at high-speed (%HSR), very-high speed (%VHSR) and sprinting were determined using the following speed thresholds: 1) arbitrary; 2) individualised (IND) using RCT, vVO 2max and MSS; 3) individualised via MAS per se; 4) individualised via MSS per se; and 5) individualised using MAS and MSS as measures of locomotor capacities (LOCO). Using MSS in isolation resulted in 61 % and 39 % of player's % HSR and % VHSR, respectively, being incorrectly interpreted, when compared to the IND technique. Estimating the RCT from fractional values of MAS resulted in erroneous interpretations of % HSR in 50 % of cases. The present results suggest that practitioners and researchers should avoid using singular fitness characteristics to individualise the intensity distribution of time-motion analysis data. A combination of players' anaerobic threshold, MAS, and MSS characteristics are recommended to individualise player-tracking data

    THE RELATIONSHIP BETWEEN MUSCULOSKELETAL STRENGTH, PHYSIOLOGICAL CHARACTERISTICS, AND KNEE KINESTHESIA FOLLOWING FATIGUING EXERCISE

    Get PDF
    Fatiguing exercise may result in impaired functional joint stability and increased risk of unintentional injury. While there are several musculoskeletal and physiological characteristics related to fatigue onset, their relationship with proprioceptive changes following fatigue has not been examined. The purpose of this study was to establish the relationship between musculoskeletal and physiological characteristics and changes in proprioception, measured by threshold to detect passive motion (TTDPM), following fatiguing exercise. Twenty, physically active females participated (age: 28.65 ± 5.6 years, height: 165.6 ± 4.3 cm, weight: 61.8 ± 8.0 kg, BMI: 22.5± 2.3 kg/m2, BF: 23.3 ± 5.4%). During Visit 1, subjects completed an exercise history and 24-hour dietary questionnaire, and body composition, TTDPM familiarization, isokinetic knee strength, and maximal oxygen uptake/lactate threshold assessments. During Visit 2, subjects completed TTDPM and isometric knee strength testing prior to and following a fatiguing exercise protocol. Wilcoxon signed rank tests determined TTDPM and isometric knee strength changes from pre- to post- fatigue. Spearman’s rho correlation coefficients determined the relationship between strength and physiological variables with pre- to post-fatigue changes in TTDPM and with pre-fatigue and post-fatigue TTDPM in extension and flexion (α=0.05). No significant differences were demonstrated from pre-fatigue to post-fatigue TTDPM despite a significant decrease in isometric knee flexion strength (P<0.01) and flexion/extension ratio (P<0.05) following fatigue. No significant correlations were observed between strength or physiological variables and changes in TTDPM from pre- to post-fatigue in extension or flexion. Flexion/extension ratio was significantly correlated with pre-fatigue TTDPM in extension (r=-0.231, P<0.05). Peak oxygen uptake was significantly correlated with pre-fatigue (r=-0.500, P<0.01) and post-fatigue (r=-0.520, P<0.05) TTDPM in extension. No significant relationships were demonstrated between musculoskeletal and physiological characteristics and changes in TTDPM following fatigue. The results suggest that highly trained individuals may have better proprioception, and that the high fitness level of subjects in this investigation may have contributed to absence of TTDPM deficits following fatigue despite reaching a high level of perceptual and physiological fatigue. Future studies should consider various subject populations, other musculoskeletal strength characteristics, and different modalities of proprioception to determine the most important contributions to proprioceptive changes following fatigue

    Abundance analysis of prime B-type targets for asteroseismology II. B6--B9.5 stars in the field of view of the CoRoT

    Get PDF
    The CoRoT satellite is collecting precise time-resolved photometry for tens of asteroseismology targets. To ensure a correct interpretation of the CoRoT data, the atmospheric parameters, chemical compositions, and rotational velocities of the stars must be determined. The main goal of the ground-based seismology support program for the CoRoT mission was to obtain photometric and spectroscopic data for stars in the fields monitored by the satellite. These ground-based observations were collected in the GAUDI archive. High-resolution spectra of more than 200 B-type stars are available in this database, and about 45% of them is analysed here. To derive the effective temperature of the stars, we used photometric indices. Surface gravities were obtained by comparing observed and theoretical Balmer line profiles. To determine the chemical abundances and rotational velocities, we used a spectrum synthesis method, which consisted of comparing the observed spectrum with theoretical ones based on the assumption of LTE. Atmospheric parameters, chemical abundances, and rotational velocities were determined for 89 late-B stars. The dominant species in their spectra are iron-peak elements. The average Fe abundance is 7.24+/-0.45 dex. The average rotational velocity is 126 km/sec, but there are 13 and 20 stars with low and moderate Vsin i values, respectively. The analysis of this sample of 89 late B-type stars reveals many chemically peculiar (CP) stars. Some of them were previously known, but at least 9 new CP candidates, among which at least two HgMn stars, are identified in our study. These CP stars as a group exhibit Vsin i values lower than the stars with normal surface chemical composition.Comment: 21 pages, 13 figures, accepted to Astronomy and Astrophysic

    Characterisation of an n-type segmented BEGe detector

    Full text link
    A four-fold segmented n-type point-contact "Broad Energy" high-purity germanium detector, SegBEGe, has been characterised at the Max-Planck-Institut f\"ur Physik in Munich. The main characteristics of the detector are described and first measurements concerning the detector properties are presented. The possibility to use mirror pulses to determine source positions is discussed as well as charge losses observed close to the core contact

    Crowdsourcing malaria parasite quantification: an online game for analyzing images of infected thick blood smears

    Get PDF
    Background: There are 600,000 new malaria cases daily worldwide. The gold standard for estimating the parasite burden and the corresponding severity of the disease consists in manually counting the number of parasites in blood smears through a microscope, a process that can take more than 20 minutes of an expert microscopist’s time. Objective: This research tests the feasibility of a crowdsourced approach to malaria image analysis. In particular, we investigated whether anonymous volunteers with no prior experience would be able to count malaria parasites in digitized images of thick blood smears by playing a Web-based game. Methods: The experimental system consisted of a Web-based game where online volunteers were tasked with detecting parasites in digitized blood sample images coupled with a decision algorithm that combined the analyses from several players to produce an improved collective detection outcome. Data were collected through the MalariaSpot website. Random images of thick blood films containing Plasmodium falciparum at medium to low parasitemias, acquired by conventional optical microscopy, were presented to players. In the game, players had to find and tag as many parasites as possible in 1 minute. In the event that players found all the parasites present in the image, they were presented with a new image. In order to combine the choices of different players into a single crowd decision, we implemented an image processing pipeline and a quorum algorithm that judged a parasite tagged when a group of players agreed on its position. Results: Over 1 month, anonymous players from 95 countries played more than 12,000 games and generated a database of more than 270,000 clicks on the test images. Results revealed that combining 22 games from nonexpert players achieved a parasite counting accuracy higher than 99%. This performance could be obtained also by combining 13 games from players trained for 1 minute. Exhaustive computations measured the parasite counting accuracy for all players as a function of the number of games considered and the experience of the players. In addition, we propose a mathematical equation that accurately models the collective parasite counting performance. Conclusions: This research validates the online gaming approach for crowdsourced counting of malaria parasites in images of thick blood films. The findings support the conclusion that nonexperts are able to rapidly learn how to identify the typical features of malaria parasites in digitized thick blood samples and that combining the analyses of several users provides similar parasite counting accuracy rates as those of expert microscopists. This experiment illustrates the potential of the crowdsourced gaming approach for performing routine malaria parasite quantification, and more generally for solving biomedical image analysis problems, with future potential for telediagnosis related to global health challenges

    Spectroscopic Orbits for 15 Late-Type Stars

    Get PDF
    Spectroscopic orbital elements are determined for 15 stars with periods from 8 to 6528 days with six orbits computed for the first time. Improved astrometric orbits are computed for two stars and one new orbit is derived. Visual orbits were previously determined for four stars, four stars are members of multiple systems, and five stars have Hipparcos G designations or have been resolved by speckle interferometry. For the nine binaries with previous spectroscopic orbits, we determine improved or comparable elements. For HD 28271 and HD 200790, our spectroscopic results support the conclusions of previous authors that the large values of their mass functions and lack of detectable secondary spectrum argue for the secondary in each case being a pair of low-mass dwarfs. The orbits given here may be useful in combination with future interferometric and Gaia satellite observations
    • …
    corecore