947 research outputs found

    SyZyGy: A Straight Interferometric Spacecraft System for Gravity Wave Observations

    Full text link
    We apply TDI, unfolding the general triangular configuration, to the special case of a linear array of three spacecraft. We show that such an array ("SyZyGy") has, compared with an equilateral triangle GW detector of the same scale, degraded (but non-zero) sensitivity at low-frequencies (f<<c/(arrany size)) but similar peak and high-frequency sensitivities to GWs. Sensitivity curves are presented for SyZyGys having various arm-lengths. A number of technical simplifications result from the linear configuration. These include only one faceted (e.g., cubical) proof mass per spacecraft, intra-spacecraft laser metrology needed only at the central spacecraft, placement in a single appropriate orbit can reduce Doppler drifts so that no laser beam modulation is required for ultra-stable oscillator noise calibration, and little or no time-dependent articulation of the telescopes to maintain pointing. Because SyZyGy's sensitivity falls off more sharply at low frequency than that of an equilateral triangular array, it may be more useful for GW observations in the band between those of ground-based interferometers (10-2000 Hz) and LISA (.1 mHz-.1 Hz). A SyZyGy with ~1 light- second scale could, for the same instrumental assumptions as LISA, make obseervations in this intermediate frequency GW band with 5 sigma sensitivity to sinusoidal waves of ~2.5 x 10^-23 in a year's integration.Comment: 13 pages, 6 figures; typos corrected, figure modified, references adde

    Stakeholder Values Related to Black Bear Damage in Alabama

    Get PDF
    Members of several stakeholder groups in Alabama were surveyed regarding their experience with bear damage and their potential tolerance for bear damage assuming black bear numbers were to increase. Very little bear related damage was reported. Regression analysis revealed that support for reintroduction, group affiliation, educational status, and knowledge of bears were important in explaining variation in the level of tolerance for potential bear related damage. Members of commodity related groups (i.e. beekeepers, cattlemen) were less likely to be tolerant of bear damage. Educational programs should be implemented before augmentation of the bear population in Alabama is attempted

    The importance of financial market development on the relationship between loan guarantees for SMEs and local market employment rates

    Get PDF
    We empirically examine whether a major government intervention in the small-firm credit market yields significantly better results in markets that are less financially developed. The government intervention that we investigate is SBA-guaranteed lending. The literature on financing small and medium size enterprises (SMEs) suggests that small firms may be exposed to a particular type of market failure associated with credit rationing. And SMEs in markets that are less financially developed will likely face a greater degree of this market failure. To test our hypothesis, we use the level of bank deposits per capita as our relative measure of financial market development, and we use local market employment rates as our measure of economic performance. After controlling for the appropriate cross-sectional market characteristics, we find that SBA-guaranteed lending has a significantly more (less) positive impact on the average annual level of employment when the local market is relatively less (more) financially developed. This result has important implications for public policy directives concerning where SBA-guaranteed lending should be directed.Small Business Administration ; Financial markets ; Small business - Finance ; Employment

    Kodiak Brown Bears Surf the Salmon Red Wave: Direct Evidence from GPS Collared Individuals

    Get PDF
    One of the goals of Ecosystems Base Fisheries Management (EBFM) is recognizing and mitigating indirect effects of fisheries on trophic interactions. Most research on indirect effects has considered how the abundance of managed fishes influences trophic interactions with other species. However, recent work has shown that attributes besides abundance, such as life history variation, can strongly mediate species interactions.  For example, phenological variation within prey species may enhance foraging opportunities for mobile predators by increasing the duration over which predators can target vulnerable life stages of prey.  Here, we present direct evidence of individual brown bears (Ursus arctos middendorffi) exploiting variation in sockeye salmon spawning phenology by tracking salmon runs across a 2,800 km2 region of Kodiak Island.  Data from 40 GPS collared brown bears show bears visited multiple spawning sites in synchrony with the order of spawning phenology.  The average time spent feeding on salmon was 67 days, while the average duration of spawning for one population was only 40 days.  The number of sites used was correlated with the number of days a bear exploited salmon, suggesting phenological variation in the study area influenced bear access to salmon, a resource which strongly influences bear fitness.  These results suggest fisheries managers attempting to maximize harvest while minimizing impacts on brown bears should strive to protect the population diversity that underlies the phenological variation used by wildlife consumers.  These results underscore the need to understand how fisheries affect life history diversity in addition to abundance in order to minimize negative effects of fisheries management on non-target species, a goal of EBFM

    The Use of the Decomposition Principle in Making Judgments

    Get PDF
    One hundred and fifty-one subjects were randomly divided into two groups of roughly equal size. One group was asked to respond to a decomposed version of a problem and the other group was presented with the direct form of the problem. The results provided support for the hypotheses that people can make better judgments when they use the principle of decomposition; and that decomposition is especially valuable for those problems where the subject knows little. The results suggest that accuracy may be improved if the subject provides the data and the computer analyzes it, than if both steps were done implicitly by the subjects

    Electronic Medical Records as a Research Tool: Evaluating Topiramate Use at a Headache Center.

    Get PDF
    Background.—Electronic medical records (EMRs) are used in large healthcare centers to increase efficiency and accuracy of documentation. These databases may be utilized for clinical research or to describe clinical practices such as medication usage. Methods.—We conducted a retrospective analysis of EMR data from a headache clinic to evaluate clinician prescription use and dosing patterns of topiramate. The study cohort comprised 4833 unique de-identified records, which were used to determine topiramate dose and persistence of treatment. Results.—Within the cohort, migraine was the most common headache diagnosis (n = 3753, 77.7%), followed by tension-type headache (n = 338, 7.0%) and cluster or trigeminal autonomic cephalalgias (n = 287, 5.9%). Physicians prescribed topiramate more often for subjects with migraine and idiopathic intracranial hypertension (P \u3c .0001) than for those with other conditions, and more often for subjects with coexisting conditions including obesity, bipolar disorder, and depression. The most common maintenance dose of topiramate was 100 mg/day; however, approximately 15% of subjects received either less than 100 mg/day or more than 200 mg/day. More than a third of subjects were prescribed topiramate for more than 1 year, and subjects with a diagnosis of migraine were prescribed topiramate for a longer period of time than those without migraine. Conclusions.—Findings from our study using EMR demonstrate that physicians use topiramate at many different doses and for many off-label indications. This analysis provided important insight into our patient populations and treatment patterns

    790-2 Baseline Electrocardiogram Predicts 30-day Mortality Among 32,812 Patients with Acute Myocardial Infarction Treated with Thrombolysis

    Get PDF
    To determine the initial electrocardiographic variables predictive of survival among patients with acute myocardial infarction, we analyzed the baseline 12-lead ECGs in 32,812 patients enrolled into the GUSTO trial. All patients had≥0.1mV of ST segment elevation in at least one lead and received thrombolytic therapy. Those with LBBB or ventricular rhythm were excluded from analysis. Clinical follow-up was &gt; 99.5% complete. 2218 (6.8%) patients died within 30 days of the initial ECG. Death within 30 days was more common in patients with RBBB (17%), LAFB (14%), and LPFB (17%), than in those with a normal conduction pattern (6%). Patients with ECG evidence of previous MI in a location distinct from the acute MI had a higher risk of death (9.8% vs. 5.9%) than those without prior infarction (p&lt;0.0001). The variable having the greatest univariate predictive power for 30-day survival was the sum of the absolute ST-segment deviation in each lead (x2=341), as shown in the following mortality curve.Other ST segment variables that predicted 30-day survival were the sum of ST-segment elevation in each lead (x2=287). the maximum ST elevation in anyone lead (X2=257), and the number of leads with ST elevation (x2=250). When multivariate modeling was performed the sum of the absolute ST deviations, number of leads with ST elevation, prior ECG MI, RBBB, and LAFB each added independent prognostic information.We conclude that an ECG at the time of presentation contains substantial prognostic information which can be used to help stratify risk among thrombelytic-treated patients with acute myocardial infarction
    • …
    corecore