208 research outputs found

    Datacasting V3.0

    Get PDF
    Datacasting V3.0 provides an RSSbased feed mechanism for publishing the availability of Earth science data records in real time. It also provides a utility for subscribing to these feeds and sifting through all the items in an automatic manner to identify and download the data records that are required for a specific application. Datacasting is a method by which multiple data providers can publish the availability of new Earth science data and users download those files that meet a predefined need; for example, to only download data files related to a specific earthquake or region on the globe. Datacasting is a server-client architecture. The server-side software is used by data providers to create and publish the metadata about recently available data according to the Datacasting RSS (Really Simple Syndication) specification. The client software subscribes to the Datacasting RSS and other RSS-based feeds. By configuring filters associated with feeds, data consumers can use the client to identify and automatically download files that meet a specific need. On the client side, a Datacasting feed reader monitors the server for new feeds. The feed reader will be tuned by the user, via a graphical user interface (GUI), to examine the content of the feeds and initiate a data pull after some criteria are satisfied. The criteria might be, for example, to download sea surface temperature data for a particular region that has cloud cover less than 50% and during daylight hours. After the granule is downloaded to the client, the user will have the ability to visualize the data in the GUI. Based on the popular concept of podcasting, which gives listeners the capability to download only those MP3 files that match their preference, Earth science Datacasting will give users a method to download only the Earth science data files that are required for a particular application

    Characterizing a Wake-Free Safe Zone for the Simplified Aircraft-Based Paired Approach Concept

    Get PDF
    The Federal Aviation Administration (FAA) has proposed a concept of operations geared towards achieving increased arrival throughput at U.S. Airports, known as the Simplified Aircraft-based Paired Approach (SAPA) concept. In this study, a preliminary characterization of a wake-free safe zone (WFSZ) for the SAPA concept has been performed. The experiment employed Monte-Carlo simulations of varying approach profiles by aircraft pairs to closely-spaced parallel runways. Three different runway lateral spacings were investigated (750 ft, 1000 ft and 1400 ft), along with no stagger and 1500 ft stagger between runway thresholds. The paired aircraft were flown in a leader/trailer configuration with potential wake encounters detected using a wake detection surface translating with the trailing aircraft. The WFSZ is characterized in terms of the smallest observed initial in-trail distance leading to a wake encounter anywhere along the approach path of the aircraft. The results suggest that the WFSZ can be characterized in terms of two primary altitude regions, in ground-effect (IGE) and out of ground-effect (OGE), with the IGE region being the limiting case with a significantly smaller WFSZ. Runway stagger was observed to only modestly reduce the WFSZ size, predominantly in the OGE region

    Earth Science Datacasting v2.0

    Get PDF
    The Datacasting software, which consists of a server and a client, has been developed as part of the Earth Science (ES) Datacasting project. The goal of ES Datacasting is to provide scientists the ability to automatically and continuously download Earth science data that meets a precise, predefined need, and then to instantaneously visualize it on a local computer. This is achieved by applying the concept of podcasting to deliver science data over the Internet using RSS (Really Simple Syndication) XML feeds. By extending the RSS specification, scientists can filter a feed and only download the files that are required for a particular application (for example, only files that contain information about a particular event, such as a hurricane or flood). The extension also provides the ability for the client to understand the format of the data and visualize the information locally. The server part enables a data provider to create and serve basic Datacasting (RSS-based) feeds. The user can subscribe to any number of feeds, view the information related to each item contained within a feed (including browse pre-made images), manually download files associated with items, and place these files in a local store. The client-server architecture enables users to: a) Subscribe and interpret multiple Datacasting feeds (same look and feel as a typical mail client), b) Maintain a list of all items within each feed, c) Enable filtering on the lists based on different metadata attributes contained within the feed (list will reference only data files of interest), d) Visualize the reference data and associated metadata, e) Download files referenced within the list, and f) Automatically download files as new items become available

    Wake Encounter Analysis for a Closely Spaced Parallel Runway Paired Approach Simulation

    Get PDF
    A Monte Carlo simulation of simultaneous approaches performed by two transport category aircraft from the final approach fix to a pair of closely spaced parallel runways was conducted to explore the aft boundary of the safe zone in which separation assurance and wake avoidance are provided. The simulation included variations in runway centerline separation, initial longitudinal spacing of the aircraft, crosswind speed, and aircraft speed during the approach. The data from the simulation showed that the majority of the wake encounters occurred near or over the runway and the aft boundaries of the safe zones were identified for all simulation conditions

    Acute effects of nicotine on visual search tasks in young adult smokers

    Get PDF
    Rationale Nicotine is known to improve performance on tests involving sustained attention and recent research suggests that nicotine may also improve performance on tests involving the strategic allocation of attention and working memory. Objectives We used measures of accuracy and response latency combined with eye-tracking techniques to examine the effects of nicotine on visual search tasks. Methods In experiment 1 smokers and non-smokers performed pop-out and serial search tasks. In experiment 2, we used a within-subject design and a more demanding search task for multiple targets. In both studies, 2-h abstinent smokers were asked to smoke one of their own cigarettes between baseline and tests. Results In experiment 1, pop-out search times were faster after nicotine, without a loss in accuracy. Similar effects were observed for serial searches, but these were significant only at a trend level. In experiment 2, nicotine facilitated a strategic change in eye movements resulting in a higher proportion of fixations on target letters. If the cigarette was smoked on the first trial (when the task was novel), nicotine additionally reduced the total number of fixations and refixations on all letters in the display. Conclusions Nicotine improves visual search performance by speeding up search time and enabling a better focus of attention on task relevant items. This appears to reflect more efficient inhibition of eye movements towards task irrelevant stimuli, and better active maintenance of task goals. When the task is novel, and therefore more difficult, nicotine lessens the need to refixate previously seen letters, suggesting an improvement in working memory

    Influence of Beta-Blocker Continuation or Withdrawal on Outcomes in Patients Hospitalized With Heart Failure Findings From the OPTIMIZE-HF Program

    Get PDF
    ObjectivesThis study ascertains the relationship between continuation or withdrawal of beta-blocker therapy and clinical outcomes in patients hospitalized with systolic heart failure (HF).BackgroundWhether beta-blocker therapy should be continued or withdrawn during hospitalization for decompensated HF has not been well studied in a broad cohort of patients.MethodsThe OPTIMIZE-HF (Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients with Heart Failure) program enrolled 5,791 patients admitted with HF in a registry with pre-specified 60- to 90-day follow-up at 91 academic and community hospitals throughout the U.S. Outcomes data were prospectively collected and analyzed according to whether beta-blocker therapy was continued, withdrawn, or not started.ResultsAmong 2,373 patients eligible for beta-blockers at discharge, there were 1,350 (56.9%) who were receiving beta-blockers before admission and continued on therapy, 632 (26.6%) newly started, 79 (3.3%) in which therapy was withdrawn, and 303 (12.8%) eligible but not treated. Continuation of beta-blockers was associated with a significantly lower risk and propensity adjusted post-discharge death (hazard ratio [HR]: 0.60; 95% confidence interval [CI]: 0.37 to 0.99, p = 0.044) and death/rehospitalization (odds ratio: 0.69; 95% CI: 0.52 to 0.92, p = 0.012) compared with no beta-blocker. In contrast, withdrawal of beta-blocker was associated with a substantially higher adjusted risk for mortality compared with those continued on beta-blockers (HR: 2.3; 95% CI: 1.2 to 4.6, p = 0.013), but with similar risk as HF patients eligible but not treated with beta-blockers.ConclusionsThe continuation of beta-blocker therapy in patients hospitalized with decompensated HF is associated with lower post-discharge mortality risk and improved treatment rates. In contrast, withdrawal of beta-blocker therapy is associated with worse risk and propensity-adjusted mortality. (Organized Program To Initiate Lifesaving Treatment In Hospitalized Patients With Heart Failure [OPTIMIZE-HF]; NCT00344513

    A Student\u27s Guide to giant Viruses Infecting Small Eukaryotes: From Acanthamoeba to Zooxanthellae

    Get PDF
    The discovery of infectious particles that challenge conventional thoughts concerning “what is a virus” has led to the evolution a new field of study in the past decade. Here, we review knowledge and information concerning “giant viruses”, with a focus not only on some of the best studied systems, but also provide an effort to illuminate systems yet to be better resolved. We conclude by demonstrating that there is an abundance of new host–virus systems that fall into this “giant” category, demonstrating that this field of inquiry presents great opportunities for future research

    108 AUROTHIOMALATE INHIBITS COX-2 EXPRESSION AND PGE2 PRODUCTION IN CHONDROCYTES BY INCREASING MKP-1 EXPRESSION AND DECREASING p38 AND JNK PHOSPHORYLATION

    Get PDF
    The very high occurrence of cardiovascular events presents a major public health issue, because treatment remains suboptimal. Lowering LDL cholesterol (LDL-C) with statins or ezetimibe in combination with a statin reduces major adverse cardiovascular events. The cardiovascular risk reduction in relation to the absolute LDL-C reduction is linear for most interventions without evidence of attenuation or increase in risk at low LDL-C levels. Opportunities for innovation in dyslipidaemia treatment should address the substantial risk of lipid-associated cardiovascular events among patients optimally treated per guidelines but who cannot achieve LDL-C goals and who could benefit from additional LDL-C-lowering therapy or experience side effects of statins. Fresh approaches are needed to identify promising drug targets early and develop them efficiently. The Cardiovascular Round Table of the European Society of Cardiology (ESC) convened a workshop to discuss new lipid-lowering strategies for cardiovascular risk reduction. Opportunities to improve treatment approaches and the efficient study of new therapies were explored. Circulating biomarkers may not be fully reliable proxy indicators of the relationship between treatment effect and clinical outcome. Mendelian randomization studies may better inform development strategies and refine treatment targets before Phase 3. Trials should match the drug to appropriate lipid and patient profile, and guidelines may move towards a precision-based approach to individual patient management. Stakeholder collaboration is needed to ensure continued innovation and better international coordination of both regulatory aspects and guidelines. It should be noted that risk may also be addressed through increased attention to other risk factors such as smoking, hypertension, overweight, and inactivity

    Early diagnosis of acute coronary syndrome.

    Get PDF
    The diagnostic evaluation of acute chest pain has been augmented in recent years by advances in the sensitivity and precision of cardiac troponin assays, new biomarkers, improvements in imaging modalities, and release of new clinical decision algorithms. This progress has enabled physicians to diagnose or rule-out acute myocardial infarction earlier after the initial patient presentation, usually in emergency department settings, which may facilitate prompt initiation of evidence-based treatments, investigation of alternative diagnoses for chest pain, or discharge, and permit better utilization of healthcare resources. A non-trivial proportion of patients fall in an indeterminate category according to rule-out algorithms, and minimal evidence-based guidance exists for the optimal evaluation, monitoring, and treatment of these patients. The Cardiovascular Round Table of the ESC proposes approaches for the optimal application of early strategies in clinical practice to improve patient care following the review of recent advances in the early diagnosis of acute coronary syndrome. The following specific 'indeterminate' patient categories were considered: (i) patients with symptoms and high-sensitivity cardiac troponin 99th percentile but without dynamic change; and (iv) patients with symptoms and high-sensitivity troponin >99th percentile and dynamic change but without coronary plaque rupture/erosion/dissection. Definitive evidence is currently lacking to manage these patients whose early diagnosis is 'indeterminate' and these areas of uncertainty should be assigned a high priority for research

    Lightning Jump Algorithm Development for the GOESR Geostationary Lightning Mapper

    Get PDF
    Current work on the lightning jump algorithm to be used in GOESR Geostationary Lightning Mapper (GLM)'s data stream is multifaceted due to the intricate interplay between the storm tracking, GLM proxy data, and the performance of the lightning jump itself. This work outlines the progress of the last year, where analysis and performance of the lightning jump algorithm with automated storm tracking and GLM proxy data were assessed using over 700 storms from North Alabama. The cases analyzed coincide with previous semiobjective work performed using total lightning mapping array (LMA) measurements in Schultz et al. (2011). Analysis shows that key components of the algorithm (flash rate and sigma thresholds) have the greatest influence on the performance of the algorithm when validating using severe storm reports. Automated objective analysis using the GLM proxy data has shown probability of detection (POD) values around 60% with false alarm rates (FAR) around 73% using similar methodology to Schultz et al. (2011). However, when applying verification methods similar to those employed by the National Weather Service, POD values increase slightly (69%) and FAR values decrease (63%). The relationship between storm tracking and lightning jump has also been tested in a realtime framework at NSSL. This system includes fully automated tracking by radar alone, realtime LMA and radar observations and the lightning jump. Results indicate that the POD is strong at 65%. However, the FAR is significantly higher than in Schultz et al. (2011) (5080% depending on various tracking/lightning jump parameters) when using storm reports for verification. Given known issues with Storm Data, the performance of the realtime jump algorithm is also being tested with high density radar and surface observations from the NSSL Severe Hazards Analysis & Verification Experiment (SHAVE)
    corecore