175 research outputs found

    Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Get PDF
    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected dat

    Systematic adjudication of myocardial infarction end-points in an international clinical trial

    Get PDF
    BACKGROUND: Clinical events committees (CEC) are used routinely to adjudicate suspected end-points in cardiovascular trials, but little information has been published about the various processes used. We reviewed results of the CEC process used to identify and adjudicate suspected end-point (post-enrolment) myocardial infarction (MI) in the large Platelet Glycoprotein IIb/IIIa in Unstable Angina: Receptor Suppression Using Integrilin (Eptifibatide) Therapy (PURSUIT) trial. METHODS: The PURSUIT trial randomised 10,948 patients with acute coronary syndromes to receive eptifibatide or placebo. A central adjudication process was established prospectively to identify all suspected MIs and adjudicate events based on protocol definitions of MI. Suspected MIs were identified by systematic review of data collection forms, cardiac enzyme results, and electrocardiograms. Two physicians independently reviewed all suspected events. If they disagreed whether a MI had occurred, a committee of cardiologists adjudicated the case. RESULTS: The CEC identified 5005 patients with suspected infarction (46%), of which 1415 (28%) were adjudicated as end-point infarctions. As expected, the process identified more end-point events than did the site investigators. Absolute and relative treatment effects of eptifibatide were smaller when using CEC-determined MI rates rather than site investigator-determined rates. The site-investigator reporting of MI and the CEC assessment of MI disagreed in 20% of the cases reviewed by the CEC. CONCLUSIONS: End-point adjudication by a CEC is important, to provide standardised, systematic, independent, and unbiased assessment of end-points, particularly in trials that span geographic regions and clinical practice settings. Understanding the CEC process used is important in the interpretation of trial results and event rates

    Disagreements between central clinical events committee and site investigator assessments of myocardial infarction endpoints in an international clinical trial: review of the PURSUIT study

    Get PDF
    BACKGROUND: Limited information has been published regarding how specific processes for event adjudication can affect event rates in trials. We reviewed nonfatal myocardial infarctions (MIs) reported by site investigators in the international Platelet Glycoprotein IIb/IIIa in Unstable Angina: Receptor Suppression Using Integrilin (Eptifibatide) Therapy (PURSUIT) trial and those adjudicated by a central clinical events committee (CEC) to determine the reasons for differences in event rates. METHODS: The PURSUIT trial randomised 10,948 patients with acute coronary syndromes to receive eptifibatide or placebo. The primary end-point was death or post-enrolment MI at 30 days as assessed by the CEC; this end-point was also constructed using site-reported events. The CEC identified suspected MIs by systematic review of clinical, cardiac enzyme, and electrocardiographic data. RESULTS: The CEC identified 5005 (46%) suspected events, of which 1415 (28%) were adjudicated as MI. The site investigator and CEC assessments of whether a MI had occurred disagreed in 983 (20%) of the 5005 patients with suspected MI, mostly reflecting site misclassification of post-enrolment MIs (as enrolment MIs) or underreported periprocedural MIs. Patients for whom the CEC and site investigator agreed that no end-point MI had occurred had the lowest mortality at 30 days and between 30 days and 6 months, and those with agreement that a MI had occurred had the highest mortality. CONCLUSION: CEC adjudication provides a standard, systematic, independent, and unbiased assessment of end-points, particularly for trials that span geographic regions and clinical practice settings. Understanding the review process and reasons for disagreement between CEC and site investigator assessments of MI is important to design future trials and interpret event rates between trials

    Dopamine Transporter and Reward Anticipation in a Dimensional Perspective : A Multimodal Brain Imaging Study

    Get PDF
    We would like to thank Christine Baron, Vincent Brulon, Stéphane LeHelleix, Stéphane Demphel, Claude Comtat, Frédéric Dollé, Philippe Gervais, and Renaud Maroy from the Service Hospitalier Frédéric Joliot for their efficient technical support and 11C radioligand preparation. They thank Marie Prat, Audrey Pepin, and Audrey Mabondo for their help in PET processing and Pr. Maria-Joao Santiago-Ribeiro and Dr Renaud de Beaurepaire for their involvement in the recruitment of participants.Peer reviewedPostprin

    Being user-oriented: convergences, divergences, and the potentials for systematic dialogue between disciplines and between researchers, designers, and providers

    Get PDF
    The challenge this panel addresses is drawn from intersecting literature reviews and critical commentaries focusing on: 1) user studies in multiple fields; and 2) the difficulties of bringing different disciplines and perspectives to bear on user‐oriented research, design, and practice. 1 The challenge is that while we have made some progress in collaborative work, we have some distance to go to become user‐oriented in inter‐disciplinary and inter‐perspective ways. The varieties of our approaches and solutions are, as some observers suggest, an increasing cacophony. One major difficulty is that most discussions are solution‐oriented, offering arguments of this sort ‐‐ if only we addressed users in this way… Each solution becomes yet another addition to the cacophony. This panel implements a central approach documented for its utility by communication researchers and long used by communication mediators and negotiators ‐‐ that of focusing not on communication but rather on meta‐communication: communicating about communication. The intent in the context of this panel is to help us refocus attention from too frequent polarizations between alternative solutions to the possibility of coming to understand what is behind the alternatives and where they point to experientially‐based convergences and divergences, both of which might potentially contribute to synergies. The background project for this panel comes from a series of in‐depth interviews with expert researchers, designers, and providers in three field groupings ‐‐ library and information science; human computer interaction/information technology; and communication and media studies. One set of interviews involved 5‐hour focus groups with directors of academic and public libraries serving 44 colleges and universities in central Ohio; the second involved one‐on‐one interviews averaging 50 minutes with 81 nationally‐internationally known experts in the 3 fields, 25‐27 interviews per field. Using Dervin\u27s Sense‐Making Methodological approach to interviewing, the expert interviews of both kinds asked each interviewee: what he/she considered to be the big unanswered questions about users and what explained why the questions have not been answered; and, what he/she saw as hindering versus helping in attempts to communicate about users across disciplinary and perspective gaps. 2 The panel consists of six teams, two from each field. Prior to the panel presentation at ASIST, each team will have read the set of interviews and completed impressionistic essays of what patterns and themes they saw as emerging. At this stage, team members will purposively not homogenize their differences and most will write solo‐authored essays that will be placed on a web‐site accessible to ASIST members prior to the November meeting. In addition, at least one systematic analysis will be completed and available online. 3 At the ASIST panel, each team\u27s leader will present a brief and intentionally provocative impressionist account of what his/her team came to understand about our struggles communicating across fields and perspectives about users. Again, each team will purposively not homogenize its own differences in viewpoints, but rather highlight them as fodder for discussion. A major purpose will be to invite audience members to join the panel in discussion. At least 20 minutes will be left open for this purpose

    Characterizing Dynamic Changes in the Human Blood Transcriptional Network

    Get PDF
    Gene expression data generated systematically in a given system over multiple time points provides a source of perturbation that can be leveraged to infer causal relationships among genes explaining network changes. Previously, we showed that food intake has a large impact on blood gene expression patterns and that these responses, either in terms of gene expression level or gene-gene connectivity, are strongly associated with metabolic diseases. In this study, we explored which genes drive the changes of gene expression patterns in response to time and food intake. We applied the Granger causality test and the dynamic Bayesian network to gene expression data generated from blood samples collected at multiple time points during the course of a day. The simulation result shows that combining many short time series together is as powerful to infer Granger causality as using a single long time series. Using the Granger causality test, we identified genes that were supported as the most likely causal candidates for the coordinated temporal changes in the network. These results show that PER1 is a key regulator of the blood transcriptional network, in which multiple biological processes are under circadian rhythm regulation. The fasted and fed dynamic Bayesian networks showed that over 72% of dynamic connections are self links. Finally, we show that different processes such as inflammation and lipid metabolism, which are disconnected in the static network, become dynamically linked in response to food intake, which would suggest that increasing nutritional load leads to coordinate regulation of these biological processes. In conclusion, our results suggest that food intake has a profound impact on the dynamic co-regulation of multiple biological processes, such as metabolism, immune response, apoptosis and circadian rhythm. The results could have broader implications for the design of studies of disease association and drug response in clinical trials

    Origins Space Telescope: Baseline mission concept

    Get PDF
    The Origins Space Telescope will trace the history of our origins from the time dust and heavy elements permanently altered the cosmic landscape to present-day life. How did galaxies evolve from the earliest galactic systems to those found in the Universe today? How do habitable planets form? How common are life-bearing worlds? To answer these alluring questions, Origins will operate at mid-and far-infrared (IR) wavelengths and offer powerful spectroscopic instruments and sensitivity three orders of magnitude better than that of the Herschel Space Observatory, the largest telescope flown in space to date. We describe the baseline concept for Origins recommended to the 2020 US Decadal Survey in Astronomy and Astrophysics. The baseline design includes a 5.9-m diameter telescope cryocooled to 4.5 K and equipped with three scientific instruments. A mid-infrared instrument (Mid-Infrared Spectrometer and Camera Transit spectrometer) will measure the spectra of transiting exoplanets in the 2.8 to 20 μm wavelength range and offer unprecedented spectrophotometric precision, enabling definitive exoplanet biosignature detections. The far-IR imager polarimeter will be able to survey thousands of square degrees with broadband imaging at 50 and 250 μm. The Origins Survey Spectrometer will cover wavelengths from 25 to 588 μm, making wide-area and deep spectroscopic surveys with spectral resolving power R ∼ 300, and pointed observations at R ∼ 40,000 and 300,000 with selectable instrument modes. Origins was designed to minimize complexity. The architecture is similar to that of the Spitzer Space Telescope and requires very few deployments after launch, while the cryothermal system design leverages James Webb Space Telescope technology and experience. A combination of current-state-of-the-art cryocoolers and next-generation detector technology will enable Origins\u27 natural background-limited sensitivity
    corecore