463 research outputs found

    The cosmic submillimeter background as a possible signature of the initial burst of galaxy formation

    Get PDF
    We propose a heuristic model for the origin of the cosmic submillimeter background (SMB), reported by the Nagoya-Berkeley collaboration. The SMB is interpreted as a direct signature of an epoch of (initial) galaxy formation at z ~ 10-15. The sources of the SMB are proposed to be dust-shrouded starburst protogalaxies, similar to the luminous IRAS galaxies at low redshifts. We interpret them as the progenitors of old stellar populations at low redshifts, ellipticals, bulges, and stellar components of the halos. The derived redshift of the galaxy formation is directly dependent on the dust temperatures assumed for these objects. The corresponding look-back times are -11.5h_(75)^(-1) Gyr for Ω_0 = 0.1, or ~ 8.5h_(75)^(-1) Gyr for Ω_0 = 1. The star formation history in an element of comoving volume was assumed to be a Gaussian in the rest frame, but this form is not critical for the models. Model spectra of the SMB were computed for the values of cosmological density parameter Ω_0 = 0.1 and 1, and the dust emissivity index n = 1 and 2. The largest allowed time scales for the star formation in these models (expressed as the FWHM of the luminosity history) are in the range FWHM ~ 0.2-0.6 Gyr for the low-density models (Ω_0 = 0.1); for the high-density models (Ω_0 = 1), the allowed widths are about a factor of 2 lower. These widths are comparable to, or slightly larger than, the free-fall times for normal galaxies. In order not to overproduce the baryonic mass density, it is necessary that the initial mass function (IMF) in these starbursts is biased toward high-mass stars; however, a substantial range in the IMF parameters is allowed. This postulated population of protogalaxies may be an important contributor to the diffuse soft X-ray background. Leaked (unobscured) starlight from these objects may give rise to a near-infrared background, at about the level detected by Matsumoto, Akiba, and Murakami. The predicted surface density of protogalaxies would be in the range ~10-100 arcsec^(-2), which is consistent with all relevant anisotropy measurements available at this time. The model also predicts that a considerable fraction of the mass density in the bulge and halo of our Galaxy would be provided by old white dwarfs, which may be detectable in deep surveys (a similar prediction was already made by Silk). Spectroscopic signatures of this population may be detectable with future space missions, e.g., with SIRTF or LDR, and possibly also from the ground in the near-infrared and millimeter/submillimeter regions

    An FPGA-based Timing and Control System for the Dynamic Compression Sector

    Full text link
    A field programmable gate array (FPGA) based timing and trigger control system has been developed for the Dynamic Compression Sector (DCS) user facility located at the Advanced Photon Source (APS) at Argonne National Laboratory. The DCS is a first-of-its-kind capability dedicated to dynamic compression science. All components of the DCS laser shock station - x-ray choppers, single-shot shutter, internal laser triggers, and shot diagnostics-must be synchronized with respect to the arrival of x-rays in the hutch. A field-programmable gate array (FPGA) synchronized to the APS storage ring radio frequency (RF) clock (352 MHz) generates trigger signals for each stage of the laser and x-ray shutter system with low jitter. The system is composed of a Zynq FPGA, a debug card, line drivers and power supply. The delay and offsets of trigger signals can be adjusted using a user-friendly graphical user interface (GUI) with high precision. The details of the system architecture, timing requirements, firmware, and software implementation along with the performance evaluation are presented in this paper. The system offers low timing jitter (15.5 ps r.m.s.) with respect to APS 352 MHz clock, suitable for the 50 ps r.m.s. x-ray bunch duration at the APS

    Using machine learning techniques to automate sky survey catalog generation

    Get PDF
    We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data

    SpaceNet MVOI: a Multi-View Overhead Imagery Dataset

    Full text link
    Detection and segmentation of objects in overheard imagery is a challenging task. The variable density, random orientation, small size, and instance-to-instance heterogeneity of objects in overhead imagery calls for approaches distinct from existing models designed for natural scene datasets. Though new overhead imagery datasets are being developed, they almost universally comprise a single view taken from directly overhead ("at nadir"), failing to address a critical variable: look angle. By contrast, views vary in real-world overhead imagery, particularly in dynamic scenarios such as natural disasters where first looks are often over 40 degrees off-nadir. This represents an important challenge to computer vision methods, as changing view angle adds distortions, alters resolution, and changes lighting. At present, the impact of these perturbations for algorithmic detection and segmentation of objects is untested. To address this problem, we present an open source Multi-View Overhead Imagery dataset, termed SpaceNet MVOI, with 27 unique looks from a broad range of viewing angles (-32.5 degrees to 54.0 degrees). Each of these images cover the same 665 square km geographic extent and are annotated with 126,747 building footprint labels, enabling direct assessment of the impact of viewpoint perturbation on model performance. We benchmark multiple leading segmentation and object detection models on: (1) building detection, (2) generalization to unseen viewing angles and resolutions, and (3) sensitivity of building footprint extraction to changes in resolution. We find that state of the art segmentation and object detection models struggle to identify buildings in off-nadir imagery and generalize poorly to unseen views, presenting an important benchmark to explore the broadly relevant challenge of detecting small, heterogeneous target objects in visually dynamic contexts.Comment: Accepted into IEEE International Conference on Computer Vision (ICCV) 201

    High-resolution imaging of the double QSO 2345 + 007

    Get PDF
    We present raw and maximum entropy restored images of the quasar pair (gravitational lens candidate) 2345 + 007 A and B. Restorations are performed using an implementation of the Gull-Skilling MEMSYS-3 package of maximum entropy method subroutines designed to achieve subpixel resolution in certain data regimes. Extensive simulations of our data imply that we are able to detect structure in the restored images down to the 0.4" level. Using this method, we qualitatively confirm that component B is resolved and, at least at visual and red wavelengths, elongated in a direction almost perpendicular to the line joining A and B. We also find evidence for a color difference and variation in the magnitude difference between the two components. We believe these data, in conjunction with recent spectroscopic results, more likely favor the multiple quasar rather than gravitational lens interpretation of the objects

    The cosmic submillimeter background as a possible signature of the initial burst of galaxy formation

    Get PDF
    We propose a heuristic model for the origin of the cosmic submillimeter background (SMB), reported by the Nagoya-Berkeley collaboration. The SMB is interpreted as a direct signature of an epoch of (initial) galaxy formation at z ~ 10-15. The sources of the SMB are proposed to be dust-shrouded starburst protogalaxies, similar to the luminous IRAS galaxies at low redshifts. We interpret them as the progenitors of old stellar populations at low redshifts, ellipticals, bulges, and stellar components of the halos. The derived redshift of the galaxy formation is directly dependent on the dust temperatures assumed for these objects. The corresponding look-back times are -11.5h_(75)^(-1) Gyr for Ω_0 = 0.1, or ~ 8.5h_(75)^(-1) Gyr for Ω_0 = 1. The star formation history in an element of comoving volume was assumed to be a Gaussian in the rest frame, but this form is not critical for the models. Model spectra of the SMB were computed for the values of cosmological density parameter Ω_0 = 0.1 and 1, and the dust emissivity index n = 1 and 2. The largest allowed time scales for the star formation in these models (expressed as the FWHM of the luminosity history) are in the range FWHM ~ 0.2-0.6 Gyr for the low-density models (Ω_0 = 0.1); for the high-density models (Ω_0 = 1), the allowed widths are about a factor of 2 lower. These widths are comparable to, or slightly larger than, the free-fall times for normal galaxies. In order not to overproduce the baryonic mass density, it is necessary that the initial mass function (IMF) in these starbursts is biased toward high-mass stars; however, a substantial range in the IMF parameters is allowed. This postulated population of protogalaxies may be an important contributor to the diffuse soft X-ray background. Leaked (unobscured) starlight from these objects may give rise to a near-infrared background, at about the level detected by Matsumoto, Akiba, and Murakami. The predicted surface density of protogalaxies would be in the range ~10-100 arcsec^(-2), which is consistent with all relevant anisotropy measurements available at this time. The model also predicts that a considerable fraction of the mass density in the bulge and halo of our Galaxy would be provided by old white dwarfs, which may be detectable in deep surveys (a similar prediction was already made by Silk). Spectroscopic signatures of this population may be detectable with future space missions, e.g., with SIRTF or LDR, and possibly also from the ground in the near-infrared and millimeter/submillimeter regions

    THE EFFECT OF BIOMECHANICALLY FOCUSED INJURY PREVENTION TRAINING ON REDUCING ANTERIOR CRUCIATE LIGAMENT INJURY RISK AMONG FEMALE COMMUNITY LEVEL ATHLETES

    Get PDF
    This study investigated changes in biomechanical risk factors following a 9-week body-weight based training intervention focused on the dynamic control of the hip/trunk. Peak knee moments and lower limb muscle activation of female community level athletes (n=18), split into intervention (n=8) and comparison (n=10) groups, were measured during unplanned sidestepping pre/post training. Following the 9-week intervention, total muscle activation of the muscles crossing the knee decreased, which was accompanied by elevated peak knee valgus and internal rotation moments among the comparison group. Increases in peak knee valgus and internal rotation moments were not observed among the training intervention group. In the context of ACL injury risk, these findings suggest that participation in biomechanically focused training may mitigate the potentially deleterious effects of regular community level sport participation

    Divergent confidence intervals among pre-specified analyses in the HiSTORIC stepped wedge trial:an exploratory post-hoc investigation

    Get PDF
    BACKGROUND: The high-sensitivity cardiac troponin on presentation to rule out myocardial infarction (HiSTORIC) study was a stepped-wedge cluster randomised trial with long before-and-after periods, involving seven hospitals across Scotland. Results were divergent for the binary safety endpoint (type 1 or type 4b myocardial infarction or cardiac death) across certain pre-specified analyses, which warranted further investigation. In particular, the calendar-matched analysis produced an odds ratio in the opposite direction to the primary logistic mixed-effects model analysis. METHODS: Several post-hoc statistical models were fitted to each of the co-primary outcomes of length of hospital stay and safety events, which included adjusting for exposure time, incorporating splines, and fitting a random time effect. We improved control of patient characteristics over time by adjusting for multiple additional covariates using different methods: direct inclusion, regression adjustment for propensity score, and weighting. A data augmentation approach was also conducted aiming to reduce the effect of sparse data bias. Finally, the raw data was examined. RESULTS: The new statistical models confirmed the results of the pre-specified trial analysis. In particular, the observed divergence between the calendar-matched and other analyses remained, even after performing the covariate adjustment methods, and after using data augmentation. Divergence was particularly acute for the safety endpoint, which had an event rate of 0.36% overall. Examining the raw data was particularly helpful to assess the sensitivity of the results to small changes in event rates and identify patterns in the data. CONCLUSIONS: Our experience reveals the importance of conducting multiple pre-specified sensitivity analyses and examining the raw data, particularly for stepped wedge trials with low event rates or with a small number of sites. Before-and-after analytical approaches that adjust for differences in patient populations but avoid direct modelling of the time trend should be considered in future stepped wedge trials with similar designs

    Managing forensic recovery in the cloud

    Get PDF
    As organisations move away from locally hosted computer services toward Cloud platforms, there is a corresponding need to ensure the forensic integrity of such instances. The primary reasons for concern are (i) the locus of responsibility, and (ii) the associated risk of legal sanction and financial penalty. Building upon previously proposed techniques for intrusion monitoring, we highlight the multi-level interpretation problem, propose enhanced monitoring of Cloud-based systems at diverse operational and data storage level as a basis for review of historical change across the hosted system and afford scope to identify any data impact from hostile action or 'friendly fire'
    • …
    corecore