769 research outputs found

    Long-Term Visual Object Tracking Benchmark

    Full text link
    We propose a new long video dataset (called Track Long and Prosper - TLP) and benchmark for single object tracking. The dataset consists of 50 HD videos from real world scenarios, encompassing a duration of over 400 minutes (676K frames), making it more than 20 folds larger in average duration per sequence and more than 8 folds larger in terms of total covered duration, as compared to existing generic datasets for visual tracking. The proposed dataset paves a way to suitably assess long term tracking performance and train better deep learning architectures (avoiding/reducing augmentation, which may not reflect real world behaviour). We benchmark the dataset on 17 state of the art trackers and rank them according to tracking accuracy and run time speeds. We further present thorough qualitative and quantitative evaluation highlighting the importance of long term aspect of tracking. Our most interesting observations are (a) existing short sequence benchmarks fail to bring out the inherent differences in tracking algorithms which widen up while tracking on long sequences and (b) the accuracy of trackers abruptly drops on challenging long sequences, suggesting the potential need of research efforts in the direction of long-term tracking.Comment: ACCV 2018 (Oral

    Towards Space-like Photometric Precision from the Ground with Beam-Shaping Diffusers

    Get PDF
    We demonstrate a path to hitherto unachievable differential photometric precisions from the ground, both in the optical and near-infrared (NIR), using custom-fabricated beam-shaping diffusers produced using specialized nanofabrication techniques. Such diffusers mold the focal plane image of a star into a broad and stable top-hat shape, minimizing photometric errors due to non-uniform pixel response, atmospheric seeing effects, imperfect guiding, and telescope-induced variable aberrations seen in defocusing. This PSF reshaping significantly increases the achievable dynamic range of our observations, increasing our observing efficiency and thus better averages over scintillation. Diffusers work in both collimated and converging beams. We present diffuser-assisted optical observations demonstrating 6216+2662^{+26}_{-16}ppm precision in 30 minute bins on a nearby bright star 16-Cygni A (V=5.95) using the ARC 3.5m telescope---within a factor of \sim2 of Kepler's photometric precision on the same star. We also show a transit of WASP-85-Ab (V=11.2) and TRES-3b (V=12.4), where the residuals bin down to 18041+66180^{+66}_{-41}ppm in 30 minute bins for WASP-85-Ab---a factor of \sim4 of the precision achieved by the K2 mission on this target---and to 101ppm for TRES-3b. In the NIR, where diffusers may provide even more significant improvements over the current state of the art, our preliminary tests have demonstrated 13736+64137^{+64}_{-36}ppm precision for a KS=10.8K_S =10.8 star on the 200" Hale Telescope. These photometric precisions match or surpass the expected photometric precisions of TESS for the same magnitude range. This technology is inexpensive, scalable, easily adaptable, and can have an important and immediate impact on the observations of transits and secondary eclipses of exoplanets.Comment: Accepted for publication in ApJ. 30 pages, 20 figure

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    Effects of Thyroxine Exposure on Osteogenesis in Mouse Calvarial Pre-Osteoblasts

    Get PDF
    The incidence of craniosynostosis is one in every 1,800-2500 births. The gene-environment model proposes that if a genetic predisposition is coupled with environmental exposures, the effects can be multiplicative resulting in severely abnormal phenotypes. At present, very little is known about the role of gene-environment interactions in modulating craniosynostosis phenotypes, but prior evidence suggests a role for endocrine factors. Here we provide a report of the effects of thyroid hormone exposure on murine calvaria cells. Murine derived calvaria cells were exposed to critical doses of pharmaceutical thyroxine and analyzed after 3 and 7 days of treatment. Endpoint assays were designed to determine the effects of the hormone exposure on markers of osteogenesis and included, proliferation assay, quantitative ALP activity assay, targeted qPCR for mRNA expression of Runx2, Alp, Ocn, and Twist1, genechip array for 28,853 targets, and targeted osteogenic microarray with qPCR confirmations. Exposure to thyroxine stimulated the cells to express ALP in a dose dependent manner. There were no patterns of difference observed for proliferation. Targeted RNA expression data confirmed expression increases for Alp and Ocn at 7 days in culture. The genechip array suggests substantive expression differences for 46 gene targets and the targeted osteogenesis microarray indicated 23 targets with substantive differences. 11 gene targets were chosen for qPCR confirmation because of their known association with bone or craniosynostosis (Col2a1, Dmp1, Fgf1, 2, Igf1, Mmp9, Phex, Tnf, Htra1, Por, and Dcn). We confirmed substantive increases in mRNA for Phex, FGF1, 2, Tnf, Dmp1, Htra1, Por, Igf1 and Mmp9, and substantive decreases for Dcn. It appears thyroid hormone may exert its effects through increasing osteogenesis. Targets isolated suggest a possible interaction for those gene products associated with calvarial suture growth and homeostasis as well as craniosynostosis. © 2013 Cray et al

    DNA replication stress restricts ribosomal DNA copy number

    Get PDF
    Ribosomal RNAs (rRNAs) in budding yeast are encoded by ~100–200 repeats of a 9.1kb sequence arranged in tandem on chromosome XII, the ribosomal DNA (rDNA) locus. Copy number of rDNA repeat units in eukaryotic cells is maintained far in excess of the requirement for ribosome biogenesis. Despite the importance of the repeats for both ribosomal and non-ribosomal functions, it is currently not known how “normal” copy number is determined or maintained. To identify essential genes involved in the maintenance of rDNA copy number, we developed a droplet digital PCR based assay to measure rDNA copy number in yeast and used it to screen a yeast conditional temperature-sensitive mutant collection of essential genes. Our screen revealed that low rDNA copy number is associated with compromised DNA replication. Further, subculturing yeast under two separate conditions of DNA replication stress selected for a contraction of the rDNA array independent of the replication fork blocking protein, Fob1. Interestingly, cells with a contracted array grew better than their counterparts with normal copy number under conditions of DNA replication stress. Our data indicate that DNA replication stresses select for a smaller rDNA array. We speculate that this liberates scarce replication factors for use by the rest of the genome, which in turn helps cells complete DNA replication and continue to propagate. Interestingly, tumors from mini chromosome maintenance 2 (MCM2)-deficient mice also show a loss of rDNA repeats. Our data suggest that a reduction in rDNA copy number may indicate a history of DNA replication stress, and that rDNA array size could serve as a diagnostic marker for replication stress. Taken together, these data begin to suggest the selective pressures that combine to yield a “normal” rDNA copy number

    Hierarchical Anatomical Brain Networks for MCI Prediction: Revisiting Volumetric Measures

    Get PDF
    Owning to its clinical accessibility, T1-weighted MRI (Magnetic Resonance Imaging) has been extensively studied in the past decades for prediction of Alzheimer's disease (AD) and mild cognitive impairment (MCI). The volumes of gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) are the most commonly used measurements, resulting in many successful applications. It has been widely observed that disease-induced structural changes may not occur at isolated spots, but in several inter-related regions. Therefore, for better characterization of brain pathology, we propose in this paper a means to extract inter-regional correlation based features from local volumetric measurements. Specifically, our approach involves constructing an anatomical brain network for each subject, with each node representing a Region of Interest (ROI) and each edge representing Pearson correlation of tissue volumetric measurements between ROI pairs. As second order volumetric measurements, network features are more descriptive but also more sensitive to noise. To overcome this limitation, a hierarchy of ROIs is used to suppress noise at different scales. Pairwise interactions are considered not only for ROIs with the same scale in the same layer of the hierarchy, but also for ROIs across different scales in different layers. To address the high dimensionality problem resulting from the large number of network features, a supervised dimensionality reduction method is further employed to embed a selected subset of features into a low dimensional feature space, while at the same time preserving discriminative information. We demonstrate with experimental results the efficacy of this embedding strategy in comparison with some other commonly used approaches. In addition, although the proposed method can be easily generalized to incorporate other metrics of regional similarities, the benefits of using Pearson correlation in our application are reinforced by the experimental results. Without requiring new sources of information, our proposed approach improves the accuracy of MCI prediction from (of conventional volumetric features) to (of hierarchical network features), evaluated using data sets randomly drawn from the ADNI (Alzheimer's Disease Neuroimaging Initiative) dataset

    Do tabloids poison the well of social media? Explaining democratically dysfunctional news sharing

    Get PDF
    This paper was accepted for publication in the journal New Media and Society and the definitive published version is available at https://doi.org/10.1177/1461444818769689The use of social media for sharing political information and the status of news as an essential raw material for good citizenship are both generating increasing public concern. We add to the debates about misinformation, disinformation, and “fake news” using a new theoretical framework and a unique research design integrating survey data and analysis of observed news sharing behaviors on social media. Using a media-as-resources perspective, we theorize that there are elective affinities between tabloid news and misinformation and disinformation behaviors on social media. Integrating four data sets we constructed during the 2017 UK election campaign—individual-level data on news sharing (N = 1,525,748 tweets), website data (N = 17,989 web domains), news article data (N = 641 articles), and data from a custom survey of Twitter users (N = 1313 respondents)—we find that sharing tabloid news on social media is a significant predictor of democratically dysfunctional misinformation and disinformation behaviors. We explain the consequences of this finding for the civic culture of social media and the direction of future scholarship on fake news
    corecore