30,266 research outputs found

    OMEGA: a software tool for the management, analysis, and dissemination of intracellular trafficking data that incorporates motion type classification and quality control [preprint]

    Get PDF
    MOTIVATION: Particle tracking coupled with time-lapse microscopy is critical for understanding the dynamics of intracellular processes of clinical importance. Spurred on by advances in the spatiotemporal resolution of microscopy and automated computational methods, this field is increasingly amenable to multi-dimensional high-throughput data collection schemes (Snijder et al, 2012). Typically, complex particle tracking datasets generated by individual laboratories are produced with incompatible methodologies that preclude comparison to each other. There is therefore an unmet need for data management systems that facilitate data standardization, meta-analysis, and structured data dissemination. The integration of analysis, visualization, and quality control capabilities into such systems would eliminate the need for manual transfer of data to diverse downstream analysis tools. At the same time, it would lay the foundation for shared trajectory data, particle tracking, and motion analysis standards. RESULTS: Here, we present Open Microscopy Environment inteGrated Analysis (OMEGA), a cross-platform data management, analysis, and visualization system, for particle tracking data, with particular emphasis on results from viral and vesicular trafficking experiments. OMEGA provides easy to use graphical interfaces to implement integrated particle tracking and motion analysis workflows while keeping track of error propagation and data provenance. Specifically, OMEGA: 1) imports image data and metadata from data management tools such as Open Microscopy Environment Remote Objects (OMERO; Allan et al., 2012); 2) tracks intracellular particles moving across time series of image planes; 3) facilitates parameter optimization and trajectory results inspection and validation; 4) performs downstream trajectory analysis and motion type classification; 5) estimates the uncertainty associated with motion analysis; and, 6) facilitates storage and dissemination of analysis results, and analysis definition metadata, on the basis of our newly proposed Minimum Information About Particle Tracking Experiments (MIAPTE; Rigano & Strambio-De-Castillia, 2016; 2017) guidelines in combination with the OME-XML data model (Goldberg et al, 2005)

    The IPAC Image Subtraction and Discovery Pipeline for the intermediate Palomar Transient Factory

    Get PDF
    We describe the near real-time transient-source discovery engine for the intermediate Palomar Transient Factory (iPTF), currently in operations at the Infrared Processing and Analysis Center (IPAC), Caltech. We coin this system the IPAC/iPTF Discovery Engine (or IDE). We review the algorithms used for PSF-matching, image subtraction, detection, photometry, and machine-learned (ML) vetting of extracted transient candidates. We also review the performance of our ML classifier. For a limiting signal-to-noise ratio of 4 in relatively unconfused regions, "bogus" candidates from processing artifacts and imperfect image subtractions outnumber real transients by ~ 10:1. This can be considerably higher for image data with inaccurate astrometric and/or PSF-matching solutions. Despite this occasionally high contamination rate, the ML classifier is able to identify real transients with an efficiency (or completeness) of ~ 97% for a maximum tolerable false-positive rate of 1% when classifying raw candidates. All subtraction-image metrics, source features, ML probability-based real-bogus scores, contextual metadata from other surveys, and possible associations with known Solar System objects are stored in a relational database for retrieval by the various science working groups. We review our efforts in mitigating false-positives and our experience in optimizing the overall system in response to the multitude of science projects underway with iPTF.Comment: 66 pages, 21 figures, 7 tables, accepted by PAS

    The VISTA Science Archive

    Full text link
    We describe the VISTA Science Archive (VSA) and its first public release of data from five of the six VISTA Public Surveys. The VSA exists to support the VISTA Surveys through their lifecycle: the VISTA Public Survey consortia can use it during their quality control assessment of survey data products before submission to the ESO Science Archive Facility (ESO SAF); it supports their exploitation of survey data prior to its publication through the ESO SAF; and, subsequently, it provides the wider community with survey science exploitation tools that complement the data product repository functionality of the ESO SAF. This paper has been written in conjunction with the first public release of public survey data through the VSA and is designed to help its users understand the data products available and how the functionality of the VSA supports their varied science goals. We describe the design of the database and outline the database-driven curation processes that take data from nightly pipeline-processed and calibrated FITS files to create science-ready survey datasets. Much of this design, and the codebase implementing it, derives from our earlier WFCAM Science Archive (WSA), so this paper concentrates on the VISTA-specific aspects and on improvements made to the system in the light of experience gained in operating the WSA.Comment: 22 pages, 16 figures. Minor edits to fonts and typos after sub-editting. Published in A&

    Digital Image

    Full text link
    This paper considers the ontological significance of invisibility in relation to the question ‘what is a digital image?’ Its argument in a nutshell is that the emphasis on visibility comes at the expense of latency and is symptomatic of the style of thinking that dominated Western philosophy since Plato. This privileging of visible content necessarily binds images to linguistic (semiotic and structuralist) paradigms of interpretation which promote representation, subjectivity, identity and negation over multiplicity, indeterminacy and affect. Photography is the case in point because until recently critical approaches to photography had one thing in common: they all shared in the implicit and incontrovertible understanding that photographs are a medium that must be approached visually; they took it as a given that photographs are there to be looked at and they all agreed that it is only through the practices of spectatorship that the secrets of the image can be unlocked. Whatever subsequent interpretations followed, the priori- ty of vision in relation to the image remained unperturbed. This undisputed belief in the visibility of the image has such a strong grasp on theory that it imperceptibly bonded together otherwise dissimilar and sometimes contradictory methodol- ogies, preventing them from noticing that which is the most unexplained about images: the precedence of looking itself. This self-evident truth of visibility casts a long shadow on im- age theory because it blocks the possibility of inquiring after everything that is invisible, latent and hidden

    Econometrics meets sentiment : an overview of methodology and applications

    Get PDF
    The advent of massive amounts of textual, audio, and visual data has spurred the development of econometric methodology to transform qualitative sentiment data into quantitative sentiment variables, and to use those variables in an econometric analysis of the relationships between sentiment and other variables. We survey this emerging research field and refer to it as sentometrics, which is a portmanteau of sentiment and econometrics. We provide a synthesis of the relevant methodological approaches, illustrate with empirical results, and discuss useful software
    • …
    corecore