1,466 research outputs found

    Integrating Remote Sensing and Geographic Information Systems

    Get PDF
    Remote sensing and geographic information systems (GIS) comprise the two major components of geographic information science (GISci), an overarching field of endeavor that also encompasses global positioning systems (GPS) technology, geodesy and traditional cartography (Goodchild 1992, Estes and Star 1993, Hepner et al. 2005). Although remote sensing and GIS developed quasi-independently, the synergism between them has become increasingly apparent (Aronoff 2005). Today, GIS software almost always includes tools for display and analysis of images, and image processing software commonly contains options for analyzing ‘ancillary’ geospatial data (Faust 1998). The significant progress made in ‘integration’ of remote sensing and GIS has been well-summarized in several reviews (Ehlers 1990, Mace 1991, Hinton 1996, Wilkinson 1996). Nevertheless, advances are so rapid that periodic reassessment of the state-of-the-art is clearly warranted

    Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    Full text link
    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.Comment: Presented at PHYSTAT 2011, CERN, Geneva, Switzerland, January 2011, to be published in a CERN Yellow Repor

    Any Data, Any Time, Anywhere: Global Data Access for Science

    Full text link
    Data access is key to science driven by distributed high-throughput computing (DHTC), an essential technology for many major research projects such as High Energy Physics (HEP) experiments. However, achieving efficient data access becomes quite difficult when many independent storage sites are involved because users are burdened with learning the intricacies of accessing each system and keeping careful track of data location. We present an alternate approach: the Any Data, Any Time, Anywhere infrastructure. Combining several existing software products, AAA presents a global, unified view of storage systems - a "data federation," a global filesystem for software delivery, and a workflow management system. We present how one HEP experiment, the Compact Muon Solenoid (CMS), is utilizing the AAA infrastructure and some simple performance metrics.Comment: 9 pages, 6 figures, submitted to 2nd IEEE/ACM International Symposium on Big Data Computing (BDC) 201

    Geocoded data structures and their applications to Earth science investigations

    Get PDF
    A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space

    Integrating Spatial Data Linkage and Analysis Services in a Geoportal for China Urban Research

    Full text link
    Many geoportals are now evolving into online analytical environments, where large amounts of data and various analysis methods are integrated. These spatiotemporal data are often distributed in different databases and exist in heterogeneous forms, even when they refer to the same geospatial entities. Besides, existing open standards lack sufficient expression of the attribute semantics. Client applications or other services thus have to deal with unrelated preprocessing tasks, such as data transformation and attribute annotation, leading to potential inconsistencies. Furthermore, to build informative interfaces that guide users to quickly understand the analysis methods, an analysis service needs to explicitly model the method parameters, which are often interrelated and have rich auxiliary information. This work presents the design of the spatial data linkage and analysis services in a geoportal for China urban research. The spatial data linkage service aggregates multisource heterogeneous data into linked layers with flexible attribute mapping, providing client applications and services with a unified access as if querying a big table. The spatial analysis service incorporates parameter hierarchy and grouping by extending the standard WPS service, and data‐dependent validation in computation components. This platform can help researchers efficiently explore and analyze spatiotemporal data online.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/110740/1/tgis12084.pd

    From multisource data to clinical decision aids in radiation oncology:The need for a clinical data science community

    Get PDF
    Big data are no longer an obstacle; now, by using artificial intelligence (AI), previously undiscovered knowledge can be found in massive data collections. The radiation oncology clinic daily produces a large amount of multisource data and metadata during its routine clinical and research activities. These data involve multiple stakeholders and users. Because of a lack of interoperability, most of these data remain unused, and powerful insights that could improve patient care are lost. Changing the paradigm by introducing powerful AI analytics and a common vision for empowering big data in radiation oncology is imperative. However, this can only be achieved by creating a clinical data science community in radiation oncology. In this work, we present why such a community is needed to translate multisource data into clinical decision aids

    Digital Injustice: A Case Study of Land Use Classification Using Multisource Data in Nairobi, Kenya

    Get PDF
    The utilisation of big data has emerged as a critical instrument for land use classification and decision-making processes due to its high spatiotemporal accuracy and ability to diminish manual data collection. However, the reliability and feasibility of big data are still controversial, the most important of which is whether it can represent the whole population with justice. The present study incorporates multiple data sources to facilitate land use classification while proving the existence of data bias caused digital injustice. Using Nairobi, Kenya, as a case study and employing a random forest classifier as a benchmark, this research combines satellite imagery, night-time light images, building footprint, Twitter posts, and street view images. The findings of the land use classification also disclose the presence of data bias resulting from the inadequate coverage of social media and street view data, potentially contributing to injustice in big data-informed decision-making. Strategies to mitigate such digital injustice situations are briefly discussed here, and more in-depth exploration remains for future work

    Assessment of emergency medicine residents: a systematic review

    Get PDF
    Background: Competency-based medical education is becoming the new standard for residency programs, including Emergency Medicine (EM). To inform programmatic restructuring, guide resources and identify gaps in publication, we reviewed the published literature on types and frequency of resident assessment.Methods: We searched MEDLINE, EMBASE, PsycInfo and ERIC from Jan 2005 - June 2014. MeSH terms included “assessment,” “residency,” and “emergency medicine.” We included studies on EM residents reporting either of two primary outcomes: 1) assessment type and 2) assessment frequency per resident. Two reviewers screened abstracts, reviewed full text studies, and abstracted data. Reporting of assessment-related costs was a secondary outcome.Results: The search returned 879 articles; 137 articles were full-text reviewed; 73 met inclusion criteria. Half of the studies (54.8%) were pilot projects and one-quarter (26.0%) described fully implemented assessment tools/programs. Assessment tools (n=111) comprised 12 categories, most commonly: simulation-based assessments (28.8%), written exams (28.8%), and direct observation (26.0%). Median assessment frequency (n=39 studies) was twice per month/rotation (range: daily to once in residency). No studies thoroughly reported costs.Conclusion: EM resident assessment commonly uses simulation or direct observation, done once-per-rotation. Implemented assessment systems and assessment-associated costs are poorly reported. Moving forward, routine publication will facilitate transitioning to competency-based medical education
    corecore