69,113 research outputs found

    Data Driven Discovery in Astrophysics

    Get PDF
    We review some aspects of the current state of data-intensive astronomy, its methods, and some outstanding data analysis challenges. Astronomy is at the forefront of "big data" science, with exponentially growing data volumes and data rates, and an ever-increasing complexity, now entering the Petascale regime. Telescopes and observatories from both ground and space, covering a full range of wavelengths, feed the data via processing pipelines into dedicated archives, where they can be accessed for scientific analysis. Most of the large archives are connected through the Virtual Observatory framework, that provides interoperability standards and services, and effectively constitutes a global data grid of astronomy. Making discoveries in this overabundance of data requires applications of novel, machine learning tools. We describe some of the recent examples of such applications.Comment: Keynote talk in the proceedings of ESA-ESRIN Conference: Big Data from Space 2014, Frascati, Italy, November 12-14, 2014, 8 pages, 2 figure

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    General cost analysis for scholarly communication in Germany : results of the "Houghton Report" for Germany

    Get PDF
    Management Summary: Conducted within the project “Economic Implications of New Models for Information Supply for Science and Research in Germany”, the Houghton Report for Germany provides a general cost and benefit analysis for scientific communication in Germany comparing different scenarios according to their specific costs and explicitly including the German National License Program (NLP). Basing on the scholarly lifecycle process model outlined by Björk (2007), the study compared the following scenarios according to their accounted costs: - Traditional subscription publishing, - Open access publishing (Gold Open Access; refers primarily to journal publishing where access is free of charge to readers, while the authors or funding organisations pay for publication) - Open Access self-archiving (authors deposit their work in online open access institutional or subject-based repositories, making it freely available to anyone with Internet access; further divided into (i) CGreen Open Access’ self-archiving operating in parallel with subscription publishing; and (ii) the ‘overlay services’ model in which self-archiving provides the foundation for overlay services (e.g. peer review, branding and quality control services)) - the NLP. Within all scenarios, five core activity elements (Fund research and research communication; perform research and communicate the results; publish scientific and scholarly works; facilitate dissemination, retrieval and preservation; study publications and apply the knowledge) were modeled and priced with all their including activities. Modelling the impacts of an increase in accessibility and efficiency resulting from more open access on returns to R&D over a 20 year period and then comparing costs and benefits, we find that the benefits of open access publishing models are likely to substantially outweigh the costs and, while smaller, the benefits of the German NLP also exceed the costs. This analysis of the potential benefits of more open access to research findings suggests that different publishing models can make a material difference to the benefits realised, as well as the costs faced. It seems likely that more Open Access would have substantial net benefits in the longer term and, while net benefits may be lower during a transitional period, they are likely to be positive for both ‘author-pays’ Open Access publishing and the ‘over-lay journals’ alternatives (‘Gold Open Access’), and for parallel subscription publishing and self-archiving (‘Green Open Access’). The NLP returns substantial benefits and savings at a modest cost, returning one of the highest benefit/cost ratios available from unilateral national policies during a transitional period (second to that of ‘Green Open Access’ self-archiving). Whether ‘Green Open Access’ self-archiving in parallel with subscriptions is a sustainable model over the longer term is debateable, and what impact the NLP may have on the take up of Open Access alternatives is also an important consideration. So too is the potential for developments in Open Access or other scholarly publishing business models to significantly change the relative cost-benefit of the NLP over time. The results are comparable to those of previous studies from the UK and Netherlands. Green Open Access in parallel with the traditional model yields the best benefits/cost ratio. Beside its benefits/cost ratio, the meaningfulness of the NLP is given by its enforceability. The true costs of toll access publishing (beside the buyback” of information) is the prohibition of access to research and knowledge for society

    The effects of problem-based learning in students reading comprehension for mastering the content and vocabulary acquisition

    Get PDF
    English reading comprehension is the most crucial language abilities in colleges where English is learned as a second language. A vigorous and powerful reading ability is certainly crucial in primary studies for students to accomplish in advanced education. However, the method in which they are instructed through lecture and paper pencil’s method were less relevant and effective in 21st Century. Students become passive, demotivated, bored, and feeling shy by the conventional teaching and learning. The purpose of this study is to examine the effectiveness of problem-based learning (PBL) on student’s English reading comprehension for mastering the contents and vocabulary acquisition. The study used a quantitative data where using an experimental pre-posttest design and the independent t-test and one-way ANOVA has been performed to test the hypothesis of the studies. The result indicated that there were statistically significant differences between the experimental and control groups for student’s reading comprehension for mastering the content and vocabulary acquisition using PBL. The results showed that the students who received PBL learning approached achieved higher performance outcome than student who is not received the PBL approach

    Towards an Observational Appraisal of String Cosmology

    Full text link
    We review the current observational status of string cosmology when confronted with experimental datasets. We begin by defining common observational parameters and discuss how they are determined for a given model. Then we review the observable footprints of several string theoretic models, discussing the significance of various potential signals. Throughout we comment on present and future prospects of finding evidence for string theory in cosmology, and on significant issues for the future.Comment: Review accepted for publication in the CQG focus issue on string cosmology. Minor clarifications and references adde
    • 

    corecore