95,181 research outputs found

    Grid Analysis of Radiological Data

    Get PDF
    IGI-Global Medical Information Science Discoveries Research Award 2009International audienceGrid technologies and infrastructures can contribute to harnessing the full power of computer-aided image analysis into clinical research and practice. Given the volume of data, the sensitivity of medical information, and the joint complexity of medical datasets and computations expected in clinical practice, the challenge is to fill the gap between the grid middleware and the requirements of clinical applications. This chapter reports on the goals, achievements and lessons learned from the AGIR (Grid Analysis of Radiological Data) project. AGIR addresses this challenge through a combined approach. On one hand, leveraging the grid middleware through core grid medical services (data management, responsiveness, compression, and workflows) targets the requirements of medical data processing applications. On the other hand, grid-enabling a panel of applications ranging from algorithmic research to clinical use cases both exploits and drives the development of the services

    The ARIEL Instrument Control Unit design for the M4 Mission Selection Review of the ESA's Cosmic Vision Program

    Get PDF
    The Atmospheric Remote-sensing Infrared Exoplanet Large-survey mission (ARIEL) is one of the three present candidates for the ESA M4 (the fourth medium mission) launch opportunity. The proposed Payload will perform a large unbiased spectroscopic survey from space concerning the nature of exoplanets atmospheres and their interiors to determine the key factors affecting the formation and evolution of planetary systems. ARIEL will observe a large number (>500) of warm and hot transiting gas giants, Neptunes and super-Earths around a wide range of host star types, targeting planets hotter than 600 K to take advantage of their well-mixed atmospheres. It will exploit primary and secondary transits spectroscopy in the 1.2-8 um spectral range and broad-band photometry in the optical and Near IR (NIR). The main instrument of the ARIEL Payload is the IR Spectrometer (AIRS) providing low-resolution spectroscopy in two IR channels: Channel 0 (CH0) for the 1.95-3.90 um band and Channel 1 (CH1) for the 3.90-7.80 um range. It is located at the intermediate focal plane of the telescope and common optical system and it hosts two IR sensors and two cold front-end electronics (CFEE) for detectors readout, a well defined process calibrated for the selected target brightness and driven by the Payload's Instrument Control Unit (ICU).Comment: Experimental Astronomy, Special Issue on ARIEL, (2017

    The Evidence Hub: harnessing the collective intelligence of communities to build evidence-based knowledge

    Get PDF
    Conventional document and discussion websites provide users with no help in assessing the quality or quantity of evidence behind any given idea. Besides, the very meaning of what evidence is may not be unequivocally defined within a community, and may require deep understanding, common ground and debate. An Evidence Hub is a tool to pool the community collective intelligence on what is evidence for an idea. It provides an infrastructure for debating and building evidence-based knowledge and practice. An Evidence Hub is best thought of as a filter onto other websites — a map that distills the most important issues, ideas and evidence from the noise by making clear why ideas and web resources may be worth further investigation. This paper describes the Evidence Hub concept and rationale, the breath of user engagement and the evolution of specific features, derived from our work with different community groups in the healthcare and educational sector

    Launching the Grand Challenges for Ocean Conservation

    Get PDF
    The ten most pressing Grand Challenges in Oceans Conservation were identified at the Oceans Big Think and described in a detailed working document:A Blue Revolution for Oceans: Reengineering Aquaculture for SustainabilityEnding and Recovering from Marine DebrisTransparency and Traceability from Sea to Shore:  Ending OverfishingProtecting Critical Ocean Habitats: New Tools for Marine ProtectionEngineering Ecological Resilience in Near Shore and Coastal AreasReducing the Ecological Footprint of Fishing through Smarter GearArresting the Alien Invasion: Combating Invasive SpeciesCombatting the Effects of Ocean AcidificationEnding Marine Wildlife TraffickingReviving Dead Zones: Combating Ocean Deoxygenation and Nutrient Runof

    Harnessing data flow and modelling potentials for sustainable development

    Get PDF
    Tackling some of the global challenges relating to health, poverty, business and the environment is known to be heavily dependent on the flow and utilisation of data. However, while enhancements in data generation, storage, modelling, dissemination and the related integration of global economies and societies are fast transforming the way we live and interact, the resulting dynamic, globalised and information society remains digitally divided. On the African continent, in particular, the division has resulted into a gap between knowledge generation and its transformation into tangible products and services which Kirsop and Chan (2005) attribute to a broken information flow. This paper proposes some fundamental approaches for a sustainable transformation of data into knowledge for the purpose of improving the peoples' quality of life. Its main strategy is based on a generic data sharing model providing access to data utilising and generating entities in a multi disciplinary environment. It highlights the great potentials in using unsupervised and supervised modelling in tackling the typically predictive-in-nature challenges we face. Using both simulated and real data, the paper demonstrates how some of the key parameters may be generated and embedded in models to enhance their predictive power and reliability. Its main outcomes include a proposed implementation framework setting the scene for the creation of decision support systems capable of addressing the key issues in society. It is expected that a sustainable data flow will forge synergies between the private sector, academic and research institutions within and between countries. It is also expected that the paper's findings will help in the design and development of knowledge extraction from data in the wake of cloud computing and, hence, contribute towards the improvement in the peoples' overall quality of life. To void running high implementation costs, selected open source tools are recommended for developing and sustaining the system. Key words: Cloud Computing, Data Mining, Digital Divide, Globalisation, Grid Computing, Information Society, KTP, Predictive Modelling and STI

    Teaching and Learning in Statistics: Harnessing the power of modern statistical software to improve students statistical reasoning and thinking

    Full text link
    [EN] The reproducibility crisis in science has launched global discussion about the need to restructure the way statistics is taught across a wide range of disciplines. While this need has been recognized and discussed in the academic community for many years, the impetus for educational reform of statistics was boosted by Ioannidis (2005), which resulted in a great deal of attention on issues regarding the inappropriate use of statistical reasoning. The availability of data across business and research has increased dramatically in recent years. This access to data has resulted in almost every member of society needing a skill set that allows them to think critically about the inferences that can validly be drawn to improve decisions based on data. One way of improving statistical literacy and thinking is through the identification and use of appropriate statistical software that will allow students, and other practitioners with basic training, access to modern statistical modeling techniques on a platform that allows them to focus on outcomes. A key component of using AutoStat for teaching statistical thinking is in alleviating the need for coding, which allows the instructors to focus on key concepts, questions and outcomes.Alston-Knox, C.; Strickland, C.; Gazos, T.; Mengersen, K. (2019). Teaching and Learning in Statistics: Harnessing the power of modern statistical software to improve students statistical reasoning and thinking. En HEAD'19. 5th International Conference on Higher Education Advances. Editorial Universitat PolitĂšcnica de ValĂšncia. 1171-1178. https://doi.org/10.4995/HEAD19.2019.9239OCS1171117

    Changing primary science education by identifying, representing, and analysing variation in data-based observations from integrated STEM activities

    Get PDF
    Making observations to describe natural phenomena is an emphasis of primary science education. In the early years of schooling, those observations are often qualitative and seldom used to make decisions. There is, however, the potential to add value to the established curriculum by providing young students the opportunity to record data-based observations as part of a science inquiry. Such an approach set within integrated STEM contexts supports students to gather empirical evidence from observation and experimentation. This presentation will provide examples from a research project entitled, Modelling with Data: Advancing STEM in the Primary Curriculum, that illustrate the potential for learning about science topics explored through data-based inquiries to foster outcomes in the relevant STEM disciplines across the primary years of schooling. Science topics include the manufacture of machine-made versus hand-made products, the transfer of heat, the application of force, the dispersal of seeds, the viscosity of liquids, and the growth of plants (Fitzallen & Watson, 2020). Common to all activities was the implementation of the Practice of Statistics (Watson et al., 2018) as the mathematics component of STEM, which involved a statistical inquiry cycle of: Formulate question/s, Collect data, Analyse data, and Interpret results (Franklin et al., 2007). Embedded within the inquiry process was the gathering of variable data related to the questions posed, representation of data in ways that account for trends within the variability seen, and interpretation of the data that accounted for the variability seen. Also, central to many of the activities was student use of the exploratory data analysis software, TinkerPlotsTM (Watson & Fitzallen, 2016), which served to scaffold student learning outcomes. REFERENCES Fitzallen, N., & Watson, J. (2020). Using the practice of statistics to design students’ experiences in STEM education. In B. Shelley, K. te Riele, N. Brown, & T. Crellin (Eds.), Harnessing the transformative power of education (pp. 74–99). Koninklijke Brill. Franklin, C., Kader, G., Mewborn, D., Moreno, J., Peck, R., Perry, M., & Scheaffer, R. (2007). Guidelines for assessment and instruction in statistics education (GAISE) report: A pre-K–12 curriculum framework. American Statistical Association. https://www.amstat.org/docs/default-source/amstat-documents/gaiseprek-12_full.pdf Watson, J., & Fitzallen, N. (2016). Statistical software and mathematics education: Affordances for learning. In L. English & D. Kirshner (Eds.), Handbook of international research in mathematics education (3rd ed., pp. 563–594). Taylor and Francis. Watson, J., Fitzallen, N., Fielding-Wells, J., & Madden, S. (2018). The practice of statistics. In D. Ben-Zvi, K. Makar, & J. Garfield (Eds.), International handbook of research in statistics education (pp. 105–137). Springer
    • 

    corecore