12,303 research outputs found

    A Review of High School Level Astronomy Student Research Projects over the last two decades

    Get PDF
    Since the early 1990s with the arrival of a variety of new technologies, the capacity for authentic astronomical research at the high school level has skyrocketed. This potential, however, has not realized the bright-eyed hopes and dreams of the early pioneers who expected to revolutionise science education through the use of telescopes and other astronomical instrumentation in the classroom. In this paper, a general history and analysis of these attempts is presented. We define what we classify as an Astronomy Research in the Classroom (ARiC) project and note the major dimensions on which these projects differ before describing the 22 major student research projects active since the early 1990s. This is followed by a discussion of the major issues identified that affected the success of these projects and provide suggestions for similar attempts in the future.Comment: Accepted for Publication in PASA. 26 page

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    Experiences in teaching grid computing to advanced level students

    Get PDF
    The development of teaching materials for future software engineers is critical to the long term success of the grid. At present however there is considerable turmoil in the grid community both within the standards and the technology base underpinning these standards. In this context, it is especially challenging to develop teaching materials that have some sort of lifetime beyond the next wave of grid middleware and standards. In addition, the current way in which grid security is supported and delivered has two key problems. Firstly in the case of the UK e-Science community, scalability issues arise from a central certificate authority. Secondly, the current security mechanisms used by the grid community are not line grained enough. In this paper we outline how these issues are being addressed through the development of a grid computing module supported by an advanced authorisation infrastructure at the University of Glasgow

    Evolving a software development methodology for commercial ICTD projects

    Get PDF
    This article discusses the evolution of a “DistRibuted Agile Methodology Addressing Technical Ictd in Commercial Settings” (DRAMATICS) that was developed in a global software corporation to support ICTD projects from initial team setup through ICT system design, development, and prototyping, to scaling up and transitioning, to sustainable commercial models. We developed the methodology using an iterative Action Research approach in a series of commercial ICTD projects over a period of more than six years. Our learning is reflected in distinctive methodology features that support the development of contextually adapted ICT systems, collaboration with local partners, involvement of end users in design, and the transition from research prototypes to scalable, long-term solutions. We offer DRAMATICS as an approach that others can appropriate and adapt to their particular project contexts. We report on the methodology evolution and provide evidence of its effectiveness in the projects where it has been used

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    Cactus: Issues for Sustainable Simulation Software

    Full text link
    The Cactus Framework is an open-source, modular, portable programming environment for the collaborative development and deployment of scientific applications using high-performance computing. Its roots reach back to 1996 at the National Center for Supercomputer Applications and the Albert Einstein Institute in Germany, where its development jumpstarted. Since then, the Cactus framework has witnessed major changes in hardware infrastructure as well as its own community. This paper describes its endurance through these past changes and, drawing upon lessons from its past, also discusses futureComment: submitted to the Workshop on Sustainable Software for Science: Practice and Experiences 201
    • …
    corecore