95,381 research outputs found
OPTIMIZATION OF SIMULATIONS IN OPENSIMPPLLE
Computer software has become an integral tool in exploring scientific concepts and computational models. Models, such as OpenSIMPPLLE, use a complex set of rules developed by experts to predict the impact of fires, disease, and wildlife on large scale landscapes.
OpenSIMPPLLE’s simulations are time-consuming when projecting far into the future. OpenSIMPPLLE needs to execute more efficiently to allow for faster completion of simulations. The increase in speed will also enable users to run simulations with more timesteps in shorter periods. There are plenty of ways to accomplish this.
The work described here identifies three different methods for increasing efficiency. The first method is refactoring expensive operations, the second is applying design patterns, and the third is to introduce parallelism. The main objective of this work is to examine whether the intersection of parallelism and efficient design will combine in an optimal runtime while analyzing the best approaches to implement parallel techniques
Libraries in transition: evolving the information ecology of the Learning Commons: a sabbatical report
This sabbatical report studied various models in order to determine best practices for design, implementation and service of Leaning Commons, a library service model which functionally and spatially integrates library services, information technology services, and media services to provide a continuum of services to the user
Building scalable digital library ingestion pipelines using microservices
CORE, a harvesting service offering access to millions of open access research papers from around the world, has shifted its harvesting process from following a monolithic approach to the adoption of a microservices infrastructure. In this paper, we explain how we rearranged and re-scheduled our old ingestion pipeline, present CORE's move to managing microservices and outline the tools we use in a new and optimised ingestion system. In addition, we discuss the ineffciencies of our old harvesting process, the advantages, and challenges of our new ingestion system and our future plans. We conclude that via the adoption of microservices architecture we managed to achieve a scalable and distributed system that would assist with CORE's future performance
and evolution
Recommended from our members
Creative User-Centered Visualization Design for Energy Analysts and Modelers
We enhance a user-centered design process with techniques that deliberately promote creativity to identify opportunities for the visualization of data generated by a major energy supplier. Visualization prototypes developed in this way prove effective in a situation whereby data sets are largely unknown and requirements open – enabling successful exploration of possibilities for visualization in Smart Home data analysis. The process gives rise to novel designs and design metaphors including data sculpting. It suggests: that the deliberate use of creativity techniques with data stakeholders is likely to contribute to successful, novel and effective solutions; that being explicit about creativity may contribute to designers developing creative solutions; that using creativity techniques early in the design process may result in a creative approach persisting throughout the process. The work constitutes the first systematic visualization design for a data rich source that will be increasingly important to energy suppliers and consumers as Smart Meter technology is widely deployed. It is novel in explicitly employing creativity techniques at the requirements stage of visualization design and development, paving the way for further use and study of creativity methods in visualization design
Virtual geological outcrops - fieldwork and analysis made less exhaustive?
For geologists studying outcrops in the field, there is an ever‐increasing need for the acquisition of accurate and comprehensive data, whatever their purpose. Fortunately, this need is mirrored by an expanding range of digital data capturing technologies that provide the possibility of examining geological outcrops in minute detail from the desktop. Although difficult technologically, there is also a need to combine differing datasets into a single, accurate, digital model that will allow field geologists to place their data in a wider context. This paper examines the techniques available, and highlights new Light Detection and Ranging (LIDAR) technology which should prove to be a unifying technique, being able to combine images and local coordinates on‐site
Astronomical Spectroscopy
Spectroscopy is one of the most important tools that an astronomer has for
studying the universe. This chapter begins by discussing the basics, including
the different types of optical spectrographs, with extension to the ultraviolet
and the near-infrared. Emphasis is given to the fundamentals of how
spectrographs are used, and the trade-offs involved in designing an
observational experiment. It then covers observing and reduction techniques,
noting that some of the standard practices of flat-fielding often actually
degrade the quality of the data rather than improve it. Although the focus is
on point sources, spatially resolved spectroscopy of extended sources is also
briefly discussed. Discussion of differential extinction, the impact of
crowding, multi-object techniques, optimal extractions, flat-fielding
considerations, and determining radial velocities and velocity dispersions
provide the spectroscopist with the fundamentals needed to obtain the best
data. Finally the chapter combines the previous material by providing some
examples of real-life observing experiences with several typical instruments.Comment: An abridged version of a chapter to appear in Planets, Stars and
Stellar Systems, to be published in 2011 by Springer. Slightly revise
- …