22,960 research outputs found
Recommended from our members
Of impacts, agents, and functions: An interdisciplinary meta-review of smart home energy management systems research
Smart home energy management technologies (SHEMS) have long been viewed as a promising opportunity to manage the way households use energy. Research on this topic has emerged across a variety of disciplines, focusing on different pieces of the SHEMS puzzle without offering a holistic vision of how these technologies and their users will influence home energy use moving forward. This paper presents the results of a systematic, interdisciplinary meta-review of SHEMS literature, assessing the extent to which it discusses the role of various SHEMS components in driving energy benefits. Results reveal a bias towards technical perspectives and controls approaches that seek to drive energy impacts such as load management and energy savings through SHEMS without user or third-party participation. Not only are techno-centric approaches more common, there is also a lack of integration of these approaches with user-centric, information-based solutions for driving energy impacts. These results suggest future work should investigate more holistic solutions for optimal impacts on household energy use. We hope these results will provoke a broader discussion about how to advance research on SHEMS to capitalize on their potential contributions to demand-side management initiatives moving forward
Accelerating scientific codes by performance and accuracy modeling
Scientific software is often driven by multiple parameters that affect both
accuracy and performance. Since finding the optimal configuration of these
parameters is a highly complex task, it extremely common that the software is
used suboptimally. In a typical scenario, accuracy requirements are imposed,
and attained through suboptimal performance. In this paper, we present a
methodology for the automatic selection of parameters for simulation codes, and
a corresponding prototype tool. To be amenable to our methodology, the target
code must expose the parameters affecting accuracy and performance, and there
must be formulas available for error bounds and computational complexity of the
underlying methods. As a case study, we consider the particle-particle
particle-mesh method (PPPM) from the LAMMPS suite for molecular dynamics, and
use our tool to identify configurations of the input parameters that achieve a
given accuracy in the shortest execution time. When compared with the
configurations suggested by expert users, the parameters selected by our tool
yield reductions in the time-to-solution ranging between 10% and 60%. In other
words, for the typical scenario where a fixed number of core-hours are granted
and simulations of a fixed number of timesteps are to be run, usage of our tool
may allow up to twice as many simulations. While we develop our ideas using
LAMMPS as computational framework and use the PPPM method for dispersion as
case study, the methodology is general and valid for a range of software tools
and methods
Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection
In order to avoid the "Midas Touch" problem, gaze-based interfaces for
selection often introduce a dwell time: a fixed amount of time the user must
fixate upon an object before it is selected. Past interfaces have used a
uniform dwell time across all objects. Here, we propose a gaze-based browser
using a two-step selection policy with variable dwell time. In the first step,
a command, e.g. "back" or "select", is chosen from a menu using a dwell time
that is constant across the different commands. In the second step, if the
"select" command is chosen, the user selects a hyperlink using a dwell time
that varies between different hyperlinks. We assign shorter dwell times to more
likely hyperlinks and longer dwell times to less likely hyperlinks. In order to
infer the likelihood each hyperlink will be selected, we have developed a
probabilistic model of natural gaze behavior while surfing the web. We have
evaluated a number of heuristic and probabilistic methods for varying the dwell
times using both simulation and experiment. Our results demonstrate that
varying dwell time improves the user experience in comparison with fixed dwell
time, resulting in fewer errors and increased speed. While all of the methods
for varying dwell time resulted in improved performance, the probabilistic
models yielded much greater gains than the simple heuristics. The best
performing model reduces error rate by 50% compared to 100ms uniform dwell time
while maintaining a similar response time. It reduces response time by 60%
compared to 300ms uniform dwell time while maintaining a similar error rate.Comment: This is an Accepted Manuscript of an article published by Taylor &
Francis in the International Journal of Human-Computer Interaction on 30
March, 2018, available online:
http://www.tandfonline.com/10.1080/10447318.2018.1452351 . For an eprint of
the final published article, please access:
https://www.tandfonline.com/eprint/T9d4cNwwRUqXPPiZYm8Z/ful
Enhancing speed and scalability of the ParFlow simulation code
Regional hydrology studies are often supported by high resolution simulations
of subsurface flow that require expensive and extensive computations. Efficient
usage of the latest high performance parallel computing systems becomes a
necessity. The simulation software ParFlow has been demonstrated to meet this
requirement and shown to have excellent solver scalability for up to 16,384
processes. In the present work we show that the code requires further
enhancements in order to fully take advantage of current petascale machines. We
identify ParFlow's way of parallelization of the computational mesh as a
central bottleneck. We propose to reorganize this subsystem using fast mesh
partition algorithms provided by the parallel adaptive mesh refinement library
p4est. We realize this in a minimally invasive manner by modifying selected
parts of the code to reinterpret the existing mesh data structures. We evaluate
the scaling performance of the modified version of ParFlow, demonstrating good
weak and strong scaling up to 458k cores of the Juqueen supercomputer, and test
an example application at large scale.Comment: The final publication is available at link.springer.co
Grid simulation services for the medical community
The first part of this paper presents a selection of medical simulation applications, including image reconstruction, near real-time registration for neuro-surgery, enhanced dose distribution calculation for radio-therapy, inhaled drug delivery prediction, plastic surgery planning and cardio-vascular system simulation. The latter two topics are discussed in some detail. In the second part, we show how such services can be made available to the clinical practitioner using Grid technology. We discuss the developments and experience made during the EU project GEMSS, which provides reliable, efficient, secure and lawful medical Grid services
Panel on future challenges in modeling methodology
This panel paper presents the views of six researchers and practitioners of simulation modeling. Collectively we attempt to address a range of key future challenges to modeling methodology. It is hoped that the views of this paper, and the presentations made by the panelists at the 2004 Winter Simulation Conference will raise awareness and stimulate further discussion on the future of modeling methodology in areas such as modeling problems in business applications, human factors and geographically dispersed networks; rapid model development and maintenance; legacy modeling approaches; markup languages; virtual interactive process design and simulation; standards; and Grid computing
Modular System for Shelves and Coasts (MOSSCO v1.0) - a flexible and multi-component framework for coupled coastal ocean ecosystem modelling
Shelf and coastal sea processes extend from the atmosphere through the water
column and into the sea bed. These processes are driven by physical, chemical,
and biological interactions at local scales, and they are influenced by
transport and cross strong spatial gradients. The linkages between domains and
many different processes are not adequately described in current model systems.
Their limited integration level in part reflects lacking modularity and
flexibility; this shortcoming hinders the exchange of data and model components
and has historically imposed supremacy of specific physical driver models. We
here present the Modular System for Shelves and Coasts (MOSSCO,
http://www.mossco.de), a novel domain and process coupling system
tailored---but not limited--- to the coupling challenges of and applications in
the coastal ocean. MOSSCO builds on the existing coupling technology Earth
System Modeling Framework and on the Framework for Aquatic Biogeochemical
Models, thereby creating a unique level of modularity in both domain and
process coupling; the new framework adds rich metadata, flexible scheduling,
configurations that allow several tens of models to be coupled, and tested
setups for coastal coupled applications. That way, MOSSCO addresses the
technology needs of a growing marine coastal Earth System community that
encompasses very different disciplines, numerical tools, and research
questions.Comment: 30 pages, 6 figures, submitted to Geoscientific Model Development
Discussion
- …