58,546 research outputs found
Nucleus: A Pilot Project
Early in 2016, an environmental scan was conducted by the Research Library
Data Working Group for three purposes:
1.) Perform a survey of the data management landscape at Los Alamos National
Laboratory in order to identify local gaps in data management services.
2.) Conduct an environmental scan of external institutions to benchmark
budgets, infrastructure, and personnel dedicated to data management.
3.) Draft a research data infrastructure model that aligns with the current
workflow and classification restrictions at Los Alamos National Laboratory.
This report is a summary of those activities and the draft for a pilot data
management project.Comment: 13 pages, repor
Regional Data Archiving and Management for Northeast Illinois
This project studies the feasibility and implementation options for establishing a regional data archiving system to help monitor
and manage traffic operations and planning for the northeastern Illinois region. It aims to provide a clear guidance to the
regional transportation agencies, from both technical and business perspectives, about building such a comprehensive
transportation information system. Several implementation alternatives are identified and analyzed. This research is carried
out in three phases.
In the first phase, existing documents related to ITS deployments in the broader Chicago area are summarized, and a
thorough review is conducted of similar systems across the country. Various stakeholders are interviewed to collect
information on all data elements that they store, including the format, system, and granularity. Their perception of a data
archive system, such as potential benefits and costs, is also surveyed. In the second phase, a conceptual design of the
database is developed. This conceptual design includes system architecture, functional modules, user interfaces, and
examples of usage. In the last phase, the possible business models for the archive system to sustain itself are reviewed. We
estimate initial capital and recurring operational/maintenance costs for the system based on realistic information on the
hardware, software, labor, and resource requirements. We also identify possible revenue opportunities.
A few implementation options for the archive system are summarized in this report; namely:
1. System hosted by a partnering agency
2. System contracted to a university
3. System contracted to a national laboratory
4. System outsourced to a service provider
The costs, advantages and disadvantages for each of these recommended options are also provided.ICT-R27-22published or submitted for publicationis peer reviewe
A Relational Database Model for Managing Accelerator Control System Software At Jefferson Lab
The operations software group at the Thomas Jefferson National Accelerator
Facility faces a number of challenges common to facilities managing a large
body of software developed in-house. Developers include members of the software
group, operators, hardware engineers and accelerator physicists. One management
problem has been ensuring that all software has an identified maintainer who is
still working at the lab. In some cases, locating source code for 'orphaned'
software has also proven to be difficult. Other challenges include enforcing
minimal standards for versioning and documentation, segregating test software
from operational software, encouraging better code reuse, consolidating
input/output file storage and management, and tracking software dependencies.
This paper will describe a relational database model for tracking the
information necessary to solve the problems above. The instantiation of that
database model provides the foundation for various productivity- and
consistency- enhancing tools for automated (or at least assisted) building,
versioning, documenting and installation of software.Comment: ICALEPCS, 2001 PSN#WEAP07
Transition UGent: a bottom-up initiative towards a more sustainable university
The vibrant think-tank ‘Transition UGent’ engaged over 250 academics, students and people from the university management in suggesting objectives and actions for the Sustainability Policy of Ghent University (Belgium). Founded in 2012, this bottom-up initiative succeeded to place sustainability high on the policy agenda of our university. Through discussions within 9 working groups and using the transition management method, Transition UGent developed system analyses, sustainability visions and transition paths on 9 fields of Ghent University: mobility, energy, food, waste, nature and green, water, art, education and research. At the moment, many visions and ideas find their way into concrete actions and policies.
In our presentation we focused on the broad participative process, on the most remarkable structural results (e.g. a formal and ambitious Sustainability Vision and a student-led Sustainability Office) and on recent actions and experiments (e.g. a sustainability assessment on food supply in student restaurants, artistic COP21 activities, ambitious mobility plans, food leftovers projects, an education network on sustainability controversies, a transdisciplinary platform on Sustainable Cities). We concluded with some recommendations and reflections on this transition approach, on the important role of ‘policy entrepreneurs’ and student involvement, on lock-ins and bottlenecks, and on convincing skeptical leaders
Identifying common problems in the acquisition and deployment of large-scale software projects in the US and UK healthcare systems
Public and private organizations are investing increasing amounts into the development of
healthcare information technology. These applications are perceived to offer numerous benefits.
Software systems can improve the exchange of information between healthcare facilities. They
support standardised procedures that can help to increase consistency between different service
providers. Electronic patient records ensure minimum standards across the trajectory of care when
patients move between different specializations. Healthcare information systems also offer economic
benefits through efficiency savings; for example by providing the data that helps to identify potential
bottlenecks in the provision and administration of care. However, a number of high-profile failures
reveal the problems that arise when staff must cope with the loss of these applications. In particular,
teams have to retrieve paper based records that often lack the detail on electronic systems.
Individuals who have only used electronic information systems face particular problems in learning
how to apply paper-based fallbacks. The following pages compare two different failures of
Healthcare Information Systems in the UK and North America. The intention is to ensure that future
initiatives to extend the integration of electronic patient records will build on the ‘lessons learned’
from previous systems
New Hampshire University Research and Industry Plan: A Roadmap for Collaboration and Innovation
This University Research and Industry plan for New Hampshire is focused on accelerating innovation-led development in the state by partnering academia’s strengths with the state’s substantial base of existing and emerging advanced industries. These advanced industries are defined by their deep investment and connections to research and development and the high-quality jobs they generate across production, new product development and administrative positions involving skills in science, technology, engineering and math (STEM)
Big Data and Analysis of Data Transfers for International Research Networks Using NetSage
Modern science is increasingly data-driven and collaborative in nature. Many scientific disciplines, including genomics, high-energy physics, astronomy, and atmospheric science, produce petabytes of data that must be shared with collaborators all over the world. The National Science Foundation-supported International Research Network Connection (IRNC) links have been essential to enabling this collaboration, but as data sharing has increased, so has the amount of information being collected to understand network performance. New capabilities to measure and analyze the performance of international wide-area networks are essential to ensure end-users are able to take full advantage of such infrastructure for their big data applications. NetSage is a project to develop a unified, open, privacy-aware network measurement, and visualization service to address the needs of monitoring today's high-speed international research networks. NetSage collects data on both backbone links and exchange points, which can be as much as 1Tb per month. This puts a significant strain on hardware, not only in terms storage needs to hold multi-year historical data, but also in terms of processor and memory needs to analyze the data to understand network behaviors. This paper addresses the basic NetSage architecture, its current data collection and archiving approach, and details the constraints of dealing with this big data problem of handling vast amounts of monitoring data, while providing useful, extensible visualization to end users
- …