8,650,965 research outputs found
Data Systems Dynamic Simulator
The Data System Dynamic Simulator (DSDS) is a discrete event simulation tool. It was developed for NASA for the specific purpose of evaluating candidate architectures for data systems of the Space Station era. DSDS provides three methods for meeting this requirement. First, the user has access to a library of standard pre-programmed elements. These elements represent tailorable components of NASA data systems and can be connected in any logical manner. Secondly, DSDS supports the development of additional elements. This allows the more sophisticated DSDS user the option of extending the standard element set. Thirdly, DSDS supports the use of data streams simulation. Data streams is the name given to a technique that ignores packet boundaries, but is sensitive to rate changes. Because rate changes are rare compared to packet arrivals in a typical NASA data system, data stream simulations require a fraction of the CPU run time. Additionally, the data stream technique is considerably more accurate than another commonly-used optimization technique
Developing and Enhancing Data Systems
This brief focuses on considerations for developing and enhancing data systems. The other two briefs focus on developing a coherent plan for effectively using data and on supporting the effective use of a data system
Industrial Data Systems, Inc.
Industrial Date Systems Corporation (IDS) is a Houston-based general engineering and services firm targeted toward the energy industry. Founded in 1985, the company has grown to annual revenues of 68 million. Together, they potentially will be able to fully meet client need for both upstream and downstream engineering and services support in the oil/gas, refining, chemicals, and petrochemicals industries. Now they must make it work. (Contact author for a copy of the complete report.)Small Business Mgmt
Big data for monitoring educational systems
This report considers “how advances in big data are likely to transform the context and methodology of monitoring educational systems within a long-term perspective (10-30 years) and impact the evidence based policy development in the sector”, big data are “large amounts of different types of data produced with high velocity from a high number of various types of sources.” Five independent experts were commissioned by Ecorys, responding to themes of: students' privacy, educational equity and efficiency, student tracking, assessment and skills. The experts were asked to consider the “macro perspective on governance on educational systems at all levels from primary, secondary education and tertiary – the latter covering all aspects of tertiary from further, to higher, and to VET”, prioritising primary and secondary levels of education
Cancer Surveillance using Data Warehousing, Data Mining, and Decision Support Systems
This article discusses how data warehousing, data mining, and decision support systems can reduce the national cancer burden or the oral complications of cancer therapies, especially as related to oral and pharyngeal cancers. An information system is presented that will deliver the necessary information technology to clinical, administrative, and policy researchers and analysts in an effective and efficient manner. The system will deliver the technology and knowledge that users need to readily: (1) organize relevant claims data, (2) detect cancer patterns in general and special populations, (3) formulate models that explain the patterns, and (4) evaluate the efficacy of specified treatments and interventions with the formulations. Such a system can be developed through a proven adaptive design strategy, and the implemented system can be tested on State of Maryland Medicaid data (which includes women, minorities, and children)
Designing Traceability into Big Data Systems
Providing an appropriate level of accessibility and traceability to data or
process elements (so-called Items) in large volumes of data, often
Cloud-resident, is an essential requirement in the Big Data era.
Enterprise-wide data systems need to be designed from the outset to support
usage of such Items across the spectrum of business use rather than from any
specific application view. The design philosophy advocated in this paper is to
drive the design process using a so-called description-driven approach which
enriches models with meta-data and description and focuses the design process
on Item re-use, thereby promoting traceability. Details are given of the
description-driven design of big data systems at CERN, in health informatics
and in business process management. Evidence is presented that the approach
leads to design simplicity and consequent ease of management thanks to loose
typing and the adoption of a unified approach to Item management and usage.Comment: 10 pages; 6 figures in Proceedings of the 5th Annual International
Conference on ICT: Big Data, Cloud and Security (ICT-BDCS 2015), Singapore
July 2015. arXiv admin note: text overlap with arXiv:1402.5764,
arXiv:1402.575
3 sampled-data control of nonlinear systems
This chapter provides some of the main ideas resulting from recent developments in sampled-data control of nonlinear systems. We have tried to bring the basic parts of the new developments within the comfortable grasp of graduate students. Instead of presenting the more general results that are available in the literature, we opted to present their less general versions that are easier to understand and whose proofs are easier to follow. We note that some of the proofs we present have not appeared in the literature in this simplified form. Hence, we believe that this chapter will serve as an important reference for students and researchers that are willing to learn about this area of research
- …
