357 research outputs found

    Managing the Ethical Dimensions of Brain-Computer Interfaces in eHealth: An SDLC-based Approach

    Get PDF
    A growing range of brain-computer interface (BCI) technologies is being employed for purposes of therapy and human augmentation. While much thought has been given to the ethical implications of such technologies at the ‘macro’ level of social policy and ‘micro’ level of individual users, little attention has been given to the unique ethical issues that arise during the process of incorporating BCIs into eHealth ecosystems. In this text a conceptual framework is developed that enables the operators of eHealth ecosystems to manage the ethical components of such processes in a more comprehensive and systematic way than has previously been possible. The framework’s first axis defines five ethical dimensions that must be successfully addressed by eHealth ecosystems: 1) beneficence; 2) consent; 3) privacy; 4) equity; and 5) liability. The second axis describes five stages of the systems development life cycle (SDLC) process whereby new technology is incorporated into an eHealth ecosystem: 1) analysis and planning; 2) design, development, and acquisition; 3) integration and activation; 4) operation and maintenance; and 5) disposal. Known ethical issues relating to the deployment of BCIs are mapped onto this matrix in order to demonstrate how it can be employed by the managers of eHealth ecosystems as a tool for fulfilling ethical requirements established by regulatory standards or stakeholders’ expectations. Beyond its immediate application in the case of BCIs, we suggest that this framework may also be utilized beneficially when incorporating other innovative forms of information and communications technology (ICT) into eHealth ecosystems

    Cell me the money: unlocking the value in the mobile payment ecosystem

    Get PDF
    This report examines the challenges and benefits of mobile commerce in the United States. The report is based on a survey of senior executives from the mobile payment value chain. Survey results shed light on the key barriers that have traditionally challenged the mobile payment market in the United States, including the lack of revenue-sharing agreements, a dearth of consumer knowledge, low levels of demand and competing platforms in a fragmented market. Getting ahead of the curve will require companies to develop mutually beneficial business models and take advantage of further innovations made on the mobile platform. Ultimately, mobile carriers and financial institutions must come to the table and sacrifice in the short-term to create an opportunity to win big down the road

    Managing the Ethical Dimensions of Brain-Computer Interfaces in eHealth: An SDLC-based Approach

    Get PDF
    A growing range of brain-computer interface (BCI) technologies is being employed for purposes of therapy and human augmentation. While much thought has been given to the ethical implications of such technologies at the ‘macro’ level of social policy and ‘micro’ level of individual users, little attention has been given to the unique ethical issues that arise during the process of incorporating BCIs into eHealth ecosystems. In this text a conceptual framework is developed that enables the operators of eHealth ecosystems to manage the ethical components of such processes in a more comprehensive and systematic way than has previously been possible. The framework’s first axis defines five ethical dimensions that must be successfully addressed by eHealth ecosystems: 1) beneficence; 2) consent; 3) privacy; 4) equity; and 5) liability. The second axis describes five stages of the systems development life cycle (SDLC) process whereby new technology is incorporated into an eHealth ecosystem: 1) analysis and planning; 2) design, development, and acquisition; 3) integration and activation; 4) operation and maintenance; and 5) disposal. Known ethical issues relating to the deployment of BCIs are mapped onto this matrix in order to demonstrate how it can be employed by the managers of eHealth ecosystems as a tool for fulfilling ethical requirements established by regulatory standards or stakeholders’ expectations. Beyond its immediate application in the case of BCIs, we suggest that this framework may also be utilized beneficially when incorporating other innovative forms of information and communications technology (ICT) into eHealth ecosystems

    QOS Monitoring in Middleware

    Get PDF
    Monitoring the system and services is unavoidable if quality of service (QoS) is to be provided. This paper describes monitoring of middleware components, which express the business logic of e-business applications, and the environment in which they execute. Middleware components have become important because of their prominent role in the EAI (Enterprise Application Integration) and implementation of n-tier web applications. We have developed a framework and a supporting toolkit that enables a QoS specialist/engineer to facilitate monitoring and reporting. Two types of monitored data are collected. The QoS engineer identifies critical activities withing a component so that their delays can be captured. Also collected are data from probing sub-systems forming the environment in which the components execute. We discuss in detail our approach to instrumentation of monitoring probes. Probes habe been successfully instrumented for a number of (sub)systems that form the environment in which the componenets execute, systems that include operating, network, and DB systems

    Satellite Networks: Architectures, Applications, and Technologies

    Get PDF
    Since global satellite networks are moving to the forefront in enhancing the national and global information infrastructures due to communication satellites' unique networking characteristics, a workshop was organized to assess the progress made to date and chart the future. This workshop provided the forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. Presentations on overview, state-of-the-art in research, development, deployment and applications and future trends on satellite networks are assembled

    Issues Related to the Emergence of the Information Superhighway and California Societal Changes, IISTPS Report 96-4

    Get PDF
    The Norman Y. Mineta International Institute for Surface Transportation Policy Studies (IISTPS) at San JosĂ© State University (SJSU) conducted this project to review the continuing development of the Internet and the Information Superhighway. Emphasis was placed on an examination of the impact on commuting and working patterns in California, and an analysis of how public transportation agencies, including Caltrans, might take advantage of the new communications technologies. The document reviews the technology underlying the current Internet “structure” and examines anticipated developments. It is important to note that much of the research for this limited-scope project was conducted during 1995, and the topic is so rapidly evolving that some information is almost automatically “dated.” The report also examines how transportation agencies are basically similar in structure and function to other business entities, and how they can continue to utilize the emerging technologies to improve internal and external communications. As part of a detailed discussion of specific transportation agency functions, it is noted that the concept of a “Roundtable Forum,” growing out of developments in Concurrent Engineering, can provide an opportunity for representatives from multiple jurisdictions to utilize the Internet for more coordinated decision-making. The report also included an extensive analysis of demographic trends in California in recent years, such as commute and recreational activities, and identifies how the emerging technologies may impact future changes

    2017 DWH Long-Term Data Management Coordination Workshop Report

    Get PDF
    On June 7 and 8, 2017, the Coastal Response Research Center (CRRC)[1], NOAA Office of Response and Restoration (ORR) and NOAA National Marine Fisheries Service (NMFS) Restoration Center (RC), co-sponsored the Deepwater Horizon Oil Spill (DWH) Long Term Data Management (LTDM) workshop at the ORR Gulf of Mexico (GOM) Disaster Response Center (DRC) in Mobile, AL. There has been a focus on restoration planning, implementation and monitoring of the on-going DWH-related research in the wake of the DWH Natural Resource Damage Assessment (NRDA) settlement. This means that data management, accessibility, and distribution must be coordinated among various federal, state, local, non-governmental organizations (NGOs), academic, and private sector partners. The scope of DWH far exceeded any other spill in the U.S. with an immense amount of data (e.g., 100,000 environmental samples, 15 million publically available records) gathered during the response and damage assessment phases of the incident as well as data that continues to be produced from research and restoration efforts. The challenge with the influx in data is checking the quality, documenting data collection, storing data, integrating it into useful products, managing it and archiving it for long term use. In addition, data must be available to the public in an easily queried and accessible format. Answering questions regarding the success of the restoration efforts will be based on data generated for years to come. The data sets must be readily comparable, representative and complete; be collected using cross-cutting field protocols; be as interoperable as possible; meet standards for quality assurance/quality control (QA/QC); and be unhindered by conflicting or ambiguous terminology. During the data management process for the NOAA Natural Resource Damage Assessment (NRDA) for the DWH disaster, NOAA developed a data management warehouse and visualization system that will be used as a long term repository for accessing/archiving NRDA injury assessment data. This serves as a foundation for the restoration project planning and monitoring data for the next 15 or more years. The main impetus for this workshop was to facilitate public access to the DWH data collected and managed by all entities by developing linkages to or data exchanges among applicable GOM data management systems. There were 66 workshop participants (Appendix A) representing a variety of organizations who met at NOAA’s GOM Disaster Response Center (DRC) in order to determine the characteristics of a successful common operating picture for DWH data, to understand the systems that are currently in place to manage DWH data, and make the DWH data interoperable between data generators, users and managers. The external partners for these efforts include, but are not limited to the: RESTORE Council, Gulf of Mexico Research Initiative (GoMRI), Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC), the National Academy of Sciences (NAS) Gulf Research Program, Gulf of Mexico Alliance (GOMA), and National Fish and Wildlife Foundation (NFWF). The workshop objectives were to: Foster collaboration among the GOM partners with respect to data management and integration for restoration planning, implementation and monitoring; Identify standards, protocols and guidance for LTDM being used by these partners for DWH NRDA, restoration, and public health efforts; Obtain feedback and identify next steps for the work completed by the Environmental Disasters Data Management (EDDM) Working Groups; and Work towards best practices on public distribution and access of this data. The workshop consisted of plenary presentations and breakout sessions. The workshop agenda (Appendix B) was developed by the organizing committee. The workshop presentations topics included: results of a pre-workshop survey, an overview of data generation, the uses of DWH long term data, an overview of LTDM, an overview of existing LTDM systems, an overview of data management standards/ protocols, results from the EDDM working groups, flow diagrams of existing data management systems, and a vision on managing big data. The breakout sessions included discussions of: issues/concerns for data stakeholders (e.g., data users, generators, managers), interoperability, ease of discovery/searchability, data access, data synthesis, data usability, and metadata/data documentation. [1] A list of acronyms is provided on Page 1 of this report
    • 

    corecore