164 research outputs found

    2017 DWH Long-Term Data Management Coordination Workshop Report

    Get PDF
    On June 7 and 8, 2017, the Coastal Response Research Center (CRRC)[1], NOAA Office of Response and Restoration (ORR) and NOAA National Marine Fisheries Service (NMFS) Restoration Center (RC), co-sponsored the Deepwater Horizon Oil Spill (DWH) Long Term Data Management (LTDM) workshop at the ORR Gulf of Mexico (GOM) Disaster Response Center (DRC) in Mobile, AL. There has been a focus on restoration planning, implementation and monitoring of the on-going DWH-related research in the wake of the DWH Natural Resource Damage Assessment (NRDA) settlement. This means that data management, accessibility, and distribution must be coordinated among various federal, state, local, non-governmental organizations (NGOs), academic, and private sector partners. The scope of DWH far exceeded any other spill in the U.S. with an immense amount of data (e.g., 100,000 environmental samples, 15 million publically available records) gathered during the response and damage assessment phases of the incident as well as data that continues to be produced from research and restoration efforts. The challenge with the influx in data is checking the quality, documenting data collection, storing data, integrating it into useful products, managing it and archiving it for long term use. In addition, data must be available to the public in an easily queried and accessible format. Answering questions regarding the success of the restoration efforts will be based on data generated for years to come. The data sets must be readily comparable, representative and complete; be collected using cross-cutting field protocols; be as interoperable as possible; meet standards for quality assurance/quality control (QA/QC); and be unhindered by conflicting or ambiguous terminology. During the data management process for the NOAA Natural Resource Damage Assessment (NRDA) for the DWH disaster, NOAA developed a data management warehouse and visualization system that will be used as a long term repository for accessing/archiving NRDA injury assessment data. This serves as a foundation for the restoration project planning and monitoring data for the next 15 or more years. The main impetus for this workshop was to facilitate public access to the DWH data collected and managed by all entities by developing linkages to or data exchanges among applicable GOM data management systems. There were 66 workshop participants (Appendix A) representing a variety of organizations who met at NOAA’s GOM Disaster Response Center (DRC) in order to determine the characteristics of a successful common operating picture for DWH data, to understand the systems that are currently in place to manage DWH data, and make the DWH data interoperable between data generators, users and managers. The external partners for these efforts include, but are not limited to the: RESTORE Council, Gulf of Mexico Research Initiative (GoMRI), Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC), the National Academy of Sciences (NAS) Gulf Research Program, Gulf of Mexico Alliance (GOMA), and National Fish and Wildlife Foundation (NFWF). The workshop objectives were to: Foster collaboration among the GOM partners with respect to data management and integration for restoration planning, implementation and monitoring; Identify standards, protocols and guidance for LTDM being used by these partners for DWH NRDA, restoration, and public health efforts; Obtain feedback and identify next steps for the work completed by the Environmental Disasters Data Management (EDDM) Working Groups; and Work towards best practices on public distribution and access of this data. The workshop consisted of plenary presentations and breakout sessions. The workshop agenda (Appendix B) was developed by the organizing committee. The workshop presentations topics included: results of a pre-workshop survey, an overview of data generation, the uses of DWH long term data, an overview of LTDM, an overview of existing LTDM systems, an overview of data management standards/ protocols, results from the EDDM working groups, flow diagrams of existing data management systems, and a vision on managing big data. The breakout sessions included discussions of: issues/concerns for data stakeholders (e.g., data users, generators, managers), interoperability, ease of discovery/searchability, data access, data synthesis, data usability, and metadata/data documentation. [1] A list of acronyms is provided on Page 1 of this report

    The Investment Process Used By Private Equity Firms: Does The Affect Heuristic Impact Decision-Making?

    Get PDF
    Individuals utilize heuristics in order to simplify problems, which may lead to biases in decision-making. The research question of this study is: “How does the affect heuristic impact the investment process of private equity decision-makers reviewing proposals?” Through an exploratory multi-case analysis, insight is provided into complex private equity decisions by studying biases in the investment process. This is a study of private equity groups’ (PEG) decision-making process when they consider businesses for investment. Qualitative data was generated from semi-structured interviews with twenty private equity decision-makers. The deliberative heuristics applied in the teaser review are learned from process experience and guide the deliberation on whether to proceed. Simplifying heuristics are applied in the more informal review process. Organizational learning was exhibited as the PEGs have modified their investment structures based on previous experiences. The study indicates that experience and learning lead to the construction of an affect heuristic that subsequently impacts investments. It also confirms the need for strategic decision-makers to recognize their own biases and adjust their processes accordingly. A significant practical implication of this study is the insight provided into the views of the PEG decision-makers as they anticipate the need to supplement the management team is helpful to business owners and their advisors. The study highlights the opportunities for biases in PEG decision-making processes. Accessing decision-makers at larger PEGs and approaching more middle market firms would broaden the results. A paper based on this dissertation is available from Wiley at https://doi.org/10.1111/jsbm.12451

    Metodología de implantación de modelos de gestión de la información dentro de los sistemas de planificación de recursos empresariales. Aplicación en la pequeña y mediana empresa

    Get PDF
    La Siguiente Generación de Sistemas de Fabricación (SGSF) trata de dar respuesta a los requerimientos de los nuevos modelos de empresas, en contextos de inteligencia, agilidad y adaptabilidad en un entono global y virtual. La Planificación de Recursos Empresariales (ERP) con soportes de gestión del producto (PDM) y el ciclo de vida del producto (PLM) proporciona soluciones de gestión empresarial sobre la base de un uso coherente de tecnologías de la información para la implantación en sistemas CIM (Computer-Integrated Manufacturing), con un alto grado de adaptabilidad a la estnictura organizativa deseada. En general, esta implementación se lleva desarrollando hace tiempo en grandes empresas, siendo menor (casi nula) su extensión a PYMEs. La presente Tesis Doctoral, define y desarrolla una nueva metodología de implementación pan la generación automática de la información en los procesos de negocio que se verifican en empresas con requerimientos adaptados a las necesidades de la SGSF, dentro de los sistemas de gestión de los recursos empresariales (ERP), atendiendo a la influencia del factor humano. La validez del modelo teórico de la metodología mencionada se ha comprobado al implementarlo en una empresa del tipo PYME, del sector de Ingeniería. Para el establecimiento del Estado del Arte de este tema se ha diseñado y aplicado una metodología específica basada en el ciclo de mejora continua de Shewhart/Deming, aplicando las herramientas de búsqueda y análisis bibliográfico disponibles en la red con acceso a las correspondientes bases de datos

    Coordination of DWH Long-Term Data Management: The Path Forward Workshop Report

    Get PDF
    Following the 2010 DWH Oil Spill a vast amount of environmental data was collected (e.g., 100,000+ environmental samples, 15 million+ publicly available records). The volume of data collected introduced a number of challenges including: data quality assurance, data storage, data integration, and long-term preservation and availability of the data. An effort to tackle these challenges began in June 2014, at a workshop focused on environmental disaster data management (EDDM) with respect to response and subsequent restoration. The previous EDDM collaboration improved communication and collaboration among a range of government, industry and NGO entities involved in disaster management. In June 2017, the first DWH Long-Term Data Management (LTDM) workshop focused on reviewing existing data management systems, opportunities to advance integration of these systems, the availability of data for restoration planning, project implementation and restoration monitoring efforts, and providing a platform for increased communication among the various data GOM entities. The June 2017 workshop resulted in the formation of three working groups: Data Management Standards, Interoperability and Discovery/Searchability. These working groups spent 2018 coordinating and addressing various complex topics related to DWH LTDM. On December 4th and 5th, 2018 the Coastal Response Research Center (CRRC), NOAA Office of Response and Restoration (ORR) and NOAA National Marine Fisheries Service (NFMS) Restoration Center (RC), co-sponsored a workshop entitled Deepwater Horizon Oil Spill (DWH) Long-Term Data Management (LTDM): The Path Forward at the NOAA Gulf of Mexico (GOM) Disaster Response Center (DRC) in Mobile, AL

    The Use of ICT Tools in Tackling Insecurity and Terrorism Problem in Nigeria

    Get PDF
    The paper seek ICT tools solution to crime and insurgence attack in Nigeria by providing a broad view of the Public Security Communications System  (PSCS), Public Safety Networks (PSNs)  and National Security Information Centre (NSIC) and some ways that ICT-based technologies can assist security agencies in been more efficient and effective in their operations for national development. In addition, some efficient and effective techniques to tackle insurgency was presented. Keywords: PSCS,  PSNs, NSIC, Insecurity, Crime, Insurgency, Boko-Haram, Nigeri

    Relationship between hinterland connectivity with logistics performance: a case of Sarawak, Malaysia

    Get PDF

    Tags Are Related: Measurement of Semantic Relatedness Based on Folksonomy Network

    Get PDF
    Folksonomy and tagging systems, which allow users to interactively annotate a pool of shared resources using descriptive tags, have enjoyed phenomenal success in recent years. The concepts are organized as a map in human mind, however, the tags in folksonomy, which reflect users' collaborative cognition on information, are isolated with current approach. What we do in this paper is to estimate the semantic relatedness among tags in folksonomy: whether tags are related from semantic view, rather than isolated? We introduce different algorithms to form networks of folksonomy, connecting tags by users collaborative tagging, or by resource context. Then we perform multiple measures of semantic relatedness on folksonomy networks to investigate semantic information within them. The result shows that the connections between tags have relatively strong semantic relatedness, and the relatedness decreases dramatically as the distance between tags increases. What we find in this paper could provide useful visions in designing future folksonomy-based systems, constructing semantic web in current state of the Internet, and developing natural language processing applications

    Prediction method of cigarette draw resistance based on correlation analysis

    Full text link
    The cigarette draw resistance monitoring method is incomplete and single, and the lacks correlation analysis and preventive modeling, resulting in substandard cigarettes in the market. To address this problem without increasing the hardware cost, in this paper, multi-indicator correlation analysis is used to predict cigarette draw resistance. First, the monitoring process of draw resistance is analyzed based on the existing quality control framework, and optimization ideas are proposed. In addition, for the three production units, the cut tobacco supply (VE), the tobacco rolling (SE), and the cigarette-forming (MAX), direct and potential factors associated with draw resistance are explored, based on the linear and non-linear correlation analysis. Then, the correlates of draw resistance are used as inputs for the machine learning model, and the predicted values of draw resistance are used as outputs. Finally, this research also innovatively verifies the practical application value of draw resistance prediction: the distribution characteristics of substandard cigarettes are analyzed based on the prediction results, the time interval of substandard cigarettes being produced is determined, the probability model of substandard cigarettes being sampled is derived, and the reliability of the prediction result is further verified by the example. The results show that the prediction model based on correlation analysis has good performance in three months of actual production.Comment: Preprint, submitted to Computers and Electronics in Agriculture. For any suggestions or improvements, please contact me directly by e-mai

    CWI-evaluation - Progress Report 1993-1998

    Get PDF
    corecore