153,818 research outputs found

    A network approach for managing and processing big cancer data in clouds

    Get PDF
    Translational cancer research requires integrative analysis of multiple levels of big cancer data to identify and treat cancer. In order to address the issues that data is decentralised, growing and continually being updated, and the content living or archiving on different information sources partially overlaps creating redundancies as well as contradictions and inconsistencies, we develop a data network model and technology for constructing and managing big cancer data. To support our data network approach for data process and analysis, we employ a semantic content network approach and adopt the CELAR cloud platform. The prototype implementation shows that the CELAR cloud can satisfy the on-demanding needs of various data resources for management and process of big cancer data

    Helium synthesis, neutrino flavors, and cosmological implications

    Get PDF
    The problem of the production of helium in big bang cosmology is re-examined in the light of several recent astrophysical observations. These data, and theoretical particle physics considerations, lead to some important inconsistencies in the standard big bang model and suggest that a more complicated picture is needed. Thus, recent constraints on the number of neutrino flavors, as well as constraints on the mean density (openness) of the universe, need not be valid

    The Hidden Inconsistencies Introduced by Predictive Algorithms in Judicial Decision Making

    Full text link
    Algorithms, from simple automation to machine learning, have been introduced into judicial contexts to ostensibly increase the consistency and efficiency of legal decision making. In this paper, we describe four types of inconsistencies introduced by risk prediction algorithms. These inconsistencies threaten to violate the principle of treating similar cases similarly and often arise from the need to operationalize legal concepts and human behavior into specific measures that enable the building and evaluation of predictive algorithms. These inconsistencies, however, are likely to be hidden from their end-users: judges, parole officers, lawyers, and other decision-makers. We describe the inconsistencies, their sources, and propose various possible indicators and solutions. We also consider the issue of inconsistencies due to the use of algorithms in light of current trends towards more autonomous algorithms and less human-understandable behavioral big data. We conclude by discussing judges and lawyers' duties of technological ("algorithmic") competence and call for greater alignment between the evaluation of predictive algorithms and corresponding judicial goals

    Data pre-processing:Case of sensor data consistency based on Bi-temporal concepts

    Get PDF
    The volume, velocity, variety, veracity and value of data currently produced and consumed by different types of information systems turned big Data into a phenomena of study. For data variety, temporal data commonly represents a source of potential inconsistency. This paper reports on a research endeavor for treating the problem of how to minimize inconsistencies in temporal databases due to unavailability of big data. This problem often occurs in situations where a same query is executed on the same data set at different points in time. To address this issue, we propose query optimization strategies based on query transformation and rewriting rules, to amend data consistency in temporal databases. We validate these strategies proposed via case scenario in sensor data analysis, and via manual data input, both for local and distributed query environments

    PERANCANGAN SISTEM INFORMASI STOK KONTAINER DENGAN METODE SCRUM PADA PT PUTRA BATAM JASA MANDIRI UTAMA

    Get PDF
    PT Putra Batam Jasa Mandiri Utama is a Logistics company based in Batam-Indonesia in which PT Putra Batam Jasa Mandiri Utama main focus is Shipping Containers. PT Putra Batam Jasa Mandiri Utama has 2 kinds of services in which this journal will be focusing on Depot Services. PT Putra Batam Jasa Mandiri Utama is facing problems in which the current application that they are using are have a big inconsistencies data problem. For this on-going issue, the researcher is going to use Scrum as the basis for planning and creating this new Inventory Management System based on user feedbacks and improvement from the researcher itself. Feedback from users such as, inconsistencies in data, the lack of export and import stock report feature in the current system itself has made a big gap in PT Putra Batam Jasa Mandiri Utama Depot SOP (Standard Operational Procedure). As for the results, The application that this researcher has developed by using Laravel Framework is an upgrade from what PT Putra Batam Jasa Mandiri Utama is now using, features such as the Dashboard will be included in which it will functions as a main bridgepoint between Admin and Field Tally

    Adaptive Normalization in Streaming Data

    Full text link
    In todays digital era, data are everywhere from Internet of Things to health care or financial applications. This leads to potentially unbounded ever-growing Big data streams and it needs to be utilized effectively. Data normalization is an important preprocessing technique for data analytics. It helps prevent mismodeling and reduce the complexity inherent in the data especially for data integrated from multiple sources and contexts. Normalization of Big Data stream is challenging because of evolving inconsistencies, time and memory constraints, and non-availability of whole data beforehand. This paper proposes a distributed approach to adaptive normalization for Big data stream. Using sliding windows of fixed size, it provides a simple mechanism to adapt the statistics for normalizing changing data in each window. Implemented on Apache Storm, a distributed real-time stream data framework, our approach exploits distributed data processing for efficient normalization. Unlike other existing adaptive approaches that normalize data for a specific use (e.g., classification), ours does not. Moreover, our adaptive mechanism allows flexible controls, via user-specified thresholds, for normalization tradeoffs between time and precision. The paper illustrates our proposed approach along with a few other techniques and experiments on both synthesized and real-world data. The normalized data obtained from our proposed approach, on 160,000 instances of data stream, improves over the baseline by 89% with 0.0041 root-mean-square error compared with the actual data

    Nuclear Scattering at Very High Energies

    Get PDF
    We discuss the current understanding of nuclear scattering at very high energies. We point out several serious inconsistencies in nowadays models, which provide big problems for any interpretation of data at high energy nuclear collisions. We outline how to develop a fully self-consistent formalism, which in addition uses all the knowledge available from studying electron-positron annihilation and deep inelastic scattering, providing a solid basis for further developments concerning secondary interactions.Comment: Invited talk at the International Workshop on Relativistic Aspects of Nuclear Physics, Caraguatatuba, Brazil, Oct. 17-20, 200

    Inconsistencies in English Language Teaching in Pakistan: A Comparison between Public and Private Institutions

    Get PDF
    The purpose of this study was to evaluate the substantial implementation of communicative approach in teaching English as a foreign language at higher secondary level in Pakistan. It also attempted to differentiate between public and private institutions in CLT application. A questionnaire, observation schedule and subsequent interviews were used to collect data. The results revealed that big gap is found in what teachers claim to do in class room and what they actually materialize. The teaching methodology adopted by the private domain teachers was found comparatively closer to the tenets of communicative approach. During the interview, the respondents tried to justify their inconsistencies in teaching style. Examination washback effect was a commonly forwarded justification by the teachers of both domains. Inexperienced teachers, physical environment, strategic facilities, over-crowded class rooms and non-availibity of teaching material were some of the other excuses given by the interviewees. Key words: ELT, teaching methodology, CLT, GTM, inconsistencies

    Big Data Quality Challenges in the Context of Business Analytics

    Get PDF
    Big data creates variety of business possibilities and helps to gain competitive advantage through predictions, optimization and adaptability. Impact of errors or inconsistencies across the different sources, from where the data is originated and how frequently data is acquired is not considered in much of the big data analysis. This thesis examines big data quality challenges in the context of business analytics. The intent of the thesis is to improve the knowledge of big data quality issues and testing big data. Most of the quality challenges are related to understanding the data, coping with messy source data and interpreting analytical results. Producing analytics requires subjective decisions along the analysis pipeline and analytical results may not lead to objective truth. Errors in big data are not corrected like in traditional data, instead the focus of testing is moved towards process oriented validation
    • …
    corecore