1,312,328 research outputs found
Recommended from our members
The Emergence of Big Data Policing
The past decade has seen both the proliferation of surveillance in everyday life and the rise of “big data.” Through extensive qualitative research focusing on the Los Angeles Police Department (LAPD), PRC faculty research associate Sarah Brayne explores whether and how adopting big data analytics transforms police surveillance practices. This brief demonstrates that, in some cases, the adoption of big data analytics is associated with mere amplifications in prior practices, but in others, it is associated with fundamental transformations in surveillance activities.Population Research Cente
Big Data of the Past
Big Data is not a new phenomenon. History is punctuated by regimes of data acceleration, characterized by feelings of information overload accompanied by periods of social transformation and the invention of new technologies. During these moments, private organizations, administrative powers, and sometimes isolated individuals have produced important datasets, organized following a logic that is often subsequently superseded but was at the time, nevertheless, coherent. To be translated into relevant sources of information about our past, these document series need to be redocumented using contemporary paradigms. The intellectual, methodological, and technological challenges linked to this translation process are the central subject of this article
Will the tachyonic Universe survive the Big Brake?
We investigate a Friedmann universe filled with a tachyon scalar field, which
behaves as dustlike matter in the past, while it is able to accelerate the
expansion rate of the universe at late times. The comparison with type Ia
supernovae (SNIa) data allows for evolutions driving the universe into a Big
Brake. Some of the evolutions leading to a Big Brake exhibit a large variation
of the equation of state parameter at low redshifts which is potentially
observable with future data though hardly detectable with present SNIa data.
The soft Big Brake singularity occurs at finite values of the scale factor,
vanishing energy density and Hubble parameter, but diverging deceleration and
infinite pressure. We show that the geodesics can be continued through the Big
Brake and that our model universe will recollapse eventually in a Big Crunch.
Although the time to the Big Brake strongly depends on the present values of
the tachyonic field and of its time derivative, the time from the Big Brake to
the Big Crunch represents a kind of invariant timescale for all field
parameters allowed by SNIa.Comment: v2: slightly expanded, 14 pages, 5 figures, 3 tables; version to be
published in Phys.Rev.
Using Ontologies for Semantic Data Integration
While big data analytics is considered as one of the most important paths to competitive advantage of today’s enterprises, data scientists spend a comparatively large amount of time in the data preparation and data integration phase of a big data project. This shows that data integration is still a major challenge in IT applications. Over the past two decades, the idea of using semantics for data integration has become increasingly crucial, and has received much attention in the AI, database, web, and data mining communities. Here, we focus on a specific paradigm for semantic data integration, called Ontology-Based Data Access (OBDA). The goal of this paper is to provide an overview of OBDA, pointing out both the techniques that are at the basis of the paradigm, and the main challenges that remain to be addressed
Entity Identification Problem in Big and Open Data
Big and Open Data provide great opportunities to businesses to enhance their competitive advantages if
utilized properly. However, during past few years’ research in Big and Open Data process, we have
encountered big challenge in entity identification reconciliation, when trying to establish accurate
relationships between entities from different data sources. In this paper, we present our innovative Intelligent
Reconciliation Platform and Virtual Graphs solution that addresses this issue. With this solution, we are able
to efficiently extract Big and Open Data from heterogeneous source, and integrate them into a common
analysable format. Further enhanced with the Virtual Graphs technology, entity identification reconciliation
is processed dynamically to produce more accurate result at system runtime. Moreover, we believe that our
technology can be applied to a wide diversity of entity identification problems in several domains, e.g., e-
Health, cultural heritage, and company identities in financial world.Ministerio de Ciencia e Innovación TIN2013-46928-C3-3-
Big data quality dimensions: a systematic literature review
Although big data has become an integral part of businesses and society, there is still concern about the quality aspects of big data. Past research has focused on identifying various dimensions of big data. However, the research is scattered and there is a need to synthesize the ever involving phenomenon of big data. This research aims at providing a systematic literature review of the quality dimension of big data. Based on a review of 17 articles from academic research, we have presented a set of key quality dimensions of big data.Although big data has become an integral part of businesses and society, there is still concern about the quality aspects of big data. Past research has focused on identifying various dimensions of big data. However, the research is scattered and there is a need to synthesize the ever involving phenomenon of big data. This research aims at providing a systematic literature review of the quality dimension of big data. Based on a review of 17 articles from academic research, we have presented a set of key quality dimensions of big data
- …