285 research outputs found
Modeling of traffic data characteristics by Dirichlet Process Mixtures
Conference Theme: Green Automation Toward a Sustainable SocietyThis paper presents a statistical method for modeling large volume of traffic data by Dirichlet Process Mixtures (DPM). Traffic signals are in general defined by their spatial-temporal characteristics, of which some can be common or similar across a set of signals, while a minority of these signals may have characteristics inconsistent with the majority. These are termed outliers. Outlier detection aims to segment and eliminate them in order to improve signal quality. It is accepted that the problem of outlier detection is non-trivial. As traffic signals generally share a high degree of spatial-temporal similarities within the signal and between different types of traffic signals, traditional modeling approaches are ineffective in distinguishing these similarities and discerning their differences. In regard to modeling the traffic data characteristics by DPM, this paper conveys three contributions. First, a new generic statistical model for traffic data is proposed based on DPM. Second, this model achieves an outlier detection rate of 96.74% based on a database of 764,027 vehicles. Third, the proposed model is scalable to the entire road network. © 2012 IEEE.published_or_final_versio
A survey of statistical network models
Networks are ubiquitous in science and have become a focal point for
discussion in everyday life. Formal statistical models for the analysis of
network data have emerged as a major topic of interest in diverse areas of
study, and most of these involve a form of graphical representation.
Probability models on graphs date back to 1959. Along with empirical studies in
social psychology and sociology from the 1960s, these early works generated an
active network community and a substantial literature in the 1970s. This effort
moved into the statistical literature in the late 1970s and 1980s, and the past
decade has seen a burgeoning network literature in statistical physics and
computer science. The growth of the World Wide Web and the emergence of online
networking communities such as Facebook, MySpace, and LinkedIn, and a host of
more specialized professional network communities has intensified interest in
the study of networks and network data. Our goal in this review is to provide
the reader with an entry point to this burgeoning literature. We begin with an
overview of the historical development of statistical network modeling and then
we introduce a number of examples that have been studied in the network
literature. Our subsequent discussion focuses on a number of prominent static
and dynamic network models and their interconnections. We emphasize formal
model descriptions, and pay special attention to the interpretation of
parameters and their estimation. We end with a description of some open
problems and challenges for machine learning and statistics.Comment: 96 pages, 14 figures, 333 reference
Proceedings of the 35th International Workshop on Statistical Modelling : July 20- 24, 2020 Bilbao, Basque Country, Spain
466 p.The InternationalWorkshop on Statistical Modelling (IWSM) is a reference workshop in promoting statistical modelling, applications of Statistics for researchers, academics and industrialist in a broad sense. Unfortunately, the global COVID-19 pandemic has not allowed holding the 35th edition of the IWSM in Bilbao in July 2020. Despite the situation and following the spirit of the Workshop and the Statistical Modelling Society, we are delighted to bring you the proceedings book of extended abstracts
Proceedings of the 35th International Workshop on Statistical Modelling : July 20- 24, 2020 Bilbao, Basque Country, Spain
466 p.The InternationalWorkshop on Statistical Modelling (IWSM) is a reference workshop in promoting statistical modelling, applications of Statistics for researchers, academics and industrialist in a broad sense. Unfortunately, the global COVID-19 pandemic has not allowed holding the 35th edition of the IWSM in Bilbao in July 2020. Despite the situation and following the spirit of the Workshop and the Statistical Modelling Society, we are delighted to bring you the proceedings book of extended abstracts
Coding shape inside the shape
The shape of an object lies at the interface between vision and cognition, yet the field of statistical shape analysis is far from developing a general mathematical model to represent shapes that would allow computational descriptions to express some simple tasks that are carried out robustly and eâ”ortlessly by humans. In this thesis, novel perspectives on shape characterization are presented where the shape information is encoded inside the shape. The representation is free from the dimensions of the shape, hence the model is readily extendable to any shape embedding dimensions (i.e 2D, 3D, 4D). A very desirable property is that the representation possesses the possibility to fuse shape information with other types of information available inside the shape domain, an example would be reflectance information from an optical camera. Three novel fields are proposed within the scope of the thesis, namely âScalable Fluctuating Distance Fieldsâ, âScreened Poisson Hyperfieldsâ, âLocal Convexity Encoding Fieldsâ, which are smooth fields that are obtained by encoding desired shape information. âScalable Fluctuating Distance Fieldsâ, that encode parts explicitly, is presented as an interactive tool for tumor protrusion segmentation and as an underlying representation for tumor follow-up analysis. Secondly, âScreened Poisson Hyper-Fieldsâ, provide a rich characterization of the shape that encodes global, local, interior and boundary interactions. Low-dimensional embeddings of the hyper-fields are employed to address problems of shape partitioning, 2D shape classification and 3D non-rigid shape retrieval. Moreover, the embeddings are used to translate the shape matching problem into an image matching problem, utilizing existing arsenal of image matching tools that could not be utilized in shape matching before. Finally, the âLocal Convexity Encoding Fieldsâ is formed by encoding information related to local symmetry and local convexity-concavity properties. The representation performance of the shape fields is presented both qualitatively and quantitatively. The descriptors obtained using the regional encoding perspective outperform existing state-of-the-art shape retrieval methods over public benchmark databases, which is highly motivating for further study of regional-volumetric shape representations
Low-dimensional representations of neural time-series data with applications to peripheral nerve decoding
Bioelectronic medicines, implanted devices that influence physiological states by peripheral neuromodulation, have promise as a new way of treating diverse conditions from rheumatism to diabetes. We here explore ways of creating nerve-based feedback for the implanted systems to act in a dynamically adapting closed loop.
In a first empirical component, we carried out decoding studies on in vivo recordings of cat and rat bladder afferents. In a low-resolution data-set, we selected informative frequency bands of the neural activity using information theory to then relate to bladder pressure. In a second high-resolution dataset, we analysed the population code for bladder pressure, again using information theory, and proposed an informed decoding approach that promises enhanced robustness and automatic re-calibration by creating a low-dimensional population vector.
Coming from a different direction of more general time-series analysis, we embedded a set of peripheral nerve recordings in a space of main firing characteristics by dimensionality reduction in a high-dimensional feature-space and automatically proposed single efficiently implementable estimators for each identified characteristic. For bioelectronic medicines, this feature-based pre-processing method enables an online signal characterisation of low-resolution data where spike sorting is impossible but simple power-measures discard informative structure. Analyses were based on surrogate data from a self-developed and flexibly adaptable computer model that we made publicly available.
The wider utility of two feature-based analysis methods developed in this work was demonstrated on a variety of datasets from across science and industry. (1) Our feature-based generation of interpretable low-dimensional embeddings for unknown time-series datasets answers a need for simplifying and harvesting the growing body of sequential data that characterises modern science. (2) We propose an additional, supervised pipeline to tailor feature subsets to collections of classification problems. On a literature standard library of time-series classification tasks, we distilled 22 generically useful estimators and made them easily accessible.Open Acces
Recommended from our members
Mathematics and Algorithms in Tomography
This is the eighth Oberwolfach conference on the mathematics of tomography. Modalities represented at the workshop included X-ray tomography, sonar, radar, seismic imaging, ultrasound, electron microscopy, impedance imaging, photoacoustic tomography, elastography, vector tomography, and texture analysis
International Society for Disease Surveillance Conference 2011: Building the Future of Public Health Surveillance: Building the Future of Public Health Surveillance
Daniel Reidpath - ORCID: 0000-0002-8796-0420 https://orcid.org/0000-0002-8796-04204pubpub1117
- âŠ