412 research outputs found
Group Importance Sampling for Particle Filtering and MCMC
Bayesian methods and their implementations by means of sophisticated Monte
Carlo techniques have become very popular in signal processing over the last
years. Importance Sampling (IS) is a well-known Monte Carlo technique that
approximates integrals involving a posterior distribution by means of weighted
samples. In this work, we study the assignation of a single weighted sample
which compresses the information contained in a population of weighted samples.
Part of the theory that we present as Group Importance Sampling (GIS) has been
employed implicitly in different works in the literature. The provided analysis
yields several theoretical and practical consequences. For instance, we discuss
the application of GIS into the Sequential Importance Resampling framework and
show that Independent Multiple Try Metropolis schemes can be interpreted as a
standard Metropolis-Hastings algorithm, following the GIS approach. We also
introduce two novel Markov Chain Monte Carlo (MCMC) techniques based on GIS.
The first one, named Group Metropolis Sampling method, produces a Markov chain
of sets of weighted samples. All these sets are then employed for obtaining a
unique global estimator. The second one is the Distributed Particle
Metropolis-Hastings technique, where different parallel particle filters are
jointly used to drive an MCMC algorithm. Different resampled trajectories are
compared and then tested with a proper acceptance probability. The novel
schemes are tested in different numerical experiments such as learning the
hyperparameters of Gaussian Processes, two localization problems in a wireless
sensor network (with synthetic and real data) and the tracking of vegetation
parameters given satellite observations, where they are compared with several
benchmark Monte Carlo techniques. Three illustrative Matlab demos are also
provided.Comment: To appear in Digital Signal Processing. Related Matlab demos are
provided at https://github.com/lukafree/GIS.gi
Model-Agnostic Semantics in Shipboard Systems Against Data Availability Attacks
Data availability attacks, such as latency attacks, data dropouts and time synchronization attacks (TSAs) still remain a prime concern not only at the network level but also impair the control system performance & stability of the physical layer in medium voltage DC (MVDC) shipboard power systems (SPSs). To address its impact on the physical layer, we equip a model-agnostic semantic architecture that can compensate using process-aware delay compensation capabilities. Unlike traditional model predictive controllers (MPCs) with a limited prediction horizon, the proposed architecture offers long event-driven prediction even during large random delayed measurements. The semantic prediction policy is governed using the inner control loop dynamics of power generation modules (PGMs) to provide reconstructed signals for delay compensation. Its robustness has been extensively tested and validated on nominal 12 kV two-zone MVDC SPS, in an OPAL-RT environment for above-mentioned attacks. Overall, the proposed model-agnostic estimator (MAE) has the potential to significantly improve the resilience of SPSs against cyber-attacks without revealing any model information
A project management approach for a mining stope
Abstract: Continuously increasing costs and weakening commodity prices have forced mining companies to focus on improving their efficiency in order to enhance production rates. Because of this, more companies are treating their daily mining operations as a project for the purpose of achieving this goal. This paper focuses on an underground stoping panel project where platinum group metals (PGMs) are extracted using conventional drill-and-blast mining methods. A conventional stope can be classified as a project due to its uncertain and unpredictable characteristics, many variations, and a large number of interdependencies. These interdependencies may be minor linked activities with characteristics that tend to increase the risk of failure. The ’project management approach’ to be applied to this case study should consider the risks associated with each event and be able to serve as a method to avoid disruption caused by unforeseen events. In project management, specific methods are applied to achieve objectives. In this study the critical chain project management (CCPM) approach and the event chain project management (ECPM) approach are compared to determine which is more applicable for use in an underground stope. The aim is to improve the efficiency of day-to-day stoping activities using a project management approach. The day-to-day operations are guided by a definite goal – achieving the most effective blast. This approach will improve project planning, therefore assisting in preparation for any uncertainty
Electroencephalography brain computer interface using an asynchronous protocol
A dissertation submitted to the Faculty of Science,
University of the Witwatersrand, in ful llment of the
requirements for the degree of Master of Science. October 31, 2016.Brain Computer Interface (BCI) technology is a promising new channel for communication
between humans and computers, and consequently other humans. This technology has the
potential to form the basis for a paradigm shift in communication for people with disabilities or
neuro-degenerative ailments. The objective of this work is to create an asynchronous BCI that
is based on a commercial-grade electroencephalography (EEG) sensor. The BCI is intended
to allow a user of possibly low income means to issue control signals to a computer by using
modulated cortical activation patterns as a control signal. The user achieves this modulation
by performing a mental task such as imagining waving the left arm until the computer performs
the action intended by the user. In our work, we make use of the Emotiv EPOC headset to
perform the EEG measurements. We validate our models by assessing their performance when
the experimental data is collected using clinical-grade EEG technology. We make use of a
publicly available data-set in the validation phase.
We apply signal processing concepts to extract the power spectrum of each electrode from
the EEG time-series data. In particular, we make use of the fast Fourier transform (FFT).
Specific bands in the power spectra are used to construct a vector that represents an abstract
state the brain is in at that particular moment. The selected bands are motivated by insights
from neuroscience. The state vector is used in conjunction with a model that performs classification. The exact purpose of the model is to associate the input data with an abstract
classification result which can then used to select the appropriate set of instructions to be executed
by the computer. In our work, we make use of probabilistic graphical models to perform
this association.
The performance of two probabilistic graphical models is evaluated in this work. As a
preliminary step, we perform classification on pre-segmented data and we assess the performance
of the hidden conditional random fields (HCRF) model. The pre-segmented data has a trial
structure such that each data le contains the power spectra measurements associated with only
one mental task. The objective of the assessment is to determine how well the HCRF models the
spatio-spectral and temporal relationships in the EEG data when mental tasks are performed
in the aforementioned manner. In other words, the HCRF is to model the internal dynamics
of the data corresponding to the mental task. The performance of the HCRF is assessed over
three and four classes. We find that the HCRF can model the internal structure of the data
corresponding to different mental tasks.
As the final step, we perform classification on continuous data that is not segmented and
assess the performance of the latent dynamic conditional random fields (LDCRF). The LDCRF
is used to perform sequence segmentation and labeling at each time-step so as to allow the
program to determine which action should be taken at that moment. The sequence segmentation
and labeling is the primary capability that we require in order to facilitate an asynchronous
BCI protocol. The continuous data has a trial structure such that each data le contains the
power spectra measurements associated with three different mental tasks. The mental tasks
are randomly selected at 15 second intervals. The objective of the assessment is to determine
how well the LDCRF models the spatio-spectral and temporal relationships in the EEG data,
both within each mental task and in the transitions between mental tasks. The performance of
the LDCRF is assessed over three classes for both the publicly available data and the data we
obtained using the Emotiv EPOC headset. We find that the LDCRF produces a true positive
classification rate of 82.31% averaged over three subjects, on the validation data which is in the
publicly available data. On the data collected using the Emotiv EPOC, we find that the LDCRF
produces a true positive classification rate of 42.55% averaged over two subjects.
In the two assessments involving the LDCRF, the random classification strategy would
produce a true positive classification rate of 33.34%. It is thus clear that our classification
strategy provides above random performance on the two groups of data-sets. We conclude that
our results indicate that creating low-cost EEG based BCI technology holds potential for future
development. However, as discussed in the final chapter, further work on both the software and
low-cost hardware aspects is required in order to improve the performance of the technology as
it relates to the low-cost context.LG201
Assessment of the Methodology for Establishing the EU List of Critical Raw Materials - Annexes
This report presents the results of work carried out by the Directorate General (DG) Joint Research Centre (JRC) of the European Commission (EC), in close cooperation with Directorate-General for Internal Market, Industry, Entrepreneurship and SMEs (GROW), in the context of the revision of the EC methodology that was used to identify the list of critical raw materials (CRMs) for the EU in 2011 and 2014 (EC 2011, 2014). As a background report, it complements the corresponding Guidelines Document, which contains the “ready-to-apply” methodology for updating the list of CRMs in 2017. This background report highlights the needs for updating the EC criticality methodology, the analysis and the proposals for improvement with related examples, discussion and justifications. However, a few initial remarks are necessary to clarify the context, the objectives of the revision and the approach.
As the in-house scientific service of the EC, DG JRC was asked to provide scientific advice to DG GROW in order to assess the current methodology, identify aspects that have to be adapted to better address the needs and expectations of the list of CRMs and ultimately propose an improved and integrated methodology. This work was conducted closely in consultation with the adhoc working group on CRMs, who participated in regular discussions and provided informed expert feedback. The analysis and subsequent revision started from the assumption that the methodology used for the 2011 and 2014 CRMs lists proved to be reliable and robust and, therefore, the JRC mandate was focused on fine-tuning and/or targeted incremental methodological improvements. An in depth re-discussion of fundamentals of criticality assessment and/or major changes to the EC methodology were not within the scope of this work.
High priority was given to ensure good comparability with the criticality exercises of 2011 and 2014. The existing methodology was therefore retained, except for specific aspects for which there were policy and/or stakeholder needs on the one hand, or strong scientific reasons for refinement of the methodology on the other. This was partially facilitated through intensive dialogue with DG GROW, the CRM adhoc working group, other key EU and extra-EU stakeholders.JRC.D.3-Land Resource
Application of the event chain project management methodology to a mining stope
Abstract: Although South Africa possesses more than 80% of the world’s platinum reserves, its reputation for reliability in supplying platinum to global markets is under threat. This is due to the 49% decrease in output per worker (1999–2014), while the domestic costs have risen by more than 10% annually for the past 5 years. In addition, the continued decline in the commodity price by 38.3% (2012–2016) has resulted in a significant portion of the sector producing at a loss in 2015. The Chamber of Mines (now the Minerals Councilof South Africa) has suggested that solutions to improve productivity and reduce cost pressures are required. This research aims to provide operational excellence through the application of event chain project management (ECPM) to improve productivity and reduce operational costs. A case study was used to carry out research in platinum mines, with data collected using a motion-time study to measure the current efficiency of operations in each mining stope through actual activity durations. The results indicate that through the application of the event chain project management methodology, risks affecting the mining stope schedule can be managed, the efficiency of operations was improved by reducing the time spent on each activity, productivity was increased by shortening the project duration, and operational costs were reduced in the process
- …