235,374 research outputs found
Metaheuristic Algorithms for Convolution Neural Network
A typical modern optimization technique is usually either heuristic or
metaheuristic. This technique has managed to solve some optimization problems
in the research area of science, engineering, and industry. However,
implementation strategy of metaheuristic for accuracy improvement on
convolution neural networks (CNN), a famous deep learning method, is still
rarely investigated. Deep learning relates to a type of machine learning
technique, where its aim is to move closer to the goal of artificial
intelligence of creating a machine that could successfully perform any
intellectual tasks that can be carried out by a human. In this paper, we
propose the implementation strategy of three popular metaheuristic approaches,
that is, simulated annealing, differential evolution, and harmony search, to
optimize CNN. The performances of these metaheuristic methods in optimizing CNN
on classifying MNIST and CIFAR dataset were evaluated and compared.
Furthermore, the proposed methods are also compared with the original CNN.
Although the proposed methods show an increase in the computation time, their
accuracy has also been improved (up to 7.14 percent).Comment: Article ID 1537325, 13 pages. Received 29 January 2016; Revised 15
April 2016; Accepted 10 May 2016. Academic Editor: Martin Hagan. in Hindawi
Publishing. Computational Intelligence and Neuroscience Volume 2016 (2016
Parameterized Algorithmics for Computational Social Choice: Nine Research Challenges
Computational Social Choice is an interdisciplinary research area involving
Economics, Political Science, and Social Science on the one side, and
Mathematics and Computer Science (including Artificial Intelligence and
Multiagent Systems) on the other side. Typical computational problems studied
in this field include the vulnerability of voting procedures against attacks,
or preference aggregation in multi-agent systems. Parameterized Algorithmics is
a subfield of Theoretical Computer Science seeking to exploit meaningful
problem-specific parameters in order to identify tractable special cases of in
general computationally hard problems. In this paper, we propose nine of our
favorite research challenges concerning the parameterized complexity of
problems appearing in this context
Allocation in Practice
How do we allocate scarcere sources? How do we fairly allocate costs? These
are two pressing challenges facing society today. I discuss two recent projects
at NICTA concerning resource and cost allocation. In the first, we have been
working with FoodBank Local, a social startup working in collaboration with
food bank charities around the world to optimise the logistics of collecting
and distributing donated food. Before we can distribute this food, we must
decide how to allocate it to different charities and food kitchens. This gives
rise to a fair division problem with several new dimensions, rarely considered
in the literature. In the second, we have been looking at cost allocation
within the distribution network of a large multinational company. This also has
several new dimensions rarely considered in the literature.Comment: To appear in Proc. of 37th edition of the German Conference on
Artificial Intelligence (KI 2014), Springer LNC
Evaluation of second-level inference in fMRI analysis
We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference
Gender and Age Related Effects While Watching TV Advertisements: An EEG Study
The aim of the present paper is to show how the variation of the EEG frontal cortical asymmetry is related to the general appreciation perceived during the observation of TV advertisements, in particular considering the influence of the gender and age on it. In particular, we investigated the influence of the gender on the perception of a car advertisement (Experiment 1) and the influence of the factor age on a chewing gum commercial (Experiment 2). Experiment 1 results showed statistically significant higher approach values for the men group throughout the commercial. Results from Experiment 2 showed significant lower values by older adults for the spot, containing scenes not very enjoyed by them. In both studies, there was no statistical significant difference in the scene
relative to the product offering between the experimental populations, suggesting the absence in our study of a bias towards the specific product in the evaluated populations. These evidences state the importance of the creativity in advertising, in order to attract the target population
EEG Resting-State Brain Topological Reorganization as a Function of Age
Resting state connectivity has been increasingly studied to investigate the effects of aging on the brain. A reduced organization
in the communication between brain areas was demonstrated b
y combining a variety of different imaging technologies (fMRI,
EEG, and MEG) and graph theory. In this paper, we propose a methodology to get new insights into resting state connectivity and
its variations with age, by combining advanced techniques of effective connectivity estimation, graph theoretical approach, and
classification by SVM method. We analyzed high density EEG signal
srecordedatrestfrom71healthysubjects(age:20–63years).
Weighted and directed connectivity was computed by means of Partial Directed Coherence based on a General Linear Kalman filter
approach. To keep the information collected by the estimator, weighted and directed graph indices were extracted from the resulting
networks. A relation between brain network properties and age of the subject was found, indicating a tendency of the network to
randomly organize increasing with age. This result is also confirmed dividing the whole population into two subgroups according
to the age (young and middle-aged adults): significant differences exist in terms of network organization measures. Classification
of the subjects by means of such indices returns an accuracy greater than 80
Toward a General-Purpose Heterogeneous Ensemble for Pattern Classification
We perform an extensive study of the performance of different classification approaches on twenty-five datasets (fourteen image datasets and eleven UCI data mining datasets). The aim is to find General-Purpose (GP) heterogeneous ensembles (requiring little to no parameter tuning) that perform competitively across multiple datasets. The state-of-the-art classifiers examined in this study include the support vector machine, Gaussian process classifiers, random subspace of adaboost, random subspace of rotation boosting, and deep learning classifiers. We demonstrate that a heterogeneous ensemble based on the simple fusion by sum rule of different classifiers performs consistently well across all twenty-five datasets. The most important result of our investigation is demonstrating that some very recent approaches, including the heterogeneous ensemble we propose in this paper, are capable of outperforming an SVM classifier (implemented with LibSVM), even when both kernel selection and SVM parameters are carefully tuned for each dataset
Neurophysiological Profile of Antismoking Campaigns
Over the past few decades, antismoking public service announcements (PSAs) have been used by governments to promote healthy
behaviours in citizens, for instance, against drinking before the drive and against smoke. Effectiveness of such PSAs has been
suggested especially for young persons. By now, PSAs efficacy is still mainly assessed through traditional methods (questionnaires
and metrics) and could be performed only after the PSAs broadcasting, leading to waste of economic resources and time in the
case of Ineffective PSAs. One possible countermeasure to such ineffective use of PSAs could be promoted by the evaluation of the
cerebral reaction to the PSA of particular segments of population (e.g., old, young, and heavy smokers). In addition, it is crucial to
gather such cerebral activity in front of PSAs that have been assessed to be effective against smoke (Effective PSAs), comparing
results to the cerebral reactions to PSAs that have been certified to be not effective (Ineffective PSAs). &e eventual differences
between the cerebral responses toward the two PSA groups will provide crucial information about the possible outcome of new
PSAs before to its broadcasting. &is study focused on adult population, by investigating the cerebral reaction to the vision of
different PSA images, which have already been shown to be Effective and Ineffective for the promotion of an antismoking
behaviour. Results showed how variables as gender and smoking habits can influence the perception of PSA images, and how
different communication styles of the antismoking campaigns could facilitate the comprehension of PSA’s message and then
enhance the related impac
- …