1,718,771 research outputs found
Online classifier adaptation for cost-sensitive learning
In this paper, we propose the problem of online cost-sensitive clas- sifier
adaptation and the first algorithm to solve it. We assume we have a base
classifier for a cost-sensitive classification problem, but it is trained with
respect to a cost setting different to the desired one. Moreover, we also have
some training data samples streaming to the algorithm one by one. The prob- lem
is to adapt the given base classifier to the desired cost setting using the
steaming training samples online. To solve this problem, we propose to learn a
new classifier by adding an adaptation function to the base classifier, and
update the adaptation function parameter according to the streaming data
samples. Given a input data sample and the cost of misclassifying it, we up-
date the adaptation function parameter by minimizing cost weighted hinge loss
and respecting previous learned parameter simultaneously. The proposed
algorithm is compared to both online and off-line cost-sensitive algorithms on
two cost-sensitive classification problems, and the experiments show that it
not only outperforms them one classification performances, but also requires
significantly less running time
COST 733 - WG4: Applications of weather type classification
The main objective of the COST Action 733 is to achieve a general numerical method for
assessing, comparing and classifying typical weather situations in the European regions. To
accomplish this goal, different workgroups are established, each with their specific aims:
WG1: Existing methods and applications (finished); WG2: Implementation and development of
weather types classification methods; WG3: Comparison of selected weather types
classifications; WG4: Testing methods for various applications.
The main task of Workgroup 4 (WG4) in COST 733 implies the testing of the selected weather
type methods for various classifications. In more detail, WG4 focuses on the following topics:•
Selection of dedicated applications (using results from WG1),
• Performance of the selected applications using available weather types provided by WG2,
• Intercomparison of the application results as a results of different methods
• Final assessment of the results and uncertainties,
• Presentation and release of results to the other WGs and external interested
• Recommend specifications for a new (common) method WG2
Introduction
In order to address these specific aims, various applications are selected and WG4 is divided in
subgroups accordingly:
1.Air quality
2. Hydrology (& Climatological mapping)
3. Forest fires
4. Climate change and variability
5. Risks and hazards
Simultaneously, the special attention is paid to the several wide topics concerning some other
COST Actions such as: phenology (COST725), biometeorology (COST730), agriculture (COST 734)
and mesoscale modelling and air pollution (COST728).
Sub-groups are established to find advantages and disadvantages of different classification
methods for different applications. Focus is given to data requirements, spatial and temporal
scale, domain area, specifi
Soft Methodology for Cost-and-error Sensitive Classification
Many real-world data mining applications need varying cost for different
types of classification errors and thus call for cost-sensitive classification
algorithms. Existing algorithms for cost-sensitive classification are
successful in terms of minimizing the cost, but can result in a high error rate
as the trade-off. The high error rate holds back the practical use of those
algorithms. In this paper, we propose a novel cost-sensitive classification
methodology that takes both the cost and the error rate into account. The
methodology, called soft cost-sensitive classification, is established from a
multicriteria optimization problem of the cost and the error rate, and can be
viewed as regularizing cost-sensitive classification with the error rate. The
simple methodology allows immediate improvements of existing cost-sensitive
classification algorithms. Experiments on the benchmark and the real-world data
sets show that our proposed methodology indeed achieves lower test error rates
and similar (sometimes lower) test costs than existing cost-sensitive
classification algorithms. We also demonstrate that the methodology can be
extended for considering the weighted error rate instead of the original error
rate. This extension is useful for tackling unbalanced classification problems.Comment: A shorter version appeared in KDD '1
Resource and cost management
Educational lecture notes contains the fundamentals of a general theory of resource and cost management, classification of costs for decision-making, methods of constructing cost functions of the enterprise, analysis of the relationship between costs, volume and profits, the methods and systems of cost calculation, principles of cost management system. Designed for students directions 073 «Management» and 076 «Entrepreneurship, trade and exchange activity»
Cost-Sensitive Classification: Empirical Evaluation of a Hybrid Genetic Decision Tree Induction Algorithm
This paper introduces ICET, a new algorithm for cost-sensitive
classification. ICET uses a genetic algorithm to evolve a population of biases
for a decision tree induction algorithm. The fitness function of the genetic
algorithm is the average cost of classification when using the decision tree,
including both the costs of tests (features, measurements) and the costs of
classification errors. ICET is compared here with three other algorithms for
cost-sensitive classification - EG2, CS-ID3, and IDX - and also with C4.5,
which classifies without regard to cost. The five algorithms are evaluated
empirically on five real-world medical datasets. Three sets of experiments are
performed. The first set examines the baseline performance of the five
algorithms on the five datasets and establishes that ICET performs
significantly better than its competitors. The second set tests the robustness
of ICET under a variety of conditions and shows that ICET maintains its
advantage. The third set looks at ICET's search in bias space and discovers a
way to improve the search.Comment: See http://www.jair.org/ for any accompanying file
Automatic Environmental Sound Recognition: Performance versus Computational Cost
In the context of the Internet of Things (IoT), sound sensing applications
are required to run on embedded platforms where notions of product pricing and
form factor impose hard constraints on the available computing power. Whereas
Automatic Environmental Sound Recognition (AESR) algorithms are most often
developed with limited consideration for computational cost, this article seeks
which AESR algorithm can make the most of a limited amount of computing power
by comparing the sound classification performance em as a function of its
computational cost. Results suggest that Deep Neural Networks yield the best
ratio of sound classification accuracy across a range of computational costs,
while Gaussian Mixture Models offer a reasonable accuracy at a consistently
small cost, and Support Vector Machines stand between both in terms of
compromise between accuracy and computational cost
- …
