Workshop on Nonstationary Models of Pattern Recognition and Classifier Combinations (NMRPC) Session 1
Time and Date: 16:20 - 18:00 on 2nd June 2015
Room: M110
Chair: Michal Woźniak
480 | An algebraic approach to combining classifiers [abstract] Abstract: In distributed classification, each learner observes its environment and deduces a classifier. As a learner has only a local view of its environment, classifiers can be exchanged among the learners and integrated, or merged, to improve accuracy. However, the operation of merging is not defined for most classifiers. Furthermore, the classifiers that have to be merged may be of different types in settings such as ad-hoc networks in which several generations of sensors may be creating classifiers. We introduce decision spaces as a framework for merging possibly different classifiers. We formally study the merging operation as an algebra, and prove that it satisfies a desirable set of properties. The impact of time is discussed for the two main data mining settings. Firstly, decision spaces can naturally be used with non-stationary distributions, such as the data collected by sensor networks, as the impact of a model decays over time. Secondly, we introduce an approach for stationary distributions, such as homogeneous databases partitioned over different learners, which ensures that all models have the same impact. We also present a method using storage flexibly to achieve different types of decay for non-stationary distributions. |
Philippe Giabbanelli, Joseph Peters |
36 | Power LBP: A novel texture operator for smiling and neutral facial display classification [abstract] Abstract: Texture operators are commonly used to describe image content for many purposes. Recently they found its application in the task of emotion recognition, especially using local binary pattern method, LBP. This paper introduces a novel texture operator called power LBP, which defines a new ordering schema based on absolute intensity differences. Its definition as well as interpretation are given. The performance of suggested solution is evaluated on the problem of smiling and neutral facial display recognition. In order to evaluate the power LBP operator accuracy, its discriminative capacity work is compared to several members of the LPB family. Moreover, the influence of applied classification approach is also considered, by presenting results for k-nearest neighbour, support vector machine, and template matching classifiers. Furthermore, results for several databases are compared. |
Bogdan Smolka, Karolina Nurzynska |
657 | Incremental Weighted One-Class Classifier for Mining Stationary Data Streams [abstract] Abstract: Data streams and big data analytics is among the most popular contemporary machine learning problems. More and more often real-life problems could generate massive and continuous amounts of data. Standard classifiers cannot cope with a large volume of the training set and/or changing nature of the environment. In this paper, we deal with a problem of continuously arriving objects, that with each time interval may contribute new, useful knowledge to the patter classification system. This is known as stationary data stream mining. One-class classification is a very useful tool for stream analysis, as it can be used for tackling outliers, noise, appearance of new classes or imbalanced data to name a few. We propose a novel version of incremental One-Class Support Vector Machine, that assigns weights to each object according to its level of significance. This allows to train more robust one-class classifiers on incremental streams. We present two schemes for estimating weights for new, incoming data and examine their usefulness on a number of benchmark datasets. We also analyze time and memory requirements of our method. Results of experimental investigations prove, that our method can achieve better one-class recognition quality than algorithms used so far. |
Bartosz Krawczyk and Michal Wozniak |
659 | Wagging for Combining Weighted One-Class Support Vector Machines [abstract] Abstract: Most of machine learning problems assume, that we have at our disposal objects originating from two or more classes. By learning from a representative training set a classifier is able to estimate proper decision boundaries. However, in many real-life problems obtaining objects from some of the classes is difficult, or even impossible. In such cases, we are dealing with one-class classification, or learning in the absence of counterexamples. Such recognition systems must display a high robustness to new, unseen objects that may belong to an unknown class. That is why ensemble learning has become an attractive perspective in this field. In our work, we propose a novel one-class ensemble classifier, based on wagging. A weighted version of boosting is used, and the output weights for each object are used directly in the process of training Weighted One-Class Support Vector Machines. This introduces a diversity into the pool of one-class classifiers and extends the competence of formed ensemble. Experimental analysis, carried out on a number of benchmarks and backed-up with statistical analysis proves that the proposed method can outperform state-of-the-art ensembles dedicated to one-class classification. |
Bartosz Krawczyk, Michal Wozniak |