FUZZY CLASSIFIER DESIGN WITH COEVOLUTIONARY ALGORITHMS APPLYING FOR SPEAKER IDENTIFICATION


Дәйексөз келтіру

Толық мәтін

Аннотация

The problem of speaker identification is considered in this article. Classification problem Japanese vowels» from UCI repository is used as source data. This problem was solved with a fuzzy classifier as a classification method that is able to extract cause-and-effect relations from source data. A new method of fuzzy classifier rule base design with coevolutionary algorithms was applied. It is multistep fuzzy classifier design based on multiple repetition of previous fuzzy classifier design with self-tuning coevolutionary algorithms. Computational investigation of fuzzy classifier design with coevolutionary algorithms for different numbers of speakers and for different number of the used fuzzy rules was performed. The proposed method allows getting acceptable classification efficiency for a test sample: from 0.985 for two speakers to 0.786 for nine speakers.

Толық мәтін

Speaker identification is an important related with spoken dialogue system problem. This problem can be formulated as a classification problem. So it is necessary to use special classification methods. One of them is a fuzzy classifier [1]. A fuzzy classifier is a classification algorithm based on fuzzy rules extraction from numerical data. Superiority of this method upon other classification algorithms such as neural networks is provided by fuzzy rules which are linguistic expressions and they are available for humans understanding as cause-and-effect relations. Thus a fuzzy classifier is one of the data mining methods for knowledge discovery. Fuzzy classifier design can be considered as optimization problem. In this case we need to find the best fuzzy classifier. Fuzzy classifier design includes two problems. The first one is a rule base generating and the second one is membership functions tuning. It should be noted that the first problem is more sophisticated due to huge dimension and discrete variables. So in this paper we present only fuzzy rule base generating problem. As fuzzy rule base generating is a complicated computational problem, the popular method of its solving is genetic-based machine learning [2; 3]. There are two basic ways for genetic algorithm applying to get fuzzy rule base: Michigan-style and Pittsburgh-style. In Michigan approach [4] chromosomes are individual rules; and a rule set is represented by the entire population. In Pittsburg method [5] chromosomes are rule sets at whole. The problem in the Michigan approach is the conflict between individual rule fitness and performance of fuzzy rule set. Pittsburgh-style systems require a lot of computational efforts. So a combination of Michigan and Pittsburgh methods is a promising approach. In [6] the hybridization of both approaches by using Michigan method as a mutation operator in Pittsburgh-style algorithm is presented. Another problem with genetic algorithm applying is the algorithm parameters setting. This problem is especially essential for optimization problems with high computational complexity such as fuzzy rule base generating. There are some methods for GA parameter setting. We suggest special procedure named cooperative-competitive coevolutionary algorithm for this problem solving [7]. A new method of Michigan and Pittsburgh approaches combination for fuzzy classifier rule base design with coevo-lutionary algorithms was developed in [6]. The next idea is the multistep fuzzy classifier design. After multiple fuzzy classifiers forming we had a set of fuzzy classifiers for each classification problem. The natural step is a collective forming fuzzy rule base using a set of classifiers generated with our approach. It is possible to increase classification efficiency and decrease diversity of classification efficiency using this method. For collective forming of fuzzy classifier cooperate-competitive coevolutionary algorithm can be applied again. Thus we can repeat this procedure more times. So we have formulated a multistep procedure of fuzzy classifier design. We have implemented this method and got results for all classification problems mentioned above. In this paper convergence investigation of multistep fuzzy classifier design is presented. We have observed dynamic features of such parameters as classification performance, standard deviation of classification performance, and number of unique fuzzy rules in a set of classifiers for multistep fuzzy classifier design. Fuzzy classifier design with coevolutionary algorithms was applied for speaker identification. Classification problem «Japanese vowels» from UCI repository [8] is used as source data. Problem description is distinguishing nine male speakers by their utterances of two Japanese vowels /ae/. We variated number of speakers and number of used fuzzy rules and performed corresponding numerical experiments. The details of the fuzzy classifier design with coevolutionary algorithms are described in Section 2. Investigations of multistep fuzzy classifier design are presented in Section 3. Results of numerical experiments for speaker identification problem are presented in Section 4. Conclusions are listed in Section 5. and science of Russian Federation, project № 16.740.11.0742, Fuzzy Classifier Design with Coevolutionary Algorithms. A new method of Michigan and Pittsburgh approaches combination for fuzzy classifier rule base design with coevolutionary algorithms was developed in [6]. Fuzzy classifier rule base design consists of two main stages excepting initial population of fuzzy rules forming using a-priori information from a learning sample. At the first stage Michigan method is used for fuzzy rules search with high grade of certainty. At the second stage Pittsburgh method is applied for searching subset of rules with good performance and given number of rules. Constraint for number of rules is used at the second stage of fuzzy classifier design. This method requires less computational efforts than multiobjective optimization for fuzzy rules extraction. Besides this method has some advantages that were showed by numerical experiments. Another problem with genetic algorithm applying is the algorithm parameters setting. This problem is especially essential for optimization problems with high computational complexity such as fuzzy rule base generating. There are some methods for GA parameter setting. We suggest special procedure named cooperative-competitive coevolutionary algorithm for this problem solving [7]. This method combines ideas of cooperation and competition among subpopulations in the coevolutionary algorithm. We have tested this algorithm for some computationally simple problems for proving its efficiency and then we used it for fuzzy rule base forming. Coevolution-ary algorithm for unconstrained optimization is applied at the first stage (Michigan approach) and coevolutionary algorithm for constrained optimization is used at the second stage (Pittsburgh approach). Michigan-style stage. The chromosomes are fuzzy rules. Chromosome length is equal to the number of attributes; each gene is an index for the corresponding fuzzy number. Fitness function is certainty grade of the fuzzy rule calculated by a learning sample [4]. Genetic algorithm for unconstrained optimization is applied. After generation performing parents and child combined to common array. Different fuzzy rules with the best values of fitness function for each class are selected to the next generation. This new population forming method provides diversity of rules for each class and diversity of classes in population. For each generation classification performance is calculated for population at whole. Population with the best value of classification performance is used for the next stage of fuzzy classifier generating. Cooper-ate-competitive coevolutionary genetic algorithm for unconstrained optimization is applied. Pittsburgh-style stage. Chromosomes are the fuzzy rule sets. Chromosome length is equal to the population size for Michigan-style stage. Chromosome genes are binary. Value «1» means using the corresponding rule in the set, value «0» means not using the corresponding rule. Fitness function is classification performance. Constraint for number of rules is used. This value is specified by researcher. The constraint is used because it is better to have small number of rules in the final rule base. Cooper-ate-competitive coevolutionary genetic algorithm for constrained optimization is applied. New generation forming method is standard. Thus this method of Michigan and Pittsburgh approaches combination for fuzzy classifier rule base generating provides simultaneous advantages of both methods. At Michigan-style stage we get different rules with high values of grade certainty for different classes and at Pitts-burgh-style stage we get the rule set with the maximum value of classification performance and necessary number of rules. As the used rules are fixed and they cannot be changed Pittsburgh-style stage doesn’t require a lot of computational resources. Special method of initial population generating provides using of a priori information from a learning sample. Multistep Fuzzy Classifier Design. The developed method of fuzzy classifier rule base design has been applied for a number of classification machine learning problems from UCI repository [8]: - Credit (Australia-1) (14 attributes, 2 classes); - Liver Disorder (6 attributes, 2 classes); - Iris (4 attributes, 3 classes); - Yeast (8 attributes, 10 classes); - Glass Identification (9 attributes, 7 classes); - Landsat Images (4 attributes, 6 classes). Having formed multiple fuzzy classifiers design we had a set of fuzzy classifiers for each problem. Although classification efficiency diversity is acceptable, the problem is there exist a small number of repetitive fuzzy rules in a set of fuzzy classifiers (see Table 1). There is feasible number of rules in brackets. A natural step for fuzzy rule repeatability increasing is a collective design fuzzy rule base using a set of classifiers generated with our approach. In this case Pittsburgh-style stage of fuzzy classifier forming is performed again. A set of fuzzy rule base is analog of fuzzy rule base generated after Michigan-style stage. We also use constraint for feasible number of rules. For collective forming of fuzzy classifier cooperate-competitive coevolutionary algorithm can be applied again. Thus we can repeat this procedure some more times. So we got multistep procedure of fuzzy classifier design. We can use threshold value of classification performance increasing, standard deviation decreasing, or number of unique rules decreasing as a stopping criterion for our multistep procedure. The action sequence for multistep fuzzy classifier forming is the following: 1) Select start fuzzy rules with a special procedure (repeat n times); 2) Perform one-step fuzzy classifier forming (repeat n times); 3) Form from n fuzzy classifiers initial population for the next iteration; 4) If stopping criterion is true end else go to position 2). Using multistep procedure fuzzy rule repeatability must increase and classification efficiency diversity must decrease. Besides it is possible to increase classification performance. We have implemented this method and approved our forecasts. Some statistical investigations were performed for all problems. For each problem maximum, average, and minimum classification performance values (correctly classified part of test sample), standard deviation of classification performance, and number of unique rules at 10 bases are presented in Tables 1-7. There are a feasible number of rules in brackets. Stopping criterion is average classification performance increasing less than 0,005 or standard deviation equals to 0. For some cases we have performed one or two steps additionally for equal step number providing for the same problem. We can see that a number of unique rules and standard deviation of classification performance decreases for each step of fuzzy classifier design. Also classification performance increases for all problems using multistep fuzzy classifier design. For the first two problems comparison with alternative classification methods has been performed. These algorithms are Bayesian approach, multilayer perceptron, boosting [9], bagging [10], random subspace method (RSM) [11], and cooperative coevolution ensemble learning (CCEL) [12; 13]. It should be noted that base algorithms for bagging, boosting, RSM, and CCEL are not fuzzy classifiers; they are conventional classification methods. Performance value is a part of test sample that is classified correctly with an algorithm. The classification performance comparison with alternative algorithms for Credit (Australia-1) and Liver Disorder problems is presented in Table 8. Table 1 Results of Numerical Experiments for One-step Fuzzy Classifier Forming Parameter Problem 1 2 3 4 5 6 Maximum performance 0,827 (10) 0,861 (20) 0,873 (30) 0,666 (10) 0,682 (15) 0,692 (30) 0,908 (3) 0,951 (4) 0,971 (5) 0,975 (6) 0,573 (20) 0,586 (30) 0,593 (60) 0,737 (20) 0,781 (30) 0,838 (10) 0,847 (15) 0,849 (20) Average performance 0,870 (10) 0,890 (20) 0,891 (30) 0,687 (10) 0,710 (15) 0,725 (20) 0,947 (3) 0,973 (4) 0,987 (5) 0,987 (6) 0,598 (20) 0,606 (30) 0,626 (60) 0,757 (20) 0,827 (30) 0,849 (10) 0,857 (15) 0,857 (20) Minimum performance 0,758 (10) 0,841 (20) 0,854 (30) 0,632 (10) 0,655 (15) 0,655 (20) 0,767 (3) 0,900 (4) 0,940 (5) 0,933 (6) 0,540 (20) 0,555 (30) 0,542 (60) 0,706 (20) 0,757 (30) 0,821 (10) 0,836 (15) 0,835 (20) Standard deviation 0,02482(10) 0,01231(20) 0,01035(30) 0,01500(10) 0,01669(15) 0,01731(20) 0,05643(3) 0,02623(4) 0,01303(5) 0,01073(6) 0,01801(20) 0,01710(30) 0,02207(60) 0,01388(20) 0,01831(30) 0,00783(10) 0,00416(15) 0,00546(20) Number of unique rules at 10 bases 100(10) 200 (20) 300(30) 85 (10) 136 (15) 183 (20) 12 (3) 20 (4) 27 (5) 30 (6) 200 (20) 300 (30) 560 (60) 196 (20) 290 (30) 68 (10) 91(15) 156 (25) Table 2 Results of Multistep Fuzzy Classifier Design for Credit (Australia-1) Parameter 1st iteration 2nd iteration 3rd iteration Maximum performance 0,870 (10) 0,891 (10) 0,891 (10) 0,890 (20) 0,919 (20) 0,919 (20) 0,891 (30) 0,926 (30) 0,928 (30) Average performance 0,827 (10) 0,888 (10) 0,891 (10) 0,861 (20) 0,918 (20) 0,919 (20) 0,873 (30) 0,924 (30) 0,926 (30) Minimum performance 0,758 (10) 0,886 (10) 0,891 (10) 0,841 (20) 0,910 (20) 0,919 (20) 0,854 (30) 0,922 (30) 0,925 (30) Standard deviation 0,02482 (10) 0,00174 (10) 0,00000 (10) 0,01231 (20) 0,00269 (20) 0,00000 (20) 0,01035 (30) 0,00171 (30) 0,00127 (30) Number of unique rules at 10 bases 100 (10) 25 (10) 17 (10) 200 (20) 51 (20) 40 (20) 300 (30) 99 (30) 76 (30) Table 3 Results of Multistep Fuzzy Classifier Design for Liver Disorder Parameter 1st iteration 2nd iteration 3rd iteration 4th iteration Maximum performance 0,687 (10) 0,713 (10) 0,716 (10) 0,716 (10) 0,710 (15) 0,739 (15) 0,739 (15) 0,742 (15) 0,725 (20) 0,757 (20) 0,757 (20) 0,757 (20) Average performance 0,666 (10) 0,705 (10) 0,714 (10) 0,716 (10) 0,682 (15) 0,731 (15) 0,735 (15) 0,738 (15) 0,692 (20) 0,748 (20) 0,754 (20) 0,755 (20) Minimum performance 0,632 (10) 0,699 (10) 0,710 (10) 0,716 (10) 0,655 (15) 0,719 (15) 0,728 (15) 0,733 (15) 0,655 (20) 0,739 (20) 0,751 (20) 0,751 (20) Standard deviation 0,01731 (10) 0,00449 (10) 0,00229 (10) 0,00000 (10) 0,01669 (15) 0,00608 (15) 0,00411 (20) 0,00280 (15) 0,01500 (20) 0,00554 (20) 0,00229 (20) 0,00246 (20) Number of unique rules at 10 bases 85 (10) 55 (10) 20 (10) 13 (10) 136 (15) 68 (20) 46 (15) 31 (15) 183 (20) 80 (20) 53 (20) 45 (20) Table 4 Results of Multistep Fuzzy Classifier Design for Yeast Parameter 1st iteration 2nd iteration 3rd iteration 4th iteration Maximum performance 0,598 (20) 0,606 (30) 0,626 (60) 0,609 (20) 0,641 (30) 0,674 (60) 0,617 (20) 0,651 (30) 0,676 (60) 0,621 (20) 0,651 (30) 0,678 (60) Average performance 0,573 (20) 0,586 (30) 0,593 (60) 0,605 (20) 0,633 (30) 0,668 (60) 0,614 (20) 0,647 (30) 0,672 (60) 0,618 (20) 0,649 (30) 0,675 (60) Minimum performance 0,540 (20) 0,555 (30) 0,542 (60) 0,602 (20) 0,625 (30) 0,662 (60) 0,610 (20) 0,640 (30) 0,667 (60) 0,617 (20) 0,647 (30) 0,672 (60) Standard deviation 0,01801 (20) 0,01710 (30) 0,02207 (60) 0,00241 (20) 0,00431 (30) 0,00429 (60) 0,00246 (20) 0,00339 (30) 0,00241 (60) 0,00198 (20) 0,00123 (30) 0,00215 (60) Number of unique rules at 10 bases 200 (10) 300 (30) 560 (60) 97 (20) 142 (30) 263 (60) 55 (20) 93(30) 205 (60) 46 (20) 75(30) 165 (60) Table 5 Results of Multistep Fuzzy Classifier Design for Landsat Images Parameter 1st iteration 2nd iteration 3rd iteration 4th iteration Maximum performance 0,849 (10) 0,851 (10) 0,853 (10) 0,853 (10) 0,857 (15) 0,861 (15) 0,862 (15) 0,862 (15) 0,857 (20) 0,864 (20) 0,866 (25) 0,866 (25) Average performance 0,838 (10) 0,850 (10) 0,852 (10) 0,852 (10) 0,847 (15) 0,859 (15) 0,860 (15) 0,862 (15) 0,849 (20) 0,863 (20) 0,865 (25) 0,866 (25) Minimum performance 0,821 (10) 0,848 (10) 0,851 (10) 0,852 (10) 0,836 (15) 0,856 (15) 0,857 (15) 0,861 (15) 0,835 (20) 0,862 (20) 0,863 (25) 0,865 (25) Standard deviation 0,00783 (10) 0,00107 (10) 0,00036 (10) 0,00014 (10) 0,00416 (15) 0,00144 (15) 0,00148 (15) 0,00037 (15) 0,00546 (20) 0,00090 (20) 0,00097 (25) 0,00034 (25) Number of unique rules at 10 68 (10) 38 (10) 23 (10) 15 (10) bases 91 (15) 63 (15) 46 (15) 33(15) 156(25) 100 (25) 72 (25) 56 (25) Table 6 Results of Multistep Fuzzy Classifier Design for Iris Parameter 1st iteration 2nd iteration Maximum performance 0,947 (3) 0,980 (3) 0,973 (4) 0,980 (4) 0,987 (5) 0,987 (5) 0,987 (6) 0,993 (6) Average performance 0,908 (3) 0,980 (3) 0,951 (4) 0,980 (4) 0,971 (5) 0,987 (5) 0,975 (6) 0,993 (6) Minimum performance 0,767 (3) 0,980 (3) 0,900 (4) 0,980 (4) 0,940 (5) 0,987 (5) 0,933 (6) 0,993 (6) Standard deviation 0,05643 (3) 0,00000 (3) 0,02623 (4) 0,00000 (4) 0,01303 (5) 0,00000 (5) 0,01073 (6) 0,00000 (6) Number of unique rules at 10 bases 12 (3) 5 (3) 20 (4) 15 (4) 27 (5) 13 (5) 30 (6) 18 (6) Table 7 Results of Multistep Fuzzy Classifier Design for Glass Identification Parameter 1st iteration 2nd iteration 3rd iteration 4th iteration 5th iteration Maximum performance 0,757 (20) 0,827 (30) 0,836 (20) 0,874 (30) 0,846 (20) 0,888 (30) 0,850 (20) 0,888 (30) 0,850 (20) 0,888 (30) Average performance 0,737 (20) 0,781 (30) 0,824 (20) 0,861 (30) 0,838 (20) 0,880 (30) 0,846 (20) 0,886 (30) 0,850 (20) 0,886 (30) Minimum performance 0,706 (20) 0,757 (30) 0,813 (20) 0,827 (30) 0,827 (20) 0,874 (30) 0,841 (20) 0,883 (30) 0,850 (20) 0,883 (30) Standard deviation 0,01388 (20) 0,01831 (30) 0,00737 (20) 0,01354 (30) 0,00630 (20) 0,00471 (30) 0,00409 (20) 0,00246 (30) 0,00000 (20) 0,00246 (30) Number of unique rules at 10 bases 196 (20) 290 (30) 121 (20) 181(30) 81 (20) 107 (30) 55 (20) 86 (30) 40 (20) 66 (30) Table 8 The Classification Performance Comparing for Different Algorithms Algorithm Credit (Australia-1) Liver Disorder Multistep fuzzy classifier design 0,928 0,757 One-step fuzzy classifier design 0,891 0,725 Bayesian approach 0,847 0,629 Multilayer perception 0,833 0,693 Boosting 0,760 0,656 Bagging 0,847 0,630 Random subspace method 0,852 0,632 Cooperative coevolution ensemble learning 0,866 0,644 Speaker Identification Problem Solving. The problem «Japanese Vowels» [8] description is to distinguish nine male speakers by their utterances of two Japanese vowels /ae/. Nine male speakers uttered two Japanese vowels /ae/ successively. For each utterance, with the analysis parameters described below, 12-degree linear prediction analysis was applied to obtain a discrete-time series with 12 LPC cepstrum coefficients. This means that one utterance by a speaker forms a time series whose length is in the range 7-29 and each point of a time series is of 12 features (12 coefficients). The number of the time series is 640 in total. We used one set of 270 time series for training and the other set of 370 time series for testing. For fuzzy classifier applying we use average means of each coefficients in time series. In numerical experiments we increased number of speakers (number of classes) sequentially form 2 to 9 and we varied a number of used fuzzy rules. For each combination of parameters (speakers number and number of used fuzzy rules) we performed 20 algorithm runs for statistical investigation. Effectiveness parameter is correctly classified part of test sample. Results of numerical experiments are presented in Table 9. You can see the average and maximum values of effectiveness parameter for each combination of parameters. The first result of our work is multistep fuzzy classifier design convergence investigations. Having generated some fuzzy classifiers we are able to construct more effective classifier from previous classifiers using cooperative-competitive coevolutionary algorithm again. Using this method semantically similar fuzzy classifiers are generated. The approach of multistep fuzzy classifier forming has the following features: Table 9 Results of Speaker Identification Problem Solving Speaker number 60 rules 100 rules 200 rules 300 rules 400 rules 500 rules 600 rules 2 (average) 0,842 0,877 0,930 0,942 0,943 0,948 0,952 2 (maximum) 0,924 0,924 0,969 0,985 0,985 0,985 0,985 3 (average) 0,570 0,660 0,768 0,822 0,845 0,868 0,872 3 (maximum 0,649 0,720 0,837 0,876 0,896 0,916 0,916 4 (average) 0,625 0,664 0,762 0,806 0,821 0,842 0,856 4 (maximum 0,736 0,745 0,827 0,872 0,891 0,909 0,900 5 (average) 0,555 0,622 0,730 0,782 0,813 0,827 0,827 5 (maximum.) 0,640 0,705 0,827 0,827 0,885 0,885 0,878 6 (average) 0,591 0,689 0,777 0,828 0,844 0,839 0,865 6 (maximum) 0,662 0,791 0,846 0,902 0,895 0,902 0,908 7 (average) 0,501 0,605 0,733 0,791 0,809 0,830 0,845 7 (maximum) 0,620 0,655 0,803 0,857 0,867 0,882 0,887 8 (average) 0,467 0,558 0,689 0,746 0,761 0,776 0,790 8 (maximum) 0,517 0,620 0,735 0,779 0,814 0,810 0,830 9 (average) 0,399 0,515 0,649 0,685 0,724 0,742 0,755 9 (maximum) 0,481 0,559 0,700 0,732 0,786 0,786 0,786 1) This method improves classification performance without increasing number of rules. 2) This method reduces diversity of performance values for multiple algorithm runs, i.e. the method has higher statistical stability. 3) The method reduces a number of unique fuzzy rules for multiple algorithm runs. 4) Corresponding to features 1-3 trends slow down for increasing of step number. So we can conclude that multistep procedure of fuzzy classifier design has convergence property. 5) The method is more effective for more complicated classification problems (more attributes and classes). Fuzzy classifier design methods comparison with alternative classification methods by performance value demonstrates that both fuzzy classifier forming methods have either the same efficiency as present-day classification algorithms or even they are more efficient. Next result is speaker identification problem solving using multistep fuzzy classifier design. The proposed method allows getting acceptable classification efficiency for a test sample: from 0.985 for two speakers to 0.786 for nine speakers. Also we can see that for increasing the speaker number it is necessary to increase a number of the used fuzzy rules for getting appropriate results. But permanent increasing of the used fuzzy rules is not good idea because from the defined number of fuzzy rules classification effectiveness doesn't increase or even can decrease. We can see that for some cases speaker number increasing allows increasing the effectiveness parameter. It depends on speaker features. If we add a speaker with voice audiometric features that will differ very much in comparison with that previous speakers, it allows effectiveness parameter increasing.
×

Авторлар туралы

R. Sergienko

Siberian state aerospace university named after academician M. F. Reshetnev

Email: romaserg@list.ru
Candidate of Science (Engineering), senior teacher of the chair of system analysis and research of operations of the Siberian state aerospace university named after academician M. F. Reshetnev. Graduated from the Siberian state aerospace university named after academician M. F. Reshetnev in 2010. Area of scientific interests - intellectual information technologies, intellectual analysis of data.

Әдебиет тізімі

  1. Ishibuchi H, Nakashima T., Murata T. Performance evaluation of fuzzy classifier systems for multidimensional pattern classification problems // IEEE Trans. on Systems, Man, and Cybernetics. 1999. Vol. 29. P. 601-618.
  2. Cordn O., Herrera F., Hoffmann F., Magdalena L. Genetic Fuzzy Systems: Evolutionary Tuning and Learning of Fuzzy Knowledge Bases. Singapore, World Scientific. 2001.
  3. Herrera F. Genetic fuzzy systems: taxonomy, current research trends and prospects. In: Evol. Intel., 2008. Vol. 1. № 1. P. 2746.
  4. Holland J. H., Reitman J. S. Cognitive systems based on adaptive algorithms // Pattern-Directed Inference Systems. Academic Press, San Diego, CA. 1978.
  5. Smith S. F. A Learning System Based On Genetic Adaptive Algorithms. PhD thesis, Department of Computer Science, University of Pittsburgh, Pennsylvania. 1980.
  6. Sergienko R. B., Semenkin E. S., Bukhtoyarov V. V. Michigan and Pittsburgh Methods Combining for Fuzzy Classifier Generating with Coevolutionary Algorithm for Strategy Adaptation // Proc. of 2011 IEEE Congress on Evolutionary Computation. New Orleans, LA, USA 2011.
  7. Sergienko R. B., Semenkin, E. S. Competitive cooperation for strategy adaptation in coevolutionary genetic algorithm for constrained optimization // Proc. of 2010 IEEE Congress on Evolutionary Computation. 2010. P. 1626-1631.
  8. UCI Machine Learning Repository. URL: http://kdd.ics.uci.edu/.
  9. Schapire R. The boosting approach to machine learning: An overview // MSRI Workshop on Nonlinear Estimation and Classification. Berkeley, CA. 2001.
  10. Breiman, L. Arcing classifiers. In: The Annals of Statistics. 1998. Vol. 26. P. 801-849.
  11. Ho T. K. The random subspace method for constructing decision forests // IEEE Trans. on Pattern Analysis and Machine Intelligence. 1998. Vol. 20. P. 832-844.
  12. Zhuravlev J. I. An algebraic approach to recognition or classifications problems // Pattern Recognition and Image Analysis. 1998. Vol. 8, № 1. P. 59-100.
  13. Voroncov V. V., Kanevsky, D. U. Cooperative coevolution algorithm ensemble learning // Tavricheskiy Vestnik Informatiki I Matematiki (Tavria's Gerald of Informatics and Mathematics). 2005. P. 51-66.

Қосымша файлдар

Қосымша файлдар
Әрекет
1. JATS XML

© Sergienko R.B., 2012

Creative Commons License
Бұл мақала лицензия бойынша қолжетімді Creative Commons Attribution 4.0 International License.

Осы сайт cookie-файлдарды пайдаланады

Біздің сайтты пайдалануды жалғастыра отырып, сіз сайттың дұрыс жұмыс істеуін қамтамасыз ететін cookie файлдарын өңдеуге келісім бересіз.< / br>< / br>cookie файлдары туралы< / a>