matlab实现的C4.5分类决策树算法.docx
《matlab实现的C4.5分类决策树算法.docx》由会员分享,可在线阅读,更多相关《matlab实现的C4.5分类决策树算法.docx(6页珍藏版)》请在淘文阁 - 分享文档赚钱的网站上搜索。
1、精选优质文档-倾情为你奉上function D = C4_5(train_features, train_targets, inc_node, region) % Classify using Quinlans C4.5 algorithm% Inputs:% features - Train features% targets - Train targets% inc_node - Percentage of incorrectly assigned samples at a node% region - Decision region vector: -x x -y y number_of
2、_points% Outputs% D - Decision sufrace %NOTE: In this implementation it is assumed that a feature vector with fewer than 10 unique values (the parameter Nu)%is discrete, and will be treated as such. Other vectors will be treated as continuous Ni, M = size(train_features);inc_node = inc_node*M/100;Nu
3、 = 10; %For the decision regionN = region(5);mx = ones(N,1) * linspace (region(1),region(2),N);my = linspace (region(3),region(4),N) * ones(1,N);flatxy = mx(:), my(:); %Preprocessing%f, t, UW, m = PCA(train_features, train_targets, Ni, region);%train_features = UW * (train_features - m*ones(1,M);%fl
4、atxy = UW * (flatxy - m*ones(1,N2); %Find which of the input features are discrete, and discretisize the corresponding%dimension on the decision regiondiscrete_dim = zeros(1,Ni);for i = 1:Ni, Nb = length(unique(train_features(i,:); if (Nb = Nu), %This is a discrete feature discrete_dim(i) = Nb; H, f
5、latxy(i,:) = high_histogram(flatxy(i,:), Nb); endend %Build the tree recursivelydisp(Building tree)tree = make_tree(train_features, train_targets, inc_node, discrete_dim, max(discrete_dim), 0); %Make the decision region according to the treedisp(Building decision surface using the tree)targets = use
6、_tree(flatxy, 1:N2, tree, discrete_dim, unique(train_targets); D = reshape(targets,N,N);%END function targets = use_tree(features, indices, tree, discrete_dim, Uc)%Classify recursively using a tree targets = zeros(1, size(features,2); if (tree.dim = 0) %Reached the end of the tree targets(indices) =
7、 tree.child; breakend %This is not the last level of the tree, so:%First, find the dimension we are to work ondim = tree.dim;dims= 1:size(features,1); %And classify according to itif (discrete_dim(dim) = 0), %Continuous feature in = indices(find(features(dim, indices) tree.split_loc); targets = targ
8、ets + use_tree(features(dims, :), in, tree.child(2), discrete_dim(dims), Uc);else %Discrete feature Uf = unique(features(dim,:); for i = 1:length(Uf), in = indices(find(features(dim, indices) = Uf(i); targets = targets + use_tree(features(dims, :), in, tree.child(i), discrete_dim(dims), Uc); endend
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- matlab 实现 C4 分类 决策树 算法
限制150内