本文主要是介绍ELM极限学习机源码,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
ELM曾今和svm是比cnn更火的方法,纪念一下:
function [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm(train_data, test_data, Elm_Type, NumberofHiddenNeurons, ActivationFunction)% Usage: elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)
% OR: [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)
%
% Input:
% TrainingData_File - Filename of training data set
% TestingData_File - Filename of testing data set
% Elm_Type - 0 for regression; 1 for (both binary and multi-classes) classification
% NumberofHiddenNeurons - Number of hidden neurons assigned to the ELM
% ActivationFunction - Type of activation function:
% 'sig' for Sigmoidal function
% 'sin' for Sine function
% 'hardlim' for Hardlim function
% 'tribas' for Triangular basis function
% 'radbas' for Radial basis function (for additive type of SLFNs instead of RBF type of SLFNs)
%
% Output:
% TrainingTime - Time (seconds) spent on training ELM
% TestingTime - Time (seconds) spent on predicting ALL testing data
% TrainingAccuracy - Training accuracy:
% RMSE for regression or correct classification rate for classification
% TestingAccuracy - Testing accuracy:
% RMSE for regression or correct classification rate for classification
%
% MULTI-CLASSE CLASSIFICATION: NUMBER OF OUTPUT NEURONS WILL BE AUTOMATICALLY SET EQUAL TO NUMBER OF CLASSES
% FOR EXAMPLE, if there are 7 classes in all, there will have 7 output
% neurons; neuron 5 has the highest output means input belongs to 5-th class
%
% Sample1 regression: [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm('sinc_train', 'sinc_test', 0, 20, 'sig')
% Sample2 classification: elm('diabetes_train', 'diabetes_test', 1, 20, 'sig')
%%%%% Authors: MR QIN-YU ZHU AND DR GUANG-BIN HUANG%%%% NANYANG TECHNOLOGICAL UNIVERSITY, SINGAPORE%%%% EMAIL: EGBHUANG@NTU.EDU.SG; GBHUANG@IEEE.ORG%%%% WEBSITE: http://www.ntu.edu.sg/eee/icis/cv/egbhuang.htm%%%% DATE: APRIL 2004%%%%%%%%%%% Macro definition
REGRESSION=0;
CLASSIFIER=1;%%%%%%%%%%% Load training dataset
%train_data=load(TrainingData_File);
T=train_data(:,1)';
P=train_data(:,2:size(train_data,2))';
clear train_data; % Release raw training data array%%%%%%%%%%% Load testing dataset
%test_data=load(TestingData_File);
TV.T=test_data(:,1)';
TV.P=test_data(:,2:size(test_data,2))';
clear test_data; % Release raw testing data arrayNumberofTrainingData=size(P,2);
NumberofTestingData=size(TV.P,2);
NumberofInputNeurons=size(P,1);if Elm_Type~=REGRESSION%%%%%%%%%%%% Preprocessing the data of classificationsorted_target=sort(cat(2,T,TV.T),2);label=zeros(1,1); % Find and save in 'label' class label from training and testing data setslabel(1,1)=sorted_target(1,1);j=1;for i = 2:(NumberofTrainingData+NumberofTestingData)if sorted_target(1,i) ~= label(1,j)j=j+1;label(1,j) = sorted_target(1,i);endendnumber_class=j;NumberofOutputNeurons=number_class;%%%%%%%%%% Processing the targets of trainingtemp_T=zeros(NumberofOutputNeurons, NumberofTrainingData);for i = 1:NumberofTrainingDatafor j = 1:number_classif label(1,j) == T(1,i)break; endendtemp_T(j,i)=1;endT=temp_T*2-1;%%%%%%%%%% Processing the targets of testingtemp_TV_T=zeros(NumberofOutputNeurons, NumberofTestingData);for i = 1:NumberofTestingDatafor j = 1:number_classif label(1,j) == TV.T(1,i)break; endendtemp_TV_T(j,i)=1;endTV.T=temp_TV_T*2-1;end % end if of Elm_Type%%%%%%%%%%% Calculate weights & biases
start_time_train=cputime;%%%%%%%%%%% Random generate input weights InputWeight (w_i) and biases BiasofHiddenNeurons (b_i) of hidden neurons
InputWeight=rand(NumberofHiddenNeurons,NumberofInputNeurons)*2-1;
BiasofHiddenNeurons=rand(NumberofHiddenNeurons,1);
tempH=InputWeight*P;
clear P; % Release input of training data
ind=ones(1,NumberofTrainingData);
BiasMatrix=BiasofHiddenNeurons(:,ind); % Extend the bias matrix BiasofHiddenNeurons to match the demention of H
tempH=tempH+BiasMatrix;%%%%%%%%%%% Calculate hidden neuron output matrix H
switch lower(ActivationFunction)case {'sig','sigmoid'}%%%%%%%% Sigmoid H = 1 ./ (1 + exp(-tempH));case {'sin','sine'}%%%%%%%% SineH = sin(tempH); case {'hardlim'}%%%%%%%% Hard LimitH = double(hardlim(tempH));case {'tribas'}%%%%%%%% Triangular basis functionH = tribas(tempH);case {'radbas'}%%%%%%%% Radial basis functionH = radbas(tempH);%%%%%%%% More activation functions can be added here
end
clear tempH; % Release the temparary array for calculation of hidden neuron output matrix H%%%%%%%%%%% Calculate output weights OutputWeight (beta_i)
OutputWeight=pinv(H') * T'; % slower implementation
%OutputWeight=inv(eye(size(H,1))/C+H * H') * H * T'; % faster method 1
%implementation; one can set regularizaiton factor C properly in classification applications
%OutputWeight=(eye(size(H,1))/C+H * H') \ H * T'; % faster method 2
%implementation; one can set regularizaiton factor C properly in classification applications%If you use faster methods or kernel method, PLEASE CITE in your paper properly: %Guang-Bin Huang, Hongming Zhou, Xiaojian Ding, and Rui Zhang, "Extreme Learning Machine for Regression and Multi-Class Classification," submitted to IEEE Transactions on Pattern Analysis and Machine Intelligence, October 2010. end_time_train=cputime;
TrainingTime=end_time_train-start_time_train % Calculate CPU time (seconds) spent for training ELM%%%%%%%%%%% Calculate the training accuracy
Y=(H' * OutputWeight)'; % Y: the actual output of the training data
if Elm_Type == REGRESSIONTrainingAccuracy=sqrt(mse(T - Y)) % Calculate training accuracy (RMSE) for regression case
end
clear H;%%%%%%%%%%% Calculate the output of testing input
start_time_test=cputime;
tempH_test=InputWeight*TV.P;
clear TV.P; % Release input of testing data
ind=ones(1,NumberofTestingData);
BiasMatrix=BiasofHiddenNeurons(:,ind); % Extend the bias matrix BiasofHiddenNeurons to match the demention of H
tempH_test=tempH_test + BiasMatrix;
switch lower(ActivationFunction)case {'sig','sigmoid'}%%%%%%%% Sigmoid H_test = 1 ./ (1 + exp(-tempH_test));case {'sin','sine'}%%%%%%%% SineH_test = sin(tempH_test); case {'hardlim'}%%%%%%%% Hard LimitH_test = hardlim(tempH_test); case {'tribas'}%%%%%%%% Triangular basis functionH_test = tribas(tempH_test); case {'radbas'}%%%%%%%% Radial basis functionH_test = radbas(tempH_test); %%%%%%%% More activation functions can be added here
end
TY=(H_test' * OutputWeight)'; % TY: the actual output of the testing data
end_time_test=cputime;
TestingTime=end_time_test-start_time_test % Calculate CPU time (seconds) spent by ELM predicting the whole testing dataif Elm_Type == REGRESSIONTestingAccuracy=sqrt(mse(TV.T - TY)) % Calculate testing accuracy (RMSE) for regression case
endif Elm_Type == CLASSIFIER
%%%%%%%%%% Calculate training & testing classification accuracyMissClassificationRate_Training=0;MissClassificationRate_Testing=0;for i = 1 : size(T, 2)[x, label_index_expected]=max(T(:,i));[x, label_index_actual]=max(Y(:,i));if label_index_actual~=label_index_expectedMissClassificationRate_Training=MissClassificationRate_Training+1;endendTrainingAccuracy=1-MissClassificationRate_Training/size(T,2)for i = 1 : size(TV.T, 2)[x, label_index_expected]=max(TV.T(:,i));[x, label_index_actual]=max(TY(:,i));if label_index_actual~=label_index_expectedMissClassificationRate_Testing=MissClassificationRate_Testing+1;endendTestingAccuracy=1-MissClassificationRate_Testing/size(TV.T,2)
end
这篇关于ELM极限学习机源码的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!