Professional Documents
Culture Documents
www.ijtsrd.com
155
IJTSRD | May-Jun 2017
Available Online @www.ijtsrd.com
International Journal of Trend in Scientific Research and Development, Volume 1(4), ISSN: 2456-6470
www.ijtsrd.com
higher dimensional space has been used. It can be takes place, the crossover operator exchanges parts of
shown that the solution for w is given by: two single chromosomes to produce children and the
w i y i x i (15) mutation operator changes the gene value in some
i randomly chosen location of the chromosome [10].
The process of evaluation, selection, and
Therefore, w. (x t ) b can be rewritten thus: recombination is iterated until the population
converges to an acceptable solution.Genetic
Algorithms (GA) has been applied in various areas of
i y i (x i ) ( x t ) b i y i K ( x i , x t ) b (16)
i i computer vision such as weights optimization of
artificial neural networks [2;10;14;15], video
Thus, the Kernel function can be used rather than segmentation [4].
actually making the transformation to higher
3 SVM parameter selection based on Genetic
dimensional space since the data appears only in dot
Algorithm
product form. The and C are the free hyper-
Using the Genetic algorithm, the two parameters C
parameters the users can supply. These two and of SVM model can be optimized. The process
of optimizing these parameters is shown in Figure 2.
parameters has a great role to play in the performance
of the SVM. Several choices have used to select
Initialization
optimal values of these parameters. These include
cross-validation, Particle swarm optimization and
genetic optimization. In this paper, however, the Train SVM
Figure 3: Sample Chromosome Encoding of C and d. Perform Crossover. In this research work,
of the RBF using strings. one-point crossing is adopted. The specific operation
is to randomly set one crossing point among
In this research work, C and values of the RBF are individual strings. When crossing is executed,
simultaneously coded as one gene. Each of the gene partial configuration of the anterior point and
is coded using randomized binary numbers of length posterior point are exchanged, and this gave birth to
5. The minimum value is 0.1 while the maximum two new offspring
value is 100. The number of bits to represent the e. Mutation. As for two-value code strings,
value of a gene must satisfy: mutation operation is to reverse the gene
values within a random number generated
( + 1);i=1,2.3,, between zero and one.
f. Termination Criterion: The termination
(17) condition is the maximum number of
where v is the string length, min is the generations.
minimum value of the gene, max is the Genetic control parameters dictate how the algorithm
maximum value of the gene and is the error will behave. Changing these parameters can change
tolerance. the computational result. These parameters are
population size, crossing probability, mutation
A chromosome represents one RBF. The length of probability and network termination condition. In this
chromosome is obtained by concatenating the bits work population size N is 50, crossing probability Pc
representing each gene as shown in Figure 3. is 0.8, mutation probability Pm is 0.5, and networks
terminative condition is MAXGEN of 100.
car features
0.1
= ( ( , )) (25)
159
IJTSRD | May-Jun 2017
Available Online @www.ijtsrd.com
International Journal of Trend in Scientific Research and Development, Volume 1(4), ISSN: 2456-6470
www.ijtsrd.com
4.2 GA-SVM Object Classification compare the classification performance of the
proposed model with other classifiers such as the
In this work, the training of the SVM consists of normal SVM, K-Means, K-NN, ANN, GA-ANN, the
providing the SVM with data for vehicles and training is done with the same set of data and the
pedestrians. Data for each class consist of a set of n same testing dataset is used to evaluate each classifier.
dimensional vectors. A Radial Basis Function In the proposed GA-SVM model, the SVM adopts
(RBF) kernel is applied to the SVM which then Radial Basis Function as the kernel. The parameters C
attempts to construct a hyper-plane in the n- and of the SVM are optimized using Genetic
dimensional space, attempting to maximize the Algorithm. Then these optimal parameters are used to
margin between the two input classes. The SVM type train the SVM model. In the normal SVM model, the
used in this work is C-SVM using a non-linear SVM use RBF as the kernel function. The parameters
classifier where C is the cost hyper-parameter [2]. The C and are randomly selected. In the K-Nearest
SVM is trained using 1D radial signals extracted from Neighbour training, data for vehicles and pedestrians
the silhouettes of the labeled images of humans and are provided for the K-NN. As many instances of
vehicles. Given a training set of human and vehicle vehicles and pedestrians are provided with their
images consisting of multiple labeled images of each corresponding class labels. After the training is done,
human and vehicle classes an SVM classifier is K-Nearest can then be used to classify test instances.
trained as follow: 1D features data are extracted from To classify a new instance, the instance is supplied
the training set images to create the vector X = (x1, and the number of neighbors k. This k defines the
x2,...,xL) for human and Y = (y1,y2, ..., yK) for vehicle neighborhood in which training data is consulted. The
images. where L and K are the total number of test data is compared with the training data. The
training images for class human and vehicle nearest K-instances are checked and the majority
respectively. To train the SVM, X = (x1, x2, ..., xL) are nearest to k dictates the class that the instance
used as the positive labeled training data and Y = belongs. In this research, k is set to 5. For the
(y1,y2, ...,yL) are used as the negative labeled training Classification Using K-Means Algorithm, Given a
data. The SVM is then trained to maximize the hyper- training set of human and vehicle classes, a K-Means
plane margin between their respective classes (1, clustering algorithm is trained using the vector X =
2). To classify (x1, x2,...,xL) for human and Y = (y1,y2, ..., yL) for
an object F, it must be assigned to one of the p vehicle data. where L is the total number of training
possible classes (1, 2), in this case p takes two images. To train the K-means, the number of classes
values. For each object class, p, an SVM is used to and the mean of each class is provided. K-Means now
calculate this class that the object F belongs to given clusters each data item to any of the classes based on
measurement vector D where in this case D is the set the Euclidean distance of the data to the mean of each
of 1D radial features. cluster. The minimum distance from each class
determines which cluster the data falls. Now after the
entire instance are clustered, the means are re-
evaluated again and the process continues until a
5 Implementation of the Proposed Model given terminating condition is fulfilled. To cluster a
new data, the distances of the new data to each of the
In order to evaluate the classification performance of evaluated cluster means are calculated and the one
the proposed SVM model, some video data were with minimum is the class of the new data. Object
gotten from major roads in Akure, Nigeria. The classification using ANN uses input nodes, which are
background subtraction algorithm was applied to 32 in number, the hidden layer which has 32 neurons
detect objects from the videos. Features of these and the output layer which has one neuron. The
objects are exccted and stored in a database. The data following steps are adopted in the Neural Network
are divided randomly into the 80 training dataset 333 modeling for object classification. In this approach,
test dataset. The 80 training dataset consist of 40 shape features extracted from the detected moving
vehicles and 40 pedestrians. The test dataset consists objects are used to train a neural network classifier in
of 100 vehicles and 223 pedestrians. The class vehicle order to recognize human and vehicle classes. The
is assigned 1 while the class human is assigned 0. The Learning Algorithm used is the back propagation
GA-SVM model is trained using [3]. In order to algorithm. The dimension of input data is thirty two.
160
IJTSRD | May-Jun 2017
Available Online @www.ijtsrd.com
International Journal of Trend in Scientific Research and Development, Volume 1(
1(4),
), ISSN: 2456-6470
www.ijtsrd.com
The network has one hidden layer having thirty two S PARAME GA S GA K- A K-
neurons and one output neuron. The network is
N TER - V - ME N N
trained using the back-propagation.. This is a
supervised learning algorithm in which the input SV M AN AN N N
vectors are supplied together with the desired output.
M N S
The back propagation algorithm (BPN) learns during
the training epoch. Several epochs are needed before 1 Number 413 41 413 413 41 41
the network can sufficiently learn and provide a
satisfactory result. The number of epochs used is observed 3 3 3
500, with the momentum of 0.5 and learning rate of features
0.3. The initial weights are randomlyy initialized to
small random numbers less than 1 using random 2 Number of 80 80 80 80 80 80
number generators. The object classification using training set
using neuro-genetic model (GA-ANN)ANN) applies genetic
algorithm to optimize the weights before the ANN is 3 Number of 2 2 2 2 2 2
trained with the optimized weight. In this case, there classes
are (32*32+32) number of weights which signifies a
chromosome. The population
opulation size N is 50, crossing 4 Dimension 32 32 32 32 32 32
probability Pc is 0.8, mutation probability Pm is 0.015 of input
and networks terminative condition is MAXGEN of
100. This combination gives ann excellent empirical features
performance. 5 Number of 333 33 333 333 33 33
test set 3 3 3
6 Results and discussion
6 Total 331 33 330 329 32 31
Table 1 shows the analysis result for GA-SVM,
SVM,GA-ANN, K-MEANS, K-NN. NN. GAGA-SVM. For Recognize 0 8 7
the GA-SVM,
SVM, 80 data are used to train the model.
model.. d
The support vectors values from the training are then
saved. The gamma parameter obtained is 19.28 and Classificati 99. 99 99. 98. 98 95
the C value is 6.49. For testing, 333 data items 7 on 39 .1 1% 8% .5 .2
consisting of 110 vehicle data and 223 pedestrian data
are used.. Out of the 110 vehicles, all are classified accuracy % % % %
correctly while for 223 human class data items, oonly 2
are misclassified. Error in classification is 0.61% and
the accuracy is 99.39%. For the normal SVM, error in
classification is 0.9% and the accuracy is 99.1%. For 100.00%
GA-ANN, error in classification is 0.9% and the 99.00%
accuracy is 99.1%. For K-MEANS MEANS, error in 98.00%
classification is 1.2% and the accuracy is 98.8% and 97.00%
for ANN, error in classification is 1.5% and accuracy
96.00%
is 98.5% and for K-NN, error in classification is 4.8% Series1
95.00%
and accuracy is 95.2%. Figure 6 shows the
classification accuracy. From the figure, it is show
showned 94.00%
that the proposed model has the highest classification 93.00%
K-MEANS
SVM
ANN
K-NN
GA-SVM
GA-ANN
accuracy.
[2] Akintola K.G., Olabode O. (2014) A Neuro- [10] Montana D.J., and Davis L.D. (1989).
Genetic Model For Moving Object Training Feed-Forward Networks using
Classification In Video Surveillance Systems. Genetic Algorithms. Proceedings of the
International Journal of research in Computer International Joint Conference on Artificial
Applications and Robotics Vol. 2, No8 pp. Intelligence, IJCAI-89. Morgan Kaufman
143-151, July, 2014. Publishers, Deutrot, MI, pp.762-767.
[3] Chang C.C and Lin, C. J. (2001) LIBSVM: A [11] Michael P. and El-Kaliouby R. (2003) Real
Library for Support Vector Machines. time facial expression recognition in video
Software available at using Support Vector Machine in proceedings
http://www.csie.ntu.edu.tw/~cjlin/libsvm. of ICM Nov 5-7, 2003 Vancour, British
Columbia, Canada.
[4] Chiu P., Girgensohn A., Polak W., Rieffel, E,
Wilcox L. (2000). A Genetic Algorithm for [12] Ramya G and Srclatha M. (2014) Multiple
video Segmentation and Summarization. Object Tracking using Support Vector
Proceedings of IEE International Conference Machine. Global Journal of Researches in
on Multimedia and Expo, Vol. 3, pp. 1329- Engineering Subtraction Vol. 14 Issue 6
1332, New York, 30th July-2nd August, 2000. Version 1.
162
IJTSRD | May-Jun 2017
Available Online @www.ijtsrd.com
International Journal of Trend in Scientific Research and Development, Volume 1(4), ISSN: 2456-6470
www.ijtsrd.com
163
IJTSRD | May-Jun 2017
Available Online @www.ijtsrd.com