Aerospace and Electronic Systems Magazine April 2017 - 34
Supervised Learning Algorithms for Spacecraft Attitude Determination and Control System Health Monitoring
The shortest training time was obtained for the polynomial kernel function (7.4 s during SVM training and the
smallest number of SVs was obtained for both the polynomial and Gaussian kernel function (10 SVs during SVM
training). Figures 7 and 8 demonstrate samples of contour
plots for nonlinear SVM classifier using Gaussian kernel
Table 2 depicts the testing performance of the best classifiers
obtained during the training process using (Set-1) and (Set-2). The
testing performance is introduced in terms of the testing efficiency
resulting from using (Set-T).
Results attained from Table 2 explain that when the training
size is reduced from (Set-1) to (Set-2), the testing efficiency is also
reduced. For (Set-1) the maximum testing efficiency is 96.98%,
while it is 94.29% for (Set-2). This shows the ability of the proposed
SVMs classifier in learning with small size of data patterns and demonstrates the computational efficiency of the proposed approach.
PLS-DA ALGORITHM AND SIMCA-P SOFTWARE ANALYSIS
Best performance obtained during training of SVMs with (Set-1) for
nominal and faulty data and for different values of kernel parameters for
Gaussian and polynomial kernel.
From the results illustrated in Figure 6 the following clarifications are worth noting.
1. Effect of penalty due to the error C
For the kernel functions under investigation, the best
performance is obtained at high values of C = 500 and
1000. In addition, as C increases the training efficiency
The maximum training efficiency is 96.98% at C = 1000.
2. Effect of the kernel parameters
For the polynomial kernel, as n increases, both the number
of SVs and the training time decrease, while the training
efficiency increases. The best performance for the polynomial kernel function is 96.13% for C = 1000, n = 10.
For the Gaussian kernel, as γ decreases, both the number of SVs and the training efficiency increase, while the
training time decreases. The best performance for the
Gaussian kernel function is 96.98% for C = 1000, γ = 0.1.
3. Effect of the type of the kernel function
The best training efficiency was obtained with the Gaussian kernel function (96.98% during SVM training).
Models are built using the new PLS-DA algorithm and compared
with SIMCA-P software developed by Umetrics . PLS-DA attempts to locate the directions that maximize discriminations and
separations between groups. The key targets in building PLS-DA
models are data preparation, scaling, and selecting the number of
PCs. First, data must be in tab enclosed text format with no headings or extra characters before entering into MATLAB. If there are
missing portions of data, a special function within should be used
to insure the algorithm will function properly. Once the data matrix
is loaded, all subsequent calculations are done within the algorithm.
The PLS-DA algorithm and SIMCA-P software develop the
score vectors in the score plot which represents the principle component (new variables) ti and the Pi which represents the loadings
vectors (eigenvectors of the covariance matrix) or variables coefficients. The score plot is a summary of the relationships among the
observations and shows how the observations are projected from
original space to latent/low dimensional space. The loading plot
displays the relationships among the variables and the loadings are
the weights combining the original variables to form the scores.
Geometrically, they represent the direction of the projection plane
in the K-space and this direction gives information about which
variables are important and which are not. Also, they demonstrate
how the important variables combine to separate the clusters of observations, or to define trends among observations overtime. The
score and the loading plots are complementary and superimposed,
which means that an interesting pattern seen in the score plot can
be interpreted by looking along that interesting direction in the
loading plot, or variables lying in each quarter of the loading plot
are contributing to the changes in the observations in the score plot.
The first model is developed based on applying our PLS-DA
algorithm coded in Matlab. The model is validated to the spacecraft telemetry for the first time. The PLS-DA model has been built
by using the training set (Set-1) which contains 810 observations
(777 nominal, 33 faulty and represented by symbol (Ni) and (Fi),
IEEE A&E SYSTEMS MAGAZINE