Aerospace and Electronic Systems Magazine July 2017 - 54

Dynamic, Data-Driven Processing of Multispectral Video Streams
objects in the foreground that are smaller than the filter size (a
parameter of the erosion function), and dilation helps to more
completely identify boundaries of detected objects. Erosion may
lead to distortion in object boundaries, so dilation is applied after erosion as a corrective operation to address this potential for
distortion.
The foreground binarization actor takes the output of the foreground filter, and converts it into a binary form, where each pixel is
classified as being either a foreground or background pixel. The binary conversion is performed by applying a threshold, and classifying
pixels as foreground whenever the corresponding pixel values exceed
the threshold. The specific threshold that is employed is determined
empirically (off-line) in an effort to enhance classification accuracy.
The resulting binary image is then processed by the foreground output actor to store the classification results for each image as a separate file in a given output directory. The files generated in the output
directory are indexed so that they can easily be matched up with their
corresponding input frames from the given multispectral data set.
We performed experiments applying LDspectral with the background subtraction subsystem shown in Figure 3. These experiments were performed using a laptop computer equipped with an
AMD A8-4500M CPU, 4GB RAM, and the Ubuntu 14.04 LTS
operating system. Results from these experiments are discussed in
the following section.

EXPERIMENTS
In our experiments involving BSS in conjunction with background
subtraction, we applied the novel data set for multispectral background subtraction that was published recently by Benezeth et al.
[1]. From this data set, we experimented with multispectral video input that contains 1102 images, where each image contains
separate components in 7 different spectral bands. Among these
7 bands, 6 are in the visible spectrum and the remaining one is in
the near-infrared spectrum. We divided this set of images into 735
images (approximately 2/3) for training and 367 images for testing.
Here, the training phase is applied to optimize the performance
of each two-band subset. Given a band subset {bs1, bs2}, training is
used to optimize the relative weightings for these bands when they
are fused in the image combination actor described previously.
More specifically, suppose that x1 and x2 are two corresponding
pixel values [pixel values at the same image coordinates (a, b)]
in bands bs1 and bs2, respectively, and let y denote the pixel value
at coordinates (a, b) in the output of the image combination actor.
Then y is derived by
y = α × x1 + (1 − α ) × x2 ,

(1)

where α (0 ≤ α ≤ 1) is a parameter of the image combination actor that is used to control the relative weightings of the two input
bands. We refer to this parameter α as the pairwise band combination (PBC) parameter.
Based on this formulation of pixel-level fusion for a two-band
subset, our training phase is used to optimize the image combination parameter α. This training process is carried out for each distinct pair {bs1, bs2} of bands to yield a corresponding PBC param54

eter value A(s1, s2) that controls the relative weighting of pixels
when combining bands bs1 and bs2.
For each distinct pair {bs1, bs2} of bands, the training phase
involves performing an exhaustive search across α ∈{0, 0.1,
0.2, ..., 1}, and then selecting a value for the PBC (with ties
broken arbitrarily) that leads to the highest average accuracy for
the background subtraction subsystem of Figure 3. This selected value is then used in the testing phase to assess the accuracy
produced by using the band subset {bs1, bs2} for background
subtraction.
The measure of accuracy employed in these experiments is the
harmonic mean performance measure of background subtraction
accuracy, which is motivated, for example, in [1]. The harmonic
mean is defined as
Fmeasure = 2 ×

recall × precision
recall + precision

(2)

Here, precision and recall are defined by
precision =

nc
n
, and recall = c ,
nf
ng

(3)

where nc is the number of correctly classified foreground pixels,
nf is the number of pixels classified as foreground, and ng is the
number of foreground pixels in the ground truth.
Table 1 and Table 2 show experimental results using the offline analysis capabilities of LDspectral to evaluate processing
trade-offs among different one- and two-band combinations (i.e.,
where the set of selected bands is restricted to contain only one or
two elements). Table 1 shows the background subtraction accuracy
that is experimentally observed for different one- and two-band
combinations, while Table 2 shows the processing times for different combinations. In each of these tables, the diagonal entries
give the results for single-band processing, while each entry at row
a and column b when a ≠ b gives the results from joint processing
of the bands indexed by a and b. In each of these tables, elements
below the diagonal are not shown since they are symmetric with
respect to the diagonal. As mentioned above, we employ 1102 images in each of these experiments. These 1102 images form the
complete set of images from the employed multispectral data set
[1] that have ground truth available as part of the data set.
From Table 1, we see that the accuracy provided by LDspectral is significantly higher on average compared with the results
presented in [1] for the same video data set. This demonstrates the
effectiveness of LDspectral in optimizing the accuracy of background subtraction.
Experimentally derived data of the form shown in Table 1 and
Table 2 can be used as the subset selection profiles to guide BSS,
as illustrated in Figure 2. Additionally, the results in Table 2 define
lower limits on how short the reconfiguration interval Tr can be.
Table 3 shows the optimized values for the PBC parameters
that were derived through the training procedure for processing of
two-band subsets. The rows and columns of Table 3 correspond,
respectively, to x1 and x2 in (1). For example, when S consists of
bands 2 and 3, we use α = 0.3, and when S consists of bands 1 and
4, we use α = 0.6. The diversity of the values in this table demon-

IEEE A&E SYSTEMS MAGAZINE

JULY 2017



Table of Contents for the Digital Edition of Aerospace and Electronic Systems Magazine July 2017

No label
Aerospace and Electronic Systems Magazine July 2017 - No label
Aerospace and Electronic Systems Magazine July 2017 - Cover2
Aerospace and Electronic Systems Magazine July 2017 - 1
Aerospace and Electronic Systems Magazine July 2017 - 2
Aerospace and Electronic Systems Magazine July 2017 - 3
Aerospace and Electronic Systems Magazine July 2017 - 4
Aerospace and Electronic Systems Magazine July 2017 - 5
Aerospace and Electronic Systems Magazine July 2017 - 6
Aerospace and Electronic Systems Magazine July 2017 - 7
Aerospace and Electronic Systems Magazine July 2017 - 8
Aerospace and Electronic Systems Magazine July 2017 - 9
Aerospace and Electronic Systems Magazine July 2017 - 10
Aerospace and Electronic Systems Magazine July 2017 - 11
Aerospace and Electronic Systems Magazine July 2017 - 12
Aerospace and Electronic Systems Magazine July 2017 - 13
Aerospace and Electronic Systems Magazine July 2017 - 14
Aerospace and Electronic Systems Magazine July 2017 - 15
Aerospace and Electronic Systems Magazine July 2017 - 16
Aerospace and Electronic Systems Magazine July 2017 - 17
Aerospace and Electronic Systems Magazine July 2017 - 18
Aerospace and Electronic Systems Magazine July 2017 - 19
Aerospace and Electronic Systems Magazine July 2017 - 20
Aerospace and Electronic Systems Magazine July 2017 - 21
Aerospace and Electronic Systems Magazine July 2017 - 22
Aerospace and Electronic Systems Magazine July 2017 - 23
Aerospace and Electronic Systems Magazine July 2017 - 24
Aerospace and Electronic Systems Magazine July 2017 - 25
Aerospace and Electronic Systems Magazine July 2017 - 26
Aerospace and Electronic Systems Magazine July 2017 - 27
Aerospace and Electronic Systems Magazine July 2017 - 28
Aerospace and Electronic Systems Magazine July 2017 - 29
Aerospace and Electronic Systems Magazine July 2017 - 30
Aerospace and Electronic Systems Magazine July 2017 - 31
Aerospace and Electronic Systems Magazine July 2017 - 32
Aerospace and Electronic Systems Magazine July 2017 - 33
Aerospace and Electronic Systems Magazine July 2017 - 34
Aerospace and Electronic Systems Magazine July 2017 - 35
Aerospace and Electronic Systems Magazine July 2017 - 36
Aerospace and Electronic Systems Magazine July 2017 - 37
Aerospace and Electronic Systems Magazine July 2017 - 38
Aerospace and Electronic Systems Magazine July 2017 - 39
Aerospace and Electronic Systems Magazine July 2017 - 40
Aerospace and Electronic Systems Magazine July 2017 - 41
Aerospace and Electronic Systems Magazine July 2017 - 42
Aerospace and Electronic Systems Magazine July 2017 - 43
Aerospace and Electronic Systems Magazine July 2017 - 44
Aerospace and Electronic Systems Magazine July 2017 - 45
Aerospace and Electronic Systems Magazine July 2017 - 46
Aerospace and Electronic Systems Magazine July 2017 - 47
Aerospace and Electronic Systems Magazine July 2017 - 48
Aerospace and Electronic Systems Magazine July 2017 - 49
Aerospace and Electronic Systems Magazine July 2017 - 50
Aerospace and Electronic Systems Magazine July 2017 - 51
Aerospace and Electronic Systems Magazine July 2017 - 52
Aerospace and Electronic Systems Magazine July 2017 - 53
Aerospace and Electronic Systems Magazine July 2017 - 54
Aerospace and Electronic Systems Magazine July 2017 - 55
Aerospace and Electronic Systems Magazine July 2017 - 56
Aerospace and Electronic Systems Magazine July 2017 - 57
Aerospace and Electronic Systems Magazine July 2017 - 58
Aerospace and Electronic Systems Magazine July 2017 - 59
Aerospace and Electronic Systems Magazine July 2017 - 60
Aerospace and Electronic Systems Magazine July 2017 - 61
Aerospace and Electronic Systems Magazine July 2017 - 62
Aerospace and Electronic Systems Magazine July 2017 - 63
Aerospace and Electronic Systems Magazine July 2017 - 64
Aerospace and Electronic Systems Magazine July 2017 - Cover3
Aerospace and Electronic Systems Magazine July 2017 - Cover4
http://www.brightcopy.net/allen/aesm/34-2s
http://www.brightcopy.net/allen/aesm/34-2
http://www.brightcopy.net/allen/aesm/34-1
http://www.brightcopy.net/allen/aesm/33-12
http://www.brightcopy.net/allen/aesm/33-11
http://www.brightcopy.net/allen/aesm/33-10
http://www.brightcopy.net/allen/aesm/33-09
http://www.brightcopy.net/allen/aesm/33-8
http://www.brightcopy.net/allen/aesm/33-7
http://www.brightcopy.net/allen/aesm/33-5
http://www.brightcopy.net/allen/aesm/33-4
http://www.brightcopy.net/allen/aesm/33-3
http://www.brightcopy.net/allen/aesm/33-2
http://www.brightcopy.net/allen/aesm/33-1
http://www.brightcopy.net/allen/aesm/32-10
http://www.brightcopy.net/allen/aesm/32-12
http://www.brightcopy.net/allen/aesm/32-9
http://www.brightcopy.net/allen/aesm/32-11
http://www.brightcopy.net/allen/aesm/32-8
http://www.brightcopy.net/allen/aesm/32-7s
http://www.brightcopy.net/allen/aesm/32-7
http://www.brightcopy.net/allen/aesm/32-6
http://www.brightcopy.net/allen/aesm/32-5
http://www.brightcopy.net/allen/aesm/32-4
http://www.brightcopy.net/allen/aesm/32-3
http://www.brightcopy.net/allen/aesm/32-2
http://www.brightcopy.net/allen/aesm/32-1
http://www.brightcopy.net/allen/aesm/31-12
http://www.brightcopy.net/allen/aesm/31-11s
http://www.brightcopy.net/allen/aesm/31-11
http://www.brightcopy.net/allen/aesm/31-10
http://www.brightcopy.net/allen/aesm/31-9
http://www.brightcopy.net/allen/aesm/31-8
http://www.brightcopy.net/allen/aesm/31-7
https://www.nxtbookmedia.com