Aerospace and Electronic Systems Magazine March 2017 - 60

Autonomous Quadrotor for Accurate Positioning
To perform the landing, fuzzy logic rules were used to calculate velocity setpoints and another ROS package, ar_track_alvar,
to find markers on the ground and locate the landing point. This
procedure is different from commercially available UAVs, like DJI
products, for instance, which do not natively allow offboard control and present their proprietary control algorithms.
The vision algorithms need high computational processing. For
that task, an ODROID U3 platform was used, which is an embedded
system with ARM architecture based on a quadcore 1.7 GHz processor.
The ODROID has as its operating system a Linux platform allowing
the installation of ROS, providing interoperability with the Pixhawk.
The use of the ARM architecture by the ODROID aims to reduce operational costs, reduce heating problems in the board and
improve its power consumption [2]. The communication between
the Pixhawk and the ODROID is accomplished by using the Mavlink protocol [3], through a UART (universal asynchronous receiver/transmitter controller) serial port communication. However,
as this is not the main goal of this paper, this manuscript will aim
in explaining the procedures to obtain a high-accuracy autonomous
flying robot using PX4FLOW.
There are three usual configurations for the embedded control
systems [3]: Embedded computer with ROS connected to the controller board using UART or USB connection. In this mode, the companion computer works as a Mavlink interface between the ground
station (GS) and the Pixhawk. On the companion computer, routines
must be run, which can be written in C++ other standard high-level
languages and use serial communication protocols and Mavlink libraries. The route processing and all the mission commands are sent
through the wireless communication link and interpreted in Mavlink messages by the companion computer. In this configuration, the
processing capacity of the onboard computer does not need to be
high; however, the data communication link needs to have sufficient
bandwidth if computer vision solutions are used.
Radio connection between the GS and the Pixhawk: In this
method, data flows through a radio link between the GS and the
controller. However, ROS is not used. Thus, the low-level access
provided by ROS, jointly with the Mavlink protocol, must be built
for the interaction between the systems to occur.
Companion computer + ROS + Wi-Fi link: This configuration
generally provides more flexibility, as the use of ROS simplifies the
task of accessing the low level of the hardware involved, allowing
the use of several path planning libraries and offboard control algorithms. The embedded processing system is used in the image processing tasks of the navigation system, reducing the need for data
bandwidth. Due to a lesser need for bandwidth, the communication
link can work at lower frequencies, increasing the operation distance.
None of the configurations described fully meets the project
requirements for the AUAV developed in this work. The aerial
vehicle was developed to operate in regions distant from the operation base, making the communication via Wi-Fi impracticable.
Thus, a hybrid configuration was implemented using advantages
of each one of the classical architectures.
Primarily, the flight mode acts as the automission mode with
the setpoints previously set; the velocity, attitude, and altitude
can be predetermined using Mavlink messages during the premission stage. When the aircraft begins the landing procedure, the
60

ODROID takes mission control operation with the offboard mode.
The route to the landing point will be calculated by the vision algorithm based on fuzzy logic metrics responsible for tracking the
landing marker. Therefore, the AUAV can perform landing in areas
that will be different from the home position of the flight.
Data exchange between the Pixhawk and the companion
computer is managed by MAVROS. The communication between the Pixhawk and the GS was supported by the Mavlink
protocol as well. The functionalities of message exchanging provided by the MAVROS package were the base for creating the
control and configuration algorithms developed for the AUAV
built in this work.

OPTICAL FLOW FOR DATA POSITION ENHANCEMENT
The OF technique is based in image analysis through integration
and differentiation processes over pixels movement patterns between two input images [3]. It is important to note that to acquire
and process images it is necessary to ensure low computational
costs, since robotic applications such as these have to provide precise, fast responses to robot movements.
In order to make it possible for the image-processing unit to
estimate system position, it should be programmed taking into
consideration the optical sensor hardware characteristics, such as
lenses focal distance and its curvature, its field of view, and its
rotation angle relative to the aircraft. Those and other aspects are
calculated by the flight controller by the equation:


0 u/Z
 u    fˆ / Z
   
ˆ
v
   0 f /Z v/Z


 vx 
 
 vy 
2

ˆ
ˆ
ˆ
uv / f  f  u / f v  vz 
 
   x 
ˆf  v 2 / Z
ˆ
uv / f
u 
 
 y
 
 z





(1)

where the pixel coordinates (u, v) are relative to the principal point
(u0,v0), which is the coordinate of the point where the optical axis
intersects the image plane. fˆ is the normalized camera focus related
with the pixel size, Z is approximately the distance from the camera
to the point observed, and υ=(vx, vy, vz, ωx, ωy, ωz)T is the velocity
vector that comprises the spatial and angular camera velocities.
The PX4FLOW used in this project is equipped with a camera
of 752x480 resolution pixels and a focal distance of 16 mm. The
processor is a Cortex MF equipped with IMUs to reduce movement ambiguity. Either a SONAR or a LIDAR can be used to
increase the precision in operations. However, due to the use of
vision algorithms and fuzzy logic controller, those sensors were
proven to be unnecessary.
Using only image processing it is not possible to determine
whether the robot is moving down, in Z axis negative direction,
or the plane is moving up, in Z axis positive direction, or both.
Because of this ambiguity, it is necessary to use an IMU precisely
calibrated in accordance to flight controller sensors.
Therefore the pixel velocity at X and Y direction related to the
optical sensor can be estimated by consecutive images using (1)
and this can be correlated with the UAV position.

IEEE A&E SYSTEMS MAGAZINE

MARCH 2017



Table of Contents for the Digital Edition of Aerospace and Electronic Systems Magazine March 2017

No label
Aerospace and Electronic Systems Magazine March 2017 - No label
Aerospace and Electronic Systems Magazine March 2017 - Cover2
Aerospace and Electronic Systems Magazine March 2017 - 1
Aerospace and Electronic Systems Magazine March 2017 - 2
Aerospace and Electronic Systems Magazine March 2017 - 3
Aerospace and Electronic Systems Magazine March 2017 - 4
Aerospace and Electronic Systems Magazine March 2017 - 5
Aerospace and Electronic Systems Magazine March 2017 - 6
Aerospace and Electronic Systems Magazine March 2017 - 7
Aerospace and Electronic Systems Magazine March 2017 - 8
Aerospace and Electronic Systems Magazine March 2017 - 9
Aerospace and Electronic Systems Magazine March 2017 - 10
Aerospace and Electronic Systems Magazine March 2017 - 11
Aerospace and Electronic Systems Magazine March 2017 - 12
Aerospace and Electronic Systems Magazine March 2017 - 13
Aerospace and Electronic Systems Magazine March 2017 - 14
Aerospace and Electronic Systems Magazine March 2017 - 15
Aerospace and Electronic Systems Magazine March 2017 - 16
Aerospace and Electronic Systems Magazine March 2017 - 17
Aerospace and Electronic Systems Magazine March 2017 - 18
Aerospace and Electronic Systems Magazine March 2017 - 19
Aerospace and Electronic Systems Magazine March 2017 - 20
Aerospace and Electronic Systems Magazine March 2017 - 21
Aerospace and Electronic Systems Magazine March 2017 - 22
Aerospace and Electronic Systems Magazine March 2017 - 23
Aerospace and Electronic Systems Magazine March 2017 - 24
Aerospace and Electronic Systems Magazine March 2017 - 25
Aerospace and Electronic Systems Magazine March 2017 - 26
Aerospace and Electronic Systems Magazine March 2017 - 27
Aerospace and Electronic Systems Magazine March 2017 - 28
Aerospace and Electronic Systems Magazine March 2017 - 29
Aerospace and Electronic Systems Magazine March 2017 - 30
Aerospace and Electronic Systems Magazine March 2017 - 31
Aerospace and Electronic Systems Magazine March 2017 - 32
Aerospace and Electronic Systems Magazine March 2017 - 33
Aerospace and Electronic Systems Magazine March 2017 - 34
Aerospace and Electronic Systems Magazine March 2017 - 35
Aerospace and Electronic Systems Magazine March 2017 - 36
Aerospace and Electronic Systems Magazine March 2017 - 37
Aerospace and Electronic Systems Magazine March 2017 - 38
Aerospace and Electronic Systems Magazine March 2017 - 39
Aerospace and Electronic Systems Magazine March 2017 - 40
Aerospace and Electronic Systems Magazine March 2017 - 41
Aerospace and Electronic Systems Magazine March 2017 - 42
Aerospace and Electronic Systems Magazine March 2017 - 43
Aerospace and Electronic Systems Magazine March 2017 - 44
Aerospace and Electronic Systems Magazine March 2017 - 45
Aerospace and Electronic Systems Magazine March 2017 - 46
Aerospace and Electronic Systems Magazine March 2017 - 47
Aerospace and Electronic Systems Magazine March 2017 - 48
Aerospace and Electronic Systems Magazine March 2017 - 49
Aerospace and Electronic Systems Magazine March 2017 - 50
Aerospace and Electronic Systems Magazine March 2017 - 51
Aerospace and Electronic Systems Magazine March 2017 - 52
Aerospace and Electronic Systems Magazine March 2017 - 53
Aerospace and Electronic Systems Magazine March 2017 - 54
Aerospace and Electronic Systems Magazine March 2017 - 55
Aerospace and Electronic Systems Magazine March 2017 - 56
Aerospace and Electronic Systems Magazine March 2017 - 57
Aerospace and Electronic Systems Magazine March 2017 - 58
Aerospace and Electronic Systems Magazine March 2017 - 59
Aerospace and Electronic Systems Magazine March 2017 - 60
Aerospace and Electronic Systems Magazine March 2017 - 61
Aerospace and Electronic Systems Magazine March 2017 - 62
Aerospace and Electronic Systems Magazine March 2017 - 63
Aerospace and Electronic Systems Magazine March 2017 - 64
Aerospace and Electronic Systems Magazine March 2017 - Cover3
Aerospace and Electronic Systems Magazine March 2017 - Cover4
http://www.brightcopy.net/allen/aesm/34-2s
http://www.brightcopy.net/allen/aesm/34-2
http://www.brightcopy.net/allen/aesm/34-1
http://www.brightcopy.net/allen/aesm/33-12
http://www.brightcopy.net/allen/aesm/33-11
http://www.brightcopy.net/allen/aesm/33-10
http://www.brightcopy.net/allen/aesm/33-09
http://www.brightcopy.net/allen/aesm/33-8
http://www.brightcopy.net/allen/aesm/33-7
http://www.brightcopy.net/allen/aesm/33-5
http://www.brightcopy.net/allen/aesm/33-4
http://www.brightcopy.net/allen/aesm/33-3
http://www.brightcopy.net/allen/aesm/33-2
http://www.brightcopy.net/allen/aesm/33-1
http://www.brightcopy.net/allen/aesm/32-10
http://www.brightcopy.net/allen/aesm/32-12
http://www.brightcopy.net/allen/aesm/32-9
http://www.brightcopy.net/allen/aesm/32-11
http://www.brightcopy.net/allen/aesm/32-8
http://www.brightcopy.net/allen/aesm/32-7s
http://www.brightcopy.net/allen/aesm/32-7
http://www.brightcopy.net/allen/aesm/32-6
http://www.brightcopy.net/allen/aesm/32-5
http://www.brightcopy.net/allen/aesm/32-4
http://www.brightcopy.net/allen/aesm/32-3
http://www.brightcopy.net/allen/aesm/32-2
http://www.brightcopy.net/allen/aesm/32-1
http://www.brightcopy.net/allen/aesm/31-12
http://www.brightcopy.net/allen/aesm/31-11s
http://www.brightcopy.net/allen/aesm/31-11
http://www.brightcopy.net/allen/aesm/31-10
http://www.brightcopy.net/allen/aesm/31-9
http://www.brightcopy.net/allen/aesm/31-8
http://www.brightcopy.net/allen/aesm/31-7
https://www.nxtbookmedia.com