A sensor correction equation developed to improve the SRS distance results in bad circumstances in which high gradients of temperature or relative humidity exist. The same work was done for the IRS. The SRS and IRS were coupled and utilized for the UAS attitude estimation and obstacle detection during the landing procedure. The distance was acquired from roll and pitch axes through four sensors (Figure 1). By using the distances between the UAS and the ground (fixed surfaces) from the four sensor positions, it is possible to extract the attitude (Figure 2). After IRS and SRS coupling, it was possible to integrate their measures. For the integration, a variance minimization approach and successively a Kalman filter were utilized. The variance minimization was done by considering xˆ = xIR + w ( xSR − xIR ) where the subscripts SR and IR refer, respectively, to the SRSs and IRSs and w is a weight coefficient. The OPS was a raspberry camera module. The OPS completes the PERSEO platform, enhancing the UAS distance and obstacle detection during the landing procedure. In addition, in this case, a quadrotor was used as a test bench. The vision-based landing system was based on the processing of a single image, using the space resection method (SRM). The goal was to detect the camera orientation parameters (position and attitude) in a fixed reference system. The realization of the reference system was obtained from a special landing pattern (Figure 2). The determination of full orientation assumes the availability of enough information in the object space. The mapping of a point in the object space x to a point in the camera space xʹ (expressed in homogenous coordinates) was fully described by a projection matrix P [4]: (3) x′ = Px wˆ = σ IR2 2 σ + σ SR 2 IR Figure 1. Attitude estimation from sensor distance acquisition. JULY 2017 (4) (5) The inspection highlighted that the methodology proposed is able to achieve high precision (on a subcentimeter scale) during the landing approach phases (for a distance from the ground of about 2 m). This performance is not achievable using a low-cost position device like a global navigation satellite system (GNSS) Figure 2. Landing pattern used in the vision-based landing system. IEEE A&E SYSTEMS MAGAZINE 59