Low-Cost Virtual Training and Analysis: ATESS and Project Bluejay (RCAF Journal - FALL 2014 - Volume 3, Issue 4)

Alternate Formats

Table of contents


By Major Eric North, Second Lieutenant Ben Frans, Second Lieutenant Jolanta Matusewicz, Maria Correa, and Colton Harrison-Steel


The Aerospace Telecommunications Engineering Support Squadron (ATESS) is an organization within Canada’s Department of National Defence (DND). ATESS provides a wealth of engineering products and services to partner organizations within DND, including engineering analysis as well as design and production of aerospace equipment. Support was provided by ATESS during an operational test and evaluation (OT&E) of the CH146 Griffon helicopter in 2009 and again 2010 in partnership with the Canadian Forces Electronic Warfare Centre (CFEWC), Land Aviation Test and Evaluation Facility and the CH146 Weapon System Manager. Members of ATESS assisted with the design and installation of electrical and mechanical assemblies for the United States Department of Defense Advanced Range Data System (ARDS), a system that provides precise time-space-position information or flight-path data during aircraft operational trials. Flight-path data was required by exercise personnel to determine position and orientation of the aircraft during each flight, both from a safety and range-control perspective in addition to post-flight assessment of aircraft manoeuvres.

While on-board navigation systems provided aircrew with situational awareness during each mission, the feature to record and obtain flight-path data was not easily available. Flight-path recording systems included with ARDS were costly and required special arrangements to be made for use in Canadian aircraft and for use in a limited geographical area. Furthermore, a number of ground-based stations and terminals needed to be in functional condition and fully operational during trials.

Upon return to Canada, members from ATESS spoke with colleagues in CFEWC to assess the viability of a smaller, low-cost version of ARDS that can be installed in Royal Canadian Air Force (RCAF) aircraft for missions in both domestic and deployed operations.[1] Portions from the above review, along with discussions between team members regarding flight-path playback, resulted in the development of a prototype system in addition to a smaller, lighter demo system and a suite of software tools for logging and displaying flight-path data. The hardware for both the prototype and demo systems was self-powered, cost less than $5000, and operated independently of flight-data recording systems and data-links. The prototype was treated as a “modification-free” aircraft mission kit with the possibility for installation at a number of different flight-station positions on multiple aircraft, both in Canada and abroad. Hardware for the prototype and demo systems included the MTi-G by XSens: a small attitude and heading reference system (AHRS) and global positioning system (GPS) employing micro-electro-mechanical-system technology and capable of supporting applications for moving vehicles. The software suite consisted of a variety of commercial and open-source programs. Using open-source software, the authors implemented a toolset used to analyse data from several trajectories in support of the project. Using the methodology presented earlier, results from land-vehicle experiments from a GPS were used to validate sensor outputs of an integrated GPS / inertial measurement unit (IMU). This validation was necessary to ensure correct operation of the GPS/IMU during subsequent airborne experiments. Also included in the software suite was a set of modifications developed by the team to an open-source flight simulator called FlightGear. With this software, users could log trajectories, analyse their data, and then display the resultant flight paths in a virtual environment, complete with custom map overlays.

Applications for Bluejay include post-mission analysis of flight paths and regimes in several training environments such as RCAF pilot training, including phase I primary, phase II basic, and phase III helicopter/multi-engine. Other potential applications include: OT&E with applications in electronic warfare; flight-path characterization for use by Director Flight Safety in conducting accident investigations; engineering test and evaluation, review and refinement of aerobatic demonstrations; and airborne tactics as well as other manoeuvres.

The outline of this paper is as follows: The Experiments section provides a description of the hardware and software used in the project in addition to a description of the experiments; the Results and Discussion section presents the findings from experiments; and the Conclusions section wraps up the discussion and outlines future work.

Top of page


Two hardware variants were developed in support of this project: (1) a prototype consisting of a metal enclosure, sensors, processor and other components suitable for installation in aircraft and (2) a demonstration system consisting of portions of the internals of the prototype system in a much smaller, lighter configuration. The following sections provide details on both the prototype and demo systems for Project Bluejay.

Prototype System

Hardware for the prototype system for Bluejay consisted of position and attitude sensors in addition to a power source and miniature computer. An XSens MTi-G AHRS/GPS was used as the primary sensor for position and attitude, while a GlobalSat DG-100 GPS receiver provided the vehicle’s position as a secondary sensor to validate some of the outputs from the MTi-G during land-vehicle experiments. This equipment was housed in an enclosure suitable for use in aviation, with switches, circuit breakers, and connectors for internal and external power as well as provision for external antennas and communications with a host computer. Fasteners, construction material, wiring, switches, interconnects, and other components were selected based on their availability, compliance with aerospace standards as well as being safe-for-flight. The enclosure was constructed with a removable lid in addition to a small door to facilitate access to data storage ports and other connectors on the miniature computer. Owing to versatility of design, the enclosure can be mounted in an aircraft at various flight-station positions using tie-downs or an adapter plate affixed to either the cabin floor or other suitable structure. The prototype system is shown in Figures 1[2] and 2.

Figure 1 illustrates the layout of the workings of the prototypical hardware used in Bluejay. The larger, left-hand portion of the figure includes the airborne hardware and software components, including the Habley BIS 6620 computer, with links extending from it to power, memory, power control interface and the attitude heading reference system, called XSENS. This reference system includes a global positioning system receiver, which is linked to an antenna. A separate antenna also extends from the DG100 GPS logger. The smaller, right-hand portion of the figure includes the ground-based hardware, including the Ethernet connection, charger, and external power supply. These are connected to the airborne hardware via the power control and interface.  End figure 1.

Figure 1. Diagram of prototype hardware for Bluejay


Figure 2 provides a three-dimensional view of the parts of the prototype, as they would appear if being removed from the casing they are housed in. The cover is raised off, with the inner workings described in Figure 1 shown in the central portion of the figure. The bottom portion of the casing includes the switches and connection inputs where the ground-based components (not shown in the figure) would connect to the inner hardware through the casing. End Figure 2.

Figure 2. Exploded view of prototype enclosure.


Top of page

Demo system

A demonstration (demo) system was developed in response to requirements to conduct air vehicle tests using a small remote-control (RC) aircraft. Compared to the prototype, the demo system (shown in Figure 3[3]) used a much smaller power supply along with a smaller single-board computer (SBC). The SBC was equipped with a processor architecture that differed from the prototype system but was compatible with Debian-based Linux, the chosen operating system. Instead of being mounted inside an enclosure, the MTi-G and GPS antennas were mounted external to the aircraft on an adapter plate conforming to the upper camber of the wing. The other components, including SBC and power source, were installed using available space within the fuselage of the aircraft aft of the engine compartment. Upon completion of flights, a data transfer cable was connected to the SBC to transfer logged data to the host.

Figure 3 illustrates the Bluejay demonstration system hardware. The beagleboard is the largest section, encompassing the computer, memory and interface, with a ground Ethernet connection. The computer connects to a mobile power supply, which is subsequently connected to a second power supply linked to a ground external power source. The computer also connects via universal serial bus to an attitude heading reference system which includes a global positioning system fitted with an antenna. End Figure 3.

Figure 3. Demo system hardware for Bluejay


Top of page

Data collection

Software for Bluejay is divided into three categories: data collection, analysis and display. Software for data collection resided in an embedded computer in both the prototype and demo systems, running on a Linux kernel with low central processing unit overhead. It was designed to be flexible, script-based, with provision to support multiple sensors. The embedded computer can be configured to run in stand-alone mode in addition to being networked with a host for debugging and transfer of sensor logs. There is a time synchronization feature to enable all sensors and the computer to share a common coordinated universal time “time-stamp” available from the GPS of the MTi-G. Users have the ability to extract data from a secure digital memory card post-flight. Alternatively, a direct connection via Ethernet can be made to a host to transfer the logged data. A depiction of the data-flow sequence for Bluejay is shown in Figure 4[4].

Figure 4 portrays the data-flow sequence from Bluejay to the host computer. Data originates in Bluejay with the Attitude and Orientation Sensor Data. It is linked via serial bus to universal serial bus to the log file. Data from the log file then transfers from Bluejay to the host computer via an Ethernet connection. The first stage in the host computer is Smoothing, Orientation and Correction. The data then transfers to the final stage (Flight-Path Playback) via generic protocol. End Figure 4.

Figure 4. Data-flow sequence for Bluejay


Top of page

Analysis tools

Analysis software was written in SciLab and was used to post-process data from sensors. SciLab is cross-platform and open-source; it contains many useful features, including correlation, de‑noising, and provision for coordinate system transformation in addition to plotting and other analysis tools. The following paragraphs outline a portion of the tool-set developed by SciLab in support of Project Bluejay.

In the absence of a reference solution for attitude (pitch, roll, and yaw), data obtained from GPS can be used in a rudimentary and trivial fashion to estimate vehicle heading in addition to pitch.[5]

This calculation for pitch can be used to quickly assess performance of an AHRS/GPS such as the MTi‑G; however, low-pass filtering and/or moving average may be required to attenuate sensor noise from the GPS. In addition, it must be applied carefully, given the low update rate of the GPS along with assumptions, including absence of side-slip and other vehicle dynamics. The calculations can be used in certain situations for sign correction and estimation of values in pitch for course assessment in sensor quality, particularly for the MTi-G in confirming that an appropriate scenario for extended Kalman filtering is used. Other sensors employing integrated GPS/IMU with data fusion may also benefit from this simple calculation (e.g., in the absence of a reference solution and where parameters of the data fusion must be tuned, particularly when the quality of the attitude outputs is in doubt).

Depending on start-up configuration in addition to clock source for each sensor, the time-stamps of multiple sensors may not be properly aligned with respect to each other. One approach to determining the difference(s) between the time-stamps of multiple sensors is to visually compare like outputs of sensors using plotting tools. At least two attempts are then made to physically match certain features of these outputs, noting how far away each feature in the output of one sensor is with respect to the same feature in the output of the other sensor. This method is adequate if, for instance: (1) the number of sensors to be compared is small; (2) there exists a large degree of similarity in the outputs of a sensor with respect to the like outputs of the other sensors; (3) features in the output of each sensor are easily recognizable using visual cues such as steep transitions from one data point to the next; (4) imprecision in aligning time-stamps can be tolerated; and (5) the difference between time-stamps of two or more sensors remains constant over the entire data-set from each sensor for the given experiment.[6] In view of the above considerations, the authors opted to take a moderately robust approach available from the literature for digital filtering to determine the time lag between the MTi-G and DG-100 during land-vehicle experiments. There may be opportunity for this approach to be further generalized depending on the type of sensors used in a given application, noting that it may be of utility for certain designs involving sensor networks. By-products of the approach include: the ability to determine the similarity in time offsets across all data obtained from two sensors (i.e., similarity as a measure of the extant of variation in time offsets for all minimum root mean square [RMS] values); the number of occurrences of a given time offset; and the similarity of data between two sensors (i.e., the magnitude of all minimum RMS values as a method for determining similarity between data from both sensors).

Top of page

Display and after-action review

The display software allowed users to play back their trajectory data in a three-dimensional simulation. It ran on a host processor and was an adaptation of an open-source, platform-independent simulator known as FlightGear. Other types of software were assessed for purposes of playing back trajectory data such as an open-source animation tool called Blender as well as Google Earth, Microsoft Flight Sim X and X-Plane. Ultimately, FlightGear was chosen due to it being open-source and aviation-friendly with its large library of terrain, aerodromes, and aircraft. Additional features such as accurate positioning of aerodromes on the reference geoid, ability to connect to a weather server, and possibility for other forms of expansion made FlightGear a good choice for Bluejay. The software was ideal for this project as custom map overlays were created and superimposed over existing terrain with good alignment between raster images and FlightGear artefacts such as aerodromes.[7] Control input of the aircraft in FlightGear is normally done by an operator, but the software release has provisions for playing back flights, which was of interest for this project.

The interface between the log file (generated using Bluejay hardware) and FlightGear was straightforward. FlightGear allowed for creation of a custom input protocol also known as a generic protocol. A .XML file was written that related the headings in the log file produced in-flight with associated properties in FlightGear. The loading process commences before running FlightGear. Options are passed to the command line interface to turn off the default flight dynamics model. Other parameters were specified including playback speed, location of log file as well as selection of a .XML file containing the generic protocol. Once initialization was complete, a simulated aircraft followed the path and orientation of the recorded flight in time and space. Among other things, playback in FlightGear allowed the user to view the recorded flight from a variety of perspectives using various aircraft types selected at start-up.

A patch[8] was applied to the FlightGear source code to enable the loading of custom raster maps versus FlightGear’s procedurally generated default terrain. Prior to start-up, custom maps needed to be transformed into a format supported by the software. FlightGear provides support for image files known as tiles in a compressed .DDS image format. These tiles are not intrinsically geo-referenced; however, their filename is an index to the geo-referenced tile location. A process and a script were created to take the corner geodetic from the initial .TIF file and then crop, rename, compress, and transfer the resultant file to the appropriate folder in the FlightGear data directories. The process for converting custom map overlays for use in FlightGear is shown in Figure 5[9].

Top of page

Figure 5 shows the process for converting custom map overlays. It begins by using the Geospatial Abstraction Libraries. There are three steps at this stage: (1) the user’s geospatial-referenced map files are created, (2) These files are converted to geotiff format, and (3) the files are then transferred to the Python script, where they are renamed. Using the nvidia tool kit, the files are compressed to dot dds format. Finally, the files are moved to the proper Flight Gear folder. End Figure 5.

Figure 5. Process for converting custom map overlays


Land Experiments

While hardware for Bluejay was designed for airborne operation, trajectory tests needed to be conducted on-ground to assess output from sensors in addition to development of both hardware and software. Several land-vehicle experiments were conducted using the GlobalSat GPS (DG-100) and XSens GPS/IMU (Mti-G) in a configuration similar to the one shown in Figure 3. Once placed inside the vehicle with both GPS antennas having an unobstructed view of the sky, the sensors were connected to a laptop for configuration, data-capture, and post-processing. The benefit of land-vehicle testing is that it is considerably less expensive than experiments on a full-sized aircraft. Several locations close to 8 Wing Trenton were chosen for the land-vehicle experiments, including roads north of Highway 401. On-line elevation profiling software called VeloRoutes was used to assess trajectories for maximum change in elevation over short distances to obtain decent variations in pitch of the vehicle. Once suitable trajectories were identified, experiments using both the MTi-G and DG-100 were conducted along chosen trajectories; data was logged for post-processing in SciLab and eventual display in FlightGear. Results from land-vehicle experiments are available in the Conclusions section.

Air Experiments

A decision was made to conduct air experiments using a small RC aircraft in lieu of a full-scale aircraft. The small form-factor and low weight of the demo system for Bluejay was of great benefit during these experiments. A local RC flying club was selected as the location to conduct tests; flights were conducted over a number of calm and clear evenings to guarantee little to no interference from wind and other environmental effects. The RC aircraft was equipped with the Bluejay demo system in addition to a GPS-enabled, high-definition (HD) video camera. The HD video camera enabled comparison of each portion of the flight using the view from the cockpit in FlightGear during playback. Figures 6 and 7 depict experimental setup of the demo system.

Top of page

Figure 6 shows the Bluejay hardware being installed in a miniature remote-controlled airplane. End Figure 6.

Figure 6. Demo hardware during initial phase of installation


Figure 7 shows the remote-controlled airplane receiving some final adjustments just prior to takeoff. End Figure 7.

Figure 7. Demo hardware during final phase of installation


Results and Discussion


Data from the MTi-G and DG-100 used during land experiments was examined using the methodology above where the analysis tools are described. Location data obtained from the DG-100 was post-processed to provide a mock reference for azimuth and pitch, with pitch calculated using a variation of the equation listed in endnote 5. Alignment of data from both sensors is presented in Figure 8. Several features in each data-set are apparent:

  1. There is a noticeable gap in data from the DG-100 starting at approximately 2200 seconds for both data-sets: this was due to a feature of the DG-100 that prevented logging of data while the vehicle remained motionless for several minutes.
  2. While most of data-set 1 for both sensors appears to be closely matched, there were several portions of data-set 2, in particular between 1050 and 2050 seconds, where outputs from the MTi-G were not matched with pitch calculated using the DG-100 due in part to modelling errors using the equation in endnote 5.
  3. Of importance is that the authors were able to employ rudimentary time-delay estimation and were able to determine an appropriate time offset between sensors for both data-sets, despite taking a simplified approach to modelling pitch using data from the DG-100.

Top of page

Figure 8 includes two graphs showing flight data. In the left graph, separate lines represent the azimuth from sensors in degrees on the Y axis over the time in seconds on the X axis. A blue line records the calculated azimuth and a black line represents the measured azimuth. The second graph uses the same two line colours to depict the calculated pitch and measured pitch in degrees on the Y axis over time in seconds represented on the X axis. End Figure 8.

Figure 8. Data from sensors: (top) Azimuth from sensors; (bottom) Pitch from sensors with time correction applied.


Top of page


Upon successful completion of flights, data captured using the demo system was transferred to a host via Ethernet and post-processed using software, and this post-processed data was then passed to FlightGear for visual analysis. Simulated flight-paths in FlightGear were compared with a program showing video captured using the HD video camera during flights. Results from air experiments showing comparison of HD video versus simulated view using the FlightGear cockpit perspective are available in Figures 9, 10, and 11.

Figure 9 consists of two images. The one on the left is a natural view from the cockpit of the sky and landscape as the aircraft is engaged in a slight left roll. On the right is a simulated view of the same image. The clouds and the blue sky are more defined in the simulated view, but the landscape does not appear as focused as it is in the natural view with the colours of the trees and fields mostly grey and white. End Figure 9.

Figure 9. Comparison of HD video camera with view from cockpit in FlightGear. Aircraft is engaged in a slight turn to the left along with introduction of a small amount of left roll.


Figure 10 consists of two images. On the left is a natural view from the cockpit of the sky and the landscape as the aircraft is engaged in a sharp left roll. On the right is a simulated view of the same image. The clouds and the blue sky are more defined in the simulated view, but the landscape does not appear as focused as it is in the natural view, with no colour in the trees and fields. End Figure.

Figure 10. Comparison of HD video camera with view from cockpit in FlightGear. Aircraft engaged in left roll.


Figure 11 consists of two images. On the left is a natural view from the cockpit of the sky and the landscape as the aircraft is engaged in left roll that is larger than the left roll in Figure 10. On the right is a simulated view of the same image. The clouds and the blue sky are more defined in the simulated view, but the landscape does not appear as focused as it is in the natural view, with no colour in the trees and fields. End Figure 11.

Figure 11. Comparison of HD video camera with view from cockpit in FlightGear. Aircraft engaged in large left roll.


Top of page

The left-side in each of the above figures is a still image from an HD video taken during testing of the demo system on-board the RC aircraft. The right-side image is a screenshot of the FlightGear playback of the same flight. In this image, there is actually a higher-definition map overlaid on the custom topographic map to facilitate referencing of landmarks between both still images. The results were effective, as the blue roof of a nearby barn provided a good reference between all sets of images. There is also a very distinct cleared area that appears grey on both the video and simulated playback of the flight.

In addition to the cockpit-view comparison between recorded video and simulated playback, project members were able to exploit FlightGear’s after-action review capability by pausing, seeking, and rewinding portions of the simulated flight-path, in addition to increasing and decreasing the playback rate. Furthermore, multiple camera angles, points of view, and perspectives both inside and outside of the aircraft were possible, along with a “chase” view that greatly illustrated various phases of the simulated playback of flight paths. Combined with custom overlays, including topographic maps and detailed terrain, all of these features provided an impressive after-action review capability.


This paper presented work on a project called Bluejay, a collection of hardware and software for post-mission analysis of flight-path data. Hardware for this project was designed to be small, self-powered, and low cost and to operate independently of flight-data recording systems and data-links. Wherever possible, software in support of the project was selected on the basis of it being platform-independent, low-cost or preferably open-source, available for public distribution and in wide-spread use. This paper demonstrated that hardware for Bluejay could be treated as a “modification-free” aircraft mission kit with the possibility for installation at a number of different flight-station positions on multiple aircraft both in Canada and abroad. Two separate systems, namely a prototype and proof-of-concept system, were developed using low-cost components in addition to being scalable in physical size to permit sensors to be evaluated. Using open-source software, the authors implemented a tool set used to analyse data from several trajectories in support of the project. Using the methodology presented in the section discussing analysis tools, results from land-vehicle experiments from a GPS were used to validate sensor outputs of an integrated GPS/IMU. This validation was necessary to ensure correct operation of the GPS/IMU during subsequent airborne experiments. Results from experiments were presented using several open-source programs, including a modified version of an open-source flight simulator called FlightGear to demonstrate the capabilities of the system. Using Bluejay, the authors proposed that pilots in the RCAF along with operators in other militaries and various organizations will be able to use the system as an after-action review capability. Operators could use Bluejay to log their flights and then display resultant flight paths in a virtual environment complete with custom map overlays.

In regard to a stand-alone data-logging system, notwithstanding the fact that the demo hardware is quite small and able to be flown on a miniature RC aircraft, work is needed to determine to what extent the hardware can be miniaturized further along with reduction in per-unit costs. Follow-on discussions are required to determine the best location for installation in various aircraft as is a decision about keeping the hardware self-powered or being able to connect to aircraft power. In addition, consideration should be given with respect to optimum installation of supporting hardware such as antennas, data transfer, and other cables. In certain cases, it may be possible to integrate Bluejay hardware into existing mission kits with an aim of minimizing any potential impacts to operational and technical airworthiness.

Further work is needed to determine potential uses for Bluejay, including adaptation of software for electronic flight bags and tablet computers. Given the proliferation of these devices in various aviation markets, it seems reasonable to offer a “one-stop” solution for logging, analysing, and displaying flight paths, all on the same device. Many tablets are being fitted with gyroscopes, accelerometers, magnetometers, and GPS, which certainly facilitate the use of these devices in a data-logging role. In addition, certain tablets are equipped with antenna pass-through to improve quality and availability of GPS position-velocity-time data.

Future initiatives for Bluejay software can include the provision for multiple aircraft operating simultaneously during replay of flight paths, particularly for applications involving close-formation flying. Visual aids should be explored such as boxes and approach plates. Of note, FlightGear already possesses a glide-slope visualization that has potential to be quite useful for after-action review of approaches. Further potential enhancements include the representation of a functional cockpit using logged data being fed to instruments to display position, attitude and heading as well as a provision for a moving map during playback.

Top of page 

Major Eric North enrolled in the Canadian Armed Forces as an Aerospace Engineer in 1998.  He graduated from Royal Military College of Canada (RMCC), Kingston, Ontario, in May 2002, with a Bachelor of Engineering. In 2007, Major North commenced a Master of Applied Science in Electrical Engineering where he took courses at RMCC as well as a robotics course at Queen’s University. He completed his postgraduate studies in May 2009 with a thesis in navigation and instrumentation. He was subsequently posted to ATESS at 8 Wing Trenton. Major North is presently the Commanding Officer at 14 Software Engineering Squadron at 14 Wing Greenwood, Nova Scotia.

Second Lieutenant Ben Frans joined the RCAF as an Avionics Technician in 2003. In 2012 He completed a degree in Electrical Engineering at Royal Military College of Canada. He is now continuing training as an Aerospace Engineering Officer. His interests include synthetic environment motion tracking, augmented reality, virtual reality, tele-presence, and windsurfing.

Second Lieutenant Jolanta Matusewicz completed a master’s degree in Aerospace Engineering from the University of Texas at Arlington in 2006. She joined the RCAF as an Aerospace Engineering Officer in 2012 and was employed at ATESS as part of her occupation training. She is presently attending second language training at the 8 Wing Language School in Trenton, Ontario.

Maria Correa received her Bachelor of Science degree in Electronic Engineering from Universidad Distrital (Bogotá, Colombia) in 1991. She has extensive experience in various areas of the avionics field in maintenance, repair as well as research and development applied to commercial and military aircraft.  Since 2000, Ms. Correa has been a design engineer and project officer at ATESS.

Colton Harrison-Steel is enrolled in Mechanical and Materials Engineering at University of Western Ontario. Mr. Harrison-Steel was employed at ATESS on multiple engagements in support of the successful completion of his degree.

Top of page


AHRS―attitude and heading reference system

ARDS―Advanced Range Data System

ATESS―Aerospace Telecommunications Engineering Support Squadron

CFEWC―Canadian Forces Electronic Warfare Centre


DND―Department of National Defence

GDAL―Geospatial Data Abstraction Library

GPS―global positioning system

HD―high definition

IMU―inertial measurement unit

OT&E―operational test and evaluation

RCAF―Royal Canadian Air Force

RC―remote control

RMCC―Royal Military College of Canada

RMS―root mean square

SBC―single-board computer

USB―universal serial bus

Top of page


1. Earlier initiatives in navigation and instrumentation were reviewed during the preparation of this paper to obtain a better understanding of various techniques for estimating the position and orientation of moving vehicles, notably the work presented in J. Georgy et al., “Low-cost Three-dimensional Navigation Solution for RISS/GPS Integration Using Mixture Particle Filter,” IEEE Transactions on Vehicular Technology 59, no. 2 (February 2010): 599–615; U. Iqbal et al., “Experimental Results on an Integrated GPS and Multisensor System for Land Vehicle Positioning,” International Journal of Navigation and Observation Volume 2009; and E. North et al., “Improved Inertial/Odometry/GPS Positioning of Wheeled Robots even in GPS-denied Environments,” InTech (February 2012). Additional sources were reviewed in regard to experimental set-up for navigation and instrumentation of aerospace vehicles including: C. Cutright and M. Braasch, “GPS and INS Flight Test Instrumentation of a Fully Aerobatic Turbojet Aircraft,” IEEE Aerospace Conference Proceedings 3 (2002); Z. J. Huang and J. C. Fang, “Integration of MEMS Inertial Sensor-based GNC of a UAV,” International Journal of Information Technology 11, no. 10 (2005); and D-H. Hwang et al., “Design of a Low-cost Attitude Determination GPS/INS Integrated Navigation System,” GPS Solutions 9, no. 4 (2005): 294–311. Modelling and simulation with applications for capability modernization were also considered during the preparation of this paper through J. Landolt and J. Evans, “R&D Initiatives in Modelling and Simulation for Capability Modernization of the Canadian Air Force,” Canadian Military Journal (Spring 2001): 37–42; along with an appreciation for DND’s planned direction towards increased use of simulation technology as seen in J. L. D. Lachance, Projecting Power: Alternative Futures for Canada’s Air Force in 2020 (Trenton, ON: Canadian Forces Aerospace Warfare Centre, 2010); and K. Truss, “Canada’s Air Synthetic Environment Centre: Enabling Force Transformation,” The Royal Canadian Air Force Journal 1, no. 3 (2012): 61–63. Motivation for this project came in part from prior work conducted by Major Adam Cybanski, a member of the Directorate of Flight Safety, in regard to flight-path recording and visualization.  (return)

2. Colours used in Figure 1 represent the components that are physically packaged together in the next higher assembly.  (return)       

3. Colours used in Figure 3 represent the components that are physically packaged together in the next higher assembly.   (return)

4. Colours used in Figure 4 represent the components that are physically packaged together in the next higher assembly.   (return)

5. The calculation for pitch is: ρ (k) = tan-1(∆h/∆d) where: ρ (k) represents pitch at sample k, ∆h = h (k) − h (k − 1), representing a change in altitude between the present and previous sample, and ∆d = d (k) − d (k − 1), representing a change in distance between the present and previous sample.  (return)

6. Other methods employing digital filtering may be preferred to the above: there are several instances in the literature mentioning systems and processes for time delay estimation of signals. C. Y. Wuu and A. Pearson, “On Time Delay Estimation Involving Received Signals,” IEEE Transactions on Acoustics, Speech, and Signal Processing 32, no. 4 (1984): 828–35.  (return)

7. FlightGear source code in several versions was readily available for download, accessed September 26, 2014, http://www.flightgear.org, making it convenient to work with.  (return)

8. Created by B. Laniel. Brest photo scenery: FlightGear patch for overlaying raster images in simulated environments, accessed September 26, 2014, http://wiki.flightgear.org/ photoscenery.  (return)

9. Colours used in Figure 5 indicate the differing pieces of software.  (return)

Top of page


Table of contents

Date modified: