Instantaneous Position and Orientation of the Body
Segments as an Arbitrary Object in 3D Space by
Merging Gyroscope and Accelerometer Information
J.A. BarrazaMadrigal^{*} R. MuñozGuerrero^{*} L. LeijaSalas^{*} R. Ranta^{**} ^{*} Centro de Investigación y de Estudios Avanzados (CINVESTAVIPN). ^{**} Centre de Recherche en Automatique de Nancy (CRANCNRS) 
Keywords: instantaneous orientation, direction cosine matrix, quaternions, inertial measurement unit, offset and drifting deviation. 
Correspondencia: 
Palabras clave: orientación instantánea, matriz de cosenos directores, cuaterniones, unidad inercial de medida, desviación de offset y deriva. 
Introduction
Joints evaluation is very important in orthopedic, traumatology and rheumatology for assessing diagnosis, prognostic, therapeutic and research as well as the design and fabrication of measuring devices used in surgical instrumentation, prostheses and orthesis [1].
The kinematics of joint evaluation requires the determination of both the position and orientation of the body segments involved; It is normally modelled as rigid bodies which is related to an inertial reference system [2]. This is commonly achieved by tracking the movement of a group of reflective point markers attached to the required body segment of interest, this using optical motion systems [3]. Nonetheless, this system is expensive and cannot also be used outside the laboratory, since it requires a specific and controlled environment.
During the last decade, body mounted sensors such as gyroscopes and/or accelerometers also known as inertial sensors have been used to obtain kinematic values, offering the advantage of identifying human motion in a wide variety of environments due their low cost, small size and low power consumption [4]. However, inertial sensors do not measure the position or orientation of the body segment directly but the physical quantities which are related to the motion of the objects where the sensors are fixed [5]. On one hand, gyroscope measures angular velocity which if initial conditions were known, information may be integrated over time to compute the sensors orientation [6] [7]. In theory, both the position and orientation of an arbitrary object or body segment can be estimated by integrating the angular rate data obtained from a gyroscope sensor attached herewith. Although this data can be translated into meaningful three dimensional information, it was reported that the integration of any amount of signal noise in the angular velocity data will cause an accumulative error in the estimated orientation also known as a drifting deviation [8]. On the other hand, the accelerometer measures the earth’s gravitation and provides an absolute reference orientation allowing it to measure the angle of the sensor unit in respect to gravity [9], [10]. This method is appropriate if magnitude of acceleration is neglected with respect to gravity, and less accurate when accelerations due to motion are affected by measured direction of gravity. Furthermore, accelerometer signals do not contain information about the rotation around the vertical axis and therefore do not give a complete description of the orientation [11].
Hence to compute a single estimate of orientation through the optimal fusion of accelerometer and gyroscope sensors is needed [12]. Recently, many studies of human motion based on inertial sensors have proposed different methods to combine gyroscope and accelerometer sensors information for kinematic joint evaluation. Dejnabadi et.al [4] proposed a method of measuring uniaxial joint angles using a combination of accelerometers and gyroscopes. The model was based on the acceleration of the joint center rotation mathematically by shifting the location of the physical sensors. Joint angles were computed by estimating absolute angles without any integration of sensors information. Luinge et.al [13] described a method that used constraints in the elbow to measure the orientation of the lower arm in respect to the upper arm. Accelerometer and gyroscope orientations were achieved by using rotational matrixes. A calibration method to determine the orientation of each sensor was determinant in the accuracy of the orientation measured. Zhang et.al [14] used a particle filter to fuse gyroscope and accelerometer information, compensating drifting deviation and also improving the estimation accuracy. Additionally, quaternions are used to represent orientations of upper limb segments thereby avoiding singularities. Tadano et.al [8] proposed a method of three dimensional gait analysis using wearable sensors and quaternion calculations. Accelerometer information was used to estimate initial orientation of the sensors. Gyroscope information was used to estimate angular displacements during gait, by integrating angular velocity. Orientations of the sensors were converted to the orientations of the body segment by a rotational matrix obtained from a calibration trial. Favre et.al [15] proposed 2 different methods to fuse gyroscope and accelerometer information; Quaternionbased integration of angular velocity and orientation correction using gravity. Both methods are based on recognizing characteristic samples which provide vertical orientation of the sensor.
This work presents an algorithm to determine both the position and orientation of an arbitrary object in 3D space, oriented to be used in kinematic joint evaluation. The proposed algorithm combines both gyroscope and accelerometer information, thereby establishing two suitable coordinate systems describing the object movement in respect to a coordinate reference system. Both the position and orientation were achieved by estimating the spatial relationship between the coordinate systems through a Direction Cosine Matrix (DCM) computed from the combination of three consecutive rotations, around each to the main axis of the evaluated system using quaternions. A PI controller feedback (PI) is used to apply the amount of rotation that is needed to align the estimated orientation by both gyroscope and accelerometer information, forcing them to converge with each other proportionally, eliminating drifting deviation and improving also the accuracy by decreasing noise response.
METHODOLOGY
A combination of 3 axes gyroscope and 3 axes accelerometer was used in order to estimate both the position and orientation of an arbitrary object in 3D space. Gyroscope information (G_{x},G_{y},G_{z}) was used to define a coordinate system describing object movement (O_{mov}) through gyroscope angular rate (ω_{x},ω_{y},ω_{z}). Accelerometer information (A_{x},A_{y},A_{z}) was used to establish an absolute coordinate reference (O_{ref}) based on the acceleration of gravity (λ) in order to compensate drifting deviation given by the integration of gyroscope measurement errors over time.
Sensors and data acquisition
Gyroscope and Accelerometer information were obtained from an IMU (Invensense MPU6050) configured at its lowest feature rates [16], a full scale range of ±250^{o}/s and ±2g with a bandwidth of 256Hz and 260Hz respectively. Full scale range parameters and bandwidth were defined considering the common values reported in kinematic joint evaluation; gyroscope from ±200^{o}/s to ±600^{o}/s, accelerometer from ±2g to ±5g and bandwidth from 2Hz to 200Hz [4], [8], [11], [15], [17], [18]. A 16bit Digital Signal Controller Microchip (Dspic  30F6014A) was used to perform the optimal fusion of gyroscope and accelerometer information and to implement an algorithm for assessing the orientation of an object in 3D space. Communication between IMU and Dspic was established through the InterIntegrated Circuit serial protocol (I^{2}C) at 400kHz which is the standard operating frequency. The response of the implemented algorithm was assessed through a virtual model designed on Vrealm builder 2.0. Communication between Dspic and virtual environment was achieved by using MATLAB R2013a version.
Orientation estimate
Orientation estimate problem involves two coordinate systems; one attached to the arbitrary object describing its movement O_{mov} and the other located at some point to the earth but not attached to it establishing a reference frame O_{ref}. Since the coordinates of a vector depend on the frame it is represented in, an arbitrary vector can be represented in the rotation frame. Consequently these two frames can rotate relatively to each other independently [19]. This rotation is estimated by a 4D complex number also called quaternion (Q) which is conformed by an scalar (q_{0}) and a 3D vector (q) with q = [q_{1},q_{2},q_{3}], Eq.(1).
 (1) 
The rotational vector Q is associated with the sine (S) and the cosine (C) of the rotation angle σ = [α,ϕ,θ], over each of the main axis (x,y,z) to the evaluated coordinate system in time, Eq.(2).
 (2) 
Quaternions can be composed to express several rotations through a quaternion product Q_{comp}, Eq.(3). Q_{comp} can also be used to express the rotation on a 3D space, Eq.(4).
Q_{comp} = Q_{a} ⊗ Q_{b} = (q_{a0},q_{a}) ⊗ (q_{b0},q_{b})  (3)  
= {q_{a0}q_{b0}  q_{a}q_{b},q_{a} × q_{b} + q_{a0}q_{b} + q_{b0}q_{a}} 
 (4) 
To transform vectors from one coordinate system to another using Eq.(4) a Direction Cosine Matrix (DCM) is performed as shown in Eq.(5), which is used to establish the spatial relationship between coordinate systems O_{mov} and O_{ref} which allows estimating the orientation [20], [21].
 (5) 
When expanded this yields

Direction cosine matrix improvement
To estimate the orientation of an object in 3D space, DCM is updated by incremental changes when either O_{mov} or O_{ref} rotates. These incremental changes can be established by integrating DCM as shown in Eq.(6).
 (6) 
Because DCM is conformed from Q_{comp}, Eq.(4) it is noted that by expressing elements of Q_{comp} as approximations rather than identities, Q_{comp} can be reduced to Q_{red}, Eq.(7).
 (7) 
For assessing DCM improvement in order to estimate orientation, incremental changes were estimated from the integration of Q_{red} over time, improving Q_{red} as shown in Eq.(8).
 (8) 
When expanded this yields:
It should be noted that numerical errors in the approximations will reduce orthogonality between rotational axes of the coordinated systems [22]. Consequently, Q_{imp} is normalized before being substituted in Eq.(5) in order to perform an improved DCM, Eq.(9).
 (9) 
Coordinate system that describes the object movement (O_{mov})
Gyroscope output (ω_{x},ω_{y},ω_{z}) is used as the main source of information in order to define the coordinate system that describes object movement. Then O_{mov} has its origin in the triaxial gyroscope and each point along the gyroscope axes (G_{x},G_{y},G_{z}). Therefore, by relating the angular rate of change along time (Δt) to its rotation during the movement of the IMU, a single estimation of gyroscope orientation (σ_{gyro}) can be determined, Eq.(10).
 (10) 
Where ω_{off} is the intrinsic offset of the gyroscope sensor, estimated only at the beginning of the test by computing the mean value of the total number of the recorded data in one second in static position.
It is noted that σ_{gyro} could be used to estimate both the position and orientation of an arbitrary object or body segment by conforming DCM. However, constructing DCM only from σ_{gyro} is not only inefficient but also inconvenient, considering that any amount of noise in the angular velocity data will result in integration errors and consequently, an accumulative gyro drift in DCM. Hence, an additional source of information is needed to assess drifting compensation.
Drifting estimation
Accelerometer information (λ_{x},λ_{y},λ_{z}), defines the coordinate reference system which is used to assess drifting deviation. Therefore, O_{ref} has its origin at the triaxial accelerometer and each point along accelerometer axes (A_{x},A_{y},A_{z}). Assuming that the accelerometer is only measuring the acceleration of gravity, then an appropriate convention would be to assume that the direction of gravity ĝ = [0,0,1] defines the vertical axis z [12], [23]. Thus the inertial directional vector can be computed from the normalized accelerometer measurements Eq.(11).
 (11) 
Given a DCM estimation from gyroscope measurement, a gravity vector is determined as Eq.(12).
 (12) 
Finally considering that the magnitude of cross product between two vectors is proportional to the sine of the angle between them and its direction can be perpendicular to both of them [24], drifting deviation detected by taking cross product between vector and , presenting both an axis and an amount of rotation that are needed to rotate to become parallel to the estimated vector σ_{gyro}, Eq.(13).
 (13) 
Data integration
Slow movements are related to low noise levels [25]. Hence, drifting deviation can be corrected by adding the total amount of rotation σ_{accel} to the estimated orientation σ_{gyro}, in order to align the coordinate systems that they represent, Eq.(14).
 (14) 
Nonetheless σ_{accel} is susceptible to high noise level due to variations in velocity during movement. Therefore, a half portion of σ_{accel}, computed by multiplying σ_{accel} by a weighed ω_{σ}, is used as a feedback to a Proportional Integral (PI) controller Eq.(15) before being added to σ_{gyro} to obtain σ Eq.(14). Hence, it is possible to apply a proportional the amount of rotation needed to align O_{mov} and O_{ref}, instead of applying the total correction. Consequently, both coordinate systems are forced to converge proportionally thereby cancelling drifting.
 (15) 
Proportional and Integral parameters (Kp&Ki) were established based on The Good Gain method for PI (D) controller tuning reported in studies [26], [27].
RESULTS AND DISSCUSION
Proposed algorithms
Gyroscope and accelerometer information were used in order to establish two suitable coordinate systems describing both object movement O_{mov} and reference coordinate system O_{ref}. The description of an object in 3D space was assessed by estimating the spatial relationship between coordinate systems. Commonly, this relationship is given by both the position and orientation of the object in respect to a reference system. However, assuming that both systems overlapped at the origin, it is said that there is no change of position between them, [24]. Therefore, an arbitrary object in 3D space can be described by its orientation. This orientation is determined through a DCM computed from the combination of three consecutive rotations around each of the main axis of the evaluated system using quaternions. Fig. 1, shows a block diagram representing the proposed algorithms.
In Fig. 1, it can be observed that two different solutions proposed to estimate the amount of rotation σ, from which quaternion Q is computed defines the amount of rotation in 3D space. 1. Estimation of σ by combining σ_{gyro} and σ_{accel} without any reduction of noise (σ_{gyro}&σ_{accel}). 2. Estimation of σ by combining σ_{gyro} and σ_{accel} information using a PI feedback controller, (σ_{gyro}&σ_{accel})&PI. It is noted that Q is used to perform DCM from which the orientation of an object in 3D space is defined. It is noteworthy that two different solutions were proposed to estimate DCM; 1. Estimation of DCM from Q_{comp} which defines the orientation from DCM product. 2. Estimation of DCM from Q_{imp} which defines the orientation through the quaternion product.
Fig. 1. Estimation of Q from σ as (σ_{gyro}&σ_{accel}) without any reduction of noise., estimation of Q from σ as (σ_{gyro}&σ_{accel})&PI by using a PI feedback controller., estimation of DCM from Q_{comp}., estimation of DCM from Q_{imp}.
Estimating differences between proposed algorithms
In order to estimate differences between proposed algorithms, the analysis of behavior of the sensor is carried out by estimating the orientation in both static position and during movement. Hence representing the orientation as Euler angles computed from DCM Eq.(16).
α = atan2(DCM_{23},DCM_{33})  
ϕ = asin(DCM_{13})  (16)  
θ = atan2(DCM_{12},DCM_{11}) 
Behavior analysis of the estimated orientation in static position
For analysis of behavior in static position five minutes of the obtained information were recorded, which allows the comparison of drifting deviation over time. It should be noted that to assess the behavior of the proposed algorithms, the estimation of the orientation using either σ_{gyro} or σ_{accel} is included to the analysis. A comparison is made by using sensors information separately and also by fusion sensors information through the proposed algorithms.
In figure 2, it can be observed that estimating orientation from σ_{gyro} only seems not to be so noisy but drifts over time. On the other hand, estimating orientation from σ_{accel} does not drift over time but seems to be quite noisy especially when compared to σ_{gyro}. When analyzing the estimated orientation either through (σ_{gyro}&σ_{accel}) or (σ_{gyro}&σ_{accel})&PI, drifting deviation appears to be cancelled and the noise reduced. Despite the fact there seems to be a considerably reduction of noise between the estimated orientation (σ_{gyro}&σ_{accel})&PI in comparison to (σ_{gyro}&σ_{accel}), it is noted that drifting deviation is compensated but not eliminated thereby maintaining the behavior of σ_{gyro} in z axis. The main reason is that both algorithms depend on the sensors fusion in order to compensate drifting deviation and the accelerometer information can only be used to estimate the orientation of an object in X and Y but not in Z axis because of its dependence on the acceleration of gravity as was reported in [11].
Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from σ_{gyro} and σ_{accel} independently, as well as through to the proposed algorithms; (σ_{gyro}&σ_{accel}) and (σ_{gyro}&σ_{accel})&PI.
X axis  Y axis  Z axis
 
Drift 5min  RMSE  Drift 5min  RMSE  Drift 5min  RMSE  
σ_{gyro}  22.85^{o}  1.3e4  45.15^{o}  1.4e4  19.87^{o}  1.8e4 
σ_{accel}  0.34^{o}  0.18  0.17^{o}  0.21  0.002^{o}  0.007 
σ_{gyro}&σ_{accel}  0.33^{o}  0.18  0.12^{o}  0.21  9.26^{o}  0.007 
(σ_{gyro}&σ_{accel})&PI  0.001^{o}  0.04  0.10^{o}  0.05  9.21^{o}  0.007 
Consequently, drifting deviation in this axis cannot be completely eliminated. Finally, in order to verify such assumptions; Table. 1 shows a comparison of the amount of drifting deviation in degrees as well as the amount of noise in static position represented by the Root Mean Square Error (RMSE). Similarly in Fig. 3, it can be observed a comparison in the estimation of the orientation in Z axis, in a range of 90 to 90 degrees using only σ_{gyro} or σ_{accel}, hence demonstrating the mentioned dependence of the acceleration of gravity.
Fig. 3 Comparison in the estimated orientation in Z axis.
Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from 90 to 90 degrees, comparing drifting deviation, between σ_{gyro} and the proposed algorithm (σ_{gyro}&σ_{accel})&PI, for each axes.
Behavior analysis of the estimated orientation during movement
Although σ_{accel} is susceptible to high noise levels, it can be used to estimate both the position and orientation of an arbitrary object or body segment in X and Y axis which is enough in some applications [2], [17]. Although σ_{accel} does not produce drifting deviation during the static analysis, it was demonstrated that it cannot be used to estimate neither the position nor the orientation in 3D space because of its dependence on the acceleration of gravity. Despite the fact that σ_{gyro} is subject to drifting deviation, some authors suggest that it could be useful in the evaluation of body segments due to its characteristic of not being susceptible to noise [2], [17]. Accordingly, due to the analysis of behavior during movement, six repetitions from 90 to 90 degrees for each axis was performed comparing σ_{gyro} with (σ_{gyro}&σ_{accel})&PI, which is the proposed algorithm in order to determine both the amount of drifting deviation and noise during movement. However, as it was assumed during static analysis in Fig. 4 X and Y axis, it can be noted that because of drifting deviation in σ_{gyro}, the estimated orientation is lost through time during movement even in a short period of time. On the other hand, it can be noted that the estimated orientation using (σ_{gyro}&σ_{accel})&PI, seems to be as noisy as σ_{gyro} but as susceptible in drifting as σ_{accel}, therefore demonstrating an optimal fusion sensor. In addition, although Z axis maintains the behavior of σ_{gyro}, it was demonstrated that the fusion sensor not only allows to compensate and to eliminate both drifting deviation and noise as can be seen in table 2 but it also allowed estimating the orientation of an arbitrary object or body segment in 3D space successfully.
By representing the orientation of an object in 3D space, DCM was conformed from σ instead of using only Euler angles because of its singularityfree orientation representation. Accordingly, by considering that an arbitrary object in 3D space can be represented as a vector through coordinate triplet O_{xyz}. The orientation of an object in 3D space will be given by Eq.(17).
 (17) 
X axis  Y axis  Z axis
 
Drift (30s)  RMSE  Drift (30s)  RMSE  Drift (30s)  RMSE  
σ_{gyro}  48.82^{o}  24.12^{o}  4.39^{o}  11.36^{o}  15.15^{o}  7.14^{o} 
(σ_{gyro}&σ_{accel})&PI  2.59^{o}  2.75^{o}  1.32^{o}  2.98^{o}  15.15^{o}  7.14^{o} 
Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.
Finally, O_{xyz} will define the orientation of an object in 3D space based on the sensors orientation representing it through a virtual environment as a cube while performing a trajectory between 0 to 90 degrees through X axis as shown in Fig. 5.
Conclusion
To estimate the orientation of an arbitrary object or body segment in 3D space cannot be properly achieved using gyroscope or accelerometer information separately. Although, gyroscope angular data rate can be translated into meaningful 3D information, it was demonstrated that the integration of any amount of signal will cause an accumulative drifting deviation over time. On the other hand, accelerometer information provides an absolute reference orientation which allows estimating the orientation in respect to gravity. Nonetheless, it was demonstrated that the accelerometer is not able to give a complete description of the orientation because the accelerometer signals do not contain information about the rotation around the vertical z axis.
Results from the present study shows that the proposed algorithm including PI controller feedback not only eliminates drifting deviation but also reduces noise in both static position and during movement, thereby estimating the orientation of an arbitrary object in 3D space. It should be noted that the proposed algorithm simplifies equations by reducing the operations to be computed and consequently computation cost as it was in contrast to methods based on the usual computation of quaternions and rotation matrixes.
Finally, from the obtained results, considering that parameters of the proposed algorithm were configured according to the kinematic joint evaluation, it was demonstrated that it is possible to estimate the instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer information.
The results obtained as well as low cost, small size and low power consumption of the employed sensors were encouraging for design a wearable system for application in the ambulatory kinematic joint body evaluation as a future work.
ACKNOWLEDGMENT
The authors express thanks to Consejo Nacional de Ciencia y Tecnología (CONACyT, México), for granted scholarship to BarrazaMadrigal J.A and their support to this research. The authors also thank the valuable assistance of Ing. Eladio Cardiel Perez for achieving this study.
References
 T. Shimada, “Normal Range of Motion of Joints in Young Japanese People,” Bulletin of allied medical sciences Kobe BAMS (Kobe), 1988; 4:103109.
 D. Giansanti, V. Macellari, G. Maccioni and A. Cappozzo, "Is it Feasible to Reconstruct Body Segment 3D Position and Orientation Using Accelerometric Data?" IEEE Transactions on Biomedical Engineering, 2003; 50(4):476483.
 Sen Suddhajit, R. Abboud, D. Ming, B. Wan, Y. Liao and Q. Gong, “A motion simulation and biomechanical analysis of the shoulder joint using a whole human model,” in 4th International Conference on Biomechanical Engineering and Informatics (BMEI), 2011; 4:23222326.
 H. Dejnabadi, B. M. Jolles and K. Aminian, “A New Approach to Accurate Measurement of Uniaxial Joint Angles Based on a Combination of Accelerometers and Gyroscopes,” IEEE Transactions on Biomedical Engineering, 2005; 52(8):14781484.
 M. A. Sabatini, “Estimating ThreeDimensional Orientation of Human Body Parts by Inertial/Magnetic Sensing,” Sensors, 2011; 11(2):14891525.
 J. Bortz, “A New Mathematical Formulation for Strapdown Inertial Navigation,” IEEE Transactions on Aerospace and Electronic Systems, 1971; AES7(1):6166.
 M. Ignangni, “Optimal Strapdown Attitude Integration Algorithms,” Journal of Guidance Control and Dynamics, 1990; 13(2): 363369.
 S. Tadano, R. Takeda and H. Miyagawa, “Three Dimensional Gait Analysis Using Wearable Acceleration and Gyro Sensors Based on Quaternion Calculations,” Sensors, 2013; 13(7):93219343.
 E. Bernmark and C. Wiktorin, “A triaxial accelerometer for measuring arm movements,” Applied Ergonomics, 2002; 33(6):541547.
 G. Hansson, P. Asterlan, N. Holmer and S. Skerfving, “Validity and reliability of triaxial accelerometers for inclinometry in posture analysis,” Medical and Biological Engineering Computing, 2001; 39(4):405413.
 H. Luinge and P. Veltink, “Measuring orientation of human body segments using miniature gyroscopes and accelerometers,” Medical and Biological Engineering and Computing, 2005; 43(2):273282.
 S. O. Madgwick, A. J. Harrison and R. Vaidyanathan, “Estimation of IMU and Marg orientation using a gradient descent algorithm,” in IEEE International Conference on Rehabilitation Robotics, ZurichSwizerland, 2011.
 J. Luige Henk and H. Peter, “Inclination Measurement of Human Movement Using a 3D Accelerometer With Autocalibration,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2004; 12(1):112121.
 Z. Zhang, Z. Huang and J. Wu, “Hierarchical Information Fusion for Human Upper Limb Motion Capture,” in 12th International Conference on Information Fusion, SeattleWAUSA, 2009.
 J. Favre, B. Jolles, O. Siegrist and K. Aminian, “Quaternionbased fusion of gyroscopes and accelerometers to improve 3D angle measurement,” Electronics Letters, 2006; 42(11):612614.
 InvenSense, MPU6000 and MPU6050 Register Map and Descriptions, Revision 4.0, 03/09/2012: 1315.
 E. L. Morin, S. A. Reid and J. M. Stevenson, “Characterization of upper Body Accelerations for task Performance in Humans,” in Proceedings of the 25th Annual Conference of the IEEE EMBS CancunMéxico, 2003.
 A. Olivares, J. Górriz, J. Ramirez and G. Olivares, “Accurate human limb angle measurement: sensor fusion through Kalman, least mean squares and recursive leastsquares adaptive filtering,” Measurement Science and Technology, 2010; 22(2):115.
 E. Bekir, “Introduction to Modern Navigation Systems,” World Scientific Co. PteLtd, 2007.
 H. Schneider and G. Barker, “Matrices and Linear Algebra,” Holt Rinehart and Winston Inc. New York, 1968.
 C. Cullen, Matrices and Linear Transformations, AddisonWesley Co. Reading M.A, 1972.
 W. Premerlani and P. Bizard, Direction Cosine Matrix: Theory, 17 May 2009
 S. O.H Madgwick, “An efficient orientation filter for inertial and inertial/magnetic sensors arrays,” Technical report, Department of Mechanical Engineering, University of Bristol, 2010.
 B. Antonio, Fundamentos de Robótica, McGraw Hill, 1997.
 R. Mahony, T. Hamel and J.M. Pflimlin, “Nonlinear Complementary Filters on the Special Orthogonal Group,” IEEE Transactions on Automatic Control, 2008; 53(5):12031218.
 F. Haugen, “The Good Gain method for PI(D) controller tuning,” TechTeach, 2010: 17.
 F. Haugen, “The good Gain method for simple experimental tuning of PI controllers,” Modeling, Identification and Control, 2012; 33(4):141152.