They have a good instructional manual and the initial paper linked above documents the various HRV methodology very well. In regards to error correction, they use a cubic spline approach:
In order to replace invalid beats, the software therefore provides automatic correction of the RR series, inspired by the algorithm developed by Kamath and Fallen (Kamath and Fallen, 1995). First, false beats are detected using Cheung's algorithm (Cheung, 1981): a high and low threshold are set for the relative variation in successive RR intervals (+32.5% and −24.5%, respectively). Second, for each detected error, the number of missing beats is estimated by comparing the total time duration within the error period with the duration of the immediately preceding beat. If the number of beats to recalculate is 3 or less, a cubic spline interpolation is done. This configuration generally originates from a single supraventricular or ventricular ectopic beat, or from an isolated R-peak missdetection. For 4 or more successive errors, the missing beats are interpolated by copying and inserting the same number of previous RRs between the first and last valid RR Although, these corrections generally results in a clean RR signal, this type of automatic algorithm can produce inconsistent values when the original EKG or RR signal is too corrupted, with lengthy portions of successive false beats. It is therefore recommended to visually inspect and review each series before performing HRV analyses. Additionally to RR correction, the software enables excluding parts of the signal from analysis, and the number of corrections is displayed in the results.
Although I will not go through a full tutorial of using this software, a few things should be pointed out. In the preferences, there is an option to automatically correct artifacts. Under most circumstances this should be checked unless one was to do beat by beat correction (which can be done).
Here is what the RR series looks line in the program:
This was from a recording of my Polar H10 indoor cycling file from the last post. The artifacts are in red. The boxed areas were the 5 minute intervals at 170 vs 190w as well as the last 2 minutes of each section for computation.
Here is what a cleaner file looks like with only one artifact (arrow). The RR intervals are all about 500 msec:
In the last post, I compared 5 minutes of cycling at about 170 vs 190 watts, looking at simultaneous tracings of artifact free (Hexoskin) vs Polar H10 (1-4% artifact)
A close up of the last 2 minutes of the190w section (20w above VT1) looks like this:
- Each blue spike upward is an artifact. The RR series is plotted over time. If a beat is missed, the RR gap widens, hence the blue spike of RR time. If this is a 2 minute window with a HR of 137 bpm, there are 274 beats. Factoring in 6 artifacts (counting the 4th blue spike twice), the percent artifact would be about 2.2% - well in the "acceptable" range.
With no artifact correction the DFA a1 was about .9 with a HR of 137 bpm:
- After automatic cubic spline artifact correction, the analysis changes.
- The DFA a1 decreases to .759:
A 30 second earlier window (same net length) shows similar results:
How does the same sequence look with no artifacts?
The same 2 minute block of 190w cycling (about 20w above VT1 for 5 minutes) - Hexoskin tracing with no artifacts:
No RR spikes as expected, steady RR times.
The DFA a1 and other HRV parameters came out to this:
- The DFA a1 is .498 with a mean HR of 137 bpm.
- If the 2 minute window is shifted 30 seconds earlier the DFA a1 result is about the same:
- The bottom line is that artifact correction done by either Kubios automatic, Kubios threshold or a different HRV software using cubic spline interpolation will all lead to elevation of the DFA a1 index as compared to a recording with no artifact (Hexoskin). This seems a limitation of the artifact correction methods and not an intrinsic bug in Kubios.
- The HRVanalysis software is a powerful tool and in many respects comparable to the free version of Kubios. The team responsible for it's creation and maintenance should be proud of their work.
- As discussed in the previous post, reliance on the DFA a1 change from correlated to uncorrelated white noise values to delineate training zone 1 to 2 transition (VT1) could be of questionable accuracy with current artifact correction methods.
- It is beyond the scope of this brief post to speculate on what limit of artifacts are reasonable to get accurate DFA a1 results. However, the Polar tracings in this and the previous post had 1-4% artifacts which is generally deemed acceptable. It is clearly not.
- The effects of artifacts on higher power intensity associated DFA a1 decline may not be as critical since we are less interested in zone 3 effects.
Post a Comment