Improving Healthcare Outcomes, One Player at a Time

Concurrent Validity of the Digital Trail Making Test

Digital cognitive testing has begun across multiple areas of practice. Simple conversions of paper and pencil to computer interfaces are not sufficient to support practice because the multiple advantages of computerized testing are not realized. This study examines the concurrent validity of two digital versions of the Trail Making Test (TMT) with the paper version and introduces a new metric to the TMT, throughput. Using iterative user-centred design we created two software versions of TMT. Each application runs autonomously. The Sand Trails version is aesthetically enhanced to suggest the user is drawing the trail in the sand. The Asteroid Trails version is gamified with an exploration narrative and a rich visual interface. Twenty-one community living veterans with who reported post-concussion symptoms took all three versions of the test in a counter-balanced study. Concurrent validity was supported with correlations generally above .5 for both the graphically enhanced and gamified versions of the test. Weaker correlations between the paper version and the Asteroid version were observed suggesting the enriched visual interface places an additional cognitive load on the task. This is an opportunity to further explore how multiple constructs can be measured simultaneously through a digital interface.