 This study evaluates whether home videos collected from a game-based mobile app can provide diagnostic insights into autism spectrum disorder, ASD. The researchers used automated data set annotations to analyze gaze fixation patterns and visual scanning methods of 95 children with ASD and neurotypical peers. They found that gaze fixation patterns differed between the two cohorts, and unique visual scanning patterns existed for individuals with ASD. A deep learning model trained on coarse gaze fixation annotations demonstrated mild predictive power in identifying ASD. The study highlights the potential of heterogeneous video data sets collected from mobile devices to quantify visual patterns and provide insights into ASD. This article was authored by Maya Varma, Peter Washington, Brianna Christman, and others.