 Beyond the limit of vision, humans can recognize and classify textures through the interplay of audio and tactile feedback when touching a surface through a tool. Inspired by this, we present a large-scale texture classification method by analyzing audio-tactile cross-moder congruence in the frequency domain, which outperformed a variety of multi-moder features. For more information, please check out our paper and presentation video.