 I'm Stajrada Nalescu from the Retouch Lab presenting our smart bracelet supporting tactile communication and interaction. There's a need for haptic devices that encode conversational touch in the digital domain, enabling transcription and translation of tactile sign languages or TSLs. Individuals with loss of vision and hearing communicate through TSLs, which exist in rich variations based on geographic location and sensory impairment. The deafblind manual alphabet is an Australian TSL employed in our work. Prior methods for supporting tactile communication via assistive haptics include Braille interfaces, haptic wearables, tactile graphics displays and screen-based accessibility features. Haptic output devices can translate text or speech into tactile patterns actuated onto the hand. We address the additional need of digitally encoding tactile input while leaving the hand free for natural touch interactions. To determine whether TSL gestures can be differentiated by measuring skin acceleration, we place two 42-element sensing arrays over the back and palm of the hand to capture a subset of letters. Each letter yielded distributions of RMS acceleration that reflected its contact location and unique spectrotemporal nature. Furthermore, same letters show high correlations compared to distinct letters, even when considering data from a subset of sensors at the wrist. We then designed a smart bracelet using four sensors at the wrist and investigated its capability to classify TSL letters using common signal processing, feature extraction and classification methods. Three axis accelerations per sensor channel are filtered and processed using principal component analysis, yielding a principal component that is then normalized across all four sensing channels and used to compute features. We extract 96 time domain, frequency domain and spectrotemporal features. Examples include peak-to-peak amplitude, spectral centroid and summary statistics across bandwidths after computing discrete or short-time Fourier transforms. Feature vectors are passed to three common classifiers with the support vector machine yielding the highest classification accuracy. It achieved 93% classification accuracy when including features from all four sensors with the use of features from two sensors maintaining robust performance. Feature work will aim to reduce misclassifications and include diverse datasets from trained TSL signers. We further envision supporting real-time interpretation of TSL where signing can occur as fast as five letters per second. Thank you for your time.