 Deafblind people have both vision and hearing impairments which result in significant communication challenges and which make navigating digital systems impossible without the use of expensive peripherals. I'm David Kuttner and I'm presenting a study in which a vibrotactile interface for Android was evaluated by two deafblind participants. High costs are a key factor restricting access to assistive devices such as hearing aids and Braille displays. Scott Davert, a deafblind professional who gave feedback on the project, explained that the average person pays $400 to $1,000 for a mobile phone, but his Braille display is a minimum $1,800 to as much as $5,000. We present Morse Input and Output, or MIO, a low cost, easy to learn vibrotactile interface for deafblind users' interactions with computers. The interface's design, including the choice to use Morse, was informed by interviews with professionals and academics who support, educate and study deafblindness and deafblind people. I encourage you to read this section of the paper if you're planning on doing work relating to accessibility. Their feedback was invaluable, and a lot of it applies more broadly than to just this project. The interface was integrated into an app to allow the user to perform a simple text recognition and entry task. Tapping the top of the screen would emit the target word as a sequence of vibrations, and the user would then enter the same word using the keypad across the bottom half of the screen, a dot, a square, and a dash. The diagram on the right shows the output vibrations for the word T, a dot, a dash, and a dot dash, for the letters T, E, and A respectively. The user then inputs the perceived word using the keypad's dot, dash, and square buttons. Our experiments involved two deafblind users, AC and WK, both recruited through the Anseldyn Foundation in Ireland, which supports deafblind people. In the experiments, the users were asked to perform the recognition and input tasks on individual letters, then on words, building up to the short sentence, the cat eats. Participants were then interviewed about their experience of the system, following the questionnaire from the system usability scale. Our results were mixed. Participant AC said the system was hard to use, but that if staff helped me, I can learn to do it. She scored the system at 4 out of 100 points on the SUS scale. Participant WK, who was more proficient with technology before the experiment, just repeated a second time. She scored 57.5 points following the first experiment, and 75 points following the second. She explained, I feel very confident because I know where everything is, and said she would like to use the keypad for typing messages when she does not have her Braille keyboard with her. This scatterplot shows the improvement with experience of both participants. Note the sharp decrease in time taken to enter target words for WK during the second experiment. In conclusion, MIO is a cheap and learnable interface for deafblind people. MIO is a viable fallback method for deafblind people with smartphones. In future work should include integration into the operating system. WK suggested vibrating the sender identity when a message is received. You can find the code for this project on GitHub. Thank you.