 Welcome to our presentation on distributed virtual reality for hearing research. Interactivity is evident in speaker changes, but also non-verbal features, such as eye, head, and autonomous movements, as well as in facial expressions. Modern hearing devices increasingly interact with the movement behavior of their users. Examples of this are highly selective beam formats that provide largest benefit in the frontal direction. Interactive communication behavior is therefore increasingly studied in hearing aid research, for example, in our gesture lab. Here, audio-visual environments are reproduced with a 300-degree video projection and via 45 loudspeakers arranged on a sphere. Conducting such experiments remotely is not possible with conventional video conferencing systems. To overcome these limitations, we developed a system for distributed audio and sensor data transmission, the OV box system. All components are optimized for minimal acoustic delay and for best audio quality. The headphones can be equipped with a motion tracker for interactive headtracked binaural rendering, but also to transmit behavioral data to other peers for data logging or remote interactive audio-visual rendering. Also, other biophysical sensors can be used as long as they can send their data as UDP messages. This system was initially developed during the pandemic with music applications in mind. The system was also used in research applications. In this study by Marie Hartwig, it was shown that the combination of virtualization and telepresence did not show an effect on movement behavior. We would like to thank our lab team for the support of the Deutsche Forschungsgemeinschaft for finding this research and you for your attention.