In the past years video games have run through a change from analogue games to hyperrealistic digital games. But this will not be the end of the evolution. Now the industry focuses on hyperimmersive virtual reality games. New affordable devices like head mounted displays allow to experience virtual reality (VR) even at home. Applications are increasingly immersive due to a rising degree of interactivity. The field of application reaches from gaming to serious usage like education, training or industrial contents. So far, most VR applications are not accessible by blind players, because they are mainly focused on a visual representation. Including the opportunity to interact with the game setting by active human echolocation and use it for the orientation might be of interest for blind and normal-sighted people. Several studies have shown the potential of this approach. Human echolocation might have a great potential to improve the VR experience and provide access for the blind. We present an experiment that investigates human echolocation in the context of VR. Is it possible, to localize a wall? Does it work intuitively enough, that it is of actual benefit in an orientation task within a purely acoustical environment? Dynamic binaural reproduction using a microphone and headphones allows to explore the virtual scene with self-produced sounds.