Remote teleoperation using a Virtual Reality (VR) allows users to experience better degrees of immersion and embodiment. Equipped with a variety of sensors, VR headsets have the potential to offer automatic adaptation to users' personal preferences and modes of operation. However, to achieve this goal VR users must be uniquely identifiable. In this paper, we investigate the possibility of identifying VR users teleoperating a simulated robotic arm, by their forms of interaction with the VR environment. In particular, in addition to standard head and eye data, our framework uses hand tracking data provided by a Leap Motion hand-tracking sensor. Our first set of experiments shows that it is possible to identify users with an accuracy close to 100% by aggregating the sessions data and training/testing with a 70/30 split approach. Last, our second set of experiments show that, even by training and testing on separated sessions, it is still possible to identify users with a satisfactory accuracy of 89,23%.