The abbreviation VBE stands for Visual, Bio-sensing, and Eye-tracking application. The app provides a platform, not only to track and record various behaviors of participants, but also to connect and synchronize all incoming data streams from connected measurement devices such as video cameras, microphones, physiological sensors and eye-trackers. The vision and scope of the BMSlab is to seek solutions to social challenges by implementing state of the art technology. The VBE enables us to study (effective) interaction between multiple human actors more objectively. Although synchronization of video capture and bio-sensors is available at the individual level (i.e., Imotions), such an application is missing for analysis at the team level, for example, when a manager has a meeting with his or her team-members. Precise synchronization of images and sounds is needed to enable the combination of data and to holistically examine episodes or events within such contexts. Moreover, technically synchronizing  video codings with skin conductance measures (e.g., using Matlab) is possible but extremely time-consuming and error-prone. With the VBE app, the different metrics are precisely synchronized, which enables the precise study of group interactions in real-time. Combining video coding with sensor-technology is relatively new in the subfields of management, educational and social sciences. The VBE-app provides several advantages, such as:

  • Collection of rich physiological and behavioral data (combining data from multiple sources/sensors)
  • Allows for second-by-second longitudinal assessment of behaviour and interactions
  • Provides more objective measures of social interaction. Instead of relying on mere perceptions of behavioural styles or stress, interdependent codings and sensors provide more objective information about social interaction. In doing so, common method bias, which is a recurring problem in social science research, is reduced.