Connecting Labvanced with external devices like EEG and Force Plates

Please email your questions to:

Psychological multi-participant experiments and behavioral games

Please email your questions to: Eye-Tracking Accuracy

If you have any questions, please email us at: New Eye-Tracking Cutting Edge Technology


Labvanced has just released a cutting edge deep learning based eye tracking algorithm, which uses the webcam to track the gaze of participants. We are highly confident that not only is this much better than webgazer.js or other algorithms doing webcam based eye tracking, but we actually believe that the accuracy is good enough for at least 80-90% of eye tracking research, which is normally done in the lab. We made several demos, so please try and give us feedback about this approach. You can use these studies also as templates for your own eye tracking research on Labvanced (  Please note that this algorithm runs completely client side and no video data / face data will be stored on our system, so your privacy and your participants’ privacy is guaranteed!

A few things to note:

  • Calibration will currently take 7-8 minutes. We will cut this down over time to about 4-5 minutes.
  • It requires a decent computer/laptop, we will soon provide an initial “system check” to exclude people with too old equipment, or other reasons why it might not work.
  • Further improvements in accuracy will most likely be coming in the next months by leveraging larger training sets and improving drift correction.

If you have any questions about our new Eye-Tracking technology, or if you are interested to use our platform, please email me, we will also be happy to provide a free demo via Google Meet at your convenience.

Labvanced Team

Head Tracking, a New Feature From

image1Head tracking post

A new state-of-the-start feature was added recently to platform, which is Head Tracking via a regular computer’s webcam or phone’s camera. We believe this feature will be a great help to the researchers in the psychology field. 

Head motion is physically complex and carries rich information. Head gestures including (nodding, shaking, tilting, tossing, dipping, thrusting, dropping, etc.) are also a way of conveying emotions. Importantly, compared to eye tracking head tracking doesn’t need any calibration, and just works immediately.

The Head Tracking can be used to:

1- Show if the participant was facing the screen or his/her head was tilted left or right during the experiment.

2- During any experiment to show if the participant was paying attention and directing their face towards the screen or somewhere else.

3- Get metrics about where the face is relatively to the screen, e.g. centered or to the left or right, and also a relative distance to the screen.

We made a demo study in our library to show our new Head Tracking capabilities:

We think using our new Head Tracking technology combined with our visual study builder (which has almost all the stimuli that a psychology researcher can think about) provides the perfect choice for the researcher who wants to utilize Head Tracking and wants to build his study fast and in the easiest way without wasting his/her time in complex coding. also offers a totally integrated platform designed to host any psychological or UX study, and we are proud to say that our platform has all the features that a psychology researcher needs to build any kind of study. Even the most complex one.

Labvanced’s platform is also tested and trusted by many big and prestigious universities and labs around the world.

So if you are thinking about bringing your research to We will be happy to have a meeting with you via Google Meet. If you are interested, contact Leah today via email or chat to schedule a meeting.

Labvanced Team the new generation of psychology software

company brochure (1)


Labvanced Team