FaceAPI is a real-time head-tracking engine which uses webcam input to acquire 3D position and orientation coordinates per frame of video. And it works!
I implemented it in Blender Game Engine for cockpit view of BGE Air Race game.
and here is how I got it working in BGE
1. Downloaded FaceAPI here:
LINK2. Downloaded FaceApiStreamer here:
LINK(exports 6 degrees of freedom head tracking to a UDP socket connection)
3. Acquire the values from FaceApiStramer in BGE with Python code like this (not sure if it is quite right, but it kinda works):
from socket import *
controller = GameLogic.getCurrentController()
own = controller.owner
if own["once"] == True:
# Set the socket parameters
host = "127.0.0.1"
port = 29129
addr = (host,port)
# Create socket and bind to address
GameLogic.UDPSock = socket(AF_INET,SOCK_DGRAM)
GameLogic.UDPSock.bind(addr)
GameLogic.UDPSock.setblocking(0)
own["once"] = False
GameLogic.UDPSock.settimeout(0.01)
try:
data,svrip = GameLogic.UDPSock.recvfrom(1024)
str = data.split(' ')
own['xPos'] = float(str[0])
own['zPos'] = float(str[1])
own['yPos'] = float(str[2])
own['pitch'] = float(str[3])
own['yaw'] = float(str[4])
own['roll'] = float(str[5])
except:
pass
Here is a blend file:
LINK you will need FaceAPI instaled and FaceApiStreamer running in background.