This program seems like it has potential for a "live puppeteer" setup, kind of like Disney's "Living Character Initiative" program.
If it were to be used for that, (which I think would be used by MANY people) there would be a few things to take into consideration:
The jaw would have to either be controlled by a button on a joystick or keyboard, or you would have to build a sound-to-motion audio driver into your software to control it.
If using 2 axis eyes, you would have to find a way to control them. Could be as easy as moving the mouse left/right/up/down, or more complicated (writing a program to "co-puppet" the eyes, make them move around in a normal fashion, and look left or right when the head turns that direction.)
Then, it would be up to the user to setup a camera so they could see who they were talking to.