Day 7: July 21, 2010

Activities:

  1. Project Discussions
  2. Second Life -- Imported BVH files (from mocap session) into Gesture Warden avatar
  3. GestureCloud website mockup, and discussion of requirements
  4. Continued programming of the robotic 'arm' in PureData (Jim Ruxton)

Project Discussions:

  • Harvesting physical motion via accelerometer data:
    • Accelerometer feeds physical motion data to PureData/ Python application, which will then interface with Second Life to trigger avatar/ object behaviour
    • Accelerometer can be a raw component, or accelerometer functionality of an IPod could be used, via TouchOSC app (simplest solution for the prototype).
  • BVH files will be used to drive avatar movement/ gesture, or possibly a multi-prim object (e.g. an animal, a hand)

Second Life Activities:

  • BVH files from the motion capture session (Day 6) were provided by Tian Yue (who was also responsible for operating the motion capture recording software during the session)
  • The BVH files were imported into the Gesture Warden avatar, via the following process:
    • Upload an 'Animation' via the Second Life Inventory screen, and choose one of the BVH files
    • In the upload dialog, ensure that 'looping' is checked
    • After the animation is uploaded, it can found under the Animations folder in the avatar's inventory
    • To play the animation: right-click on the animation file, and either Play Locally (only you can see the animation), or Play InWorld (everyone can see the animation)
  • *Upload limitations: We discovered that BVH files must be under 30 seconds in duration; otherwise, they are rejected by Second Life
  • Experimented with techniques for stringing together several animations:
    • Via Gestures
      • In Inventory, right-click and select 'New Gesture'
      • For each animation we wanted to include in the gesture, we had to add 3 steps - Start Animation, Wait (to let the animation play fully), and Stop Animation (since all animations in a gesture play simultaneously by default, this ensured that the current animation has stopped, before continuing to the next one)
      • *Observed limitation: Second Life would not permit gestures which contained more than about 25 steps
    • Using A Script
      • Added script to an object, which could play a series of animations when touched (using a llStartAnimation command for each animation)
      • Encountered problems with llStartAnimation when referencing our custom animations by name or UUID. (only pre-canned SL animations such as 'sit' would work)

GestureCloud Website:

  • Showed first mockup for GestureCloud website (Ken Leung)
  • Discussed further requirements for the website:
    • Sections for Toronto-based and Beijing-based phases of the project, with image documentation (sketches, and photographs), summary text, and full appendix for each
    • Video documentation section

References:

Websites:

BVH (BioVision Hierarchy) Specification:
http://www.cs.wisc.edu/graphics/Courses/cs-838-1999/Jeff/BVH.html
http://en.wikipedia.org/wiki/Biovision_Hierarchy

TouchOSC
http://hexler.net/software/touchosc

Contacts:

Wang Limin
CAFA Digital Media Lab
(Motion Capture)

Tian Yue
yuetian.cn@gmail.com
CAFA Digital Media Lab
(Motion Capture)