OpenSim and Xbox Kinect sensor integration

I had fun recently playing with using the Xbox Kinect sensor paired with OpenSim.

The project I work on is called PLANE, Pathways for Learning Anywhere, anytime — a Network for Educators. One of our core missions is connecting educators from all sectors such as state run, independent and Catholic region schools.

The virtual worlds are one component of the PLANE stack of features for educators which take a gameful and playful approach for educators to carry out their professional development activities.

Navigating OpenSim with XBox Kinect. (Image courtesy Stanley Yip.)

We chose to use OpenSim as it fits nicely with our philosophy of ‘Do it. Share it. Lead it.’ It’s easy for our end users to use to create and build in-world objects, it’s open source and it’s a way to contribute back into the community around it. We were also able to get our single sign on process to seamlessly integrate with the Imprudence viewer to auto launch and login.

Using the Xbox Kinect sensor with OpenSim opens up a whole range of fun and exciting educational possibilities.

“As distinct from off the shelf games that have the goals already programmed in, using OpenSim and Kinect allows you to easily tailor make experiences to suit your own unique classroom setting,” said Brendan Jones, one of our physical education teachers on the team. “OpenSim makes it easy to build your own virtual environments.”

We also see this as something to introduce into professional development sessions for educators. Teachers need to have fun too.

 

Flying in OpenSim using body movement. (Image courtesy Stanley Yip.)

The two different methods I played with are using FAAST — Flexible Action and Articulated Skeleton Toolkit — and SLKinect2. There are some really smart and creative people out there to make these tools! They are easy to follow and have fun results.

The hardware needed was a Xbox Kinect sensor bar, a USB power cable for the Kinect sensor, a PC and access to OpenSim. It was about 30 minutes to set up each of the two methods.

FAAST is quite configurable. It was so configurable that some of the gestures were overlapping and it was confusing the Kinect sensor and triggering unwanted movements. Some trial and error and the gesture list was able to be fine tuned to what I wanted. It was a matter of drag and dropping to set up walk, start fly, stop fly, turn left, turn right, go backwards, go forwards and wave hello.

With SLKinect2, this enabled real time puppet avatar movements. The movements you make in real life are picked up by the Kinect sensor and mirrored onto the avatar in OpenSim. It is very accurate and responsive. They are independent pieces of software but if they were somehow melded together the range of possibilities would be even greater.

stanley.yip@hypergridbusiness.com'
Latest posts by Stanley Yip (see all)