Bots get new life in OpenSim

[Editor: Recent OpenSim upgrades have substantially expanded support for NPCs — non-player characters — also known as bots or AIs. NPCs are typically used in games, as enemies for players to defeat, or characters who help the players in their quests. However, they are also increasingly being used for business.]

I had fun over the last few weeks putting together a demo showing how immersive environments can be used to support planning and training for emergency services.

As well as showing indoor and outdoor environments, we also wanted to show how we could downgrade the experience, so as well as plunging the area into darkness — zero illumination, not the false “midnight” setting in OpenSim or Second Life — and navigating by torch-light — very evocative.

We also had our suspect package billowing smoke which made it hard to see avatars even just a few metres away.

(Image courtesy David Burden.)

The shot above shows a couple of policemen and emergency vehicles outside a shopping center — our scenario was focussed on a suspect package in the shopping center.

Note that the policeman on the left is an osNPC [OpenSim non-player character] bot running our osNPC management package which includes simple chatbot functionality — as well as an optional link to our Discourse speech engine — and navigation using a topological map.

Since the osNPC functionality was restored to OpenSim we’ve been building a set of tools to add functionality and to help manage single and multiple bots.

While osNPC works well for crowd bots where no interaction is needed, once you start to create more NPC-type bots with whom the user interacts you start to spot some significant limitations – like the bots don’t listen!

The best work around we’ve found at the moment is to put an attachment on the bot which carries all the additional functionality needed – such as Listen and Sense.

However, you then bump into another problem which is that since NPC appearances are changed as outfits this includes the attachment, so every saved appearance must have the attachment – which means the more changes you make to the attachment — which we call a plug — and the more outfits you have, management is a nightmare. Our solution is to put the absolute minimum in the plug, and have the majority of actual processing — as opposed to sensing — run in a more accessible control object.

 

One reason why we’ve been able to add functionality to osNPCs pretty quickly is that we’ve been able to use our Avatar interface languages, so that they work instantly with our chatbot , route finding, crowd and AI management tools. We have two languages – Avatar Action Markup Language which defines the commands sent to a bot to do things, and Avatar Sensing Markup Language which defines the sensations, including text chat, nearby avatars and objects, that the avatar — or rather the plug — “sees.”

The net result of all this is that we are rapidly getting to the stage where we can deploy quite sophisticated foreground bots driven by our chatbot and AI apps, and background crowd bots – both of which make any training simulation far more real than an empty building.

It also means that we can look at using bots to create more socially-oriented training sessions, and to use them for more conventional process and activity simulation and data visualisation work. We’re already looking at how we could drive them using data from the VAST Challenge, an annual competition dealing with the challenges of visualising and analysing big data.

Now all we need is for the OpenSim team to implement the new ray-tracing and route finding functionality being introduced into Second Life!