When we started OpenSim many years ago (back with MW, lbsa71 and many others who have since drifted on), we had grand ambitions. Yes, it started as a fun tinkering project that we spun out of our original activities — understanding Second Life from the code we collected in libsecondlife. But we quickly saw the potential to build a new virtual world that could scale to a larger audience than Second Life ever achieved. (Even during Second Life’s peak of 2006 to 2008, it was common knowledge that it was failing to retain new users.)
For a while, there were some big names adopting the project in droves. Nearly every major tech company had some involvement — or at least one employee contributing — to OpenSim at some point. IBM had an entire team of OpenSim developers and was running internal conferences using the project. During my involvement, the OpenSim software was downloaded hundreds of thousands of times. In the years since, it’s found its way into many surprising places, from NASA to university courses.
It’s gratifying to see OpenSim still soldiering on 12 years later, in great part through the efforts of the educators who’ve embraced it, and through worlds like OSGrid, which maintains a small but dedicated user community, along with a host of other enterprises, projects and grids using the software.
And while OpenSim didn’t become the breakout success we hoped it would, I learned a lot from it, about building virtual world platforms — and what they need. (I had some follow-on lessons from the years I spent building other virtual worlds for various entertainment partners too, but that’s another story – and another article.)
You can’t build just half a platform
This was our original mistake. Having the Second Life viewer around gave us a leg up — we only had to build half the project. But without any ability to modify the viewer, we could not customize OpenSim’s user experience, or make real architectural improvements. A few side projects like the Open Metaverse Project had, in hindsight, a better idea — and probably deserved a bit more support than they got.
The Second Life viewer was eventually open sourced, and later, relicensed so developers could improve it — but the combination of a monolithic viewer codebase, and too much momentum to simply clone Second Life, killed any real attempts to do it properly.
That’s not to say people haven’t tried – for example, the team over at realXtend made a strong effort at it. But ultimately, I think this was the biggest mistake we made.
Virtual worlds shouldn’t reinvent the wheel
This is true of Second Life and OpenSim, and numerous other virtual worlds and MMOs — attempting to build key features and functionality by creating them from scratch, when better options already exist.
At the time, the list of free or cheap 3D engines could be counted on one hand — Torque, Ogre3D, Irrlicht, etc. But today, we have dozens of fantastic high-end options, including Unity, Unreal, Lumberyard, CryEngine, and Unigine. If you were willing to shell out real cash, Unreal, CryEngine, id Tech and others have been available throughout.
Building your own graphics engine from scratch, however, is a dumb idea. It’s an insanely complex bit of software. Throw in a few thousand graphics cards and chips, various drivers, and you’ve got the recipe for a monumental headache on compatibility and support, let alone trying to stay up to date with the latest and greatest in 3D features. Trying to build your own is just going to result in you wasting a ton of talent reinventing the wheel.
Like graphics, virtual world networking is hard. Scaling a robust high-bandwidth, real-time protocol is one of computer science’s hardest problems — tools like Raknet (now open source!) make it a lot easier. Easily half of all the effort that went into OpenSim was working on solving the thorny problem of, “this protocol is complex and requires sending way too many small packets”.
But we didn’t even need to do that. A lot of the problem was streaming heavy data (like textures) which are decidedly not optimal for a UDP-based delivery system. The Internet solved this problem in 1993 with HTTP; the fact OpenSim didn’t use that for the longest time was a fruit of the viewer — it didn’t support HTTP-delivered textures until recently in its history.
Graphics and rendering are hard, so send that to middleware. Low-level networking and reliability are hard, so outsource that. Lots of hard scaling problems have been solved already by incredibly clever people — load balancing and mirroring make many problems go away. The standard Internet already has a pretty robust content delivery system, so use it — especially for delivering large files like textures and 3D models, which are ideally suited for technologies like HTTP.
It’s staggering to imagine the opportunity cost of so many development hours that went into feature reinvention in OpenSim that could have been spent elsewhere — like, for example, friction-free accessibility.
Which takes me to my next point:
Virtual worlds must be accessible — immediately
Even among gamers, the percentage of people willing to downland and install a client, then endure a time-consuming, multi-step login process, is vanishingly small. For the same reason, web and mobile access matter too. We know from our own efforts that if you want someone to download or install something, half of the people who sign up, won’t.
Today’s consumers don’t use desktops either – the web today is mobile, and I find myself using my phone more and more, switching only to my desktop to get work done. You need to be where the users are – and that, in my opinion, means friction- free and device-agnostic experiences.
Virtual worlds must be future proof
Graphics standards from 2003 are not the same as graphics standards in 2018, or more key, 2025. A virtual world must be built around that assumption. A related point: Plan and expect that form factors will evolve over time. Between 2003, when Second Life went public, and 2018, we’ve gotten web apps and smartphones, video game consoles that are ever more Internet and multiplayer-focused, and now, AR and VR headsets are slowly gaining some traction. A virtual world which can’t grow with these platforms is destined for near-obsolescence.
IP protection needs to be baked into the virtual world from day one
The Second Life-OpenSim model relies on copying assets and storing each copy separately — so deleting copyright infringing content became insanely difficult, since there’s no easy way of delete all the copies.
Any active user of either platform knows the massive drama and IP theft allegations that plague a system where IP protections are added long after launch. Worse, this creates a strong disincentive for content creators to remain faithful to the world.
I’m convinced this is a key reason for creator burnout in the larger worlds, and why OpenSim failed to take off: Without confidence that their intellectual property is safe, much of the creator class quickly lost faith; and the treadmill of building new content constantly to stay ahead of pirates harms the mental and financial health of existing creators.
While it’s far from perfect, OpenSim and Second Life should have followed Apple’s App Store review and approval model – at least for mass distribution of content. While it adds some friction to publication, it’s created a massive new community and market of successful app creators. Developers who build apps for iOS make a lot more money from their apps than over on Android, despite Android’s far larger audience.
These are just some of the lessons I’ve learned the hard way over the last 15 years, and that we’re now putting into practice in Sinespace. For instance, to strengthen our IP protection, we recently announced a Certified Creators program. I’d invite readers to have a look and offer feedback — both positive and negative. Whatever happens, there’s no doubt there’s a lot more ahead to learn and adapt in virtual worlds over the next 20 years, as there has been in the past 20.