When the Dreamcast came out in the fall of 1999, most people were excited by Soul Caliber's stunning graphics and the fact that Sega eclipsed EA's long-running Madden series with a single release. But while I certainly spent hours playing Soul Caliber, I spent Hundreds I spent hours breeding and evolving Chaos, cute little virtual pets cleverly hidden in Sonic Adventure's Chao Garden. And 25 years later, virtual pets are making a big comeback in new and interesting ways.
AC Thursday
In his weekly column, Android Central Senior Content Producer Nick Sutrich covers all aspects of VR, from new hardware to new games to upcoming technologies and more.
The experience was made even better by the fact that I could put my mess on the Dreamcast's Visual Memory Unit (VMU) and take it with me all day. I spent several hours in 8th grade collecting points to evolve my pets, and later that year Sega released Seaman – one of the weirdest virtual pets ever – that you could talk to (and that talked back to you) through a microphone attached to the controller. They're still some of my favorite gaming memories of all time.
Fast forward more than two decades and Sega may be a shadow of its former self, but virtual pets are just as popular now as they were back in the Tamagotchi days. The folks behind the ever-popular Pokémon Go, Niantic, released Peridot last year and recently released a VR tie-in called Hello Dot on Meta Quest that lets you play with your virtual pets in a noticeably new way.
And some of the folks behind Job Simulator and Vacation Simulator are bringing modern LLM-based AI engines to NPCs and even virtual pets. That means the era of Leonard Nimoy having to record hundreds or thousands of voice lines – like Sega did with Seaman – is no longer necessary. Virtual pets are back and I couldn't be happier about it.
Give me a pet
My renewed interest in virtual pets began when my then 8-year-old son got a Tamagotchi for Christmas. A year later, Neko Atsume Kitty Collector made its mixed reality debut, right after the release of Meta Quest 3, and I dreamed of being out and about with virtual pets all day and then coming home to play with them in VR.
That's where Peridot and Hello Dot come in. I reached out to Asim Ahmed, head of global product marketing at Niantic, to learn more about the company's goals for Peridot and Hello Dot and whether the company plans to further integrate both games into a cohesive virtual pet ecosystem.
I was excited about what I discovered.
While most of Niantic's games aim to get people to “get out more and explore the world around them,” as Ahmed put it, Peridot does things a little differently. There's still the classic Niantic formula that's been around since the Ingress days, where you're encouraged to walk around the real world and discover new virtual creatures and items, but Peridot blends the real and virtual worlds in a way the company hasn't pushed since the original Pokemon Go days: through your smartphone's camera.
Watch on
Peridot uses your phone's spatial understanding of the real world to make it seem like your virtual pet is walking alongside you, but is only visible through your phone's magic window. When you're out and about, Peridot encourages you to take out your phone and play with your pet no matter where you are. New locations might reveal new items or new developments you've never seen before.
Imagine what will happen when Google and Magic Leap launch their first Android XR-based smart glasses and you no longer have to hold the device in your hand to see your pet? You would have them with you all day, instead of only when you remember to take out your phone and launch the app.
In some ways, Rec Room's My Little Monsters update already does this, albeit in a roundabout way. Rec Room is one of the most popular multiplayer VR games available, but it adds to the popularity by allowing non-VR players to join in on the fun. That means you can grow and evolve your pet on the go, then play around with it at home in VR. But it's a bit more complicated than what I'd hope for from a true Chao Garden successor.
At the moment, Hello Dot probably comes closest to that, but in its early stages, it's a rather paltry gaming experience that's mostly fun for a few minutes. It's the same basic gameplay as Peridot, but offers a far more immersive way to interact with them. Instead of just tapping your phone screen to pet them, you can actually bend down and pet their head, stroke their chin, or even pick them up for a cuddle.
Until Hello Dot and Neko Atsume, there were few games that offered this possibility. Most of the time, this limitation was simply that the pet was trapped inside an electronic device. In some ways, this is still the same limitation – the difference is that it feels more real to your brain because you can naturally move it and interact with it as if it were real.
Ironically, while Niantic is keen to get people outdoors, Hello Dot and Peridot don't have a good way to take your pet with you – at least not yet. Ahmed says Niantic has a full roadmap for Hello Dot and hopes to shape the future of the game in tandem with the mobile version, Peridot, and create new, previously unseen ways to interact with virtual pets.
The next stage of evolution
While virtual pets almost always “evolve” to the next level rather than growing like regular real-life pets, the way we interact with virtual pets hasn't evolved since the 1990s. Tamagotchi allowed you to care for your pet on the go, while games like Creatures on PC gave players a far more detailed world to play in. But we always had to click or tap to interact with them.
This is now set to change with the emergence of companies like AstroBeam, who are harnessing the power of LLM-based AI engines to make in-game NPCs (short for non-player characters) feel more like they are interacting with real people or creatures.
The future of NPCs is coming! Introducing our multiplayer NPCs with LLM technology. Multiple people can chat and interact with the same NPC in real time in VR! pic.twitter.com/A3dpxGRNoQ22 August 2024
Using LLM-based AI models to drive NPC interactions is nothing exceptionally new—Skyrim players have been using ChatGPT to create smarter NPCs for over a year now—but AstroBeam wants to take this concept and use it to make multiplayer games more inclusive.
I spoke to Devin Reimer, founder of AstroBeam, former CEO of Owlchemy Labs, and one of the minds behind Job Simulator, to learn more about how the NPC paradigm will change.
As Reimer puts it, the current way of implementing ChatGPT-style AI models into NPCs in existing games feels mostly odd and “really awful.” A big part of that is that while it's cool to hear an NPC react in a strangely intelligent, human-like way, most implementations are missing one key expectation: actions.
That's because all of these implementations are mods of existing games, rather than games built from the ground up with AI implementations. Reimer describes his company's work as conceptually similar to the creation of Job Simulator.
“When VR first started, people tried to put together what they already had… but it turns out that almost everything that was successful was designed from the ground up for VR.”
Talking to our phones or computers doesn't feel like we're talking to a person yet, but Reimer seems to think that can change with realistic reactions, movements, and contextual actions and responses – especially when portrayed in VR. He points out that in the video example above, the robot not only looks at the person it's talking to, but also uses hand gestures and other expressions to make it feel “right.”
When I think back to the hours I spent playing the Dreamcast version of Seaman, one of the most impressive (and disturbing) things about it was the emotional reactions the fishmen showed when I spoke to them. Of course, the game didn't really “understand” what I was saying in a human way; it mostly looked for keywords to respond to in order to fake a human reaction.
What's better than modern AI in this regard? Nothing so far, of course! LLMs like Google Gemini have proven that they can understand not only what we say to them, but also drawings and things they see through cameras. So imagine how much better the concept of a virtual pet could be if it could actually understand you.
I don't need my pet to talk to me like the fish in Seaman, but how cool would it be if it could understand more than just simple commands and instead evolve to fully understand language? This is without a doubt the true next generation of virtual pets.