Imagine a world where real, 'nuts and bolts' robots are operated by AI-powered Unmanned Aerial Vehicles (UAVs), which are in turn supervised by humans.

This futuristic world takes place in both physical (where the humans and robots exist) and virtual spaces (where the AI-powered UAV’s exist).

Using physical UAVs for controlling a swarm of robots in a real-world environment would be a resource intensive and logistically challenging process; and this is where the "Swarm Metaverse" can play a hugely important role.

According to Dr. Aya Hussein, the lead researcher on the Swarm Metaverse project at UNSW Canberra, the introduction of AI-powered virtual UAVs simplifies the role of human supervisors by significantly reducing the stress and complexity of having to manage several robots or UAVs at once.

"In the Swarm Metaverse we set-up AI-powered navigators that control virtual UAVs, each acting upon a sub-swarm of physical robots. Within this mixed physical-virtual world, an endless number of swarm guidance tactics can be tested, while also avoiding high costs and operational complexities of physical UAV platforms," Dr Hussein said.

A human supervisor would control the virtual UAVs in the Swarm Metaverse using gestural communications and they would view the swarm through a graphical user interface, which could be on a screen or using a virtual reality (VR) headset. The Swarm Metaverse allows the human supervisors to use common gestures, minimizing the need for specialized training.

"Communication among people often involves gestures, for instance pointing to an object is much easier than using spoken language to describe its exact location. Gesture-based interfaces are really useful in a virtual world," Dr Hussein said.

The advantages of using the Swarm Metaverse concept

The Swarm Metaverse offers two distinct advantages.

The first is that is provides a cost-effective way to test robot and UAV swarm guidance tactics without the need for expensive physical platforms.

The second is that with every use the Swarm Metaverse uses AI to make continuous improvements which will lead to increasingly higher levels of autonomy in future swarm operations.

The "sheepdog model"

In the Swarm Metaverse a model inspired by the interactions between a sheepdog and a flock of sheep is used, where the sheepdog is the UAV’s virtual navigator and the sheep are the robots. The human within the Swarm Metaverse is the farmer who instructs the sheepdog.

The principle of this model is that the more times a sheepdog controls a flock of sheep (under the supervision of a farmer), the better it will become at herding sheep in future; and so it would be for a virtual navigator with UAVs and robots. The sheepdog's primary responsibility is to reduce the work of the farmer.

"The sheepdog model enhances human performance by allowing them to function as supervisors rather than operators. This model improves the human-robot interaction, especially in complex situations," Dr Hussein said.

Adjustable levels of human control and autonomy

Autonomy levels in the Swarm Metaverse range from low to high.

At the lowest level, humans are able to guide and instruct a virtual navigator, while the highest level involves minimal human interaction after the task objective has been communicated to the virtual navigators.

Mid-level autonomy strikes a balance, allowing human influence without being overwhelming, offering flexibility in robot and UAV swarm control.

The research into the Swarm Metaverse by Dr Hussein is hoped to improve the impact on humans in stressful and high-workload scenarios where multiple UAV or robot activities are being undertaken at once.