Unity brings robotics design and learning capabilities to the metaverse.

Unity, a San Francisco-based game and other 3D content creation and management platform, announced on November 10  the launch of Unity Simulation Pro and Unity SystemGraph to advance AI-powered simulation, testing, and training of complex systems. 

Unity brings robotics design and learning capabilities to the metaverse.

Unity, a San Francisco-based game and other 3D content creation and management platform, announced on November 10  the launch of Unity Simulation Pro and Unity SystemGraph to advance AI-powered simulation, testing, and training of complex systems. 

 

With the increasing use of robotics  in supply chain and manufacturing, such software is critical to ensuring efficient and safe operation. 

 

Unity's senior vice president of artificial intelligence, Danny Lange, told VentureBeat in an email that the Unity SystemGraph uses a node-based approach to model the complex logic commonly found in electrical and mechanical systems. “This allows robots and engineers to easily model small systems and group them into larger, more complex systems, allowing them to prototype systems, test and analyze their behavior, and make optimal design decisions without  access to real equipment. ' said Lange.  Unity Simulation Pro, Unity's 

 runtime engine, provides headless rendering, eliminating the need to project every image onto the screen, increasing simulation efficiency by up to 50% and reducing costs. 

 

Robotics Use Cases 

“Unity Simulation Pro is the only product built from the ground up to provide distributed rendering, allowing multiple graphics processing units (GPUs) to simultaneously render the same Unity project or simulation environment, locally or in a private cloud.” said. ...that means Unity today can simulate multiple robots faster than real-time with tens, hundreds or  thousands of sensors. 

According to Lange, users in markets such as robotics, autonomous driving, drones, agricultural technologies, etc. create simulation models containing millions of warehouse spaces, dozens of robots and environments with hundreds of sensors, sensors and models. These simulations allow you to test software in a real virtual world, train and train robot operators, or try physical integrations before implementing them in the real world. All of this is  faster, more economical and safer on the metaverse. 

 

"A more specific use case is exploring collaborative mapping and mission planning for  indoor and outdoor robotic systems using Unity Simulation Pro," said Lange. He added that some users have built  simulated 4,000-square-foot buildings in larger forested areas and are trying to figure out how to map the environment using a combination of drones, off-road mobile robots, and walking robots. The company says it's working on enabling creators to create and simulate sensors and  mechatronic systems for use in simulations. 

The primary purpose of the Unity SystemGraph is to allow people studying building modeling with  physically accurate camera, lidar models, and SensorSDKs to leverage the SystemGraph library of pre-built models and easily customize them for specific use cases. 

 customers can now model at scale, iterate quickly, and test more to gain insight at a fraction of the cost of ongoing modeling, Unity said. Customers such as Volvo Cars, the Allen Institute for Artificial Intelligence and Carnegie Mellon University are already seeing results, the company added. 

Although there are several companies that have made simulators specifically for AI applications such as robotics or synthetic data generation, Unity's ease of use of its development tools has been proven by Roblox, Aarki, Chartboost, MathWorks and Mobvista. Lange says this is evident when you look at the size of its existing user base of over 1.5 million creators using Unity's editor tools. 

 

Unity states that organizations have technologies aimed at impacting the industry metaverse that continue to push the limits of modeling. 

 “As these simulations become more complex in terms of the size of the environment, the number of sensors used in that environment, or the number of avatars running in that environment, the demand for our products increases. Unity Simulation Pro's unique distributed rendering capabilities take advantage of the increase in GPU computing resources available to clients in the cloud or LAN to render these simulations faster than real-time. This is not possible with many open source rendering technologies or native Unity products. All of these scenarios render at less than 50% real-time,” said Lange. 

 

Future AI-based Technology 

Unity expects AI adoption to skyrocket by 2022,  with two main drivers of adoption. “On the one hand, companies like Unity will continue to offer products that  lower barriers to market entry and empower more customers to make broader decisions. This is combined with  cost savings for computing, sensors and other hardware components,” said Lang. “Then  the main trends driving adoption in terms of customer adoption are the widespread labor shortages and the need for  operational efficiencies. All of this has the effect of accelerating the economy driving adoption of these technologies in two ways.” 

 

Unity is redoubled its efforts to create custom-designed products for  simulation users. This allows users to simulate the real world by simulating environments with multiple sensors, multiple avatars and agents, greatly improving performance at a lower cost. The company says it will help  customers  take their first steps into the industrial metaverse. 

 

Unity will showcase  Unity Simulation Pro and Unity SystemGraph in an in-depth session at the  Unity AI Summit on November 18, 2021.