Home / Technology / AI & IT / Robotics simulator

Robotics simulator

Modelling is the process of representing a model (e.g., physical, mathematical, or logical representation of a system, entity, phenomenon, or process) which includes its construction and working. This model is similar to a real system, which helps the analyst predict the effect of changes to the system. Simulation of a system is the operation of a model in terms of time or space, which helps analyze the performance of an existing or a proposed system.

 

A robotics simulator is a simulator used to create applications for a physical robot without depending on the actual machine, thus saving cost and time. In some case, these applications can be transferred onto the physical robot (or rebuilt) without modifications.

 

The last five years marked a surge in interest for and use of smart robots, which operate in dynamic and unstructured environments and might interact with humans. Emerging artificial intelligence techniques will endow the next generation of robots with mobility and decision-making skills. These robots will be flexible and reconfigurable; interact with humans; and operate in environments that are unstructured, uncertain, and rapidly changing in time. It is expected of them to assume new roles such as operating on highways as autonomous vehicles, in nursing homes assisting social workers, in schools tutoring young learners, underwater managing oil spills, and in the adverse and cluttered environments of search-and-rescue missions or remotely performing surgeries. While physically testing these robots before deployment is mandatory, through simulation, the engineering design process can be accelerated, made more cost effective, and benefit from more thorough testing.

 

Robotics simulators are invaluable tools that allow developers to rapidly and inexpensively design, prototype, and
test robots in a controlled environment without the need for physical hardware. Simulation is particularly promising for verification and validation (V&V) of robotic systems, potentially providing an automated, cost-effective, and scalable alternative to the manual and expensive process of field testing.

 

The term robotics simulator can refer to several different robotics simulation applications. For example, in mobile robotics applications, behavior-based robotics simulators allow users to create simple worlds of rigid objects and light sources and to program robots to interact with these worlds. Behavior-based simulation allows for actions that are more biological in nature when compared to simulators that are more binary, or computational. In addition, behavior-based simulators may “learn” from mistakes and are capable of demonstrating the anthropomorphic quality of tenacity.

 

There are two time-consuming stages in the design of a new robot: the mechanical design and the control policy design. The former is concerned with producing a solution that can execute a predefined set of tasks. The latter is concerned with endowing the robot with smarts to actually carry out the tasks that it has the potential to execute. For both mechanical and control policy design, one typically produces prototypes that iteratively improve on previous versions until a certain prototype is acceptable; this prototype becomes a candidate solution. The iterative process to produce the candidate solution is time-consuming.

 

Additionally, it can be expensive, unsafe (for humans or the robot hardware), and sometimes impractical (if designing a rover for Mars, testing cannot be done in Martian conditions). Moving from a model-free to a model-based design approach, i.e., carrying out the iterative loop in simulation, can reduce the time associated with the design process. Indeed, changing a rover suspension design to assess trafficability in simulation can be as fast as modifying a handful of parameters in a template file that defines the geometry of the vehicle. By comparison, physically modifying the suspension of a prototype is significantly more time-consuming.

 

Additional time and cost savings are incurred in the subsequent stage, the testing of the candidate design. Before it becomes the solution, the candidate design is subjected to extensive additional testing in which a limited collection of candidate design clones is assessed via a predefined evaluation process. Physically building the collection of candidate designs is time consuming and costly since there is no assembly line ready yet and each clone is handcrafted. In simulation, testing the candidate might be as simple as copying the model files to different folders and conducting the predefined evaluation process using high-throughput, parallel computing.

 

Approaches to verification of autonomous systems are in their infancy. Approaches to verification and “debugging” of autonomous robotic systems that learn online are essentially nonexistent. Repeatability, particularly with respect to stress/corner cases; full control of the experiment insofar as the “environment” is concerned; and the lack of risk to human and hardware damage are three attributes that can make simulation instrumental in establishing principled protocols for autonomous system verification and, by extension, industry standards and guidelines. Against this backdrop, as autonomous systems cannot foresee unknowns that they may encounter, novel formal verification schemes can be developed to verify specifications such as safety in real time under assumed uncertainty in the system and its environment.

 

Simulation can play an important role in providing insights into multirobot, collaborative scenarios. Collaborative, multirobot systems can exclusively comprise robots interacting with each other based on their own local decision-making algorithms that factor in sensed and/or shared information or can include human interaction as in, for instance, search and rescue scenarios. As the number of robot–robot and/or human–robot interactions increases, so does the complexity of designing and verifying these systems. Physical testing and verification are daunting as the collection of scenarios to probe increases quadratically with the number of agents in the system. Moreover, it is difficult to systematically test in real conditions multirobot systems used, for instance, in environmental monitoring, off-road mobility/survivability, surveillance, or infrastructure management, etc., due in part to the stiff challenges posed by operating groups of agents in such environments. Simulation is very convenient in such scenarios, given that it can also be used to probe for interagent connectivity failure or hostile network penetration.

 

Hostile/adversarial attacks aim at more than multirobot network penetration—the robot’s control stack and sensing system are two other targets. One of the most compelling cases for use of simulation in robotics can be made in conjunction with the need to thoroughly check the consequence of such attacks. Do the checks designed to guard against adversarial attacks work as expected? What is a safe way to counteract an adversarial attack? The answer to these and similar questions is critical for safety and regulatory purposes yet direct hardware testing is costly and potentially unsafe (bulky robots gone awry can become outright dangerous). Simulating the response of the robotic system in cases of failure, while potentially performing a task, permits the investigation of fail–safe strategies for the system as well as the design of security, antibreach policies. Without simulation, one is limited in what and how many scenarios can be tested.

 

 

A robust and feature-rich set of four or five simulation tools available in the open-source domain is critical to advancing the state of the art in robotics. Validated open-source platforms democratize the simulation-in-robotics effort and inspire/inform future, more refined open-source or commercial efforts. Trusted open-source solutions are quickly embraced; support the idea of reproducibility/verifiability in science; and have the side effect of immediately raising the bar for the commercial tools, which must necessarily up the ante. Owing to the breadth of robotics applications, it is likely that no single platform will emerge as the solution of choice for all targeted simulation scenarios, which provides the rationale for the “four or five simulation platforms” recommendation above.

 

The use of a robotics simulator for the development of a robotics control program is highly recommended regardless of whether an actual robot is available or not. The simulator allows for robotics programs to be conveniently written and debugged off-line with the final version of the program tested on an actual robot. This primarily holds for industrial robotic applications only, since the success of off-line programming depends on how similar the real environment of the robot is to the simulated environment.

 

Sensor-based robot actions are much more difficult to simulate and/or to program off-line, since the robot motion depends on the instantaneous sensor readings in the real world.

 

Challenges

Researchers from School of Computer Science, Carnegie Mellon University,  wrote a paper with aim  to develop a grounded understanding of the ways developers use simulation in their process and the challenges they face in doing so. This type of understanding can guide the development of more effective simulators and testing techniques for modern robotics development that are better suited to developer needs and that can ultimately result in higher quality robots.

 

To this end, we conduct a study of robotics developers to understand how they perceive simulation-based testing,
and what challenges they face while using simulators. Our survey with 82 participants confirms that simulation is a
popular tool among robotics developers and that testing is its most common use case. From our participants’ responses, we identified 10 challenges that make it difficult for developers to use simulation in general, for testing, and specifically
for automated testing.

 

Reality gap: The simulator does not sufficiently replicate the real-world behavior of the robot to a degree that is useful.

Complexity: The time and resources required to setup a sufficiently accurate, useful simulator could be better spent on
other activities

Lacking capabilities: Simulators may not possess all of the capabilities that users desire, or those simulators that do may be prohibitively expensive.

Reproducibility: Simulations are non-deterministic, making it difficult to repeat simulations, recreate issues encountered in simulation or on real hardware, and track down problems.

Scenario and environment construction: It is difficult to create the scenarios and environments required for testing
the system in simulation.

Resource costs: The computational overhead of simulation requires special hardware and computing resources which adds to the financial cost of testing

Automation features: The simulator is not designed to be used for automated testing and does not allow headless, scripted or parallel execution.

Simulator reliability: The simulation is not reliable enough to be used in test automation in terms of the stability of the simulator software, and the timing and synchronization issues introduced by the simulator.

Interface stability: The simulator’s interface is not stable enough or sufficiently well documented to work with existing
code or testing pipelines.

 

To date, robotics simulation has almost exclusively drawn on rigid body dynamics. Indeed, the underlying modeling is simpler, the software implementation effort is more reasonable, and the simulation runtimes are shorter. Looking ahead, support for soft robotics is critical in several fields, e.g., HRI and biomimetic robots. It is anticipated that embracing compliance in the robotics models will elicit new approaches to handling frictional contact with the potential benefit of alleviating numerical artifacts/paradoxes brought to the fore by the rigid body model. Generating through simulation sensory-motor data that match the multiresolution dynamics, noise, softness, etc., of sensors and actuators during complex tasks that include both compliance and frictional contacts is poised to open the door to a systematic study of the sensory-motor space for robotic manipulation and locomotion.

 

The issue of establishing human models that capture mechanical attributes of the body and/or psychological and cognitive traits of human behavior is cross-cutting.  Applications in which HRI will come into play include robotic surgery, which is relevant in surgery training and remotely treating patients in remote/disaster zones; assisting seniors with tasks such as dressing, personal hygiene, cleaning, and cooking; and assisting individuals with limited ambulatory ability with transportation needs, etc. The body of work in the area of HRI modeling is meager, which explains the limited knowledge vis-à-vis the issue of human cognitive performance in HRI. In this context, there is a very limited set of science-based requirements and thresholds for safe human–robot interaction.

 

Military relevant settings

In complex militarily-relevant settings, robotic vehicles have not demonstrated operationally relevant speed and aren’t autonomously reliable. Unmanned and autonomous ground vehicles have the potential to revolutionize military and civilian navigation. Military vehicles, however, present unique challenges related to autonomous navigation that are not encountered in civilian applications. These include a high percentage of off-road navigation, navigation in hostile environments, navigation in GPS-denied environments, and navigation in urban environments where little data regarding road networks are available.

 

While the past decade has seen increased use of simulation in developing field robotics, the military off-road environment is especially challenging and complex. Computers need to re-create three-dimensional surfaces, compliant soils and vegetation, and hundreds of obstacle classes. Software also needs to take into account lower fidelity or limited mapping data, unique platform-surface interactions, continuous motion planning, and no defined road networks or driving rules. In addition, modeling high speed off-road performance of sensors/modalities, sensor-to-terrain representations, autonomous platforms, and autonomous control remains a software and processing challenge.

 

DARPA’s RACER-Sim project is seeking innovations in technologies that bridge the gap from simulation to the real world and significantly reduce the cost of off-road autonomy development. Over a four-year timeline, RACER-Sim will investigate technologies that are applicable to the off-road environment in the areas of algorithm development, simulation element technologies, and simulator content generation.

 

 

Common Robot Simulators

Customarily, simulation in robotics calls for the interplay of three types of submodels: robots, synthetic worlds, and sensors. In some cases, the human component comes into play, and, for multirobot scenarios, one might need to simulate the communication layer.

 

Modern simulators tend to provide the following features:

  • Fast robot prototyping
  • Using the own simulator as creation tool.
  • Using external tools.
  • Physics engines for realistic movements. Most simulators use Bullet, ODE or PhysX.
  • Realistic 3d rendering. Standard 3d modeling tools or third-party tools can be used to build the environments.

One of the most popular applications for robotics simulators is for 3D modeling and rendering of a robot and its environment. This type of robotics software has a simulator that is a virtual robot, which is capable of emulating the motion of an actual robot in a real work envelope. Some robotics simulators use a physics engine for more realistic motion generation of the robot.

  • Dynamic robot bodies with scripting. C, C++, Perl, Python, Java, URBI, MATLAB languages used by Webots, Python used by Gazebo.

 

Popular simulators, such as Gazebo, V-REP, and Webots, have been used to simulate a variety of systems including industrial robots, unmanned aerial vehicles, and autonomous (self-driving) cars. Numerous companies involved in the autonomy sector, such as Uber, NVIDIA, and Waymo, use simulation on a large scale to develop, train, and test their algorithms. The high demand for simulation in this sector has led to the development of a new generation of specialized simulators, such as CARLA, LGSVL, AirSim, and AADS

 

1. Webots

Webots is 3D simulation platform developed by Cyberbotics and used in service and industrial simulations. Webots is an open source and multi-platform desktop application used to simulate robots. It provides a complete development environment to model, program and simulate robots. The tool offer support for Windows, Linux and Apple platforms, and is one of the most used simulation software in education or research purposes. Any robot can be modeled, programmed and simulated in C, C++, Java, Python, Matlab, or URBI. The software is compatible with external libraries like OpenCV.

2. Gazebo

Gazebo is a multi-robot simulator with support for a wide range of sensors and objects. The software is ROS compatible along with many others Willow Garage robotics platforms. Gazebo offers the ability to accurately and efficiently simulate populations of robots in complex indoor and outdoor environments. At your fingertips is a robust physics engine, high-quality graphics, and convenient programmatic and graphical interfaces. Best of all, Gazebo is free with a vibrant community.

Designed by Coppelia Robotics, V-rep is one of the most advanced 3D simulators for industrial robots. The tool offers support for a wide range of programming languages including C/C++, Python, Java, Lua, Matlab or Urbi. It has support to develop algorithms to simulate automation scenarios, the platform is used in education as well by engineers for remote monitoring or safety double-checking

 

4. RoboWorks

RoboWorks is a 3D simulation tool developed by Newtonian. The software can be used to simulate in a virtual 3D world the behavior for industrial and service robots. RoboWorks offer support for ‘C/C++’, C/C++ interpreter Ch, VB, VB.NET, LabView, etc.

5. Blender

Blender is a powerful tool to design and simulate service robots in complex environments. The platform is compatible with Windows (XP, Vista, 7), Linux, OS X, FreeBSD, and Sun. With 3D content support, Blender is one of the most advanced design tools that can be used to simulate in virtual worlds the behavior of the robots.

6. RoboLogix

RoboLogix is a 3D industrial simulation software developed by Logic Design. The platform was designed to be used in real-world emulation for robotics applications with five-axis industrial robot. The program installed on the robot can be developed and tested in a wide range of practical applications. The platform offers support for a wide range of industrial robots including ABB, Fanuc and Kawasaki.

 

 

 

 

 

 

 

References and Resources also include:

https://www.pnas.org/content/118/1/e1907856118

https://sites.google.com/site/ruijiaoli/resources/roboticssimulationsoftwarelist

https://arxiv.org/pdf/2004.07368.pdf

About Rajesh Uppal

Check Also

Revolutionizing Space Security: AI-Enabled Satellite Swarms

As our reliance on satellite technology continues to grow, ensuring the security of these vital …

error: Content is protected !!