2020-08-27T22:00:00Z
Organization: Open Source Robotics Foundation
Mentor: Mabel Zhang
Student: Martiño Crespo (marticres@gmail.com)
Link to GSoC project: Google Summer of Code
This summer I have been selected to work on a tactile sensor plugin for Ignition Gazebo, Gazebo Classic next generation simulator. I would like to thank my mentor @mabelzhang and people at Open Robotics for making me feel part of the team from day 1. If you are thinking about applying for GSoC 2021, specially at Open Robotics, just do it!
Choosing a project
Since I started considering applying for GSoC, I clearly knew that the topic would be robotics. Apart from the fact that I find it extremely interesting, it is my main academic and working experience, however more as an integrator than as a software developer. Of course, when speaking about the latter, names like ROS, Gazebo or Open Robotics are very well known. I could not think of a better organization for my project. From the projects they offered, working on a tactile sensor plugin really caught my eye and that is why I decided to contact my mentor. I really liked her responsive attitude, and she even gave me some info and links to prepare my application. I guess that a realistic timeline and objectives — I just had some courses on software development back in my Bachelor’s, which I made very clear — as well as showing some initiative — I wrote a very simple, though you could say silly, Gazebo plugin before the application deadline — allowed me to be selected for the project.
The optical tactile sensor plugin
Tactile sensors are devices that measure information coming from the physical interaction with their environment, like your fingers or your skin do. Instead of focusing on one specific brand and model of the available tactile sensors in the market and simulating it, we switched our objectives and efforts into implementing a specific sensor technology such as an optical tactile sensor. These sensors are usually made up of a light source, a deformable surface, a lens and a camera. One of the aspects that make them very popular is that you can apply the already existing computer vision algorithms in order to extract information from that data.
As you can see in the previous screenshot from the tutorial, we used a contact sensor and a depth camera to simulate this behaviour. As objects come into contact with the former, surface normals are computed from the values returned by the latter. It is important to note that the values coming from the contact sensor are not currently used to make any computation, rather just for visualizing the contacts as a quick way to check that the plugin is working correctly. Due to lack of time, we could not merge the information coming from these two different sensors and provide a more realistic measurement.
The following is an intermediate work showing how the plugin interacts with different Ignition Fuel models included in the simulation. Note how the plugin is able to visualize the surface normals of the objects it is being pointed at:
The plugin also provides rich information for objects where there is a considerable variation of the surface normals, like the drill below. It currently outputs the surface normals through a topic, which you could use to infer information like an optimal grasp, what part of the object you are touching or even to try and guess what the sensor is touching!
However, the final implementation only outputs the normals which are inside the contact surface, so you have to actually touch an object just like any tactile sensor would have to. Unlike the previous screenshots, in the following image the normals are visualized using simpler geometry in order to increase the performance due to lower message transport and marker rendering costs.
The process
As there was no related work available in Ignition, this was also an exploratory project and we had to try new things, propose ideas and iterate on the design. We left behind some ideas that we thought would work but eventually did not, and unfortunately some details could not be finished.
In order to communicate updates on the work being done, we established a weekly meeting, even though email and Slack were always there when necessary. I believe that the communication from both mentor and student was really fluent, which without any doubt made the project accomplish its most important goals.
Conclusion
Tactile sensors allow you to sense physical interactions with the environment. This GSoC project aimed to implement the first step in the simulation of an optical tactile sensor using a depth camera and a contact sensor. The plugin allows you to:
- Measure surface normals of the objects in contact.
- Visualize them in Ignition Gazebo.
- Access them through a topic.
- Tune the plugin with parameters like resolution, visualization and more.
- Turning the plugin on and off through a service.
I hope this is useful to the community or as an entry point to anyone willing to continue this work. Contributions are very welcome!
Thanks,
Martiño Crespo