Assembly of physical structure
Physical assembly of the structure including gantry, all the setup of the sensors (cameras and LiDARS), cabling, collaborative robot and connections to computer.Creation and adaptation of drivers for velodynes and cameras
The creation and adaptation of drivers is essential to ensure the communication with ROS. For the velodyne, this included using the ROS velodyne_driver. To communicate with different velodynes, it is necessary to connect them to a switch and define different IPs for each one of them.In the case of the astra cameras, there is also a ROS driver. Yet, this driver does not work well in multi camera mode due to USB overflow. Currently, this problem isn't yet solved because each USB camera requires a single unique USB port. Our efforts led us to conclude that our current computer is not able to handle this because of USB incompatibilities with the current motherboard.
Development of a fully simulated collaborative cell (larcc)
I developed a fully functional collaborative cell mimicking the real one at LAR. This involved the creation of xacros for each cell component. It also involved the simulation in gazebo of the cameras, lidars and collaborative robot.At the moment, the cell is completly working in real time and is able to simulate all of the above and to plan and execute ur10e trajectories using the motion planning toolbox.
Several tests of the sensor drivers (Orbecc Astra and Velodyne VLP16) were performed to obtain robust communication with the sensors through the ROS. These tests also implied adapting the drivers to work with multiple sensors simultaneously. It is now possible to acquire sensory data from the 3D lidar and the RGBD cameras. Experiments are being conducted to understand the number of sensors that can be used simultaneously. To optimize the bandwidth used for communications between sensors and the computer, the sensors are being configured to the appropriate resolution in the case of the cameras. The lidars are being configured to monitor of a 180º spherical calotte instead of the 360º configured by default. The following figure has an example of sensory data visualized in real-time in the cell. The video can be viewed at https://www.youtube.com/watch?v=miV3Ge0ebUw .