Project

General

Profile

0116 » History » Revision 6

Revision 5 (Quentin Labourey, 2018-01-16 15:36) → Revision 6/8 (Ellon Paiva Mendes, 2018-01-16 15:46)

h1. 01/16 

 *Participants:*  
 * From MAG: Vincent Bissonnette, Clément Bazerque, Le Bihan, Raphaël (?) 
 * From LAAS: Andrea, Ellon, Quentin 

 *Goal of the meeting:* As we are going to test our algorithm chains in MORSE, we need to coordinate our effort on the simulator and collaborate where we can. As Clément already started working full-time on MORSE, we went there to see what was already available and how we can catch-up and improve. 

 h2. MORSE at MAG 

 MAG's goal at the moment is to produce a full working chain integrated into MORSE (and if possible, with our help, directly on our rovers), *by the end of February*. The first version of the processing chains they want to integrate is the following: 
 !/home/quentin/Desktop/Chain1.png! 
 <pre> 
 Stereovision => Disparity computation => DEM building => Navigation Map => Path planning => Motion control 
 </pre> 

 All the algorithms come from CNES and need integration inside MORSE. The following sensor/actuator flows are already implemented: 
 * Lidar 
 * Stereobench 
 * Pan-tilt unit 
 * Robot control (did not ask which one, but from Simon's report it is a RMP440) 

 At the moment, the interface between MORSE and the algorithms is made via YARP, but they are working on an interface with ROS. The orchestration of scenarios is made through a simple state-machine written in Python. The scenario implemented so far is "Acquire a scan and move forward for a given distance". No localization is performed in MORSE.The pose of the robot is directly given through MORSE and does not come from a MAG algorithm. The machine has then two states: 
 * Move pan-tilt 
 * Go to point 
 


 *Concerning datatypes:* Right now, they are not using ASN.1 compliant datatypes. Vincent told us that the ESROCOS ASN.1 compiler exists but the did not interface anything with the C struct produced. He proposes that we use ROS datatypes for now and see for interfacing properly afterwards. We agreed on this. 

 h2. How LAAS can contribute 

 *First thing is to provide MAG with accurate simulations of our rovers, in order to test their algorithms and ease integration directly on the rover. They require: 
 ** Sensor streams: 
 *** Velodyne 
 *** Stereovision 
 *** Sick LDMRS 
 ** Transform tree of the rovers 
 ** Accurate control flow 

 *Second element would be to provide a localization module to integrate in the chain, in order to have the full chains coming from InFuse algorithms => We propose to use the Visual Odometry from Andrea 
 *Third element is to provide a basic version for the DPM. We would then have the following chain of processing: 
 <pre> 
 Stereovision => Disparity => DEM 
 ||=> Visual Odometry ======= ^ 
 </pre>