Project

General

Profile

A General Control Architecture for Visual Servoing and Physical Interaction Tasks for Fully-actuated Aerial Vehicles

1. Description

The objective of this work is the development of a control architecture to allow fully-actuated Multi-rotor aerial vehicles (MRAVs) to physically interact with the environment.

The user scenario designed to validate this controller architecture is a pick-and-place application. In this setting, a fully-actuated hexa-rotor has to scan the surrounding area to search for some bricks to be collected. The bricks are detected by means of an onboard monocular camera and by the use of AruCo tags placed over the bricks. Once a brick is collected, the robot scans the area to search for the placing location, which is again marked through a different tag. The robot is equipped with a custom gripper which enables it to collect the detected bricks. It is composed by an electromagnet controllable by means of an Arduino board.

The contact detection with the environment, either the brick to be collected or with the placing surface, is estimated by means of onboard measurements which are fed to a wrench observer. This algorithm computes an estimation of the external wrench - forces and torques - applied on the aerial robot by the environment.

2. Simulations

Simulations are performed withing the Gazebo simulator.

In the following, a screenshot of a simulation is provided.

Snapshot of the simulation environment in Gazebo.
Figure 1. Snapshot of the simulation environment in Gazebo.

3. Experiments

The experiments were conducted in the indoor arena available at LAAS-CNRS (Toulouse, France), whose picture is provided in the following.

Indoor arena at LAAS-CNRS and experimental setup.
Figure 2. Picture of the indoor arena available at LAAS-CNRS and experimental setup.

That figure shows also the starting experimental setup for the validation campaign.

The robot used for the experimental campaign is a custom-designed fully-actuated hexa-rotor shown in the figure below.

Fully-actuated hexa-rotor designed and built at LAAS-CNRS.
Figure 3. Picture of the fully-actuated hexa-rotor designed and built at LAAS-CNRS.

The robot features a down-facing camera, namely a Intel Realsense T265 module. It is equipped with an onboard computer running all the software components (state machine, control architecture and vision modules). Six motor-propeller actuation units are placed symmetrically with respect to the main body.c The particularity is that they are not collinear which lend the fully-actuation property to the system.

In the following, the figure shows a snapshot taken during the experiments. It is taken while the robot passes over two bricks. As it is possible to see from the frame acquired by the camera, reported in the bottom right, the bricks are detected by the onboard camera. Indeed, a red dot marker is visualized for each detected tag.

A red dot marker is present also for the detected AruCo tag for the placing location represented by the white plate in the image.

Experimental setup and frame from onboard camera.
Figure 4. Picture of the experiments. On the bottom right, a frame from the onboard camera is provided.

4. Technical details

4.1. Control architecture

Block diagram of control architecture.
Figure 5. Block diagram of the developed control architecture.

In the figure above, it is shown a block diagram of the robot control architecture. An Hybrid Visual Servoing generates the desired trajectory that drives the robot towards the goal (collectable brick position or placing location). This trajectory is filtered by an Admittance Filter relying on the estimation of the external wrench applied on the robot which is provided by the Wrench Observer. The filtered reference trajectory is the one that allows the robot to be compliant with the applied external forces and torques. It is then tracked by a Geometric Controller which generates proper motor commands to be sent to the robot actuators.

4.2. Software implementation

We rely on the Telekyb3 architecture for the robot communication, and on GenOM3 and pocolibs to realize and run each module on the onboard computer mounted on the robot. Each module is then implemented in C/C++. A state machine to control the robot and the execution of the task are realized in TCL. A screenshot of such a state machine running is provided in the figure below.

Quad-rotor simulation.
Figure 6. Screenshot of the state machine running in simulation along with the Gazebo simulator.

This figure shows also the Gazebo simulator running, which is used as simulation environment.

4.3. Hardware

4.3.1. Gripper device

The gripper device is composed by:

  • an electromagnet

  • an H-bridge to drive the electromagnet

  • a 24V step-up DC/DC converter

  • an Arduino board to control the H-bridge

In the following, a picture of the actual gripper is shown alongside the driving circuitry.

Picture of the gripper device.
Figure 7. Picture of the gripper device.

In the following figure, a schematics detailing the interconnections within the different components is provided.

Schematics gripper device.
Figure 8. Schematics of the electronic components adopted in the gripper device.

The two digital pins of the Arduino board command the gates of the mosfets in the h-bridge which consequently drives the electromagnet. In this way, it is possible to drive the electromagnet from +24V to -24V. The electromagnet is empirically tested to exert a larger attracting force at -24V, while being neutralized at -24V, as in its resting configuration (i.e. 0V) it offers a light pulling force. The step-up dc/dc converter is used to stabilize the voltage powering the h-bridge and it is chosen in order to provide enough current (power). The Arduino board is connected through USB to the PC.

Title:

A General Control Architecture for Visual Servoing and Physical Interaction Tasks for Fully-actuated Aerial Vehicles

Authors:

Gianluca Corsini[1], Martin Jacquet[1], Antonio Enrique Jimenez-Cano[1], Amr Afifi[2], Daniel Sidobre [1], Antonio Franchi[2][1]

Cite as:

G. Corsini, M. Jacquet, A. E. Jimenez-Cano, A. Afifi, D. Sidobre, A. Franchi, "A General Control Architecture for Visual Servoing and Physical Interaction Tasks for Fully-actuated Aerial Vehicles", in The 1st AIRPHARO Workshop on Aerial Robotic Systems Physically Interacting with the Environment, Oct 2021, Biograd na Moru, Croatia. doi: 10.1109/AIRPHARO52252.2021.9571053.

Keywords:

Aerial Robotics; Multi-rotor aerial vehicles; Physical interaction; Fully-actuation; Hybrid Visual Servoing (HVS); Wrench observer; Admittance filter

5.2. Released software


1. LAAS-RIS - Équipe Robotique et InteractionS.
2. EEMCS - Faculty of Electrical Engineering, Mathematics and Computer Science.