Firefly Eindhoven - Prototype and TMC Demo

From Control Systems Technology Group

(Difference between revisions)
Jump to: navigation, search
(TMC event)
m (TMC event)
Line 20: Line 20:
The second day of testing was used to get the synchronization between the drone, the music and the visualization working. To make sure the music and visualization started at the same time as the drone show, the value of the switch that turns on the trajectory following on the drone was send to the laptop over ZigBee communication. When this value switched, the Simulink model would then send a UDP signal to another version of Matlab that would manage the music. Using this hierarchy, the time delays were all very much alike and therefore no further synchronization was needed between the drone and simulation.     
The second day of testing was used to get the synchronization between the drone, the music and the visualization working. To make sure the music and visualization started at the same time as the drone show, the value of the switch that turns on the trajectory following on the drone was send to the laptop over ZigBee communication. When this value switched, the Simulink model would then send a UDP signal to another version of Matlab that would manage the music. Using this hierarchy, the time delays were all very much alike and therefore no further synchronization was needed between the drone and simulation.     
-
[[File: TMCFlow.png ||thumb|right|200px|Vectors pointing in the phi is zero direction and the direction of the drone. LEDs 1 and 2 are the top LEDs]]
+
[[File: TMCFlow.png ||thumb|right|400px|Vectors pointing in the phi is zero direction and the direction of the drone. LEDs 1 and 2 are the top LEDs]]
Using these architectures, it was possible to get the drone flying and visualized with as only human interaction the take-off procedure and the landing.  
Using these architectures, it was possible to get the drone flying and visualized with as only human interaction the take-off procedure and the landing.  

Revision as of 14:08, 20 May 2018

Contents

Introduction

Motivation

Duarte noted that "physics" should be mentioned.

Initial concept

  • Synchronization with music
  • LEDs omitted
  • 5 segments in the trajectory

Video

LED Animation

  • How to design it from scratch (a new animation)
  • How was it programmed?

TMC event

To make the TMC event a success, the drone would need to fly in a different environment than the soccer field environment on which the drone was tested. To test if the drone would fly correctly, the UWB system was set up in the cage in which the drone was going to fly and the drone was updated with the correct distances between the beacons. After uploading the right parameters to the drone, the drone was moved by hand through the field to see find the boundaries of where it could fly without a risk of crashing into the net. During this, a close eye was kept to the position calculated on the drone to see if there were any spikes or jumps in the position which might give problems during the flight. The trajectory was then scaled and shifted to the cage positions and dimensions. When testing the drone, it was noticed that the z position was oscillating a lot. This led us to thinking that the ground was reflecting too much, and therefore it was decided that it would be better to take the soccer field floor and place it on top of the floor to get rid of light reflections.

The second day of testing was used to get the synchronization between the drone, the music and the visualization working. To make sure the music and visualization started at the same time as the drone show, the value of the switch that turns on the trajectory following on the drone was send to the laptop over ZigBee communication. When this value switched, the Simulink model would then send a UDP signal to another version of Matlab that would manage the music. Using this hierarchy, the time delays were all very much alike and therefore no further synchronization was needed between the drone and simulation.

Vectors pointing in the phi is zero direction and the direction of the drone. LEDs 1 and 2 are the top LEDs

Using these architectures, it was possible to get the drone flying and visualized with as only human interaction the take-off procedure and the landing. During the final tests, it was noticed that during the part of the show where the drone had to stand still, it was oscillating a lot. Tests and checks were done to find the problem and the conclusion was that it had to be the ultra-wide band that was giving the oscillating problems. This was because the cage in which the drone had to fly had to be moved a little by the TMC organizers, and during this process, the beacons had moved a few centimeters with respect to each other. This made the UWB results very inaccurate up to tens of centimeters. Therefore the new distances were calculated and uploaded, after which the drone flew well again.

During the event it happened once that the drone was unable to start its landing sequence and therefore had to be shut down manually, but this is a bug that could not be replicated and therefore there still has not been found a reason for this nor has anything changed. It never happened afterwards, so it is assumed that it is a rare deadlock in the code that still has to be found.

The TMC event has been a great success due to the good preparations and hard work of the team, and therefore we became first in the TMC competition where the team competed against other technological ideas for startup companies.

Personal tools