PRE2015 3 Groep2 week5: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 123: Line 123:
|- valign="top"  
|- valign="top"  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | 2B
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | 2B
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | X
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | X
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
|- valign="top"  
|- valign="top"  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | 2C
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | 2C
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |  
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | X
| style="border-right: 1px solid black; border-bottom: 1px solid black;" | X
| style="border-right: 1px solid black; border-bottom: 1px solid black;" |
|- valign="top"  
|- valign="top"  
|}
|}

Revision as of 09:57, 9 March 2016

Approaching Users

Experiment 1: Landing distance

Figure 1: Picture of the first experiment for determining the landing distance. Strips on the ground are giving the distance per 0.5m.

The variable landing distance is about the distance that users are still comfortable with the drone around. The optimal distance that users like and the nearest distance that people are comfortable with drones around are determined with an experiment. The subject (an user) stands on a given spot (l=0). The distances 0.5, 1, 1.5…7 meters are marked with masking tape (distance to test subject) on the ground. The drone will start at a distance of 7 meters (= lstart) as seen in figure 1. The drone will approach the person at a steady speed of approximately v = 1 m/s. It does so at a height of h = 1 meter. Whenever the test subject feels like the current distance between him and the drone is the most comfortable distance to land, the test subject will give off a sign and the drone will be given the order to land (lend). The subject will redo the test to determine the nearest distance where he or she feels comfortable. Those distances are measured and rounded per 0.25m. The results are seen below.

Experiment Optimal distance (m) Nearest distance (m)
1 2.25 1.0
2 2.75 0.75
3 2.5 1.0
4 2.25 0.75
5 2.0 0.75
6 1.75 0.5
7 3.5 2.0
8 3.5 1.75
9 1.75 0.5


The mean value of the optimal distance is 2.47m with an standard deviation of 0.67. The nearest distance has a mean of 1.00m with an standard deviation of 0.53. These means give the landing distance from this experiment, the optimal landing distance is 2.5m with a nearest landing distance of 1.0m. The drone should be programmed to keep these distances as first option and starting point of the landing procedure.

Points of improvement

Though the experiment came with a clear conclusion, with a few outliners, some variables that might influence the results have not been touched upon. The experiment above gives a general idea for a distance to keep from the users, but does for instance not distinguish between different users. These differences may for example concern:

  • Age
  • Sex
  • Experience (with drones)
  • Length of the user
  • And even general (in)security or character of the user

But also the drone itself can have influence on results:

  • Approaching speed
  • Size of the drone
  • Appearance of the drone
  • Noise
  • Wind generated

Further research will have to point out to what extend these factors play a role in approaching users with drones.

Experiment 2: Way of approach

Figure 2: Schematic representation of the experiment setup. Situation A, displayed in red. Situation B, displayed in green. Situation C, displayed in blue.

It's not only interesting to look at the best landing distance, but also at the way the drone approaches the user. For this, a distinction is made between three different situations. For a description of these situation see the list below and figure 2. In all these situations the test person is positioned at l = 0m. The drone starts at a distance lstart and height hstart.

Situation A
The drone flies horizontally to a certain distance lend then the drone lands vertically.
Situation B
The drone flies diagonally, at an angle α, to a certain point at distance lend and height hend. Then the drone lands vertically.
Situation C
The drones lowers itself vertically to a certain height hend. It then flies horizontally to a certain distance lend before it lands vertically on the ground.

For the distance lstart a distance of 8m is chosen. The ending distance lend is chosen according to the results of experiment 1 at roughly 2.5 meters.


After each test variation the test person is asked to rate the experience with the values very bad/bad/neutral/good/very good.

Experiment Very bad Bad Neutral Good Very good
1A X
1B X
1C X
2A X
2B X
2C X

Autonomous landing

Test setup

A major problem during the autonomous landing phase is that the drone needs to know where it is on the map it is creating. It's not wise to say that the drone flies at a constant speed, since it doesn't. Doing so would create an error which can result in complete disaster.

Since the test setup has been moved inside, the situation created won't be realistic, but very controllable. Determining the location of the drone will be done with markers placed on the ground or objects which the drone can detected with its down facing camera. These markers will be placed 50cm from its neighboring marker.

In the first situation the system will be developed in a two dimensional world. Their will be height differences and the drone can be front- and backwards. No movement to the left or right will be possible. If the drone succeeds to land in such a randomized environment their will be looked at the second situation.

In the second situation a third dimension will be added, the left and right movement. This will be a more realistic environment, but far from the environment a drone will encounter in a real world scenario.

In both the situations their will only be objects the drone can detected from the height it uses to scan the environment. Since the markers are used to determine the location of the drone, they will be placed on a level surface. This could be an object or the ground. Their will also be enough space to land the drone in a user friendly way.