Embedded Motion Control 2013 Group 9

From Control Systems Technology Group
Jump to navigation Jump to search

Group members

Name: Student id: Email:
Jeroen Lamers 0771264 j.w.lamers@student.tue.nl
Rens Samplonius 0785119 r.j.samplonius@student.tue.nl
Haico Ploegmakers 0775395 h.e.c.w.ploegmakers@student.tue.nl
Frank Evers 0789890 f.evers@student.tue.nl
Filipe Catarino 0821789 f.freire.catarino@student.tue.nl

Planning

Week: Activities: People:
Week 1
  • Install the Ubuntu Software / Ros / Gazebo / QT Edit
  • Start the tutorials for Ros / C++
  • Setup VPN
Everbody
Week 2
  • Finish all tutorials
  • Start brainstorming
  • Getting feeling with the robot doing the Simulations
  • Identify the Sensor characteristics
  • Everybody
  • Everybody
  • Everybody
  • Everybody
Week 3
  • Create architecture for Nodes and Functions
  • Writing specific functions for Corridor Competition
  • Get experience with PICO, first experiment (18/9)
  • Frank & Filipe
  • Rens & Jeroen & Haico
  • Everybody
Week 4
  • Test the concept code on PICO (23/9)
  • Finalize the code for Pico Competition
  • Corridor Competition on (25/9)
  • Decide all necessary functions for Maze
  • Everybody
  • Everybody
  • Everybody
Week 5
  • Rewrite the function to drive straight
  • Write function to determine the junction
  • Write function to take corners based on the corner
Week 6 ...
Week 7 Not yet planned

Progress

Week 1: September 2 - September 8

The first week mainly consisted of installing the Ubuntu OS with all the required ROS/Gazebo software. This did not went smoothly as there is no easy available Ubuntu installation for Mac (1 member) and missing one tiny step in the Tutorials could end up with a not working environment.

It was decided that everyone should do all the tutorials to know how everything works, and that we would work with the software provided by the course material (QT for c++ and fuerte Ros).

Week 2: September 9 - September 15

During the second week everyone finished the tutorials for ROS and c++.

Robot Sensors

Researching the given example with the simulation environment shows that the laser data is acquired by scanning in a 270 degree angle with increments of 0.25 degrees. This information is saved in a a vector and can be used to obtain distances for every angle at a certain time.

Week 3: September 16 - September 22

The third week consisted of 2 parts:

  • Creating the architecture of the system, by dividing all the necessary steps into functions.
  • Focus on the corridor competition

System functions & Architecture

The group was split into two teams where 2 people where focusing on the architecture of the system. By doing this early it is clear how the system is going to be created. There is one main file that is able to call for all different functions that are required for the situation at hand. The focus of course is on the basic functions of the system for the corridor competition. These functions are:

  • Driving in a straight line in the middle of the corridor
  • Detecting the corner on either the left or the right side
  • Turn into the corner and leave the corridor
  • Stop outside the corridor


Corridor Competition concept

Because the contest dictates that the robot will be between the walls, and facing the end of the competition, the idea is that the robot is able to drive forward into the corridor. Because the robot is put there by human hand, and walls might not be exactly parallel there has to be some kind of control. By measuring the closest distance to the walls with the laser, the robot is able to calculate where the middle of the corridor is. The laser is then also used to calculate the angle between the robot and the wall.

The concept program for the corridor competition is drawn in the following schematic:

Emc09 Corridor.png

To make this program possible, the system has a main loop with a case switch. This will activate the functions that are required at that moment. When for example the corner is detected it will switch to that function.

Driving Straight: To drive straight the robot measures the closest distance to each wall and also the corresponding angle. Using this information the orientation of the robot and the position in the corridor is determined. Using a controller, the robot will drive forward and correct its position until it is moving forward in the middle of the corridor.

Corner Detection: The corner is detected by looking at an angle of 90 degrees. If there is a sudden change in that distance (opening in the wall) the system will switch to the corner function.

Turn into Corner: After detecting the corner, it will drive while keeping the corner at the same distance. After this it will drive straight until the walls disappear and stop.

Week 4: September 23 - September 29

Test on the Robot

After we created a program for the Corridor Competition, the simulation showed good results. By running the program with the robot in various positions in the Corridor we concluded that the program was functioning properly. After that came the testing on the Robot. There were a lot of problems with the Robot. For that reason we were not able to have normal testing until one day before the contest.

In the end the tests on the robot were successful. The Robot was able to drive straight trough the corridor, however due to the hysteresis, friction and the low values on the controller it drove in a stretched S-shape, like a snake. The corner however was very smooth.

Emc09 controller.jpg

The Competition

After some problems where the robot would turn after initialization, the competition was a success in the second try. Group 9 set the third time of 4 groups what were able to finish the competition.

Youtube corri.png

<br\>

Strategy

After the competition and a brainstorm session we came to the conclusion that this method is only effective for straight corridors with single exits. Watching the maze that is provided with the simulation is is clear that there are a lot of different types of situations. Also decisions on where to at certain junctions is not supported by the current code. Therefor the strategy should contain 'mapping'. This way there is a way to detect the path in front of the robot while also saving everthing that was discovered before. This should prevent driving into allready explored parts.

Week 5: September 30 - October 6

This week we started with a totally new design for the pico software architecture. For the corridor competition we only focused on a particular task, driving straight and take one corner. In the real maze more functions and software features will be required. We started by separating the existing code in even more individual functions. We split our group in 3 subgroups

  • Frank and Filipe look into analyzing of the laser data and come up with a matrix of “edges” from which lines can be determined.
  • Rens and Jeroen will use this information to determine a trajectory through the junctions based on “nodes”.
  • Haico wil look into the arrow detection with the camera.

Finding 'Hard Points'

The way to find corners is based on scanning the entire range of the laser. From the position of the robot, a corner can be detected by finding a sudden change in the distance in another direction. Using this theory, there are 2 types of corners: one where the middle is closest and the distance grows from this point (a corner to turn around), and also a corner that is opposed to this (a corner to turn before). This way the robot and drive to a junction, determine the type, and then use this information to take the specific corner.

Determine Trajectory in Junctions

After the 'hard points' are known, the junction can be determined. This way we have the possibility to pick one of the directions and also drive trough it. Because there are only so many possible junctions, it is possible to prepare for all of these and determine the trajectory.

Vision

Haico looked for a solution.

Week 6: October 7 - October 13