Mobile Robot Control 2021 Group 3

From Control Systems Technology Group
Revision as of 15:58, 8 May 2021 by 20200666 (talk | contribs)
Jump to navigation Jump to search

Group Members

Students

Cecile.jpeg LarsHagendoorn.jpg WardKlip.jpg
Cecile Conrad Lars Hagendoorn Ward Klip
1016008 1013526 1015501
c.conrad@student.tue.nl l.s.hagendoorn@student.tue.nl w.m.klip@student.tue.nl
Foto Sjoerd.jpg EduardSzucs.jpg
Sjoerd Leemrijse Eduard-Istvan Szucs Matthijs Teurlings
1009082 1536419 1504274
s.j.leemrijse@student.tue.nl e.szucs@student.tue.nl m.h.b.teurlings@student.tue.nl

Tutor

Jordy Senden
j.p.f.senden@tue.nl


Task Division

In order to show on what task each student from the group is, or was, working on, we made this excel table in which every student entered their participation on a certain task. Please note that this table is often updated.

If there is a need for updating or just visualizing the current state of the table, please access the following link: [Task division.]

Everyone in this group has editing right and everyone else has just viewing right to the excel under the link found above.

Design Document

To get an overview of the project, a design document was created, which can be found here. In this document, the requirements, the components and specifications, and the functions and interfaces are described. Following this, the content will be used to better understand what will be done in the Escape Room Challenge and the Hospital Challenge.


Escape Room Challenge

Introduction

In this year's version of the course, we are going to use a simulation that reproduces the exact same behavior of the real PICO robot.

The first major milestone of this course is the Escape Room Challenge in which we are faced with the task of driving our robot out of a square room, through a corridor. This environment is said to have walls that are not perfectly straight and corners that are not perfectly perpendicular. This requires a more robust design of our code, such that if, for example, the robot encounters a slightly different corner angle, it would still operate successfully. In order to achieve this goal, we can use the data that is coming from the laser scanner, as well as the data coming from the encoders attached to the wheels of the robot.

We are given two trials to complete the challenge. On trial ends if our robot does at least one of the following actions:

  • Bumps into a wall, but a slight touch is allowed if the judges consider it acceptable;
  • Has not moved nor done any action for 30 seconds;
  • The total time spent in the room exceeds 5 minutes;

The challenge is completed if the robot does not bump into any wall, respects the time limitations, and when the entire rear wheel of the robot has passed the finish line that is placed at least at 3 m into the corridor.

The robot's behavior

In the figure presented below, one can visualize the state machine after which the robot behavior is created for the Escape Room Challenge.

The state machine of the robot in the Escape Room Challenge.

The robot starts by moving forward until finds a wall that is closer than a certain threshold distance. Once a wall is found, the robot will turn to the left, also correcting its position relative to the wall, meaning that if the detected wall is not entirely perpendicular to the robot's front direction, it will adjust its turn accordingly. After turning towards the left, the robot will keep a constant distance to the wall on its right side and will move forward until encounters another wall. Once faced with an inner corner, it will again turn left, basically, the algorithm is computed again.

In the case in which the distance to the wall on the right suddenly changes to higher values, the algorithm detects this "jump" and turns the robot to the right this time, considering that there should be a corridor. Once it enters, it will again keep a constant distance to the wall on its right side, while driving forward. In this way, it will cross the finish line further down the corridor.

In the following section, we are going to describe the code. As mentioned before, we are using only data from the laser scanner and the odometry, so the code description will be treated accordingly, in two separate sections.

Software description

Based on the laser scanner
The laser scanner range (not the true scale).

Before heading to the software description of the laser scanner part, one can find the image on the right in which the range of the laser scanner is presented. The minimum angle at which the robot can sense is -2 radians, which is approximately -114.6 degrees and the maximum angle is 2 radians or 114.6 degrees, having the reference the x-direction, or the front direction of the robot. This creates a total range of 4 radians or 229.2 degrees in which the sensor is capable of collecting data from the environment.

Another key aspect is the maximum scanning range in the radial direction that is bounded at 10 m, while the minimum range is at 0.1 m.

Based on the difference between two laser rays, there was given the angle_increment variable that is equal to approximately 0.004 radians. This information is going to be used later in the calculation of the angle at which the robot has to rotate depending on the data that is coming from a certain laser ray.






We are now starting to describe the software choices and steps taken that led us towards accomplishing the goal of the challenge:

  • We start by creating a struct that has as members the double variables called distance and angle, and the third boolean variable called found. In the right section in this code description, we are going to further explain the utilization of each member.
  • We then create a class called LRFT that initializes the data collected by the laser scanner. Under this class two boolean functions are defined. One called available() that check if there is data coming from the laser or not, and the other Wall_in_stop_radius that checks whether or not the robot is positioned too close to a wall, meaning that is closer than a defined value of 0.35 m.
  • Another function is called Closest_to_wall. Here, the variables mentioned in the initial struct are defined as follows. The variable distance returns the distance sensed by the laser scanner, which is in between the minimum and maximum range. This prevents the usage of irrelevant data which appears when the objects are out of range and the values do not have any useful information. The variable angle returns the angle of a certain direction towards a sensed point. This variable is computed based on the minimal angle, the increment, and the index at which the measurement data is collected. The variable found returns wheter or not there has been a useful value found in the range covered by the scanner.





Hospital Challenge