Mobile Robot Control 2021 Group 3: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
No edit summary
No edit summary
Line 47: Line 47:
<br>
<br>


<h2>Task Division</h2>
<h2>Introduction</h2>
In order to show on what task each student from the group is, or was, working on, we made this excel table in which every student entered their participation on a certain task. Please note that this table is often updated.
Due to COVID-19 the pressure on the hospitals has increased enormously, exposing the ongoing shortage of medical personnel. This reduces the care which can be provided to the ones in need of medical attention. To reduce the workload of the nurses, robotic devices could be implemented to assist in e.g. the retrieval of the patients medicines. During this course PICO's software is developed with this purpose in mind. In the first part the basics of the software are displayed and tested during the escape room challenge. In the second part more detailed software designing is employed for the hospital challenge.  


If there is a need for updating or just visualizing the current state of the table, please access the following link: [[https://tuenl-my.sharepoint.com/:x:/r/personal/e_szucs_student_tue_nl/Documents/Task_division_group3.xlsx?d=w7ac7c99aa4f348e989fda57515ec0225&csf=1&web=1&e=ecSmeb| Task division.]]
Everyone in this group has editing right and everyone else has just viewing right to the excel under the link found above.


<h2>Design Document</h2>
<h2>Design Document</h2>
Line 73: Line 70:


<h4>The robot's behavior</h4>
<h4>The robot's behavior</h4>
In the figure presented below, one can visualize the state machine after which the robot behavior is created for the ''Escape Room Challenge''.
[[File:ERC_stateMachine.jpg|600px|right|thumb| The state machine of the robot in the Escape Room Challenge.]]
[[File:ERC_stateMachine.jpg|900px|center|thumb| The state machine of the robot in the Escape Room Challenge.]]
In the figure presented on the right, one can visualize the state machine after which the robot behavior is created for the ''Escape Room Challenge''. At the beginning of the challenge PICO rotates maximal 130 degrees, since PICO has a viewing range of 4 rad, which corresponds to approximately 230 degrees. By turning a maximum of 130 degrees the whole 360 range is scanned. If during the scan the corridor is found, the middle of the corridor is set as a target. When the target is reached the robot will turn to directly face the corridor and it will move into the corridor. Here it will follow the wall on its right side and turn only if a wall or another corridor is detected. If PICO does not find a corridor during the initial scanning procedure, it will move forward until it finds a wall. Once a wall is found, the robot will turn to the left, also correcting its position relative to the wall, meaning that if the detected wall is not entirely perpendicular to the robot's front direction, it will adjust its turn accordingly. After turning towards the left, the robot will keep a constant distance to the wall on its right side and will move forward until encounters another wall. Once faced with an inner corner, it will again turn left, basically, the algorithm is computed again. In case the distance to the wall on the right suddenly changes to higher values, the algorithm detects this "jump" and turns the robot to the right this time, considering that there should be a corridor. Once it enters, it will again keep a constant distance to the wall on its right side, while driving forward. In this way, it will cross the finish line further down the corridor.


The robot starts by moving forward until finds a wall that is closer than a certain threshold distance. Once a wall is found, the robot will turn to the left, also correcting its position relative to the wall, meaning that if the detected wall is not entirely perpendicular to the robot's front direction, it will adjust its turn accordingly. After turning towards the left, the robot will keep a constant distance to the wall on its right side and will move forward until encounters another wall. Once faced with an inner corner, it will again turn left, basically, the algorithm is computed again.
<h2>Corridor detection</h2>
[[File:Laser_scanner_range.JPG|600px|right|thumb| The laser scanner range (not the true scale).]]
In the escape room challenge the only goal is to detect a corridor and drive through it. For now obstacle detection and the orientation are not considered relevant. The software should be able to:
* Divide the LRF data into data segments.  
* Fit lines through the segments.
* Use these lines to compute edges and corners.
* Determine target positions.


In the case in which the distance to the wall on the right suddenly changes to higher values, the algorithm detects this "jump" and turns the robot to the right this time, considering that there should be a corridor. Once it enters, it will again keep a constant distance to the wall on its right side, while driving forward. In this way, it will cross the finish line further down the corridor.
<h4>Data segmentation</h4>
First the data from the laser range finder (LRF) is transformed from polar to cartesian coordinates:


In the following section, we are going to describe the code. As mentioned before, we are using only data from the laser scanner and the odometry, so the code description will be treated accordingly, in two separate sections.


<h4>Software description</h4>
//under construction :)
<h5>Based on the laser scanner</h5>
 
[[File:Laser_scanner_range.JPG|500px|right|thumb| The laser scanner range (not the true scale).]]
Before heading to the software description of the laser scanner part, one can find the image on the right in which the range of the laser scanner is presented. The minimum angle at which the robot can sense is -2 radians, which is approximately -114.6 degrees and the maximum angle is 2 radians or 114.6 degrees, having the reference the x-direction, or the front direction of the robot. This creates a total range of 4 radians or 229.2 degrees in which the sensor is capable of collecting data from the environment.
Before heading to the software description of the laser scanner part, one can find the image on the right in which the range of the laser scanner is presented. The minimum angle at which the robot can sense is -2 radians, which is approximately -114.6 degrees and the maximum angle is 2 radians or 114.6 degrees, having the reference the x-direction, or the front direction of the robot. This creates a total range of 4 radians or 229.2 degrees in which the sensor is capable of collecting data from the environment.


Line 114: Line 116:


<h2>Hospital Challenge</h2>
<h2>Hospital Challenge</h2>
<h2>Task Division</h2>
In order to show on what task each student from the group is, or was, working on, we made this excel table in which every student entered their participation on a certain task. Please note that this table is often updated.
If there is a need for updating or just visualizing the current state of the table, please access the following link: [[https://tuenl-my.sharepoint.com/:x:/r/personal/e_szucs_student_tue_nl/Documents/Task_division_group3.xlsx?d=w7ac7c99aa4f348e989fda57515ec0225&csf=1&web=1&e=ecSmeb| Task division.]]
Everyone in this group has editing right and everyone else has just viewing right to the excel under the link found above.

Revision as of 12:14, 11 May 2021

Group Members

Students

Cecile.jpeg LarsHagendoorn.jpg WardKlip.jpg
Cecile Conrad Lars Hagendoorn Ward Klip
1016008 1013526 1015501
c.conrad@student.tue.nl l.s.hagendoorn@student.tue.nl w.m.klip@student.tue.nl
Foto Sjoerd.jpg EduardSzucs.jpg
Sjoerd Leemrijse Eduard-Istvan Szucs Matthijs Teurlings
1009082 1536419 1504274
s.j.leemrijse@student.tue.nl e.szucs@student.tue.nl m.h.b.teurlings@student.tue.nl

Tutor

Jordy Senden
j.p.f.senden@tue.nl


Introduction

Due to COVID-19 the pressure on the hospitals has increased enormously, exposing the ongoing shortage of medical personnel. This reduces the care which can be provided to the ones in need of medical attention. To reduce the workload of the nurses, robotic devices could be implemented to assist in e.g. the retrieval of the patients medicines. During this course PICO's software is developed with this purpose in mind. In the first part the basics of the software are displayed and tested during the escape room challenge. In the second part more detailed software designing is employed for the hospital challenge.


Design Document

To get an overview of the project, a design document was created, which can be found here. In this document, the requirements, the components and specifications, and the functions and interfaces are described. Following this, the content will be used to better understand what will be done in the Escape Room Challenge and the Hospital Challenge.


Escape Room Challenge

Introduction

In this year's version of the course, we are going to use a simulation that reproduces the exact same behavior of the real PICO robot.

The first major milestone of this course is the Escape Room Challenge in which we are faced with the task of driving our robot out of a square room, through a corridor. This environment is said to have walls that are not perfectly straight and corners that are not perfectly perpendicular. This requires a more robust design of our code, such that if, for example, the robot encounters a slightly different corner angle, it would still operate successfully. In order to achieve this goal, we can use the data that is coming from the laser scanner, as well as the data coming from the encoders attached to the wheels of the robot.

We are given two trials to complete the challenge. On trial ends if our robot does at least one of the following actions:

  • Bumps into a wall, but a slight touch is allowed if the judges consider it acceptable;
  • Has not moved nor done any action for 30 seconds;
  • The total time spent in the room exceeds 5 minutes;

The challenge is completed if the robot does not bump into any wall, respects the time limitations, and when the entire rear wheel of the robot has passed the finish line that is placed at least at 3 m into the corridor.

The robot's behavior

The state machine of the robot in the Escape Room Challenge.

In the figure presented on the right, one can visualize the state machine after which the robot behavior is created for the Escape Room Challenge. At the beginning of the challenge PICO rotates maximal 130 degrees, since PICO has a viewing range of 4 rad, which corresponds to approximately 230 degrees. By turning a maximum of 130 degrees the whole 360 range is scanned. If during the scan the corridor is found, the middle of the corridor is set as a target. When the target is reached the robot will turn to directly face the corridor and it will move into the corridor. Here it will follow the wall on its right side and turn only if a wall or another corridor is detected. If PICO does not find a corridor during the initial scanning procedure, it will move forward until it finds a wall. Once a wall is found, the robot will turn to the left, also correcting its position relative to the wall, meaning that if the detected wall is not entirely perpendicular to the robot's front direction, it will adjust its turn accordingly. After turning towards the left, the robot will keep a constant distance to the wall on its right side and will move forward until encounters another wall. Once faced with an inner corner, it will again turn left, basically, the algorithm is computed again. In case the distance to the wall on the right suddenly changes to higher values, the algorithm detects this "jump" and turns the robot to the right this time, considering that there should be a corridor. Once it enters, it will again keep a constant distance to the wall on its right side, while driving forward. In this way, it will cross the finish line further down the corridor.

Corridor detection

The laser scanner range (not the true scale).

In the escape room challenge the only goal is to detect a corridor and drive through it. For now obstacle detection and the orientation are not considered relevant. The software should be able to:

  • Divide the LRF data into data segments.
  • Fit lines through the segments.
  • Use these lines to compute edges and corners.
  • Determine target positions.

Data segmentation

First the data from the laser range finder (LRF) is transformed from polar to cartesian coordinates:


//under construction :)

Before heading to the software description of the laser scanner part, one can find the image on the right in which the range of the laser scanner is presented. The minimum angle at which the robot can sense is -2 radians, which is approximately -114.6 degrees and the maximum angle is 2 radians or 114.6 degrees, having the reference the x-direction, or the front direction of the robot. This creates a total range of 4 radians or 229.2 degrees in which the sensor is capable of collecting data from the environment.

Another key aspect is the maximum scanning range in the radial direction that is bounded at 10 m, while the minimum range is at 0.1 m.

Based on the difference between two laser rays, there was given the angle_increment variable that is equal to approximately 0.004 radians. This information is going to be used later in the calculation of the angle at which the robot has to rotate depending on the data that is coming from a certain laser ray.






We are now starting to describe the software choices and steps taken that led us towards accomplishing the goal of the challenge:

  • We start by creating a struct that has as members the double variables called distance and angle, and the third boolean variable called found. In the right section in this code description, we are going to further explain the utilization of each member.
  • We then create a class called LRFT that initializes the data collected by the laser scanner. Under this class two boolean functions are defined. One called available() that check if there is data coming from the laser or not, and the other Wall_in_stop_radius that checks whether or not the robot is positioned too close to a wall, meaning that is closer than a defined value of 0.35 m.
  • Another function is called Closest_to_wall. Here, the variables mentioned in the initial struct are defined as follows. The variable distance returns the distance sensed by the laser scanner, which is in between the minimum and maximum range. This prevents the usage of irrelevant data which appears when the objects are out of range and the values do not have any useful information. The variable angle returns the angle of a certain direction towards a sensed point. This variable is computed based on the minimal angle, the increment, and the index at which the measurement data is collected. The variable found returns wheter or not there has been a useful value found in the range covered by the scanner.





Hospital Challenge

Task Division

In order to show on what task each student from the group is, or was, working on, we made this excel table in which every student entered their participation on a certain task. Please note that this table is often updated.

If there is a need for updating or just visualizing the current state of the table, please access the following link: [Task division.]

Everyone in this group has editing right and everyone else has just viewing right to the excel under the link found above.