Mobile Robot Control 2023 Group 15

From Control Systems Technology Group
Jump to navigation Jump to search

Group members:

Caption
Name student ID
Tobias Berg 1607359
Guido Wolfs 1439537
Tim de Keijzer 1422987


Practical exercises week 1:

1. On the robot-laptop open rviz. Observe the laser data. How much noise is there on the laser? What objects can be seen by the laser? Which objects cannot? How do your own legs look like to the robot.

The laser data shows with point clouds highlighted in red where objects are. In a constant environment the laser data is still quite fluid and dynamic. This indicates that there is significant noise on the robot. As of our own legs, they are visible as two small rectangles.

2. Go to your folder on the robot and pull your software.

This has been done.

3. Take your example of don't crash and test it on the robot. Does it work like in simulation?

After significant tweaking and updates, yes the simulation works.

4. Take a video of the working robot and post it on your wiki.

Video: https://youtu.be/FXOC3232ob8

Navigation Assignment 1:

Figure 1. Updated maze

To improve the efficiency of finding the shortest path through the maze, it is suggested to decrease the number of nodes. By reducing the number of nodes, the algorithm will have to evaluate fewer nodes, which will make it more efficient.

The number of nodes can be reduced by removing nodes that are between the nodes that represent a straight line in the maze, as illustrated in Figure 1. This method results in a maze with 19 nodes, and the connections between them are indicated by red dotted lines. It is important to note that the cost of the straight paths should be updated to match the original cost of the same path. For example, the path between node 1 and 2 in the maze of Figure 1 would have a cost of 6.






Navigation Assignment 2:

The code for the Navigation-2 assignment can be found in the group repository in Gitlab, within the folder "Navigation-2".

The idea behind the used approach is based on the Artificial Potential Field Algorithm. The laser-data of the robot is used to detect objects. When these objects are within a specified range of the robot, a vector from the object to the robot is created. This vector is inversely scaled based on the distance of the object to the robot. In this approach local coordinates are used with the robot at the center of the coordinate frame. All the calculated vectors are stored in a list.

Coordinates for a goal are also defined. These are however globally defined, and are converted to local coordinates based on the odometry data of the robot. A vector from the robot to the goal is also drawn, this vector is however scaled to a fixed length, and not dependent on the distance of the robot towards the goal.

The list of vectors is summed to create a resulting force vector. Based on the orientation of this vector, the angular velocity of the robot is adjusted. The forward velocity is normally kept constant, however when the robot needs to make a very sharp turn, the robot decides to decrease the forward velocity and thus also decreasing it's turn radius.


This concept idea has resulted in the robot being able to solve the corridor maze on the simulator:

Video: https://youtu.be/9UnAluLiUOM

As well as that the robot is able to solve the corridor maze in real life during experimentation:

Video: https://youtu.be/Q__zXivK0Xg


Localisation Assignment 1:

Assignment 2:

Figure 2. Simulation experiment map

In order to determine the reliability of the odometry data in simulation, a map with distance markers was created as depicted in Figure 2. The map was used for a simulation in which the robot moved to four consecutive goals positioned on the map, forming a 5x5 meter box, and then returned to the starting point. The robot relied solely on the odometry data, employing an attractive force method derived from the artificial repulsive force code used in the navigation assignments. The code for the Localisation-2 assignment can be found in the group repository in Gitlab, within the folder "Localisation-2".

In the first simulation experiment, where the uncertain_odom option was not enabled, the robot successfully reached the goals and returned to the starting point, indicating that the odometry data was dependable for navigation.

Link to the video of simulation: https://youtu.be/e5O7nT2eaoo

However, in the second simulation experiment with the uncertain_odom option enabled, the odometry data proved to be highly unreliable. The robot failed to reach the designated goals on the map and deviated from the starting point by more than two meters.

Link of video of simualtion with uncertain_odom option: https://youtu.be/j7_fsJzvNnI


If the odometry data in real-life were as reliable as it was in the first simulated experiment, navigation could solely rely on this data. However, we are aware that this is not the case, and it would likely resemble the odometry data observed with the uncertain_odom option turned on due to sensor drift, slip of the wheels and other sources of error. Therefore, relying exclusively on odometry data for navigation would not be a suitable approach for the final challenge.


Assignment 3:

Figure 3. Experiment result

The experiment was replicated in real-life using the COCO robot. Although the disparity between the goals and the position obtained from the odometry data was not as significant as in the simulation with the uncertain_odom option, it still demonstrated a lack of reliability. The robot exhibited a deviation of approximately 40cm, whereas based on the odometry data, it should have stopped within 15cm of the origin, as illustrated in Figure 3.

Link to the video of the experiment: https://youtu.be/PfWP8JML4XE

Additionally, during the experiments, we encountered a case of robot kidnapping. When manually repositioning the robot to the origin, we noticed that the odometry data was not reset, resulting in further deviation from the expected path and goals.

Link to the video of experiment started after manual repositioning: https://youtu.be/pCS6CVaZp_k

Despite the smaller deviation observed, this experiment reinforces the conclusion that the odometry data cannot be considered sufficiently reliable for exclusive reliance in navigation.


Localisation Assignment 1:

Assignment 0:

  • What is the difference between the ParticleFilter and ParticleFilterBase classes, and how are they related to each other?

There are three classes present in the ParticleFilter and ParticleFilterBase files. The first two are identical to the file name, the third is the Resampler class. The ParticleFilter class is a specialized form of the ParticleFilterBase class focused on localization. Thus the ParticleFilterbase class provides the standard functionality of particle filters, for example, it includes common functionalities like setting weights, and propagating particles. The Resampler class handles as the name suggests the resampling. This resampling as the lecture showed ensures that the particles stay focused in high likelihood areas. The ParticleFilter class can be initialized with both uniformly distributed particles and with a Gaussian distribution.

  • How are the ParticleFilter and Particle class related to eachother?

The ParticleFilter class uses the functionalities provided by the ParticleFilterBase class to implement the particle filtering algorithm.

  • Both the ParticleFilter and Particle classes implement a propagation method. What is the difference between the methods?

The propagation method in the ParticleFilter class is responsible for propagating all the particles in the filter, based on the odometry data. The propagation method in the Particle class is used to update the state of an particle based on a given motion and an offset angle, taking into account the process noise.


Assignment 1:

  • What is the difference between the two constructors?

The difference between the two constructors is the distribution of the particles. In de first constructor the 'N' number of particles are distributed uniformly in the ' World' environment. In the second constructor the 'N' number of particles are distributed in the ' World' environment according to a Gaussian distribution with the specified 'mean' and 'sigma' for the mean and standard deviation of the distribution.