Embedded Motion Control 2013 Group 2

From Control Systems Technology Group
Jump to navigation Jump to search

Group Members

Name: Student id: Email:
Joep Alleleijn 0760626 j.h.h.m.alleleijn@student.tue.nl
E. Romero Sahagun 0827538 e.romero.sahagun@student.tue.nl
L. Galindez Olascoaga 0867854 l.i.galindez.olascoaga@student.tue.nl
Koen Meeusen 0657087 k.a.meeusen@student.tue.nl

Planning

Week: Activities:
Week 1: Sep 2 - Sep 8
Start ROS & C++ tutorials
Prepare software (Ubuntu, ROS, Gazebo, etc..)
Project planning & brainstorming
Week 2: Sep 9 - Sep 15
Conclude software troubleshoot.
Start simulations in Gazebo for sensors and actuators identification.
Week 3: Sep 16 - Sep 22
Code development, module based.
Code implementation for Corridor Competition, tests on simulation.
First real robot trial (Sep 20, 13.00 - 14:00 hrs)
Week 4: Sep 23 - Sep 29
Last minute preparations for Corridor Competition
Second real robot trial (to schedule..)
Corridor Competition (Sep 25)
2nd Real robot test (Sep 26, 11.00 - 12:00 hrs)
Week 5: Sep 30 - Oct 6
3rd Real robot test (Oct 3, 11.00 - 12:00 hrs)
Week 6: Oct 7 - Oct 13
4th Real robot test (Oct 10, 11.00 - 12:00 hrs)
Week 7: Oct 14 - Oct 20
5th Real robot test (Oct 17, 11.00 - 12:00 hrs)
Week 8: Oct 21 - Oct 27
Maze Competition (Oct 23)

Current Work

Team Member: Working on:
Joep Alleleijn System architecture, message structure between nodes-> which information is communicated and how. Determine location within the environment based on laser data, build in functionality so it still works when there is an opening in the maze.
E. Romero Sahagun Movement module/functions (move forward, backward, turn left/right)
L. Galindez Olascoaga System architecture, interfaces and integration/path planning algorithm
Koen Meeusen Wall and corner detection, area mapping.

Progress

Week 1: September 2 - September 8

Software installation
The goal of the first week was to install all necessary software. The installation of Ubuntu went well in general. In one case (on the 2013 TU/e laptop) Ubuntu would install correctly (it said) but when Ubuntu was started the desktop screen of Ubuntu was not loaded. Instead a command prompt like screen was displayed and reports where shown that there were missing files. The problem was eventually solved as follows:

In case the laptop has been fitted with a small ssd parallel to the main harddisk (like the 2013 TU/e laptop), Ubuntu will not install properly. Because the ssd-drive and the harddisk are placed parallel the laptop will start faster since the ssd provides a fast start-up. When Ubuntu starts it requires files which are not present on the ssd, which causes Ubuntu to fail. The solution is to disable the raid configuration of the laptop. This disables the ssd-drive and its advantages but Ubuntu will start now since all the required files are received from the harddisk. In some cases the Raid is called Intel RST (rapid storage technology). Switching of the raid system in BIOS might result in losing your windows and all your data on the disk. So it is not recommended ( We have never tried it before). Login in windows and open the Intel Rapid Storage Technology program and disable raid support in a less brutal way to avoid such risks.

The other required software installed well except Qt. By a few persons Qt did not install. Therefore the choise has been made to use eclipse to type the c++ code. The disadvantage is that in eclipse you will have to rebuild your “cmake” and project files every time you change something in the script. This requires a restart of eclipse. Qt does not have this problem. An advantage of eclipse over Qt is that eclipse can handle vector programming easier then Qt.

Problem investigation
In order to solve the maze problem some important questions had to be answered, namely:

- Is the maze unique? (In other words, is there only one solution?)

- Are there island in the maze? (walls which are not connected to the outside of the maze)

The answered to these questions are yes, the maze is unique and no, there are no island. With these questions answered a simple strategy has been made to solve the maze:

If the maze contains islands the solution won’t be totally unique, because there are multiple ways to solve the maze. With islands it is even possible to get stuck in a loop around the island in the maze. With only one path which is correct (a unique solution) and no islands a solution to the maze can be to follow the right hand wall of the robot. In case of the corridor challenge, the solution is not unique, since there are to exits (a correct one and a false one). Although the strategy to follow the right hand wall will in this case give the correct solution.

Besides more advances technique to solve a maze, this solution can easily be programed and can be used for testing the simulator. The goal is to have a more advanced maze solving algorithm for the corridor test. However this has to be developed yet.

Week 2: September 9 - September 15

Testing laser data acquisition
Simulated laser data (Laura and Joep)
Structure message from laser:
Header header # timestamp in the header is the acquisition time of
# the first ray in the scan.
#
# in frame frame_id, angles are measured around
# the positive Z axis (counterclockwise, if Z is up)
# with zero angle being forward along the x axis
float32 angle_min # start angle of the scan [rad]
float32 angle_max # end angle of the scan [rad]
float32 angle_increment # angular distance between measurements [rad]
float32 time_increment # time between measurements [seconds] - if your scanner
# is moving, this will be used in interpolating position
# of 3d points
float32 scan_time # time between scans [seconds]
float32 range_min # minimum range value [m]
float32 range_max # maximum range value [m]
float32[] ranges # range data [m] (Note: values < range_min or > range_max should be discarded)
float32[] intensities # intensity data [device-specific units]. If your
# device does not provide intensities, please leave
# the array empty.

Week 3: September 16 - September 22

Testing 3D simulation and visualization
The goal for the third week was to ensure the complete functionality of the working environment. After applying the changes included in the updated wiki the complete functionality of Gazebo was achieved. The maze was succesfully spawned and the urdf file of the robot was displayed correctly.
Group2 Gazebo jazz.png
The visualization of the robot in rviz was also achieved in all computers and the laser data was as well displayed. The topic to which we have to subscribe in order to visualize this data is /pico/laser which is of type [sensor_msgs/LaserScan]. The contents of this message are shown in last week's post.
Group2 Rviz jazz.png
We could also identify that reference frame transformation data as well as odometry data are already provided in the general repository. The following screenshot shows the reference frame transformation tree of our system. If we wanted to have a fixed reference frame we would have to add it before the odometry one.
Group2 Tf view frames.png
Defining our system's architecture
During this week we also decided on our system's architecure which we defined as a modular one. This enables us to work simultaneously in several algorithms, delegate work and also use ROS's functionality and communication infrastructure. We decided to create our own .msg files and headers in order to structure data comprenhensively. The following image shows a print screen of our architecture as displayed by rxgraph. We programed nodes for each package that subscribed to and published the topics we will later use. In this way, we only have to edit the nodes' source code and add the algorithms for each module. We have already uploaded this set of packages to our svn repository. We choose to divide the the main functionality of the robot within the ros-environment into different modules that are coupled sequentially. The system is divided into 5 main nodes: sensors, location, map, trajectory planner and robot movement. The nodes will use a structure to communicate. The main name of this "Data", each node will add sub cell into the structure beginning with the name of the node, followed by the variable that is added. Example: "Data.location.theta_wall". Furthermore, we will use SI units.
Group2 Rxgraph architecture.png
We can see in this graph that we have, up to now, a communication structure with 8 nodes:
/rosout: corresponds to ROS master
/rviz, /pico_state_publisher and /GazeboRosLaser_node: were provided in general repository and contain the simulation and visualization functionality as well as the robot's sensor reading and tf algorithm.
/path_prediction,/map and /localization nodes will be written by us and will contain the needed algorithms for the challenges.
Test plan for September 20
1. Follow wiki's instruction on using Pico to set it up.
2. Be able to read and interpret sensor data (laser and odometry)
3. If any of the algorithms (mapping, wall detection, localization, trajectory planning...) have already been tested in simulation we could try implementing them.

Week 4: September 23 - September 29

...

Week 5: September 30 - October 6

...

Week 6: October 7 - October 13

...

Doubts & Questions

Week: Doubts:
Sep 2 - Sep 8
Issues related with software installation
Sep 9 - Sep 15
Issues related with software installation
More details about the project
Sep 16 - Sep 22
What is the sample frequency and scan methodology of the system/laser? - we have to figure that out.
Orientation coordinate system - we can define that.
Frequency the robot - we have to figure this out
Obstacles in the hall competition/maze? - no, just the walls.
Is it possible to get more days available every week to work on the robot? no, we should make a test plan, limited time is part of the challenge.
Is it possible to test again on the real robot early next week? no
Is there a time limit to complete the maze? No
How reliable is the simulation? If we manage to communicate properly with the robot on Friday, and we test our solution later on the simulation, can we trust it?? We have to run tests both in simulation and in the actual robot in order to define how reliable the first one is.
Are we allowed to use motion planning algorithms available in ROS repositories? Yes, but we should be able to generate our own solutions to the problem, it is not advisable.
Sep 23 - Sep 29
...
Sep 30 - Oct 6
...
Oct 7 - Oct 13
...
Oct 14 - Oct 20
...
Oct 21 - Oct 27
...

FAQ & Doubts