Autonomous Referee System: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
No edit summary
No edit summary
 
(75 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[File:Drone Ref.png|thumb|right|600px|Illustration by Peter van Dooren, BSc student at Mechanical Engineering, TU Eindhoven, November 2016.]]
<div align="left">
<div align="left">
<font size="5">Autonomous Referee System</font><br />
<font size="4">'An objective referee for robot football'</font>
<font size="4">'An objective referee for robot soccer'</font>
</div>
</div>


=Introduction=
<div STYLE="float: left; width:80%">
</div><div style="width: 35%; float: right;"><center>{{:Content_MSD16_large}}</center></div>
__NOTOC__
 
 
 
A football referee can hardly ever make "the correct decision", at least not in the eyes of the thousands or sometimes millions of fans watching the game. When a decision will benefit one team, there will always be complaints from the other side. It is oft-times forgotten that the referee is also merely a human. To make the game more fair, the use of technology to support the referee is increasing. Nowadays, several stadiums are already equipped with [https://en.wikipedia.org/wiki/Goal-line_technology goal line technology] and referees can be assisted by a [http://quality.fifa.com/en/var/ Video Assistant Referee (VAR)]. If the use of technology keeps increasing, a human referee might one day become entirely obsolete. The proceedings of a match could be measured and evaluated by some system of sensors. With enough (correct) data, this system would be able to recognize certain events and make decisions based on these event.
 
 
The aim of this project is to do just that; making a system which can evaluate a soccer match, detect events and make decisions accordingly. Making a functioning system which could actually replace the human referee would probably take a couple of years, which we don't have. This project will focus on creating a high level system architecture and giving a prove of concept by refereeing a robot-soccer match, where currently the refereeing is also still done by a human. This project will build upon the [[Robotic_Drone_Referee|Robotic Drone Referee]] project executed by the first generation of Mechatronics System Design trainees.
 
 
To navigate through this wiki, the internal navigation box on the right side of the page can be used.
 
 
<center>[[File:tumbnail_test_video.png|center|750px|link=https://www.youtube.com/embed/XyRR3rPQ4R0?autoplay=1]]</center>
 


=Team=
This project was carried out for the second module of the 2016 MSD PDEng program. The team consisted of the following members:
This project was carried out for the second module of the 2016 MSD PDEng program. The team consisted of the following members:
* Tim Verdonschot
* Akarsh Sinha
* Tuncay Uğurlu Ölçer
* Farzad Mobini
* Sa Wang
* Joep Wolken
* Joep Wolken  
* Jordy Senden
* Farzad Mobini  
* Sa Wang
* Jordy  Senden
* Tim Verdonschot
*       Akarsh Sinha
* Tuncay Uğurlu Ölçer
 
 


=Project Definition=
<center>[[File:Drone Ref.png|thumb|center|1000px|Illustration by Peter van Dooren, BSc student at Mechanical Engineering, TU Eindhoven, November 2016.]]</center>
<p>
As described in <ref>D. Antunes and R. Molengraft, Drone Referee, Control Systems Technology group, Mechanical Engineering Department,
TU Eindhoven, November 2016.</ref> verbatim, the goal of the present project is to contribute to this vision and create an autonomous robot referee system using drones. The first generation of MSD PDEng students created a system architecture <ref> [http://cstwiki.wtb.tue.nl/index.php?title=Robotic_Drone_Referee "Robotic Drone Referee"] </ref> to be used with a single drone. This architecture provides the basis for the present project. In particular some of the modules of such architecture, such as ball out of bound detection and an indoor positioning system using ultra-wind band technology, were implemented and tested. The overall goal of this project is to extend this system architecture and implement more modules.
</p>


=Background=
=Acknowledgements=
<p>
A project like this is never done alone. We would like to express our gratitude to the following parties for their support and input to this project.
[http://www.robocup.org/ RoboCup]  is an international initiative to promote and advance research in robotics and  artificial intelligence. Founded in 1997, its main goal is to ‘develop a team of fully autonomous humanoid robot soccer players which is able to win against the winner of the most recent World Cup, complying with the official rules of FIFA, by the middle of the 21st century’. In the Middle Size League [http://wiki.robocup.org (MSL)], two teams of five autonomous robots play a soccer match on an artificial field. These robots are able to drive around while using several on-board camera's to position themselves on the field. Moreover, they can determine the position of the ball, opponents and team mates. Through radio signals they can communicate with each other and decide upon a strategy. With a ball-handling system the ball can be captured and controlled and a shooting mechanism is able to shoot a ball over the ground or through the air.


As discussed in [http://wiki.robocup.org/images/7/72/RequirementsforMSL_2016.pdf Tutorial: Requirements for RoboCup MSL] a standard RoboCup field measures 12x18 meters. During a match, there are ten robots on this field, driving around with velocities up to 5 m/s and possibly even higher. These robots are all competing for the same thing: scoring goals. This means that getting possession of the ball is a primary goal. When several robots are competing for the ball, collisions, pushing and scrummages are nearly inevitable. To make sure the match is played in a fair way, a human referee keeps a keen eye on the events on the field from the sideline. This human referee is backed up by an auxiliary referee which is standing on the opposite side, next to the field. Both can decide on stopping the game, due to a committed foul, a scored goal, a ball out of bound or any other event. The rules for MSL are based on the official FIFA rules, but adapted to robot football rules were necessary [http://wiki.robocup.org/images/0/0f/Robocup-msl-rules-2016.pdf rulebook 2016]. However, the large set of rules and the interpretation thereof can often lead to situations where a referee might decide to continue the game, while another might decide to interrupt. This can and will often lead to frustrations in the aggrieved team. Moreover, a decision made by a referee can affect the outcome of a game and even an entire championship. An example of this is the final match of the [http://www.robocup2016.org/en/ world championship 2016 in Leipzig Germany] ([https://www.youtube.com/watch?v=f7Y6QLYVhSs&feature=youtu.be&t=6h17m38s full match], [https://www.youtube.com/watch?v=2JxNjgKE8HQ highlights]). The final was played between team [http://www.techunited.nl/ TechUnited] from the Netherlands and team [http://blog.sina.com.cn/s/articlelist_2532664717_0_1.html%E2%80%8D WATER] from China. The winner of this would become world champion robot soccer in the MSL. At the end of the match the scoreboard showed 2-2. As in human soccer, this means extra time to decide on the winner. During the match, team WATER had some trouble with the ball handling, preventing the ball to rotate in a ‘natural’ way over the field. When it happens that the ball does not rotate in the direction it is being moved, this is considered clamping and regarded as a foul in favor of the other team. In the last couple of minutes the score was 3-3 when WATER turned towards the TechUnited goal, shot and scored the winning goal. While the Chinese team was already celebrating their victory, the auxiliary referee decided that the scoring robot was clamping the ball before scoring the goal. After a discussion with the main referee, it was decided to declare the goal invalid. Since the extra time also ended in a draw, penalties were needed to decide who would become the new world champion. After all penalties of the Chinese team were stopped by the Dutch keeper, the first shot of the TechUnited robot went into the net. The Dutch team won the penalty series with 1-0 and thus TechUnited became the world champion of 2016. This example shows how important the decisions of the human referee team can be in shaping the course of a match or even a tournament. Rules are always prone to interpretation and a team which is disadvantaged by this will always complain. The referee has no means to justify his decision other than his own intuition and interpretation on the rules. This lead to the question on whether it would be possible to develop a system which can support the human referee team in making decisions. Such system might even become fully autonomous and could replace the human factor in refereeing entirely.
<center>[[File:logoAcknowledgements.png|center|1000px]]</center>
</p>


=Project Objectives=
<p>
* System architecture of the proposed solution by January 31st along with a time plan, risk assessment of the choices, and task distribution for the elements of the group.
* Software of the proposed solutions including:
** Out of bound ball detection by the ground robot, including both motion algorithm and camera processing. Suggested: end of January.
** Detection of a fault including both movement. Suggested: end of February.


* Software with the interaction between the two robots. Suggested: end of March.
* Demo to be scheduled by the end of March or beginning of April.
* A Wiki-page documenting the project and providing a repository for the software developed, similar to the one obtained from the first generation of MSD students.
* One minute long video to be used in presentations illustrating the work.


</p>


=System=
In this section, the autonomous referee system is explained. An overview of the system architecture is given. Next, the sub-parts in this architecture will be explained in more detail.


==System Overview==
[[File:System_Overview_Presentation1.png|center|600px|Overview of the system architecture.]]


<!--


==Ground Robot==
==Ground Robot==
Line 64: Line 60:
** The GR should be able to keep the ball in sight of its Kinect camera. If the ball is lost, GR should try to find it again with the Kinect.
** The GR should be able to keep the ball in sight of its Kinect camera. If the ball is lost, GR should try to find it again with the Kinect.
** Since the ball is best tracked with the Kinect, the omni-vision camera can be used to keep track of the players.  
** Since the ball is best tracked with the Kinect, the omni-vision camera can be used to keep track of the players.  
** To accommodate the ball and player tracking, the GR needs to be able to drive next to the field at:  x=-w/2+Δw,    y=[-l/2,0],    θ=[0,-π] during gameplay.


<br>
<br>
Line 98: Line 93:


==Drone==
==Drone==
*AR Parrot Drone Elite Addition 2.0
*19 min. flight time (ext. battery)
*720p Camera (but used as 360p)
*~70° Diagonal FOV (measured)
*Image ratio 16:9
===Drone control===
*Has own software & controller
*Possible to drive by MATLAB using arrow keys
*Driving via position command and format of the input data is a work to do
*x, y, θ position feedback via top cam and/or UWBS
*z position will be constant and decided according FOV


==Positioning==
==Positioning==
Positioning System block is responsible for creating the reference position of the drone and the ground robot referee based on the information of the players and the ball. The low level controller of the both system will incorporate the reference position as a desired state for tracking purposes.
[[File:Positioning.png|thumb|right|400px|Depiction of the positioning subsystem.]]
Currently :
*Ground referee (Turtle) focuses on ball
*Drone focuses on collision/players


==Detection==
==Detection==
Line 119: Line 131:


<p>
<p>
==Definition of fault/foul==
===Definition of fault/foul===
The definition of foul/fault or offence is based on the Robo Cup MSL Rule Book <ref> [http://wiki.robocup.org/Middle_Size_League#Rules "Middle Size Robot League Rules and Regulations"] </ref> . Simple physical contact does not represent an offence. Speed and impact of physical contact shall be used to define offence or a foul. There are two cases in which foul detection should be formulated.
The definition of foul/fault or offence is based on the Robo Cup MSL Rule Book <ref> [http://wiki.robocup.org/Middle_Size_League#Rules "Middle Size Robot League Rules and Regulations"] </ref> . Simple physical contact does not represent an offence. Speed and impact of physical contact shall be used to define offence or a foul. There are two cases in which foul detection should be formulated.
*'''Case 1: One of the robots is in possession of the ball'''
*'''Case 1: One of the robots is in possession of the ball'''
Line 152: Line 164:
***Continuous application of momentum
***Continuous application of momentum
***#Detect if defaulter changes direction of movement within t seconds
***#Detect if defaulter changes direction of movement within t seconds
</p>
==Image processing==
===Capturing images===
'''Objective''': Capturing images from the (front) camera of the drone.


'''Method''':
*MATLAB
** ffmpeg
** ipcam
** gigecam
** hebicam
* C/C++/Java/Python
** opencv
No method chosen yet, but ipcam, gigecam and hebicam are tested and do not work for the camera of the drone. FFmpeg is also tested and does work, but capturing one image takes 2.2s which is way too slow. Therefore, it might be better to use software written in C/C++ instead of MATLAB.
===Processing images===
'''Objective''': Estimating the player (and ball?) positions from the captured images.
'''Method''': Detect ball position (if on the image) based on its (orange/yellow) color and detect the player positions based on its shape/color (?).
== Top Camera ==
The topcam is a camera that is fixed above the playing field. This camera is used to estimate the location and orientation of the drone. This estimation is used as feedback for the drone to position itself to a desired location.
The topcam can stream images with a framerate of 30 Hz to the laptop, but searching the image for the drone (i.e. image processing) might be slower. This is not a problem, since the positioning of the drone itself is far from perfect and not critical as well. As long as the target of interest (ball, players) is within the field of view of the drone, it is acceptable.


</p>


==References==
=References=
<references/>
<references/>
-->

Latest revision as of 16:07, 24 October 2017

'An objective referee for robot football'



A football referee can hardly ever make "the correct decision", at least not in the eyes of the thousands or sometimes millions of fans watching the game. When a decision will benefit one team, there will always be complaints from the other side. It is oft-times forgotten that the referee is also merely a human. To make the game more fair, the use of technology to support the referee is increasing. Nowadays, several stadiums are already equipped with goal line technology and referees can be assisted by a Video Assistant Referee (VAR). If the use of technology keeps increasing, a human referee might one day become entirely obsolete. The proceedings of a match could be measured and evaluated by some system of sensors. With enough (correct) data, this system would be able to recognize certain events and make decisions based on these event.


The aim of this project is to do just that; making a system which can evaluate a soccer match, detect events and make decisions accordingly. Making a functioning system which could actually replace the human referee would probably take a couple of years, which we don't have. This project will focus on creating a high level system architecture and giving a prove of concept by refereeing a robot-soccer match, where currently the refereeing is also still done by a human. This project will build upon the Robotic Drone Referee project executed by the first generation of Mechatronics System Design trainees.


To navigate through this wiki, the internal navigation box on the right side of the page can be used.


Tumbnail test video.png


Team

This project was carried out for the second module of the 2016 MSD PDEng program. The team consisted of the following members:

  • Akarsh Sinha
  • Farzad Mobini
  • Joep Wolken
  • Jordy Senden
  • Sa Wang
  • Tim Verdonschot
  • Tuncay Uğurlu Ölçer


Illustration by Peter van Dooren, BSc student at Mechanical Engineering, TU Eindhoven, November 2016.

Acknowledgements

A project like this is never done alone. We would like to express our gratitude to the following parties for their support and input to this project.

LogoAcknowledgements.png