Proof of Concept Robotic Drone Referee: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
 
(4 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<p>
Description here
</p>
==Use Case - Referee Ball Crossing Pitch Border Line==
==Use Case - Referee Ball Crossing Pitch Border Line==
<p>
<p>
As a proof of concept, part of the system architecture is developed for a specific use case. I.e. to referee a ball crossing the pitch border lines. Given limited hardware and time for development, requirements are relaxed as follows:
As a proof of concept, part of the system architecture is developed for a specific use case. I.e. to referee a ball crossing the pitch border lines. Given limited hardware and time for development, requirements are relaxed as follows:


*There is no distinction between goal lines and back and front lines. For the demo the developed system does not distinguish between a goal score and a ball crossing the back line. A ball crossing the back line is therefor considered as a candidate for both ball out of pitch and goal score. Additional screening is required to distinguish between these.
*There is no distinction between goal lines and back and front lines. For the demo the developed system does not distinguish between a goal score and a ball crossing the back line. A ball crossing the back line is therefore considered as a candidate for both ball out of pitch and goal score. Additional screening is required to distinguish between these.


*For the demo, there is not aimed for a real-time refereeing solution. Using limited processing power (PC), the aim is rather to proof the concept.
*For the demo, it is not aimed to achieve a real-time refereeing solution. Using limited processing power (PC), the aim is rather to proof the concept.


*Ball crossing line is only detected for:<br /> <br />  
*Ball crossing line is only detected for:<br /> <br />  
Line 98: Line 94:


== Developed Blocks ==
== Developed Blocks ==
[[Robotic Drone Referee#Developed Blocks|Developed Blocks]]
The development of the different blocks is described separately in [[Robotic Drone Referee#Developed Blocks|Developed Blocks]].

Latest revision as of 11:33, 29 November 2016

Use Case - Referee Ball Crossing Pitch Border Line

As a proof of concept, part of the system architecture is developed for a specific use case. I.e. to referee a ball crossing the pitch border lines. Given limited hardware and time for development, requirements are relaxed as follows:

  • There is no distinction between goal lines and back and front lines. For the demo the developed system does not distinguish between a goal score and a ball crossing the back line. A ball crossing the back line is therefore considered as a candidate for both ball out of pitch and goal score. Additional screening is required to distinguish between these.
  • For the demo, it is not aimed to achieve a real-time refereeing solution. Using limited processing power (PC), the aim is rather to proof the concept.
  • Ball crossing line is only detected for:

    • Speeds lower than 1.3 [m/s]. Design Choices:
      • Relaxed accuracy of 0.13 [m]. In the worst case, the ball crosses the pitch line at maximum speed, only just entirely and then is kicked/returned back into the field immediately after. To observe this, ball detection should be possible before and after line crossing. Therefore, at least an accuracy equal to the ball diameter is necessary i.e.: 0.13 [m].
      • Camera rate up to 10 [fps]: To improve simulation runtime the camera frame rate it.
        Getting an accuracy of 0.13 [m] with 10 [fps] results into an allowed ball speed of: 0.13 [m]*10[fps] = 1.3 [m/s]. Ball speed is typically higher than 1.3 [m/s]. For instance in RoboCup matches, ball speeds of up to 10 [m/s] are very common. In actual matches, ball speeds are up to 35 [m/s]. For more accurate ball detection:
        • A higher framerate is required, i.e. higher quality cameras.
        • And more processing power is necessary.

    • Ball lower than 0.1 [m] w.r.t. the ground. Design Choices:
      • To prevent misdetection (of other smaller or bigger circular objects compared to the ball) a specific aspect ratio is set to search for the ball.
      • The drone height w.r.t. the ground is kept constant at 2 [m]. The aim is to keep the drone as high as possible to increase its field of vision and in such a way be less demanding on tracking the ball and thus motion control. In the test field a net is cast on a height of 3 [m]. To keep from this, a safety margin of 1 [m] is chosen.
        With a specific constant tight aspect ratio and a preset constant drone height, detection of the ball is achievable for ball height up to 0.1 [m]. As a result there is not aimed at detecting/ refereeing ball bouncing across pitch line.

    • For occlusion less than 50 %. Design choices:
      • The color detection algorithm used for ball detection detects the ball for occlusion up to 50 %.
        The designed system only allows for occlusion under 50%. This is considered sufficient, because in case a better vision is required, the drone has the option to move around to reduce occlusion.

  • The developed system is tested without robots on the field. Robustness against such disturbances should in the future be investigated and possibly improved upon.

Scope realized System Architecture for POC

Considering the time available for realization to proof the concept, the scope is defined as follows:

  1. WM – Trilateration
  2. WM – Computer Vision
  3. WM – Field Line Estimator
  4. AS – Rule Evaluation
  5. ES – Detection


These are the blocks that have been actually realized, integrated and implemented to test and proof the concept according to the defined use case.
Next to these blocks, some others have been researched extensively:

  1. WM – Grid based coordinates
  2. WM – Sensor Fusion


Additionally, some have been researched roughly:

  1. AS – Search for ball
  2. ES – Positioning
  3. ES – Trajectory planning
  4. ES – Motion Control

Specification Interfaces

As the proof of concept already combines many different functionalities, it is important that the interfaces are clearly defined. For this reason before and during development the following overview was created and updated. It specifies for each block the inputs and outputs as well as source and destination of those inputs and outputs. For the overview of the system architecture and blocks being referred to, go to the System Architecture Section.

Display/Sound:

  • Output: This block produces a visual and/or audio message when the ball is out of pitch.
  • Input: This block requires an event id {0, 1, - 1}. 0 means ball in pitch, 1 means ball out of pitch, and -1 means no evaluation possible at the moment. This signal is retrieved from the WM- Game State block.


WM - Game State:

  • Output: This block produces an event id {0, 1, - 1} as required by the Display/Sound block.
  • Input: This block requires the event id {0, 1, - 1} from the Rule Evaluation block.


Rule Evaluation:

  • Output: This block produces an event id {0, 1, - 1} as required by the WM - Game state block.
  • Input: This block requires:

    • Pitch line id within field of vision: Side lines are numbered 1 and 3. Goal lines are numbered 2 and 4. See FIGURE for the numbering w.r.t. the origin and configuration of the coordinate frame. This data is retrieved from the WM – Field Line Estimator block.
    • Expected theta and rho of lines w.r.t. top left corner of field of vision:

      • Rho: (Rho > 0 [m]) ± (0.2 [m]): the allowed error is taken small enough, 40% of the distance between two parallel lines, as to still distinguish between two parallel lines present at the same time in the field of vision. In the MSL field the closest distance between parallel lines is 0.5 [m]. The margin is taken have of this distance, i.e. 0.2 [m]. This data is retrieved from the WM – Field Line Estimator block.
      • Theta: (-pi/2 < theta < pi/2) ± (pi/6). All (straight) lines in the soccer field are all either parallel or perpendicular to each other. Parallel lines are distinguished based on rho. Because of the rather large fixed angle of pi/2 between two perpendicular lines a relatively high tolerance of ± pi/6 is allowed. This data is retrieved from the WM – Field Line Estimator block.

    • Measured theta and rho of lines w.r.t. top left corner of field of vision: Measured theta and rho require higher accuracy. This is achieved as image processing is used for this, i.e. Hough transform. This higher accuracy data is retrieved from the Ball/Line Detection block. This data is consider sufficient because the performance of the system is determined by the relaxed requirements for ball crossing line evaluation.


WM - Field Line Estimator:

  • Output: This block produces:

    • Pitch line id within field of vision
    • Estimated rho: (Rho > 0 [m]) ± (0.1 [m]). The accuracy of this prediction is directly related (equal) to the error in measured drone position (input for this block). This error is 0.1 [m] << 0.2 [m]. This output is thus accurate enough as an input for the Rule Evaluation block.
    • Estimated theta: (-pi/2 < theta < pi/2) ± pi/18. The accuracy of this prediction is directly related (equal) to the error in measured yaw angle of the drone (input for this block). This error is pi/18 << pi/6. This output is thus accurate enough as an input for the Rule Evaluation block.

  • Input: This block requires:

    • Drone position w.r.t. center of field: (0 [m] < x,y < 12 [m]) and the error should be less than 0.2 [m]. This data is retrieved from WM – UWBS – Trilateration block.
    • Drone height: constant value for z.
    • Drone yaw angle: (-2pi < yaw < 2pi) and the error should be less than pi/6. This data is retrieved from the WM - Top View Camera Block.


Ball/Line Detection

  • Output:

    • Rho: In the worst cases the detected line is on the on the border of the line instead of the center. The allowed error is ± 0.08 [m] (w: line width).
    • Theta: In the worst case, the detected line is oriented diagonally within the line (width). The field of vision is high enough reducing the error in the angle to an allowable ± 10 [⁰].

These are sufficiently accurate for use as input for the Rule Evaluation block.


WM -UWBS - Trilateration:

  • Output: Drone position: (0 [m] < x,y < 12 [m]) ± (0.1 [m]). With the error 0.1 [m] << 0.2 [m], this output is a sufficient input for the WM – Field Line Estimator block.


WM-TVC:

  • Output: Drone yaw angle: (-2pi < yaw < 2pi) ± pi/18. With the error 0.1 [m] << 0.2 [m], this output is a sufficient input for the WM – Field Line Estimator block.

Developed Blocks

The development of the different blocks is described separately in Developed Blocks.