WF Wolves

← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description


Please give a brief summary of your walking algorithm (max. 1000 characters).

Rewrote Body Code into several ROS standard drivers + nodes + Weight Cells As the Hamburg Bitbots already tried to work on weight cells to measure the foot pressure force, we adapted this approach the last year. Using these, it allows us to stabilize the walking. With the update of the MX Dynamixels to the new Firmware, effort values based on the current motors is made available. These values are used as additional information to determine the position of the joints and the foot plates.


Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

Vision Since two years we are working with tensorflow and neural networks to detect balls, field lines and goal posts. This is important as the size of the soocer field grows every year and longer distances occur. In this field, especially in cooperation with the Hamburg Bitbots we could expand our skills. For classification, the Bitbots Imagetagger is used . The built-in Jetson helps to calculate and classify the recordings in a fast way and object recognition in general works way better than using the cascade classifier as done before. Camera Change


Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

Localization Since first approaches using a visual compass weren’t as robust and computational light as necessary during competition difficulty increased further by adding natural light scenarios last year. With the introduction of natural light, existing reflections on the artificial grass increased. Therefore we developed approaches on finding the right field color and field-lines for basic localization [7]. While trying to emphasize ROS capabilites we developed early prototypes (particle-filter based approaches) that normally rely on odometry and laser scan data. We altered these approaches to fit with visual features feedbacks e.g. lines. These approaches can be extended by other features but were hardly evaluated during play yet due to the lack of a basic odometry. Furthermore we use a goal detection so that the robot kicks the ball in the targeted direction.


Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

Behavior Within our behavior we are using a state machine. Depending on the role of the robot, it fulfill different tasks in different situations. It is defined how to search the ball, how and whether to go to the ball when it is detected, how and where to kick the ball or to defend. Main roles are striker, defender and keeper. For more flexibility we started evaluating FlexBE a Framework able to design complex robot workflows also used in DARPA or ARGOS challenges. Thus we are able to build and analyze our robot behaviour more easily.


List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

Participation Kid + TeenSize developed common ROS Architecture used by Bit Bots & Rhoban & us Coorporation with Taura Bots & Hamburg Bit Bots several papers from A. Gabel


Please list RoboCup-related papers your team published in 2019.