Bold Hearts

← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description

Global description file


Please give a brief summary of your walking algorithm (max. 1000 characters).

Since 2013 the team developed and adopted a customised framework, except for some modules that were provided as open-source by the Darwin-OP platform1, including the walking algorithm. Consequently to the required changes of the hardware and use of ROS 2 as middleware, we needed a suitable and more stable walking algorithm that allows our robots to walk on artificial grass. We decided to integrate in our ROS 2 system the open loop walk engine for humanoid robots developed by the Robocup team Rhoban2.


Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

In previous years our vision system relied on simple pixel classi fication based on thresholding in HSV colour space. In 2018 we developed a Convolutional Neural Network (CNN) based segmentation approach as a drop-in replacement for this outdated method. The next year, in 2019, we integrated this CNN framework into ROS 2. We have open-sources this implementation based on Tensor Flow Lite3, and are currently working to release it into the ROS 2 build farm and repositories. Currently, our focus is on further optimising the runtime overhead of this system, to enable using more complex models. This is based on utilising Tensor Flow Lite's model discretisation methods to boot performance on our robot's ARM based CPUs, and applying further optimised operators that obtain good performance at lower computational cost.


Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

Developing a new framework from scratch requires time and dedication. In order to participate in the RoboCup competition we started to develop the key behaviours that will allow us to play. However, several approaches to localisation on the RoboCup eld are already used, and we are investigating the relatively most robust and reliable at runtime to use. We are particularly interested in a localisation mechanism that can be done using team communication and a vision-based approach to determine which direction to play at. In this direction, we are looking at a vision-based Monte-Carlo localization that uses goalie and other players' relative positions and knowledge of the goalpost and ball's position to determine a global model of the environment to enhance decision making.


Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

In RoboCup 2019 we started to move to ROS 2, which is based on Data Distribution Services (DDS) for real-time systems. This connectivity framework aims to enable scalability and real-time data exchange using a publisher-subscriber architecture. ROS 2 sits on top of that, providing standard messages and tools to adapt DDS for robotic needs. Our team has focused most of its efforts on this new framework. As a result, we have developed the basic behaviours so far that are required for the competition. More detail can be found in the attachment.


List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

Bold Hearts first Robocup competition was in 2003 in the Simulation league. Then, the team decided to start a new challenge moving to the Humanoid Kid size league in 2013. The team is quite active within the Robocup community. More detail can be found in the attachment.


Please list RoboCup-related papers your team published in 2019.

Publications are listed in the document attached.