← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description


Please give a brief summary of your walking algorithm (max. 1000 characters).

We use a gait generator based on the Robotis Darwin-OP software, that is an open loop walk generator.


Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

At the moment we only detect the ball, using the Mobilenet system trained on thousands of ball images.


Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

The team does not execute localization during the games, only during the start of the game, and it is made by dead reckoning.


Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

The decision process is based on state machines, one for each type of player. States are based on the position of the ball in the image, and actions are done to achieve the ball and kick it. The goalkeeper was trained using deep reinforcement learning.


List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

Out team competes in the humanoid league Since 2014, finishing among the 8 best teams in 2016. Our team was one of the first to use NUC computing units, to use carbon fiber in the structural components of the robots, to use large 3D printed parts, among other small contributions to the league.


Please list RoboCup-related papers your team published in 2019.

Object detection under constrained hardware scenarios: a comparative study of reduced convolutional network architectures. IEEE Latim American Robotics Symposium 2019.