Electric Sheep

← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description

walking

Please give a brief summary of your walking algorithm (max. 1000 characters).

The walking algorithm is based on the open loop walk engine written and published by team Rhoban in 2015 [1]. As our platforms lower body kinematic structure is relatively unique, we have to adapt the kinematic chain to account for the significant joint offsets. We attempt to keep the torso orientation upright and keep it elevated from the ground in order to reduce load torque on the pitch motors. Balance is achieved by measuring the orientation of the torso and adjusting the foot offset parameters into the walk engine to compensate appropriately. Our intention over the next six months is to increase leg motor torque and to generate a new walking pattern, more suitable for larger robot structures. [1] https://github.com/Rhoban/IKWalk

vision

Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

The vision system is a custom CNN based on YOLO in the Darknet framework which we named xYOLO [1]. For detecting balls and goals, we are able to achieve a single core performance of approximately 10 frames per second on the Raspberry Pi 3 B. This was achieved by reducing the size of the network and strategically using XNOR layers, slightly reducing overall accuracy but significantly increasing speed. In the next six months we are upgrading our computation unit and intend to increase the size of the network, as well as detecting other robots and field lines. [1] Barry, Daniel, et al. "xYOLO: A Model For Real-Time Object Detection In Humanoid Soccer On Low-End Hardware", IVCNZ, 2019.

localization

Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

Localization will consist of inertial measurement unit (IMU) based odometry and vision based positioning and heading correction, where they will be combined with an extended Kalman filter. IMUs are located in the torso and feet. The IMU foot data will predict step length and heading estimation by using zero-velocity update (ZUPT) [1]. ZUPT will be applied during the stable phase of the walk cycle. Heading will be estimated from gyroscope data from the foot IMUs, where each step accumulates a position and orientation estimate. Accumulated IMU drift error is accounted for by a variance value in the Kalman filter. Goal posts will be absolute reference points for the vision system. As the playing field is inherently symmetric, team communication will be used to break the symmetry. [1] Xiaofang, Li, et al. “Applications of zero-velocity detector and Kalman filter in zero velocity update for inertial navigation system.” Proc. Chinese Guidance, Navigation & Control. IEEE, 2014.

behavior

Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

Our behaviour is a “fall-through” state-machine and is split into several sections, essentially we have shared behaviour and role-specific behaviour. In the shared behaviour, safety critical functionality is addressed first (i.e. buttons, overheating motors), followed by basic game behaviour (i.e. game controller state, get up from fallen). Role specific behaviour is then able to make use of several inputs, including networking (game controller and mitecom team protocol), object detection, localization. The striker behaviour’s priorities are to locate the ball, walk to ball, find the goal, kick the ball towards the goal. These priorities can be achieved out-of-order if for example the correct goal is already located, a search is not required.

contributions

List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

2019 was our teams first competition and we had difficulty with qualifying due to various hardware issues. Despite this, we attended each drop-in game, each of the league matches and fulfilled our referee duty (including finals). One of our team members was awarded best referee. After the world cup competition in 2019, we open sourced our entire platform, including electronics, hardware and software under an MIT license (unless other- wise stated in directories/files) [1]. Our intention is to open source the platform used in the world cup competition in 2020. The training images and annotations used for the neural network were also uploaded to the BitBots Image Tagger [2]. [1] https://github.com/electric-sheep-uc/black-sheep [2] https://imagetagger.bit-bots.de

publications

Please list RoboCup-related papers your team published in 2019.

Attached file →