NUbots

← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description

Global description file

walking

Please give a brief summary of your walking algorithm (max. 1000 characters).

The current walk implementation uses an open-loop hybrid walk controller, dynamically switching between an analytically solved trajectory for the centre of mass based on the linear inverted pendulum model, and a Zero Moment Point preview controller, based on the work of Yi et al. and Song. To add local feedback, positions for the pitch and roll ankle joints are prescribed based on information from the Inertial Measurement Unit (IMU) to counteract small imbalances. Our current walk is an old walk used on the Darwin-OP robots and has major issues. Because of this we are investigating other walks to use for the 2020 competition. These include BitBot’s Quintic Walk based off Rhoban’s Quintic Walk and Rhoban’s IK Walk. A quasi-static walk being developed as a mechatronics final year project is due to be completed by the 2020 competition. It uses a non-linear optimisation method to generate a quasi-static walk gait.

vision

Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

The Visual Mesh underpins the vision system, and is used for sparse detection of balls, points on the field, field lines, goal posts and other robots. From the Visual Mesh a series of specialised detectors are employed to detect field edges, balls, and goal posts. All points that the Visual Mesh has identified as either field points or field line points are clustered into connected regions and each cluster is then either merged or discarded using some heuristics until a single cluster remains, allowing an upper convex hull to be fitted. The ball detector finds all clusters which are below the upper convex hull. A circular cone is then fitted to each cluster. Different heuristics, such as degree of circle fit and different distance metrics, are then used to discard to cones. The goal post detector finds all clusters which intersect the upper convex hull. Clusters are formed from goal post edge points and a determination of the bottom mid-point is made for each post.

localization

Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

The localisation on-board the robot is performed using a Particle Filter. This allows us to maintain multiple hypotheses about the current pose of the robot, providing robustness against the mirrored field problem and having multiple potential initial positions when entering the field. The filter relies on measurement updates from the vision module, and on IMU data for time updates. The measurement update previously only tracked the four goal post locations; however, we are working on introducing the tracking of field lines during 2020. In particular, we will be focusing on distinctive field features such as corners and the centre circle.

behavior

Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

The robot's behaviour is a basic state machine where the robot will look for the ball and goals, position itself to kick the ball toward the goals, and then kick. The vision system determines which foot the robot will kick with.

contributions

List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

The NUbots team participated in the 2019 Humanoid Teen-Size League & finished as quarter finalists. All the team's RoboCup code, hardware, & debugging tools are open sourced on GitHub. The NUbots team have developed a Blender plugin to generate semi-synthetic images with fully-annotated ground truth segmentation maps. The NUbots team have been active participants in the Humanoid League rules discussions. A new standard communication protocol, based on Protobuf messages, was proposed by NUbots to the TC. A prototype tool, based on the NUsight debugging utility, for monitoring network communications & displaying robot communications in a meaningful manner is currently being developed. A new development this year hopes to benefit not only the NUbots team but other teams interested in any of our systems. This is a comprehensive documentation resource in the form of a public website, providing detailed information about the hardware & software systems, as well as current & future projects.

publications

Please list RoboCup-related papers your team published in 2019.

Zahn, B., Fountain, J., Houliston, T., Biddulph, A., Stephan, C., Mendes, A. (2019). Optimization of robot movements using genetic algorithms and simulation. Ginn D., Mendes A., Chalup S., Fountain J. (2018) Monocular ORB-SLAM on a Humanoid Robot for Localization Purposes. In: Mitrovic T., Xue B., Li X. (eds) AI 2018: Advances in Artificial Intelligence. AI 2018. Lecture Notes in Computer Science, vol 11320. Springer, Charm. https://doi.org/10.1007/978-3-030-03991-2_8