ZSTT-NTNU

← Back to teams list

Team website
Qualification video
Team short paper
Hardware specifications

Software description

walking

Please give a brief summary of your walking algorithm (max. 1000 characters).

We found the appropriate swing value for a stable walking gait of a humanoid robot using IMU. Our humanoid robot applied a basic walking gait using inverse and forward kinematics. Software for making a basic walking motion is made available by calculating those kinematics. We calculate walking gait by analyzing IMU sensor values calculated by our algorithm. And the motion program shows an inverted pendulum simulation and the output of the sensor data. Our robot can walk stably in all omni-direction. We can make motions such as kicking a ball, moving the arms, moving the head, and so forth in our motion software. The motion software and the humanoid are connected by serial communication.

Attached file →

vision

Please give a brief summary of your vision algorithm, i.e. how your robots detect balls, field borders, field lines, goalposts and other robots (max. 1000 characters).

Our software system is implemented with python and OpenCV. we have been able to detect objects such as the ball, the goalpost, the field, and the opponent using the hough circle and colour based methods.

Attached file →

localization

Please give a brief summary of how your robots localize themselves on the field (max. 1000 characters).

The localization system uses vision data and IMU data. In our system, we calculate the distance of the ball, the opponent, and the goalposts. The distance is calculated using the angle of the head (camera) and the vertical position of objects on the image (from camera) and the height of the robot (height of camera from ground). After detecting the objects' positions such as the ball, the opponent, and the goalpost, the robot estimates its position in the field using the IMU data and the result of the image processing.

behavior

Please give a brief summary of your behavioral architecture and the decision processes of your robots (max. 1000 characters).

The game controller server broadcasts the game status such as the current state (initializing, ready, set, etc.). When the perception controller receives the information, it sends a heartbeat to the game controller server to notify that we are connected. The robot periodically sends IMU data (roll, pitch, and yaw) to the perception controller. The perception controller gets frames from the camera and then attempts to detect objects such as the field, the ball, the opponent, and the opponent's goalpost. The perception controller determines the command and sends it to the robot.

contributions

List your previous participation (including rank) and contribution to the RoboCup community (release of open-source software, datasets, tools etc.)

ROBOCUP 2019 Humanoid League Adult size Technical challenge 3rd place ROBOCUP 2019 Humanoid League Adult size Main game 4th place ROBOCUP 2018 Humanoid League Adult size Technical challenge 3rd place ROBOCUP 2017 Humanoid League Adult size Main game 4th place

publications

Please list RoboCup-related papers your team published in 2019.