-
Notifications
You must be signed in to change notification settings - Fork 0
Hulks Nao Calibration
When using Twix with the Parameter
or Manual Calibration
panel, there are two options.Set
is to change the values on the robot and Save to Disk
stores them locally in the robots head file. It looks like etc/configuration/head.P0000074A04S8C700011.json
and will contain all the changes. After saving to disk, push them to a separate branch or to main if no calibration has been done yet.
MAKE SURE TO SAVE THEM TO DISK OR YOU WILL LOSE THEM!!!!!!!!!!!!!!
- Put the robot into the
Initial
state - Hold the front head button, press the chest button for a solid second. The robot should center its head and its chest button will turn purple if done correctly. The robot is now in the
Calibration
state. - Place the robot in the center of the field looking at one of the penalty markers.
- Open Twix. Open the panels: Manual Calibration and Image side by side.
- In the
Image
panel, enable thePenalty Boxes
in theOverlay
dropdown menu. - Check the console where Twix was run, if there are any errors: restart twix (the panels should be automatically saved)
- Tweak the sliders in the
Manual Calibration
panel to fit the overlay to the actual fields penalty box.
- Open up the
Image Segments
panel and a parameter panel withimage_segmenter
- Mainly tweak with the
"vertical_edge_threshold"
attribute of the bottom and top camera - Do this until You have decent segments up close and in the horizontal center of the image. Make sure that each reasonably visible line has a clear segment.
Good Threshold | Bad (Noisy) Threshold |
---|---|
Prevent as much noise in the image as possible, without influencing the line detection. Especially in the center of the image.
The vertical stride means the amount of pixels that it will skip after analyzing a pixel. This will speed up the process, but will make it a lot more inaccurate especially if the line is further in the distance and therefore smaller on screen. Going from "vertical_stride": 2
-> "vertical_stride": 1
will double the execution time for this fairly large part of the vision pipeline. Keep the value to 2, tweak everything according to that value and only if truly necessary, this value can be put to 1
Stride: 1 | Stride: 2 |
---|---|
vertical_stride: keep to 2, if necessary go to 1
Using the FieldColor
option in the ColorMode
drop down segments will be colored by how well they match the field color parameters. Open the field_color_detection
parameters panel. With the color meaning:
- Yellow: Completely confident
- White: Pretty sure
- Original Color: Not field
To get a closer look at the algorithm behind this filtering look at
field_color.rs
. For a mainly well lit field about 90-95% of the segments should be Yellow with the rest being mainly White. Worse conditions may be hard to tweak.
The following properties should be changed in order.
First look at the other color than green. Using the Image Segments
panel and moving your cursor over segments can show you the RGB values and might help in changing "blue_chromaticity_threshold"
and "red_chromaticity_threshold"
.
Then tweak the "lower_green_chromaticity_threshold"
, "upper_green_chromaticity_threshold"
, "green_luminance_threshold"
and tweak by changing them slightly and seeing the effect.
"vision_top": {
"blue_chromaticity_threshold": How much blue is filtered
"red_chromaticity_threshold": How much red is filtered
"lower_green_chromaticity_threshold": Lower Threshold
"upper_green_chromaticity_threshold": Upper Threshold
"green_luminance_threshold": Green Luminance
}
Bad filtering | Good Filtering |
---|---|
To see the result of the filtering enable the Filtered Segments
option in the Image Segments
panel.
The input for the line detection is the filtered segments view in image segments on Twix. Look for good lines there. Sometimes it might flicker, which is still unknown why. This is then put into a RANSAC algorithm, which is non-deterministic, meaning it won't produce the exact same results every time. The parameters to tweak can be found by searching line_detection
. The process of line detection is quite complex.
- Start by disabling all "check_*" booleans. These are meant to filter out some lines, which is imporant to look for.
- Tweak the other parameters to get good lines
- Turn the other parameters back on to see if they don't filter out the wrong lines.
- To better understand what happens behind the scenes, it's best to look how each variable is used in the code by searching for it.
"vision_top": {
"allowed_line_length_in_field": {
"end": 4.0,
"start": 0.15000000596046448
},
"check_line_distance": true,
"check_line_length": true,
"check_line_segments_projection": true,
"gradient_alignment": -0.949999988079071,
"maximum_distance_to_robot": 6.0,
"maximum_fit_distance_in_pixels": 3.0,
"maximum_gap_on_line": 30.0,
"maximum_number_of_lines": 10,
"maximum_projected_segment_length": 0.30000001192092896,
"minimum_number_of_points_on_line": 5
}
Odometry refers to the process of estimating the robot's position and orientation using data from its leg motions and other sensors. By analyzing the leg movements and sensor feedback, the robot can estimate how it has moved and turned on the field. This information is crucial for navigation and decision-making during the game.
To tweak this you can perform a test of letting the robot walk to its starting position, whilst disabling localization by lines. This can be done by setting localization.use_line_measurements
to false
. Then you can tweak the odometry
parameters, which is (probably) in the format of [x, y]
.