Skip to content

Commit

Permalink
Merge pull request #5 from OCRTOC/patch
Browse files Browse the repository at this point in the history
release v1.1
  • Loading branch information
rar-lw authored Aug 28, 2020
2 parents 8aaf1c9 + ab7f03b commit ab1bfce
Show file tree
Hide file tree
Showing 2,594 changed files with 3,548,940 additions and 71 deletions.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
## [V1.1] release date: 2020-08-28
**Changes in v1.1**
- Added 20 object models and 100 trial scenes (available at ocrtoc_materials_patch). You can use the script ocrtoc_materials_patch/patch.sh to copy the new models and scenes into your docker image. The new object models will be re-used in the simulation contest and in the real robot stage.
- Added the task evaluation module (search "Trigger task and evaluation" in readme). Your solution will be evaluated by the same software module in the simulation contest.
- Changed ambient light strength from 0.3 to 0.7 in all trial scenes.
- Fixed camera_info of the Kinect camera for the Sapien simulator.
- Fixed gripper issues for the Sapien simulator. Now the robotiq 2f-85 gripper model uses an explicitly-built prismatic joint. Before the fix, the gripper model relied on the kinematics constraints of three revolute joints, which sometimes failed to maintain the mimic joint status due to certain external forces.

**Known issues in v1.1**
- The gripper model in the Gazebo simulator is not fixed. If you experience any weird problems while using Gazebo, please use Sapien instead.


15 changes: 6 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ This is the OCRTOC software package. For more information on OCRTOC please visit

To take part in OCRTOC, you need to develop your own solution using this software package. After uploading your solution to the competition platform, the performance of your solution will be evaluated.

For the simulation stage, we support two simulators: Gazebo and Sapien. You can choose either of them to test your solution on your local machine. On the competition platform we use both simulators to evaluate your solution. Your final score will be the maximum score of the two simulators. As long as your solution works fine with either of the two simulators, it is good enough for the qualification of the real robot stage.
For the simulation stage, we support two simulators: Gazebo and Sapien. You can choose either of them to test your solution on your local machine. On the competition platform we use both simulators to evaluate your solution. As long as your solution works fine with either of the two simulators, it is good enough for the qualification of the real robot stage.

For the real robot stage, hardware drivers will be provided in this software package (around the end of August). Your solution will be tested on real robot hardware. The software interfaces for sensor readings and robot control are the same for both simulation and the real robot hardware. So you will not encounter interface issues when transferring your solution from simulation to the real robot.

Expand Down Expand Up @@ -108,7 +108,7 @@ After you obtained the task information, you need to implement the core function

**Evaluation**

After you finished the task, you need to publish the actionlib result topic. The format of this result is a string. We do not parse the content of this string. Instead, it is only used to activate our callback function for evaluation. So you can write anything reasonable into this string, such as "done", "finished" and so on. If you do not publish the actionlib result at all, your solution will be terminated after a predefined timeout (e.g. 10 minutes), and then the evaluation will start automatically. We highly recommend you to publish the actionlib result topic, once your solution has finished the execution. This helps us compute the execution time of your solution. If two teams have the same score, the team consuming less execution time will be ranked higher.
After you finished the task, you need to publish the actionlib result topic. The format of this result is a string. We do not parse the content of this string. Instead, it is only used to activate our callback function for evaluation. So you can write anything reasonable into this string, such as "done", "finished" and so on. If you do not publish the actionlib result at all, your solution will be terminated after a predefined timeout (e.g. 10 minutes), and then the evaluation will start automatically. We highly recommend you to publish the actionlib result topic, once your solution has finished the execution. This helps us compute the execution time of your solution. If two teams have the same performance, the team consuming less execution time will be ranked higher.


## Use the OCRTOC software package on your local machine
Expand Down Expand Up @@ -150,16 +150,13 @@ sudo docker exec -it ocrtoc_container bash
roslaunch ocrtoc_solution commit_solution.launch
```

5. **Trigger task and scoring**

**The scoring module** will be released by the end of August. Before the release, you will not see score output by running the following script.

5. **Trigger task and evaluation**
```bash
sudo docker exec -it ocrtoc_container bash
# For gazebo
roslaunch ocrtoc_task trigger_and_score.launch simulator:=gazebo scene:=1-1
roslaunch ocrtoc_task trigger_and_evaluation.launch simulator:=gazebo scene:=1-1
# For sapien
roslaunch ocrtoc_task trigger_and_score.launch simulator:=sapien scene:=1-1
roslaunch ocrtoc_task trigger_and_evaluation.launch simulator:=sapien scene:=1-1
```

## Submit your solution for the simulation contest
Expand Down Expand Up @@ -195,7 +192,7 @@ roslaunch ocrtoc_solution commit_solution.launch
# In terminal 3
sudo docker exec -it ocrtoc_container bash
source /root/catkin_ws/install.setup.bash
roslaunch ocrtoc_task trigger_and_score.launch simulator:=gazebo scene:=1-1
roslaunch ocrtoc_task trigger_and_evaluation.launch simulator:=gazebo scene:=1-1
# 4. Export a docker image by the docker container.
sudo docker commit ocrtoc_container your_submission_docker_image_name
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,189 @@
o convex_0
v -0.029227 -0.004752 -0.073315
v 0.033142 0.002542 0.078614
v 0.033142 -0.001433 0.078614
v -0.032540 -0.000108 0.079283
v 0.002620 0.017138 -0.076641
v 0.008596 -0.017358 -0.077294
v -0.010644 -0.019349 0.078614
v -0.016620 0.017808 0.076625
v 0.025847 0.001217 -0.076625
v 0.018546 0.017805 0.076625
v -0.024576 0.009840 -0.075957
v 0.023191 -0.014704 0.076625
v -0.020595 -0.014038 -0.076625
v -0.026570 -0.012051 0.079267
v 0.020534 0.011169 -0.071978
v 0.023853 -0.009393 -0.062701
v -0.031215 0.007190 0.040787
v 0.005276 -0.020011 0.078630
v -0.009319 -0.018020 -0.075957
v 0.030492 0.008515 0.037477
v -0.010644 0.015813 -0.077946
v -0.004012 0.019796 0.044113
v -0.000693 0.009178 0.081288
v 0.032479 -0.004748 0.043445
v 0.018546 -0.008068 -0.081288
v -0.023914 -0.006740 -0.080604
v -0.030552 -0.008731 0.038798
v -0.025908 0.012497 0.078614
v 0.013902 0.009178 -0.081288
v -0.019270 -0.016691 0.040135
v -0.032540 0.001879 0.011585
v 0.007264 0.019796 0.075973
v 0.025847 0.013822 0.077962
v 0.014565 0.014489 -0.074652
v -0.029227 0.002542 -0.075304
v 0.015890 -0.018020 0.078630
v 0.033142 0.003208 0.042124
v -0.018607 0.013822 -0.074636
v 0.024516 -0.013375 0.038145
v -0.025245 0.013160 0.041456
v 0.003282 -0.007402 0.081288
v 0.019872 -0.012047 -0.076625
v 0.026510 -0.004086 -0.068000
v -0.017282 0.007190 -0.081288
v -0.010644 0.016480 -0.073983
v 0.025847 0.005862 -0.064021
v -0.033202 -0.003424 0.046755
v 0.023853 0.014485 0.042776
v -0.023914 -0.012051 -0.076625
v -0.026570 -0.012713 0.040135
v 0.007927 0.015813 0.080620
v -0.006662 -0.015366 -0.080620
v 0.010583 0.019133 0.044113
v 0.029166 -0.010059 0.077962
v 0.005939 -0.018020 -0.075973
v -0.002680 -0.020011 0.062016
v 0.015227 -0.015366 -0.062016
v -0.013300 -0.017358 -0.074636
v 0.029166 -0.009393 0.036156
v -0.019270 0.013160 0.080620
v -0.007325 0.019796 0.075973
v -0.013963 -0.004748 0.081288
v 0.031154 0.008515 0.077962
v -0.031215 0.006524 0.077294
f 17 31 64
f 15 9 29
f 21 5 29
f 9 25 29
f 7 14 30
f 15 29 34
f 29 5 34
f 17 11 35
f 26 1 35
f 1 31 35
f 31 17 35
f 2 3 37
f 3 24 37
f 21 11 38
f 12 36 39
f 11 17 40
f 28 8 40
f 17 28 40
f 8 38 40
f 38 11 40
f 3 2 41
f 7 18 41
f 36 3 41
f 18 36 41
f 25 9 42
f 6 25 42
f 16 39 42
f 24 16 43
f 9 37 43
f 37 24 43
f 42 9 43
f 16 42 43
f 11 21 44
f 21 29 44
f 29 25 44
f 35 11 44
f 26 35 44
f 5 21 45
f 22 5 45
f 38 8 45
f 21 38 45
f 9 15 46
f 15 20 46
f 20 37 46
f 37 9 46
f 14 4 47
f 1 27 47
f 27 14 47
f 31 1 47
f 4 31 47
f 20 15 48
f 10 33 48
f 33 20 48
f 34 10 48
f 15 34 48
f 1 26 49
f 26 13 49
f 27 1 49
f 14 27 50
f 30 14 50
f 13 30 50
f 49 13 50
f 27 49 50
f 10 32 51
f 33 10 51
f 41 2 51
f 23 41 51
f 6 19 52
f 19 13 52
f 25 6 52
f 13 26 52
f 44 25 52
f 26 44 52
f 5 22 53
f 32 10 53
f 22 32 53
f 34 5 53
f 10 34 53
f 24 3 54
f 36 12 54
f 3 36 54
f 12 39 54
f 19 6 55
f 6 36 55
f 36 18 55
f 18 7 56
f 7 19 56
f 19 55 56
f 55 18 56
f 36 6 57
f 39 36 57
f 6 42 57
f 42 39 57
f 13 19 58
f 19 7 58
f 7 30 58
f 30 13 58
f 16 24 59
f 39 16 59
f 24 54 59
f 54 39 59
f 8 28 60
f 28 4 60
f 23 51 60
f 60 51 61
f 32 22 61
f 45 8 61
f 22 45 61
f 51 32 61
f 8 60 61
f 4 14 62
f 14 7 62
f 7 41 62
f 41 23 62
f 60 4 62
f 23 60 62
f 20 33 63
f 2 37 63
f 37 20 63
f 33 51 63
f 51 2 63
f 4 28 64
f 28 17 64
f 31 4 64
15 changes: 15 additions & 0 deletions ocrtoc_materials_patch/models/conditioner/meshes/textured.mtl
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# 3ds Max Wavefront OBJ Exporter v0.97b - (c)2007 guruware
# File Created: 19.08.2020 10:57:47

newmtl _texture
Ns 30.0000
Ni 1.5000
d 1.0000
Tr 0.0000
Tf 1.0000 1.0000 1.0000
illum 2
Ka 0.0000 0.0000 0.0000
Kd 1.0000 1.0000 1.0000
Ks 0.0000 0.0000 0.0000
Ke 0.0000 0.0000 0.0000
map_Kd textured_map.jpg
Loading

0 comments on commit ab1bfce

Please sign in to comment.