-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathREADME
93 lines (55 loc) · 3.82 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
To use this package, first set up your catkin workspace according to the instructions at
https://github.com/utexas-bwi/bwi
Next, navigate to catkin_ws/src/bwi_common/ and checkout the branch with static fact query service implemented:
$ cd catkin_ws/src/bwi_common/
$ git checkout jesse/static_fact_query
Next, set up the nlu_pipeline by navigating to catkin_ws/src and cloning this repository
$ cd ../
$ git clone https://github.com/thomason-jesse/nlu_pipeline.git
Finally, source the terminal and make the package
$ cd ../
$ source devel/setup.bash
$ catkin_make
Running the pipeline is a little different between simulation and the segbot.
SIMULATION:
Launch the segbot simulator.
$ roslaunch bwi_launch simulation.launch
The simulator has to be launched to allow the robot to execute actions given to it in dialoge. It also
launches the ASP knowledge base used to ground predicates such as person(ray).
SEGBOT:
Turn on the base and launch the v2 platform
$ roslaunch bwi_launch segbot_v2.launch
Localize the robot, then launch
$ roslaunch bwi_kr_execution bwi_kr_execution.launch
So that KB facts can be grounded.
TESTING PIPELINE:
In a new tab for catkin_ws/, you can test the pipeline with
$ rosrun nlu_pipeline test_pipeline.py [ontology text file] [lexicon text file] [training text file]
rosrun nlu_pipeline test_pipeline.py src/nlu_pipeline/src/resources/fake_to_real/ont.txt src/nlu_pipeline/src/resources/fake_to_real/lex.txt src/nlu_pipeline/src/resources/fake_to_real/utterance_sem_forms.txt
TESTING POMDP:
rosrun nlu_pipeline test_pomdp_pipeline.py src/nlu_pipeline/src/resources/fake_to_real/ont.txt src/nlu_pipeline/src/resources/fake_to_real/lex.txt src/nlu_pipeline/src/resources/fake_to_real/utterance_sem_forms.txt
POMDP AMT Experiment
rosrun nlu_pipeline DialogueServer.py src/nlu_pipeline/src/resources/fake_to_real/ont.txt src/nlu_pipeline/src/resources/fake_to_real/lex.txt src/nlu_pipeline/src/resources/fake_to_real/utterance_sem_forms.txt true
Training POMDP :
rosrun nlu_pipeline train_pomdp_pipeline.py src/nlu_pipeline/src/resources/fake_to_real/ont.txt src/nlu_pipeline/src/resources/fake_to_real/lex.txt src/nlu_pipeline/src/resources/fake_to_real/utterance_sem_forms.txt
Rosbridge server:
roslaunch rosbridge_server rosbridge_websocket.launch
Train Parser :
From catkin_ws/src/nlu_pipeline/src/bootstrapping run
python train_parser_iterative.py ../resources/fake_to_real/ont.txt ../resources/fake_to_real/lex.txt ../resources/fake_to_real/utterance_sem_forms.txt ../tests/templated_examples/1000.txt
NOTE:
Right now there will be an initial "testing Generator" prompt. You can skip this by typing "stop".
This is a result of me not branching the latest component.
Format of various logs -
1. executed_actions - The file is named as user_id.txt and has a single line containing the final action of the dialogue.
2. log - The text of the dialogue, named as user_id.txt.
3. log_special - Files that contain aggregtae data for an experiment, for example the list of user ids
4. object_log - This has a pickle named user_id.pkl for each dialogue. The pickle is a tuple with
- A string : One of 'only_parser_learning', 'only_dialog_learning', 'both_parser_and_dialog_learning'
- A list of lists, each of which corresponds to a turn. Each turn is a list of
- The system action of type SystemAction or a string 'take_action'
- The text of the user response (None if the previous system action is of action_type 'take_action')
- The final action understood, of type Action
- Train data for the parser. This is a dict with keys
- full - Text corresponding to a complete action
- <param_name> - Value for the parameter <param_name>. Currently the parameter names in use are patient, recipient and location