Code for Generating Chinese Classical Poems with RNN Encoder-Decoder (CCL 2017). More related resources are available at THUAIPoet.
All rights reserved.
We built our poetry generator with GroundHog, more details of the Gated Unit RNN model please refer to https://github.com/lisa-groundhog/GroundHog
All needed source codes are in the directory GroundHogAttention. We did a few changes on the source codes, thus they are not totally same as the original GroundHog.
Please add GroundHogAttention to python path, then preprocess your poetry corpus with the tools in GroundHogAttention/experiments/nmt/preprocess. More details please refere to https://github.com/lisa-groundhog/GroundHog.
You can put your training data (mainly the vocabulary) , and the model file, e.g. search_model.npz and the config file, e.g. search_state.pkl in GroundHogAttention/experiments/nmt/SPB/ and run generate.py to generate poetry lines.
Other Blocks described in our paper, such as CPB and WPB, can be easily got by changing the training data.
This work has been integrated into the automatic poetry generation system, ** THUAIPoet(Jiuge, 九歌)**, which is available via https://jiuge.thunlp.cn. This system is developed by Research Center for Natural Language Processing, Computational Humanities and Social Sciences, Tsinghua University (清华大学人工智能研究院与社会人文计算研究中心). Please refer to THUAIPoet, THUNLP and THUNLP Lab for more information.
Xiaoyuan Yi, Ruoyu Li, and Maosong Sun. 2017. Generating Chinese Classical Poems with RNN Encoder-Decoder. In Proceedings of the Sixteenth Chinese Computational Linguistics, pages 211–223, Nanjing, China
If you have any questions, suggestions and bug reports, please email [email protected] or [email protected].