Poetry and couplets, as a valuable part of human cultural heritage, carry traditional Chinese culture. Auto-generation couplet and poetry writing are challenges for NLP. This paper proposed a new multi-task neural network model for the automatic generation of poetry and couplets. The model used seq2seq encoding and decoding structure, which combined attention mechanism, self-attention mechanism and multi-task learning parameter sharing. The encoding part used two BiLSTM networks to learn the similar characteristics of ancient poems and couplets, one for encoding keywords and the other for encoding generated poems or couplet sentences. The decoding parameters were not shared. It consisted of two LSTM networks which decode the output of ancient poems and couplets, respectively, in order to preserve the different semantic and grammatical features of ancient poems and couplets. Poetry and couplets have many similar characteristics, and multi-task learning can learn more features through related tasks, making the model more generalized. Therefore, we used multi-task model to generate poems and couplets, which is significantly better than single-task model. Also our model introduced a self-attention mechanism to learn the dependency and internal structure of words in sentences. Finally, the effectiveness of the method was verified by automatic and manual evaluations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.