Attention based bidirectional long short-term memory networks have been increasingly concerned and widely used in Natural Language Processing tasks. Motivated by the performance of attention mechanism, various attentive models have been proposed to prompt the effectiveness of question answering. However, there are few researches that have focused on the impact of positional information on question answering, which has been proved effective in information retrieval. In this paper, we assume that if a word appears both in the question sentence and answer sentence, words close to it should be paid more attention to, since they are more likely to contain potential valuable information for the question. Moreover, there also has few researches that consider part-of-speech into question answering. We argue that words except nouns, verbs and pronouns tend to contain less useful information than nouns, verbs and pronouns, so that we can neglect the positional impact of them. Based on both assumptions above, we propose a part-of-speech and position attention mechanism based bidirectional long short-term memory networks for question answering system, abbreviated in DPOS-ATT-BLSTM, which cooperates with traditional attention mechanism to obtain attentive answer representations. We experiment on the Chinese medicinal dataset collected from the http://www.xywy.com/ and http://www.haodf.com/, and comparative experiments are made comparing with methods based on traditional attention mechanism. The experimental results demonstrate the good performance and efficiency of our proposed model.