Sentiment analysis is a significant task in nature language processing (NLP). Acquiring high quality word representations is a key point in the task. Specially we find that the same word has different meaning in different sentence, which should be recognized by computer. This idea cannot be done well by traditional way of word embeddings. In this paper, we propose a BERT(Bidirectional Encoder Representation from Transformers) + BiGRU (Bidirectional Gated Recurrent Unit) model which first put words into vector via BERT model, from which we can gain the contextualized embeddings, then perform the sentiment analysis by BiGRU. Experimental results prove that compared with various of different methods, our model has the best performing.