In this article we are going to discuss about very interesting topic of natural language processing(NLP) Neural Machine translation (NMT) using Attention model. Machine translation is nothing but automatic translation of text from one language to another.
Here we will learn how to use sequence to sequence architecture (seq2seq) with Bahdanau’s Attention mechanism for NMT.
This article assumes that you understand following:-
Before going through code we will discuss Bidirectional LSTM and Attention mechanism in short.
If you understand LSTM then Bidirectional is…
Machine translation is one of earliest and challenging task of computers due to fluidity of human language. It is simply an automatic translation of text from one language to another.
In this article we will discuss little bit of encoder-decoder. Then we will walkthrough code of Neural machine translation. This is going to be a fun ride.
Before we go ahead you should know about following-