Natural Language Processing with Attention Models

Por: Coursera . en: , ,

  • Neural Machine Translation
    • Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.
  • Text Summarization
    • Compare RNNs and other sequential models to the more modern Transformer architecture, then create a tool that generates text summaries.
  • Question Answering
    • Explore transfer learning with state-of-the-art models like T5 and BERT, then build a model that can answer questions.
  • Chatbot
    • Examine some unique challenges Transformer models face and their solutions, then build a chatbot using a Reformer model.