Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Exploration of BERT-BiLSTM models with Layer Aggregation (attention-based and capsule-routing-based) and Hidden-State Aggregation (attention-based and capsule-routing-based).