PyTorch-Batch-Attention-Seq2seq

PyTorch-Batch-Attention-Seq2seq

AuCson

PyTorch implementation of batched bi-RNN encoder and attention-decoder.

275 Stars
49 Forks
275 Watchers
Python Language
Cost to Build
$953
Market Value
$2.0K

Growth over time

7 data points  ·  2021-08-01 → 2023-07-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about PyTorch-Batch-Attention-Seq2seq

Question copied to clipboard

What is the AuCson/PyTorch-Batch-Attention-Seq2seq GitHub project? Description: "PyTorch implementation of batched bi-RNN encoder and attention-decoder.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone PyTorch-Batch-Attention-Seq2seq

Clone via HTTPS

git clone https://github.com/AuCson/PyTorch-Batch-Attention-Seq2seq.git

Clone via SSH

[email protected]:AuCson/PyTorch-Batch-Attention-Seq2seq.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the PyTorch-Batch-Attention-Seq2seq issue tracker:

Open GitHub Issues