One million English sentences, each split into two sentences that together preserve the original meaning, extracted from Wikipedia edits.
What is the google-research-datasets/wiki-split GitHub project? Description: "One million English sentences, each split into two sentences that together preserve the original meaning, extracted from Wikipedia edits.". Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the wiki-split issue tracker:
Open GitHub Issues