pinkapple

joined 3 years ago
[–] pinkapple@lemmy.ml 2 points 12 hours ago (4 children)

Nobody is scraping wikipedia over and over to create datasets for AIs, there are already open datasets and API deals. But wiki in particular has always had a data dump of the entire db bimonthly.

https://dumps.wikimedia.org/