5

I'm looking for Wikipedia (enwiki-latest-pages-articles-multistream.xml.bz2) and a Wikidata (latest-all.json.bz2 ) dumps. The files are quite large (the latter has ~47GB) and my internet connection tends to stop while downloading.

I'm looking for torrent files. I found one for Wikipedia, but I cannot find one for Wikidata. Is there any?

Patrick Hoefler
  • 5,790
  • 4
  • 31
  • 47
dzieciou
  • 233
  • 1
  • 8
  • 3
    "Try downloading from command line using wget --continue url. Creating a torrent is difficult because the file needs to be updated often." From https://t.me/joinchat/AZriqUj5UagVMHXYzfZFvA – Stanislav Kralin Mar 17 '20 at 13:47
  • 1
    these answers may be old (2013) but take a look https://opendata.stackexchange.com/q/107/1511 – philshem Mar 17 '20 at 20:26
  • 1
    I think it worth to create a torrent e.g. quarterly. For some, like me, an 3 months old dump would be almost as useful as the latest one as long as data structure haven't changed. If I can prove that my concept - for me finding the relationships (parent-child, teacher-pupil) between composers - works, I can download the latest version after that. – Arpad Horvath Feb 03 '22 at 09:00

1 Answers1

0

They may have changed their site, but the link you shared now has magnet links to torrents near the bottom.

James Risner
  • 278
  • 1
  • 3
  • 9