Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi. I'm considering generating a graph of all references in Wikipedia using WBI.
I want to generate it on disk first an then upload it to Wikibase using WBI.
I'm thinking the following algorithm would work:
Then using a Wikibase loop over all references and upload the json to Wikibase for each unique reference and store the wcdqid in redis (key=unihash value=wcdqid)
Then loop over all articles and finish generating the item using unihash list and get the wcdqid from redis. Upload up to 500 references on an article in one go any surplus references using addclaim.
Does this seem like an efficient way to pre-generate a graph?
Any ideas for improvement?
We could store everything in redis but I'm unsure how big the database would get. Since SSDB is exists for redis databases that surpasses the memory size and caching to disk it it could be done using that one if redis fails because of the total size.
Beta Was this translation helpful? Give feedback.
All reactions