Unsure whether here or opening a feature request issue on the Philomena repo.
Implement a bulk endpoint for faves & upvotes (ID & hash only)
Rationale
To improve data portability by re-implementing the booru sync script that
- Seems to have been abandoned
- Only has its thread still up on Ponerpics? I swear its author posted his scripts everywhere.
The script somewhat works still, however, it completely chokes when encountering an error while fetching the list of images to sync. This makes it useless for those of us with thousands of favorite images or updoots, as it’ll error out on page 45 (at a rate of 50 images/page) and stop instead of retrying or skipping the errored page.
Pseudocode of endpoint
def bulk_list( # for GET requests
api_key, # whose lists to grab?
list_type, # either :faves, :upvotes, :hidden, or :downvotes
page \\ 0, # pagination
apply_current_filter \\ false, # apply the current filter for the user when retrieving results or grab everything?
) do ...
def bulk_update( # for PUT requests
api_key, list_type, # same as above
hashes, # TODO: exact formatting
) do ...
bulk_list would return something like
[
8675309: "123fa69420abcde",
69420: "f16c98e2848c2f1bfff3985e8f1a54375cc49f78125391aeb80534ce011ead14e3e452a5c4bc98a66f56bdfcd07ef7800663b994f3f343c572da5ecc22a9660f",
]
as a minimal return. For a more comprehensive response, perhaps upgrade the response for each image to be hash, source URLs, and tags (names only, no IDs).
bulk_update takes the output from bulk_list [or just the list of hashes?] and then applies the appropriate operation to all images on the destination booru matching the supplied hashes [after resolving merged duplicates]. The response for bulk_update is a list of hashes with errors and the corresponding error [deleted, totally not found, something else, etc…]