Expand README, check if URLs are found before trying to iterate

This commit is contained in:
Alex Thomassen 2023-07-01 23:00:52 +00:00
parent d6d1d7ac96
commit 80073d5830
Signed by: Alex
GPG Key ID: 10BD786B5F6FF5DE
2 changed files with 8 additions and 1 deletions

View File

@ -11,6 +11,11 @@ fi
for json in $(ls *.json);
do
urls="$(cat $json | jq -r '.gfycats[] | .mp4Url')";
if [[ -z "${urls}" ]]; then
echo "No urls found in ${json}";
continue;
fi
for url in $urls;
do
echo "Downloading $url";

View File

@ -1,6 +1,8 @@
# GfycatGrab
Budget way of grabbing all Gfycats from your account.
Budget way of grabbing all Gfycats from your account.
I haven't tested this very much. Worked fine with my account, since I only have 83 GIFs saved total, so I only had to save one request from my browser.
In theory the script will scan every JSON file you have in the directory you run it in, and download every GIF in each JSON file (assuming you follow the steps).
## Requirements