Expand README, check if URLs are found before trying to iterate
This commit is contained in:
parent
d6d1d7ac96
commit
80073d5830
@ -11,6 +11,11 @@ fi
|
|||||||
for json in $(ls *.json);
|
for json in $(ls *.json);
|
||||||
do
|
do
|
||||||
urls="$(cat $json | jq -r '.gfycats[] | .mp4Url')";
|
urls="$(cat $json | jq -r '.gfycats[] | .mp4Url')";
|
||||||
|
if [[ -z "${urls}" ]]; then
|
||||||
|
echo "No urls found in ${json}";
|
||||||
|
continue;
|
||||||
|
fi
|
||||||
|
|
||||||
for url in $urls;
|
for url in $urls;
|
||||||
do
|
do
|
||||||
echo "Downloading $url";
|
echo "Downloading $url";
|
||||||
|
@ -1,6 +1,8 @@
|
|||||||
# GfycatGrab
|
# GfycatGrab
|
||||||
|
|
||||||
Budget way of grabbing all Gfycats from your account.
|
Budget way of grabbing all Gfycats from your account.
|
||||||
|
I haven't tested this very much. Worked fine with my account, since I only have 83 GIFs saved total, so I only had to save one request from my browser.
|
||||||
|
In theory the script will scan every JSON file you have in the directory you run it in, and download every GIF in each JSON file (assuming you follow the steps).
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user