Closed
Description
When for some reason the GitHub API limit is exceeded the script tries to decompress the file when it is not.
Current code that fails to extract:
curl --silent --location --max-time "${TIMEOUT}" "${LINK}" | tar zxf - || {
echo "Error downloading"
exit 2
}
Suggestion:
curl --silent --location --fail --max-time "${TIMEOUT}" "${LINK}" -o "$GH_REPO_BIN" || {
echo "Error downloading"
exit 2
}
tar zxf "$GH_REPO_BIN" || exit 2
Metadata
Metadata
Assignees
Labels
No labels