I'm trying to check URL before downloading. I not sure even if I have the right approach. So I was hoping someone might shed so light.
I can print it all. I can grep. I can download. I just can't get it to come together. I'm new to this and trying.
````grep -o '$i' $url````
````file="/home/pi/blah"````
````n=1````
````while IFS= read -r i; do````
````if [[ "${i##*/}" = "\#*" ]]; then````
````printf '%s\n' "$n: #'${i##*/}': scheme missing" && continue````
````elif [[ "${i##*/}" = "${url##*/}" ]]; then````
````printf '%s\n' "$n: File '${i##*/}' already there; not retrieving" && continue````
````elif [[ ! -e "${i##*/}" ]]; then````
````printf '%s\n' "$n: '${i##*/}': No such file or directory" && continue````
````else````
````cat blah | while read wget -q --show-progress $url; done; fi````
````printf '%s\n' "$n: Downloading '${i##*/}' to ${file##*/}: Done"````
````n=$((n+1))````
````done < "$file"````
Either have it download or just state the condition.
Aucun commentaire:
Enregistrer un commentaire