Learn Docker With My Newest Course

Dive into Docker takes you from "What is Docker?" to confidently applying Docker to your own projects. It's packed with best practices and examples. Start Learning Docker →

3 Practical Examples of Using Bash While and Until Loops

blog/cards/3-practical-examples-of-using-bash-while-and-until-loops.jpg

We'll go over using curl to poll a site's status code response, check if a process is running and wait until an S3 bucket is available.

Quick Jump: Demo Video

In this video we’ll go over mostly 1 liners that you can run on the command line to solve a few specific problems. Once you get comfy using loops you can use them to quickly solve your specific problems too.

Demo Video

Timestamps

  • 0:19 – Using curl to get the HTTP status code of a response
  • 1:30 – Executing the curl command every 100ms in a while loop
  • 2:27 – Alternatively we can use while true and run commands in the loop
  • 2:54 – Using pgrep to see if unattended-upgrade is finished running
  • 5:19 – Using a while loop and pgrep to see if a process is done running
  • 6:23 – Using an until loop to check if an S3 bucket was created
  • 8:44 – wait-until is a generic version of an until loop with a timeout
  • 9:35 – Refactoring the S3 until loop to use the wait-until script
  • 10:24 – Recap

Code snippets

Here’s the commands and code shown on video.

Getting the HTTP status code of a response every 100ms with curl using a while loop:
# Return just the status code of the response:
curl -sI -o /dev/null -w "%{http_code}\n" https://example.com

# Run our curl command every 100ms in a while loop:
while curl -sI -o /dev/null -w "%{http_code}\n" https://example.com; do sleep 0.1; done;

# Alternatively this works too:
while true; do echo hello && sleep 1; done;
Using pgrep and a while loop to monitor a process:
# See if unattended-upgrade is running:
pgrep -af unattended-upgrade

# Run our pgrep command every 1 second in a while loop:
while pgrep -af unattended-upgrade; do sleep 1; done;
Checking if an S3 bucket is created before uploading files to it:
#!/usr/bin/env bash

readonly bucket="example"
readonly timeout="3"

echo "Waiting until S3 bucket is created..."

i=1
until aws s3api head-bucket --bucket "${bucket}" 2>/dev/null
do
  ((i++))

  if [ "${i}" -gt "${timeout}" ]; then
      echo "${bucket} s3 bucket was never created, aborting due to ${timeout}s timeout!"
      exit 1
  fi

  sleep 1
done

aws s3 sync --exclude="*" --include="*.jpg" . s3://"${bucket}"

Refactoring the above script to use the wait-until script:

#!/usr/bin/env bash

# I forgot to add this which is why the AccessDenied error came up in the end.
#
# It didn't happen in the original version because that script did an exit 1 in
# the script itself so it never got a chance to run the sync command.
#
# Where as in this version wait-until is a separate script that's running so
# we need to make this script exit on any error which is what set -e does.
set -e

readonly bucket="example"
readonly timeout="3"

echo "Waiting until S3 bucket is created..."

wait-until "aws s3api head-bucket --bucket "${bucket}" 2>/dev/null" "${timeout}"

aws s3 sync --exclude="*" --include="*.jpg" . s3://"${bucket}"

What have you used while and until loops for? Let me know below.

Never Miss a Tip, Trick or Tutorial

Like you, I'm super protective of my inbox, so don't worry about getting spammed. You can expect a few emails per month (at most), and you can 1-click unsubscribe at any time. See what else you'll get too.



Comments