AWS SSO credentials don't work when mounted into Docker containers
Mounted ~/.aws into a Docker container while using AWS SSO (IAM Identity Center), and got a token resolution failure.
docker run -v "$HOME/.aws:/root/.aws:ro" my-image:latest
# Error when retrieving token from sso: Token has expired and refresh failedSSO token caching depends on the host-side process, so the container can't resolve it. Extracting temporary credentials from boto3 and passing them as environment variables works reliably.
CREDS=$(python3 -c "
import json, boto3
creds = boto3.Session().get_credentials().get_frozen_credentials()
print(json.dumps({'AK': creds.access_key, 'SK': creds.secret_key, 'ST': creds.token}))
")
docker run -p 8080:8080 \
-e AWS_ACCESS_KEY_ID=$(echo $CREDS | python3 -c "import sys,json; print(json.load(sys.stdin)['AK'])") \
-e AWS_SECRET_ACCESS_KEY=$(echo $CREDS | python3 -c "import sys,json; print(json.load(sys.stdin)['SK'])") \
-e AWS_SESSION_TOKEN=$(echo $CREDS | python3 -c "import sys,json; print(json.load(sys.stdin)['ST'])") \
my-image:latestIn production, ECS task roles or EC2 instance profiles eliminate credential management entirely. Covered in detail in deploy series part 1.
