If you have a json file that needs formatting, you can use a built-in tool in Python.
python -m json.tool sample.json |
Output will be printed on screen. You can also send the output to a file.
python -m json.tool sample.json > sample2.json |
cloud engineer
If you have a json file that needs formatting, you can use a built-in tool in Python.
python -m json.tool sample.json |
python -m json.tool sample.json
Output will be printed on screen. You can also send the output to a file.
python -m json.tool sample.json > sample2.json |
python -m json.tool sample.json > sample2.json
How to activate a GCP service account for other users in Linux.
First generate a key for the service account. Save as key.json.
Login to the server as that user and copy the key there. Activate the service account.
$ gcloud auth activate-service-account [ACCOUNT] --key-file=key.json |
$ gcloud auth activate-service-account [ACCOUNT] --key-file=key.json
Once authenticated, you should be able to check if service account is active.
$ gcloud config list |
$ gcloud config list
A better option without needing a key.
gcloud config set core/account service-account@project-id.iam.gserviceaccount.com |
gcloud config set core/account service-account@project-id.iam.gserviceaccount.com
JQ is a json processor. It’s for slicing and dicing data.
To install on GitBash, unzip the file and copy the executable to /usr/bin.
mv jq-win64.exe to /usr/bin/jq.exe |
mv jq-win64.exe to /usr/bin/jq.exe
Create an alias. Edit ~/.bashrc.
alias jq='/usr/bin/jq.exe' |
alias jq='/usr/bin/jq.exe'
Using JQ.
jq --version jq '.selfLink' vm.json |
jq --version jq '.selfLink' vm.json
How to change DNS records in Route 53 via AWS CLI.
cd /path/to/scripts/ # the command to switch to the elb aws route53 change-resource-record-sets --hosted-zone-id xxxxxxxxxxxxxx --change-batch file://elb.json # the command to switch to standard site aws route53 change-resource-record-sets --hosted-zone-id xxxxxxxxxxxxxx --change-batch file://live.json |
cd /path/to/scripts/ # the command to switch to the elb aws route53 change-resource-record-sets --hosted-zone-id xxxxxxxxxxxxxx --change-batch file://elb.json # the command to switch to standard site aws route53 change-resource-record-sets --hosted-zone-id xxxxxxxxxxxxxx --change-batch file://live.json
elb.json = points to AWS ELB (elastic load balancer)
{ "Comment": "back to elb", "Changes": [ { "Action": "UPSERT", "ResourceRecordSet": { "Name": "yourdomain.com", "Type": "A", "AliasTarget": { "HostedZoneId": "xxxxxxxxxxxxxx", "EvaluateTargetHealth": false, "DNSName": "xxxxxxxxxxxxx.us-east-1.elb.amazonaws.com." } } } ] } |
{ "Comment": "back to elb", "Changes": [ { "Action": "UPSERT", "ResourceRecordSet": { "Name": "yourdomain.com", "Type": "A", "AliasTarget": { "HostedZoneId": "xxxxxxxxxxxxxx", "EvaluateTargetHealth": false, "DNSName": "xxxxxxxxxxxxx.us-east-1.elb.amazonaws.com." } } } ] }
live.json = points to your standard site. Value is your IP Address.
{ "Comment": "back to live", "Changes": [ { "Action": "UPSERT", "ResourceRecordSet": { "Name": "yourdomain.com", "Type": "A", "TTL": 60, "ResourceRecords": [ { "Value": "xxx.xxx.xxx.xxx" } ] } } ] } |
{ "Comment": "back to live", "Changes": [ { "Action": "UPSERT", "ResourceRecordSet": { "Name": "yourdomain.com", "Type": "A", "TTL": 60, "ResourceRecords": [ { "Value": "xxx.xxx.xxx.xxx" } ] } } ] }
I was banging my head against the wall (not quite literally, but it was close) trying to come up with a S3 lifecycle configuration from scratch. I needed a JSON file that I can run in a bash script, so I can apply the lifecycle rules to a few dozen S3 buckets. Obviously, trying to create one from scratch wasn’t the wisest choice in my part. As it turns out, you can setup a temporary bucket and use the console to recreate the lifecycle, then export the lifecycle JSON file. Duh! Genius! Here’s the command to export the lifecycle configuration.
aws s3api get-bucket-lifecycle-configuration \
--bucket bucket-name |
aws s3api get-bucket-lifecycle-configuration \ --bucket bucket-name
The output would be in a JSON format similar to this:
{ "Rules": [ { "Status": "Enabled", "NoncurrentVersionExpiration": { "NoncurrentDays": 90 }, "NoncurrentVersionTransitions": [ { "NoncurrentDays": 7, "StorageClass": "INTELLIGENT_TIERING" } ], "Filter": { "Prefix": "" }, "Expiration": { "ExpiredObjectDeleteMarker": true }, "AbortIncompleteMultipartUpload": { "DaysAfterInitiation": 7 }, "ID": "EFS S3 Backup Lifecycle Rules" } ] } |
{ "Rules": [ { "Status": "Enabled", "NoncurrentVersionExpiration": { "NoncurrentDays": 90 }, "NoncurrentVersionTransitions": [ { "NoncurrentDays": 7, "StorageClass": "INTELLIGENT_TIERING" } ], "Filter": { "Prefix": "" }, "Expiration": { "ExpiredObjectDeleteMarker": true }, "AbortIncompleteMultipartUpload": { "DaysAfterInitiation": 7 }, "ID": "EFS S3 Backup Lifecycle Rules" } ] }
JSON, or JavaScript Object Notation, is a minimal, readable format for structuring data. It is used primarily to transmit data between a server and web application, as an alternative to XML. YAML is a flexible, human readable file format that is ideal for storing object trees. YAML stands for “YAML Ain’t Markup Language”. It is easier to read than JSON, and can contain richer meta data. If you would like to switch formats, for example from JSON to YAML or vice versa, there are numerous website out there that can convert your code from one format to another. Check out this json2yaml converter.