You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
DevOps-Bash-tools/README.md

1599 lines
185 KiB
Markdown

# Hari Sekhon - DevOps Bash Tools
2 years ago
[![GitHub stars](https://img.shields.io/github/stars/harisekhon/devops-bash-tools?logo=github)](https://github.com/HariSekhon/DevOps-Bash-tools/stargazers)
[![GitHub forks](https://img.shields.io/github/forks/harisekhon/devops-bash-tools?logo=github)](https://github.com/HariSekhon/DevOps-Bash-tools/network)
7 months ago
[![Lines of Code](https://img.shields.io/badge/lines%20of%20code-96k-lightgrey?logo=codecademy)](https://github.com/HariSekhon/DevOps-Bash-tools#hari-sekhon---devops-bash-tools)
[![License](https://img.shields.io/badge/license-MIT-green)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/LICENSE)
[![My LinkedIn](https://img.shields.io/badge/LinkedIn%20Profile-HariSekhon-blue?logo=data:image/svg%2bxml;base64,PHN2ZyByb2xlPSJpbWciIGZpbGw9IiNmZmZmZmYiIHZpZXdCb3g9IjAgMCAyNCAyNCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48dGl0bGU+TGlua2VkSW48L3RpdGxlPjxwYXRoIGQ9Ik0yMC40NDcgMjAuNDUyaC0zLjU1NHYtNS41NjljMC0xLjMyOC0uMDI3LTMuMDM3LTEuODUyLTMuMDM3LTEuODUzIDAtMi4xMzYgMS40NDUtMi4xMzYgMi45Mzl2NS42NjdIOS4zNTFWOWgzLjQxNHYxLjU2MWguMDQ2Yy40NzctLjkgMS42MzctMS44NSAzLjM3LTEuODUgMy42MDEgMCA0LjI2NyAyLjM3IDQuMjY3IDUuNDU1djYuMjg2ek01LjMzNyA3LjQzM2MtMS4xNDQgMC0yLjA2My0uOTI2LTIuMDYzLTIuMDY1IDAtMS4xMzguOTItMi4wNjMgMi4wNjMtMi4wNjMgMS4xNCAwIDIuMDY0LjkyNSAyLjA2NCAyLjA2MyAwIDEuMTM5LS45MjUgMi4wNjUtMi4wNjQgMi4wNjV6bTEuNzgyIDEzLjAxOUgzLjU1NVY5aDMuNTY0djExLjQ1MnpNMjIuMjI1IDBIMS43NzFDLjc5MiAwIDAgLjc3NCAwIDEuNzI5djIwLjU0MkMwIDIzLjIyNy43OTIgMjQgMS43NzEgMjRoMjAuNDUxQzIzLjIgMjQgMjQgMjMuMjI3IDI0IDIyLjI3MVYxLjcyOUMyNCAuNzc0IDIzLjIgMCAyMi4yMjIgMGguMDAzeiIvPjwvc3ZnPgo=)](https://www.linkedin.com/in/HariSekhon/)
[![GitHub Last Commit](https://img.shields.io/github/last-commit/HariSekhon/DevOps-Bash-tools?logo=github)](https://github.com/HariSekhon/DevOps-Bash-tools/commits/master)
[![Codacy](https://app.codacy.com/project/badge/Grade/dffc1bfd13404c95b5a0ab97fd47974e)](https://www.codacy.com/gh/HariSekhon/DevOps-Bash-tools/dashboard)
5 years ago
[![CodeFactor](https://www.codefactor.io/repository/github/harisekhon/devops-bash-tools/badge)](https://www.codefactor.io/repository/github/harisekhon/devops-bash-tools)
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=alert_status)](https://sonarcloud.io/dashboard?id=HariSekhon_DevOps-Bash-tools)
[![Maintainability Rating](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=sqale_rating)](https://sonarcloud.io/dashboard?id=HariSekhon_DevOps-Bash-tools)
[![Reliability Rating](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=reliability_rating)](https://sonarcloud.io/dashboard?id=HariSekhon_DevOps-Bash-tools)
[![Security Rating](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=security_rating)](https://sonarcloud.io/dashboard?id=HariSekhon_DevOps-Bash-tools)
[![Vulnerabilities](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=vulnerabilities)](https://sonarcloud.io/summary/new_code?id=HariSekhon_DevOps-Bash-tools)
<!--
BitBucket exposes HTML comments - open issue - works properly on GitHub/GitLab
doesn't detect shell code properly
[![Lines of Code](https://sonarcloud.io/api/project_badges/measure?project=HariSekhon_DevOps-Bash-tools&metric=ncloc)](https://sonarcloud.io/dashboard?id=HariSekhon_DevOps-Bash-tools)
-->
[![Linux](https://img.shields.io/badge/OS-Linux-blue?logo=linux)](https://github.com/HariSekhon/DevOps-Bash-tools#hari-sekhon---devops-bash-tools)
[![Mac](https://img.shields.io/badge/OS-Mac-blue?logo=apple)](https://github.com/HariSekhon/DevOps-Bash-tools#hari-sekhon---devops-bash-tools)
[![Docker](https://img.shields.io/badge/container-Docker-blue?logo=docker&logoColor=white)](https://hub.docker.com/r/harisekhon/bash-tools)
[![Dockerfile](https://img.shields.io/badge/repo-Dockerfiles-blue?logo=docker&logoColor=white)](https://github.com/HariSekhon/Dockerfiles)
[![DockerHub Pulls](https://img.shields.io/docker/pulls/harisekhon/bash-tools?label=DockerHub%20pulls&logo=docker&logoColor=white)](https://hub.docker.com/r/harisekhon/bash-tools)
[![StarTrack](https://img.shields.io/badge/Star-Track-blue?logo=github)](https://seladb.github.io/StarTrack-js/#/preload?r=HariSekhon,Nagios-Plugins&r=HariSekhon,Dockerfiles&r=HariSekhon,DevOps-Python-tools&r=HariSekhon,DevOps-Perl-tools&r=HariSekhon,DevOps-Bash-tools&r=HariSekhon,HAProxy-configs&r=HariSekhon,SQL-scripts)
[![StarCharts](https://img.shields.io/badge/Star-Charts-blue?logo=github)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/STARCHARTS.md)
4 years ago
[![Mac Homebrew](https://img.shields.io/badge/Mac-Homebrew-999999?logo=apple&logoColor=white)](https://brew.sh/)
[![Alpine](https://img.shields.io/badge/Linux-Alpine-0D597F?logo=alpine%20linux)](https://alpinelinux.org/)
[![CentOS](https://img.shields.io/badge/Linux-CentOS-262577?logo=centos&logoColor=white)](https://www.centos.org/)
[![Debian](https://img.shields.io/badge/Linux-Debian-A81D33?logo=debian)](https://www.debian.org/)
[![Fedora](https://img.shields.io/badge/Linux-Fedora-294172?logo=fedora&logoColor=white)](https://getfedora.org/)
[![Redhat](https://img.shields.io/badge/Linux-Redhat-EE0000?logo=red%20hat)](https://www.redhat.com/en)
[![Rocky](https://img.shields.io/badge/Linux-Rocky-10B981?logo=rockylinux&logoColor=white)](https://rockylinux.org/)
[![Ubuntu](https://img.shields.io/badge/Linux-Ubuntu-E95420?logo=ubuntu&logoColor=white)](https://ubuntu.com/)
4 years ago
<!-- TODO: fix
[![DockerHub Build Automated](https://img.shields.io/docker/automated/harisekhon/bash-tools?logo=docker&logoColor=white)](https://hub.docker.com/r/harisekhon/bash-tools)
[![Docker Build Status](https://img.shields.io/docker/cloud/build/harisekhon/bash-tools?logo=docker&logoColor=white)](https://hub.docker.com/r/harisekhon/bash-tools/builds)
-->
<!--
official badges without logos to differentiate them
this one I don't trust it'll stick around so using shields version instead
[![Build Status](https://badges.herokuapp.com/travis/HariSekhon/DevOps-Bash-tools?label=Travis%20CI)](https://travis-ci.org/HariSekhon/DevOps-Bash-tools)
awkward URLs more nicely replaced with shields.io
[![AppVeyor](https://ci.appveyor.com/api/projects/status/u6f97cskcgb30sce/branch/master?svg=true)](https://ci.appveyor.com/project/HariSekhon/devops-bash-tools/branch/master)
[![Drone](https://cloud.drone.io/api/badges/HariSekhon/DevOps-Bash-tools/status.svg)](https://cloud.drone.io/HariSekhon/DevOps-Bash-tools)
-->
[![CI Builds Overview](https://img.shields.io/badge/CI%20Builds-Overview%20Page-blue?logo=circleci)](https://harisekhon.github.io/CI-CD/)
[![Jenkins](https://img.shields.io/badge/Jenkins-ready-blue?logo=jenkins&logoColor=white)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/Jenkinsfile)
[![Concourse](https://img.shields.io/badge/Concourse-ready-blue?logo=concourse&logoColor=white)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/cicd/.concourse.yml)
[![GoCD](https://img.shields.io/badge/GoCD-ready-blue?logo=go&logoColor=white)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/cicd/.gocd.yml)
[![TeamCity](https://img.shields.io/badge/TeamCity-ready-blue?logo=teamcity)](https://github.com/HariSekhon/TeamCity-CI)
[![CircleCI](https://circleci.com/gh/HariSekhon/DevOps-Bash-tools.svg?style=svg)](https://circleci.com/gh/HariSekhon/DevOps-Bash-tools)
[![BuildKite](https://img.shields.io/buildkite/f11bdd9690a9bac9a8edc6094dc2f2b9af3218a7a15d4ec17d/master?label=BuildKite&logo=buildkite)](https://buildkite.com/hari-sekhon/devops-bash-tools)
[![AppVeyor](https://img.shields.io/appveyor/build/harisekhon/devops-bash-tools/master?logo=appveyor&label=AppVeyor)](https://ci.appveyor.com/project/HariSekhon/devops-bash-tools/branch/master)
[![Drone](https://img.shields.io/drone/build/HariSekhon/DevOps-Bash-tools/master?logo=drone&label=Drone)](https://cloud.drone.io/HariSekhon/DevOps-Bash-tools)
[![Codefresh](https://g.codefresh.io/api/badges/pipeline/harisekhon/GitHub%2FDevOps-Bash-tools?branch=master&key=eyJhbGciOiJIUzI1NiJ9.NWU1MmM5OGNiM2FiOWUzM2Y3ZDZmYjM3.O69674cW7vYom3v5JOGKXDbYgCVIJU9EWhXUMHl3zwA&type=cf-1)](https://g.codefresh.io/pipelines/edit/new/builds?id=5e53eaeea284e010982eaa6e&pipeline=DevOps-Bash-tools&projects=GitHub&projectId=5e52ca8ea284e00f882ea992&context=github&filter=page:1;pageSize:10;timeFrameStart:week)
[![Cirrus CI](https://img.shields.io/cirrus/github/HariSekhon/DevOps-Bash-tools/master?logo=Cirrus%20CI&label=Cirrus%20CI)](https://cirrus-ci.com/github/HariSekhon/DevOps-Bash-tools)
[![Semaphore](https://harisekhon.semaphoreci.com/badges/DevOps-Bash-tools.svg)](https://harisekhon.semaphoreci.com/projects/DevOps-Bash-tools)
[![Buddy](https://img.shields.io/badge/Buddy-ready-1A86FD?logo=buddy)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/buddy.yml)
[![Shippable](https://img.shields.io/badge/Shippable-legacy-lightgrey?logo=jfrog&label=Shippable)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/shippable.yml)
[![Travis CI](https://img.shields.io/badge/TravisCI-ready-blue?logo=travis&label=Travis%20CI)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/travis/.travis.yml)
[![Reviewed by Hound](https://img.shields.io/badge/Reviewed%20by-Hound-8E64B0.svg)](https://houndci.com)
[![Repo on GitHub](https://img.shields.io/badge/repo-GitHub-2088FF?logo=github)](https://github.com/HariSekhon/DevOps-Bash-tools)
[![Repo on GitLab](https://img.shields.io/badge/repo-GitLab-FCA121?logo=gitlab)](https://gitlab.com/HariSekhon/DevOps-Bash-tools)
[![Repo on Azure DevOps](https://img.shields.io/badge/repo-Azure%20DevOps-0078D7?logo=azure%20devops)](https://dev.azure.com/harisekhon/GitHub/_git/DevOps-Bash-tools)
[![Repo on BitBucket](https://img.shields.io/badge/repo-BitBucket-0052CC?logo=bitbucket)](https://bitbucket.org/HariSekhon/DevOps-Bash-tools)
[![Azure DevOps Pipeline](https://dev.azure.com/harisekhon/GitHub/_apis/build/status/HariSekhon.DevOps-Bash-tools?branchName=master)](https://dev.azure.com/harisekhon/GitHub/_build/latest?definitionId=1&branchName=master)
[![GitLab Pipeline](https://img.shields.io/badge/GitLab%20CI-legacy-lightgrey?logo=gitlab)](https://gitlab.com/HariSekhon/DevOps-Bash-tools/pipelines)
[![BitBucket Pipeline](https://img.shields.io/badge/Bitbucket%20CI-legacy-lightgrey?logo=bitbucket)](https://bitbucket.org/harisekhon/devops-bash-tools/addon/pipelines/home#!/)
[![AWS CodeBuild](https://img.shields.io/badge/AWS%20CodeBuild-ready-blue?logo=amazon%20aws)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/cicd/buildspec.yml)
[![GCP Cloud Build](https://img.shields.io/badge/GCP%20Cloud%20Build-ready-blue?logo=google%20cloud&logoColor=white)](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/cicd/cloudbuild.yaml)
[![ShellCheck](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/shellcheck.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/shellcheck.yaml)
[![JSON](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/json.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/json.yaml)
[![YAML](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/yaml.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/yaml.yaml)
[![XML](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/xml.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/xml.yaml)
[![Validation](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/validate.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/validate.yaml)
[![Kics](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/kics.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/kics.yaml)
[![Grype](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/grype.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/grype.yaml)
[![Semgrep](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/semgrep.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/semgrep.yaml)
[![Semgrep Cloud](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/semgrep-cloud.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/semgrep-cloud.yaml)
[![Trivy](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/trivy.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/trivy.yaml)
5 years ago
[![Docker Build (Alpine)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_alpine.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_alpine.yaml)
[![Docker Build (Debian)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_debian.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_debian.yaml)
[![Docker Build (Fedora)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_fedora.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_fedora.yaml)
[![Docker Build (Ubuntu)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_ubuntu.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/docker_bash_ubuntu.yaml)
[![GitHub Actions Ubuntu](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/GitHub%20Actions%20Ubuntu/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22GitHub+Actions+Ubuntu%22)
[![Mac](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac.yaml)
[![Mac 11](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac_11.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac_11.yaml)
[![Mac 12](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac_12.yaml/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions/workflows/mac_12.yaml)
[![Ubuntu](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Ubuntu/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Ubuntu%22)
[![Ubuntu 20.04](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Ubuntu%2020.04/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Ubuntu+20.04%22)
[![Ubuntu 22.04](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Ubuntu%2022.04/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Ubuntu+22.04%22)
[![Debian](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Debian/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Debian%22)
[![Debian 10](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Debian%2010/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Debian+10%22)
[![Debian 11](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Debian%2011/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Debian+11%22)
[![Debian 12](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Debian%2012/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Debian+12%22)
[![Fedora](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Fedora/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Fedora%22)
[![Alpine](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Alpine/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Alpine%22)
[![Alpine 3](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Alpine%203/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Alpine+3%22)
[![Python 3.7](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Python%203.7/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Python+3.7%22)
[![Python 3.8](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Python%203.8/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Python+3.8%22)
[![Python 3.9](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Python%203.9/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Python+3.9%22)
[![Python 3.10](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Python%203.10/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Python+3.10%22)
[![Python 3.11](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Python%203.11/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Python+3.11%22)
<!--
[![Self Hosted](https://github.com/HariSekhon/DevOps-Bash-tools/workflows/Self%20Hosted/badge.svg)](https://github.com/HariSekhon/DevOps-Bash-tools/actions?query=workflow%3A%22Self+Hosted%22)
-->
<!-- TODO: https://codecov.io, https://coveralls.io -->
[git.io/bash-tools](https://git.io/bash-tools)
1000+ DevOps Shell Scripts and Advanced Bash environment.
Fast, Advanced Systems Engineering, Automation, APIs, shorter CLIs, etc.
5 years ago
Heavily used in many [GitHub repos](https://github.com/search?o=desc&q=user%3Aharisekhon+type%3Arepository&type=Repositories), dozens of [DockerHub builds](https://hub.docker.com/r/harisekhon) ([Dockerfiles](https://github.com/HariSekhon/Dockerfiles)) and 600+ [CI builds](https://harisekhon.github.io/CI-CD/).
5 years ago
## Summary
- Scripts for many popular DevOps technologies, see [Inventory](https://github.com/HariSekhon/DevOps-Bash-tools#inventory) below for more details
- Advanced configs for common tools like [Git](https://git-scm.com/), [vim](https://www.vim.org/), [screen](https://www.gnu.org/software/screen/), [tmux](https://github.com/tmux/tmux/wiki), [PostgreSQL psql](https://www.postgresql.org/) etc...
- CI configs for most major Continuous Integration products (see [CI builds](https://harisekhon.github.io/CI-CD/) page)
- CI scripts for a drop-in framework of standard checks to run in all [CI builds](https://harisekhon.github.io/CI-CD/), CI detection, accounting for installation differences across CI environments, root vs user, virtualenvs etc.
- API scripts auto-handling authentication, tokens and other details to quickly query popular APIs with a few keystrokes just supplying the `/path/endpoint`
- Advanced Bash environment - `.bashrc` + `.bash.d/*.sh` - aliases, functions, colouring, dynamic Git & shell behaviour enhancements, automatic pathing for installations and major languages like Python, Perl, Ruby, NodeJS, Golang across Linux distributions and Mac. See [.bash.d/README.md](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/.bash.d/README.md)
- Installs the best systems packages -
[AWS CLI](https://aws.amazon.com/cli/),
[Azure CLI](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest),
[GCloud SDK](https://cloud.google.com/sdk),
[Digital Ocean CLI](https://docs.digitalocean.com/reference/doctl/),
[Terraform](https://www.terraform.io/),
[Terragrunt](https://terragrunt.gruntwork.io/),
[GitHub CLI](https://github.com/cli/cli),
[Kubernetes](https://kubernetes.io/)
[kubectl](https://kubernetes.io/docs/reference/kubectl/overview/) &
[kustomize](https://kustomize.io/),
[Helm](https://helm.sh/),
[eksctl](https://eksctl.io/),
[Docker-Compose](https://docs.docker.com/compose/),
[jq](https://stedolan.github.io/jq/)
and many others... extensive package lists for servers and desktops for most major Linux distributions package managers and Mac
7 months ago
- `install/` - contains many installation scripts for popular open source software and direct binary downloads from GitHub releases
- `configs/` - contains many dot configs for common technologies like ViM, top, Screen, Tmux, MySQL, PostgreSQL etc.
- `setup/` - contains setup scripts, package lists, extra configs, Mac OS X settings etc.
- Utility Libraries used by many hundreds of scripts and [builds](https://harisekhon.github.io/CI-CD/) across [repos](https://github.com/search?o=desc&q=user%3Aharisekhon+type%3Arepository&type=Repositories):
- `.bash.d/` - interactive library
- `lib/` - scripting and CI library
- [SQL Scripts](https://github.com/HariSekhon/SQL-scripts) - 100+ scripts for [PostgreSQL](https://www.postgresql.org/), [MySQL](https://www.mysql.com/), [AWS Athena](https://aws.amazon.com/athena/) + [CloudTrail](https://aws.amazon.com/cloudtrail/), [Google BigQuery](https://cloud.google.com/bigquery)
- [Templates](https://github.com/HariSekhon/Templates) - templates for common programming languages and build configs
- [Kubernetes Configs](https://github.com/HariSekhon/Kubernetes-configs) - Kubernetes YAML configs for most common scenarios, including Production Best Practices, Tips & Tricks
6 years ago
See Also: [similar DevOps repos](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/README.md#see-also) in other languages
Hari Sekhon
Cloud & Big Data Contractor, United Kingdom
(ex-Cloudera, former Hortonworks Consultant)
[![My LinkedIn](https://img.shields.io/badge/LinkedIn%20Profile-HariSekhon-blue?logo=data:image/svg%2bxml;base64,PHN2ZyByb2xlPSJpbWciIGZpbGw9IiNmZmZmZmYiIHZpZXdCb3g9IjAgMCAyNCAyNCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48dGl0bGU+TGlua2VkSW48L3RpdGxlPjxwYXRoIGQ9Ik0yMC40NDcgMjAuNDUyaC0zLjU1NHYtNS41NjljMC0xLjMyOC0uMDI3LTMuMDM3LTEuODUyLTMuMDM3LTEuODUzIDAtMi4xMzYgMS40NDUtMi4xMzYgMi45Mzl2NS42NjdIOS4zNTFWOWgzLjQxNHYxLjU2MWguMDQ2Yy40NzctLjkgMS42MzctMS44NSAzLjM3LTEuODUgMy42MDEgMCA0LjI2NyAyLjM3IDQuMjY3IDUuNDU1djYuMjg2ek01LjMzNyA3LjQzM2MtMS4xNDQgMC0yLjA2My0uOTI2LTIuMDYzLTIuMDY1IDAtMS4xMzguOTItMi4wNjMgMi4wNjMtMi4wNjMgMS4xNCAwIDIuMDY0LjkyNSAyLjA2NCAyLjA2MyAwIDEuMTM5LS45MjUgMi4wNjUtMi4wNjQgMi4wNjV6bTEuNzgyIDEzLjAxOUgzLjU1NVY5aDMuNTY0djExLjQ1MnpNMjIuMjI1IDBIMS43NzFDLjc5MiAwIDAgLjc3NCAwIDEuNzI5djIwLjU0MkMwIDIzLjIyNy43OTIgMjQgMS43NzEgMjRoMjAuNDUxQzIzLjIgMjQgMjQgMjMuMjI3IDI0IDIyLjI3MVYxLjcyOUMyNCAuNzc0IDIzLjIgMCAyMi4yMjIgMGguMDAzeiIvPjwvc3ZnPgo=)](https://www.linkedin.com/in/HariSekhon/)
<br>*(you're welcome to connect with me on LinkedIn)*
### Quick Setup
To bootstrap, install packages and link in to your shell profile to inherit all configs, do:
```bash
curl -L https://git.io/bash-bootstrap | sh
```
- Adds sourcing to `.bashrc`/`.bash_profile` to automatically inherit all `.bash.d/*.sh` environment enhancements for all technologies (see [Inventory](https://github.com/HariSekhon/DevOps-Bash-tools#Inventory) below)
- Symlinks `.*` config dotfiles to `$HOME` for [git](https://git-scm.com/), [vim](https://www.vim.org/), top, [htop](https://hisham.hm/htop/), [screen](https://www.gnu.org/software/screen/), [tmux](https://github.com/tmux/tmux/wiki), [editorconfig](https://editorconfig.org/), [Ansible](https://www.ansible.com/), [PostgreSQL](https://www.postgresql.org/) `.psqlrc` etc. (only when they don't already exist so there is no conflict with your own configs)
- Installs OS package dependencies for all scripts (detects the OS and installs the right RPMs, Debs, Apk or Mac HomeBrew packages)
- Installs Python packages
- Installs [AWS CLI](https://aws.amazon.com/cli/)
To only install package dependencies to run scripts, simply `cd` to the git clone directory and run `make`:
```shell
git clone https://github.com/HariSekhon/DevOps-Bash-tools bash-tools
cd bash-tools
make
```
2 years ago
`make install` sets your shell profile to source this repo. See [Individual Setup Parts](https://github.com/HariSekhon/DevOps-Bash-tools#Individual-Setup-Parts) below for more install/uninstall options.
## Index
- [Linux & Mac](#linux--mac) - curl OAuth / JWT, LDAP, find duplicate files, SSL certificate get/validate, URL encoding/decoding, Vagrant, advanced configurations:
- `.bashrc`, `.bash.d/*.sh`, `.gitconfig`, `.vimrc`, `.screenrc`, `.tmux.conf`, `.toprc`, `.gitignore`...
- [AWS - Amazon Web Services](#aws---amazon-web-services) - AWS account summary, lots of IAM reports, CIS Benchmark config hardening, EC2, ECR, EKS, Spot termination, S3 access logging, KMS key rotation info, SSM, CloudTrail, CloudWatch billing alarm with SNS notification topic and subscription for email alerts
- [GCP - Google Cloud Platform](#gcp---google-cloud-platform) - massive GCP auto-inventory, scripts for GCE, GKE, GCR, Secret Manager, BigQuery, Cloud SQL, Cloud Scheduler, Terraform service account creation
- [Kubernetes](#kubernetes) - massive Kubernetes auto-inventory, cluster management scripts & tricks
- [Docker](#docker) - Docker API, Dockerhub API, Quay.io API scripts
- [Databases](#databases) - fast CLI wrappers, instant Docker sandboxes (PostgreSQL, MySQL, MariaDB, SQLite), [SQL scripts](https://github.com/HariSekhon/SQL-scripts), SQL script testers against all versions of a DB, advanced `.psqlrc`
- [Data](#data) - data tools, converters and format validators for Avro, Parquet, CSV, JSON, INI / Properties files (Java), LDAP LDIF, XML, YAML
- [Big Data & NoSQL](#big-data--nosql) - Kafka, Hadoop, HDFS, Hive, Impala, ZooKeeper, Cloudera Manager API & Cloudera Navigator API scripts
- [Git - GitHub, GitLab, Bitbucket, Azure DevOps](#git---github-gitlab-bitbucket-azure-devops) - scripts for Git local & mirror management, GitHub, GitLab & BitBucket APIs
- [CI/CD - Continuous Integration / Continuous Delivery](#cicd---continuous-integration--continuous-deployment) - API scripts & build pipeline configs for most major CI systems:
- Jenkins, Concourse, GoCD, TeamCity - one-touch boot & build
- Azure DevOps Pipelines, GitHub Actions Workflows, GitLab CI, BitBucket Pipelines, AppVeyor, BuildKite, Travis CI, Circle CI, Codefresh, CodeShip, Drone.io, Semaphore CI, Shippable ...
- Terraform Cloud, Octopus Deploy
- Checkov / Bridgecrew Cloud
- [AI & IPaaS](#ai--ipaas) - OpenAI (ChatGPT), Make.com
- [Internet Services](#internet-services) - Cloudflare, DataDog, Digital Ocean, Kong API Gateway, GitGuardian, Jira, NGrok, Traefik, Pingdom, Wordpress
- [Java](#java) - Java utilies to debug running Java programs or decompile Java JAR code for deeper debugging
- [Python](#python) - Python utilities & library management
- [Perl](#perl) - Perl utilities & library management
- [Golang](#golang) - Golang utilities
- [Media](#media) - MP3 metadata editing, grouping and ordering of albums and audiobooks, mkv/avi to mp4 converters, YouTube channel download
- [Spotify](#spotify) - 40+ Spotify API scripts for backups, managing playlists, track deduplication, URI conversion, search, add/delete, liked tracks, followed artists, top artists, top tracks etc.
- [More Linux & Mac](#more-linux--mac) - more systems administration scripts, package installation automation
- [Builds, Languages & Linting](#builds-languages--linting) - programming language, build system & CI linting
- [Templates](https://github.com/HariSekhon/Templates) - Templates for AWS, GCP, Terraform, Docker, Jenkins, Cloud Build, Vagrant, Puppet, Python, Bash, Go, Perl, Java, Scala, Groovy, Maven, SBT, Gradle, Make, GitHub Actions, CircleCI, Jenkinsfile, Makefile, Dockerfile, docker-compose.yml etc.
- [Kubernetes Configs](https://github.com/HariSekhon/Kubernetes-configs) - Kubernetes YAML configs for most common scenarios, including Production Best Practices, Tips & Tricks
### Linux & Mac
Top-level `.bashrc`, `bin/`, `.bash.d/` and `configs/` directories:
- `.*` - dot conf files for lots of common software eg. advanced `.vimrc`, `.gitconfig`, massive `.gitignore`, `.editorconfig`, `.screenrc`, `.tmux.conf` etc.
- `.vimrc` - contains many awesome [vim](https://www.vim.org/) tweaks, plus hotkeys for linting lots of different file types in place, including Python, Perl, Bash / Shell, Dockerfiles, JSON, YAML, XML, CSV, INI / Properties files, LDAP LDIF etc without leaving the editor!
- `.screenrc` - fancy [screen](https://www.gnu.org/software/screen/) configuration including advanced colour bar, large history, hotkey reloading, auto-blanking etc.
- `.tmux.conf` - fancy [tmux](https://github.com/tmux/tmux/wiki) configuration include advanced colour bar and plugins, settings, hotkey reloading etc.
- [Git](https://git-scm.com/):
- `.gitconfig` - advanced Git configuration
- `.gitignore` - extensive Git ignore of trivial files you shouldn't commit
- enhanced Git diffs
- protections against committing AWS secret keys or merge conflict unresolved files
- `.bashrc` - shell tuning and sourcing of `.bash.d/*.sh`
- `.bash.d/*.sh` - thousands of lines of advanced bashrc code, aliases, functions and environment variables for:
- [Linux](https://en.wikipedia.org/wiki/Linux) & [Mac](https://en.wikipedia.org/wiki/MacOS)
- SCM - [Git](https://git-scm.com/), [Mercurial](https://www.mercurial-scm.org/), [Svn](https://subversion.apache.org)
- [AWS](https://aws.amazon.com/)
- [GCP](https://cloud.google.com/)
- [Docker](https://www.docker.com/)
- [Kubernetes](https://kubernetes.io/)
- [Kafka](http://kafka.apache.org/)
- [Vagrant](https://www.vagrantup.com/)
- automatic GPG and SSH agent handling for handling encrypted private keys without re-entering passwords, and lazy evaluation to only prompt key load the first time SSH is called
- and lots more - see [.bash.d/README](https://github.com/HariSekhon/DevOps-Bash-tools/blob/master/.bash.d/README.md) for a more detailed list
- run `make bash` to link `.bashrc`/`.bash_profile` and the `.*` dot config files to your `$HOME` directory to auto-inherit everything
- `lib/*.sh` - Bash utility libraries full of functions for
[Docker](https://www.docker.com/),
environment,
CI detection ([Travis CI](https://travis-ci.org/), [Jenkins](https://jenkins.io/) etc),
port and HTTP url availability content checks etc.
Sourced from all my other [GitHub repos](https://github.com/harisekhon) to make setting up Dockerized tests easier.
- `install/install_*.sh` - various simple to use installation scripts for common technologies like
[AWS CLI](https://aws.amazon.com/cli/),
[Azure CLI](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest),
[GCloud SDK](https://cloud.google.com/sdk),
[Terraform](https://www.terraform.io/),
[Ansible](https://www.ansible.com/),
[MiniKube](https://kubernetes.io/docs/setup/learning-environment/minikube/),
[MiniShift](https://www.okd.io/minishift/)
(Kubernetes / [Redhat OpenShift](https://www.openshift.com/)/[OKD](https://www.okd.io/) dev VMs),
[Maven](https://maven.apache.org/),
[Gradle](https://gradle.org/),
[SBT](https://www.scala-sbt.org/),
[EPEL](https://fedoraproject.org/wiki/EPEL),
[RPMforge](http://repoforge.org/),
[Homebrew](https://brew.sh/),
[Travis CI](https://travis-ci.org/),
[Circle CI](https://circleci.com/),
[AppVeyor](https://www.appveyor.com/),
[BuildKite](https://buildkite.com),
[Parquet Tools](https://github.com/apache/parquet-mr/tree/master/parquet-tools)
etc.
- `login.sh` - logs to major Cloud platforms if their credentials are found in the environment, CLIs such as AWS, GCP, Azure, GitHub... Docker registries: DockerHub, GHCR, ECR, GCR, GAR, ACR, Gitlab, Quay...
- `clean_caches.sh` - cleans out OS package and programming language caches - useful to save space or reduce Docker image size
7 months ago
- `delete_duplicate_files.sh` - deletes duplicate files with (N) suffixes, commonly caused by web browser downloads,
- `download_url_file.sh` - downloads a file from a URL using wget with no clobber and continue support, or curl with atomic replacement to avoid race conditions. Used by `github/github_download_release_file.sh`, `github_download_release_jar.sh`, and `install/download_*_jar.sh`
- `dump_stats.sh` - dumps common command outputs to text files in a local tarball. Useful to collect support information for vendor support cases
7 months ago
in the given or current directory. Checks they're exact duplicates of a matching basename file without the (N) suffix with
the exact same checksum for safety. Prompts to delete per file. To auto-accept deletions, do
`yes | delete_duplicate_files.sh`. This is a fast way of cleaning up your `~/Downloads` directory and can be put your
user crontab
- `curl_auth.sh` - shortens `curl` command by auto-loading your OAuth2 / JWT API token or username & password from environment variables or interactive starred password prompt through a ram file descriptor to avoid placing them on the command line (which would expose your credentials in the process list or OS audit log files). Used by many other adjacent API querying scripts
- `find_duplicate_files*.sh` - finds duplicate files by size and/or checksum in given directory trees. Checksums are only done on files that already have matching byte counts for efficiency
- `find_broken_links.sh` - find broken links with delays to avoid tripping defenses
- `find_broken_symlinks.sh` - find broken symlinks pointing to non-existent files/directories
9 months ago
- `http_duplicate_urls.sh` - find duplicate URLs in a given web page
- `image_join_stack.sh` - stack joins two images after matching their widths so they align correctly
4 months ago
- `ldapsearch.sh` - shortens `ldapsearch` command by inferring switches from environment variables
- `ldap_user_recurse.sh` / `ldap_group_recurse.sh` - recurse Active Directory LDAP users upwards to find all parent groups, or groups downwards to find all nested users (useful for debugging LDAP integration and group-based permissions)
- `log_timestamp_large_intervals.sh` - finds log lines whose timestamp intervals exceed the given number of seconds and outputs those log lines with the difference between the last and current timestamps. Useful to find actions that are taking a long time from log files such as CI/CD logs
- `diff_line_threshold.sh` - compares two files vs a line count diff threshold to determine if they are radically different. Used to avoid overwriting files which are not mere updates but completely different files
7 months ago
- `mac_diff_settings.sh` - takes before and after snapshots of UI setting changes and diffs them to make it easy to find `defaults` keys to add to `setup/mac_settings.sh` to save settings
7 months ago
- `mac_iso_to_usb.sh` - converts a given ISO file to a USB bootable image and burns it onto a given or detected inserted USB drive
2 months ago
- `organize_downloads.sh` - moves files of well-known extensions in the `$HOME/Downloads` directory older than 1 week to capitalized subdirectories of their type to keep the `$HOME/Downloads/` directory tidy
- `copy_to_clipboard.sh` - copies stdin or string arg to system clipboard on Linux or Mac
7 months ago
- `paste_from_clipboard.sh` - pastes from system clipboard to stdout on Linux or Mac
- `paste_diff_settings.sh` - Takes snapshots of before and after clipboard changes and diffs them to show config changes
- `pldd.sh` - parses `/proc` on Linux to show the runtime `.so` loaded dynamic shared libraries a program pid is using. Runtime equivalent of the classic static `ldd` command and because the system `pldd` command often fails to attach to a process
- `random_select.sh` - selects one of given args at random. Useful for sampling, running randomized subsets of large test suites etc.
- `shields_embed_logo.sh` - base64 encodes a given icon file or url and prints the `logo=...` url parameter you need to add the [shields.io](https://shields.io/) badge url
- `shred_file.sh` - overwrites a file 7 times to DoD standards before deleting it to prevent recovery of sensitive information
- `shred_free_space.sh` - overwrites free space to prevent recovery of sensitive information for files that have already been deleted
- `split.sh` - split large files into N parts (defaults to the number of your CPU cores) to parallelize operations on them
- `ssh_dump_stats.sh` - uses SSH and `dump_stats.sh` to dump common command outputs from remote servers to a local tarball. Useful for vendor support cases
- `ssh_dump_logs.sh` - Uses SSH to dump logs from server to local text files for uploading to vendor support cases
- `ssl_get_cert.sh` - gets a remote `host:port` server's SSL cert in a format you can pipe, save and use locally, for example in Java truststores
- `ssl_verify_cert.sh` - verifies a remote SSL certificate (battle tested more feature-rich version `check_ssl_cert.pl` exists in the [Advanced Nagios Plugins](https://github.com/HariSekhon/Nagios-Plugins) repo)
- `ssl_verify_cert_by_ip.sh` - verifies SSL certificates on specific IP addresses, useful to test SSL source addresses for CDNs, such as Cloudflare Proxied sources before enabling SSL Full-Strict Mode for end-to-end, or Kubernetes ingresses (see also `curl_k8s_ingress.sh`)
- `urlencode.sh` / `urldecode.sh` - URL encode/decode quickly on the command line, in pipes etc.
- `urlopen.sh` - opens the given URL from first arg or stdin, or first URL found in a given file. Uses the system's default browser
- `vagrant_hosts.sh` - generate `/etc/hosts` output from a `Vagrantfile`
- `vagrant_total_mb.sh` - calculate the RAM committed to VMs in a `Vagrantfile`
4 months ago
See also [Knowledge Base notes for Linux](https://github.com/HariSekhon/Knowledge-Base/blob/main/linux.md)
and [Mac](https://github.com/HariSekhon/Knowledge-Base/blob/main/linux.md).
### Databases
`mysql/`, `postgres/`, `sql/` and `bin/` directories:
- [sql/](https://github.com/HariSekhon/SQL-scripts) - 100+ SQL scripts for [PostgreSQL](https://www.postgresql.org/), [MySQL](https://www.mysql.com/), [Google BigQuery](https://cloud.google.com/bigquery) and [AWS Athena](https://aws.amazon.com/athena/) [CloudTrail](https://aws.amazon.com/cloudtrail/) logs integration
- `sqlite.sh` - one-touch [SQLite](https://www.sqlite.org/index.html), starts sqlite3 shell with sample 'chinook' database loaded
- `mysql*.sh` - [MySQL](https://www.mysql.com/) scripts:
- `mysql.sh` - shortens `mysql` command to connect to [MySQL](https://www.mysql.com/) by auto-populating switches from both standard environment variables like `$MYSQL_TCP_PORT`, `$DBI_USER`, `$MYSQL_PWD` (see [doc](https://dev.mysql.com/doc/refman/8.0/en/environment-variables.html)) and other common environment variables like `$MYSQL_HOST` / `$HOST`, `$MYSQL_USER` / `$USER`, `$MYSQL_PASSWORD` / `$PASSWORD`, `$MYSQL_DATABASE` / `$DATABASE`
- `mysql_foreach_table.sh` - executes a SQL query against every table, replacing `{db}` and `{table}` in each iteration eg. `select count(*) from {table}`
- `mysql_*.sh` - various scripts using `mysql.sh` for row counts, iterating each table, or outputting clean lists of databases and tables for quick scripting
- `mysqld.sh` - one-touch [MySQL](https://www.mysql.com/), boots docker container + drops in to `mysql` shell, with `/sql` scripts mounted in container for easy sourcing eg. `source /sql/<name>.sql`. Optionally loads sample 'chinook' database
- see also the [SQL Scripts](https://github.com/HariSekhon/SQL-scripts) repo for many more straight MySQL SQL scripts
- `mariadb.sh` - one-touch [MariaDB](https://mariadb.org/), boots docker container + drops in to `mysql` shell, with `/sql` scripts mounted in container for easy sourcing eg. `source /sql/<name>.sql`. Optionally loads sample 'chinook' database
- `postgres*.sh` / `psql.sh` - [PostgreSQL](https://www.postgresql.org/) scripts:
- `postgres.sh` - one-touch [PostgreSQL](https://www.postgresql.org/), boots docker container + drops in to `psql` shell, with `/sql` scripts mounted in container for easy sourcing eg. `\i /sql/<name>.sql`. Optionally loads sample 'chinook' database
- `psql.sh` - shortens `psql` command to connect to [PostreSQL](https://www.postgresql.org/) by auto-populating switches from environment variables, using both standard postgres supported environment variables like `$PG*` (see [doc](https://www.postgresql.org/docs/12/libpq-envars.html)) as well as other common environment variables like `$POSTGRESQL_HOST` / `$POSTGRES_HOST` / `$HOST`, `$POSTGRESQL_USER` / `$POSTGRES_USER` / `$USER`, `$POSTGRESQL_PASSWORD` / `$POSTGRES_PASSWORD` / `$PASSWORD`, `$POSTGRESQL_DATABASE` / `$POSTGRES_DATABASE` / `$DATABASE`
- `postgres_foreach_table.sh` - executes a SQL query against every table, replacing `{db}`, `{schema}` and `{table}` in each iteration eg. `select count(*) from {table}`
- `postgres_*.sh` - various scripts using `psql.sh` for row counts, iterating each table, or outputting clean lists of databases, schemas and tables for quick scripting
### AWS - Amazon Web Services
`aws/` directory:
- [AWS](https://aws.amazon.com/) scripts - `aws_*.sh`:
- `aws_cli_create_credential.sh` - creates an AWS service account user for CI/CD or CLI with Admin permissions (or other group or policy), creates an AWS Access Key, saves a credentials CSV and even prints the shell export commands and aws credentials file config to configure your environment to start using it. Useful trick to avoid CLI reauth to `aws sso login` every day.
- `aws_terraform_create_credential.sh` - creates a AWS terraform service account with Administrator permissions for Terraform Cloud or other CI/CD systems to run Terraform plan and apply, since no CI/CD systems can work with AWS SSO workflows. Stores the access key as both CSV and prints shell export commands and credentials file config as above
- `.envrc-aws` - copy to `.envrc` for [direnv](https://direnv.net/) to auto-load AWS configuration settings such as AWS Profile, Compute Region, EKS cluster kubectl context etc.
- calls `.envrc-kubernetes` to set the `kubectl` context isolated to current shell to prevent race conditions between shells and scripts caused by otherwise naively changing the global `~/.kube/config` context
- `aws_terraform_create_s3_bucket.sh` - creates a Terraform S3 bucket for storing the backend state, locks out public access, enables versioning, encryption, and locks out Power Users role and optionally any given user/group/role ARNs via a bucket policy for safety
- `aws_terraform_create_dynamodb_table.sh` - creates a Terraform locking table in DynamoDB for use with the S3 backend, plus custom IAM policy which can be applied to less privileged accounts
- `aws_terraform_create_all.sh` - runs all of the above, plus also applies the custom DynamoDB IAM policy to the user to ensure if the account is less privileged it can still get the Terraform lock (useful for GitHub Actions environment secret for a read only user to generate Terraform Plans in Pull Request without needing approval)
- `aws_terraform_iam_grant_s3_dynamodb.sh` - creates IAM policies to access any S3 buckets and DynamoDB tables with `terraform-state` or `tf-state` in their names, and attaches them to the given user. Useful for limited permissions CI/CD accounts that run Terraform Plan eg. in GitHub Actions pull requests
2 years ago
- `aws_account_summary.sh` - prints AWS account summary in `key = value` pairs for easy viewing / grepping of things like `AccountMFAEnabled`, `AccountAccessKeysPresent`, useful for checking whether the root account has MFA enabled and no access keys, comparing number of users vs number of MFA devices etc. (see also `check_aws_root_account.py` in [Advanced Nagios Plugins](https://github.com/HariSekhon/Nagios-Plugins))
- `aws_billing_alarm.sh` - creates a [CloudWatch](https://aws.amazon.com/cloudwatch/) billing alarm and [SNS](https://aws.amazon.com/sns/) topic with subscription to email you when you incur charges above a given threshold. This is often the first thing you want to do on an account
- `aws_budget_alarm.sh` - creates an [AWS Budgets](https://aws.amazon.com/cloudwatch/) billing alarm and [SNS](https://aws.amazon.com/sns/) topic with subscription to email you when both when you start incurring forecasted charges of over 80% of your budget, and 90% actual usage. This is often the first thing you want to do on an account
- `aws_batch_stale_jobs.sh` - lists [AWS Batch](https://aws.amazon.com/batch/) jobs that are older than N hours in a given queue
- `aws_batch_kill_stale_jobs.sh` - finds and kills [AWS Batch](https://aws.amazon.com/batch/) jobs that are older than N hours in a given queue
- `aws_cloudtrails_cloudwatch.sh` - lists [Cloud Trails](https://aws.amazon.com/cloudtrail/) and their last delivery to [CloudWatch](https://aws.amazon.com/cloudwatch/features/) Logs (should be recent)
- `aws_cloudtrails_event_selectors.sh` - lists [Cloud Trails](https://aws.amazon.com/cloudtrail/) and their event selectors to check each one has at least one event selector
- `aws_cloudtrails_s3_accesslogging.sh` - lists [Cloud Trails](https://aws.amazon.com/cloudtrail/) buckets and their Access Logging prefix and target bucket. Checks [S3 access logging](https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html) is enabled
- `aws_cloudtrails_s3_kms.sh` - lists [Cloud Trails](https://aws.amazon.com/cloudtrail/) and whether their [S3](https://aws.amazon.com/s3/) buckets are [KMS](https://aws.amazon.com/kms/) secured
- `aws_cloudtrails_status.sh` - lists [Cloud Trails](https://aws.amazon.com/cloudtrail/) status - if logging, multi-region and log file validation enabled
- `aws_config_all_types.sh` - lists [AWS Config](https://aws.amazon.com/config/) recorders, checking all resource types are supported (should be true) and includes global resources (should be true)
- `aws_config_recording.sh` - lists [AWS Config](https://aws.amazon.com/config/) recorders, their recording status (should be true) and their last status (should be success)
- `aws_csv_creds.sh` - prints AWS credentials from a CSV file as shell export statements. Useful to quickly switch your shell to some exported credentials from a service account for testing permissions or pipe to upload to a CI/CD system via an API (eg. `jenkins_cred_add*.sh`, `github_actions_repo*_set_secret.sh`, `gitlab_*_set_env_vars.sh`, `circleci_*_set_env_vars.sh`, `bitbucket_*_set_env_vars.sh`, `terraform_cloud_*_set_vars.sh`, `kubectl_kv_to_secret.sh`). Supports new user and new access key csv file formats.
- `aws_codecommit_csv_creds.sh` - prints AWS [CodeCommit](https://aws.amazon.com/codecommit/) Git credentials from a CSV file as shell export statements. Similar use case and chaining as above
- `aws_ec2_list_instance_states.sh` - quickly list AWS EC2 instances, their DNS names and States in an easy to read table output
- `aws_ec2_terminate_instance_by_name.sh` - terminate an AWS EC2 instance by name
2 months ago
- `aws_ec2_ebs_*.sh` - AWS EC2 [EBS](https://aws.amazon.com/ebs/) scripts:
- `aws_ec2_ebs_volumes.sh` - list EC2 instances and their EBS volumes in the current region
2 months ago
- `aws_ec2_ebs_create_snapshot_and_wait.sh - creates a snapshot of a given EBS volume ID and waits for it to complete with exponential backoff
2 months ago
- `aws_ec2_ebs_resize_and_wait.sh - resizes an EBS volume and waits for it to complete modifying and optionally optimizing with exponential backoff
- `aws_ec2_ebs_volumes_unattached.sh` - list an unattached EBS volumes in a table format
2 months ago
- `aws_ecr_*.sh` - AWS [ECR](https://aws.amazon.com/ecr/) docker image management scripts:
- `aws_ecr_docker_build_push.sh` - builds a docker image and pushes it to ECR with not just the `latest` docker tag but also the current Git hashref and Git tags
3 months ago
- `aws_ecr_list_repos.sh` - lists ECR repos, and their docker image mutability and whether image scanning is enabled
- `aws_ecr_list_tags.sh` - lists all the tags for a given ECR docker image
- `aws_ecr_newest_image_tags.sh` - lists the tags for the given ECR docker image with the newest creation date (can use this to determine which image version to tag as `latest`)
- `aws_ecr_alternate_tags.sh` - lists all the tags for a given ECR docker `image:tag` (use arg `<image>:latest` to see what version / build hashref / date tag has been tagged as `latest`)
- `aws_ecr_tag_image.sh` - tags an ECR image with another tag without pulling and pushing it
- `aws_ecr_tag_image_by_digest.sh` - same as above but tags an ECR image found via digest (more accurate as reference by existing tag can be a moving target). Useful to recover images that have become untagged
- `aws_ecr_tag_latest.sh` - tags a given ECR docker `image:tag` as `latest` without pulling or pushing the docker image
- `aws_ecr_tag_branch.sh` - tags a given ECR `image:tag` with the current Git branch without pulling or pushing the docker image
- `aws_ecr_tag_datetime.sh` - tags a given ECR docker image with its creation date and UTC timestamp (when it was uploaded to ECR) without pulling or pushing the docker image
- `aws_ecr_tag_newest_image_as_latest.sh` - finds and tags the newest build of a given ECR docker image as `latest` without pulling or pushing the docker image
- `aws_ecr_tags_timestamps.sh` - lists all the tags and their timestamps for a given ECR docker image
- `aws_ecr_tags_old.sh` - lists tags older than N days for a given ECR docker image
- `aws_ecr_delete_old_tags.sh` - deletes tags older than N days for a given ECR docker image. Lists the image:tags to be deleted and prompts for confirmation safety
- `aws_foreach_profile.sh` - executes a templated command across all AWS named profiles configured in AWS CLIv2, replacing `{profile}` in each iteration. Combine with other scripts for powerful functionality, auditing, setup etc. eg. `aws_kube_creds.sh` to configure `kubectl` config to all EKS clusters in all environments
- `aws_foreach_region.sh` - executes a templated command against each AWS region enabled for the current account, replacing `{region}` in each iteration. Combine with AWS CLI or scripts to find resources across regions
2 months ago
- `aws_iam_*.sh` - AWS [IAM](https://aws.amazon.com/iam/) scripts:
- `aws_iam_password_policy.sh` - prints [AWS password policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_passwords_account-policy.html) in `key = value` pairs for easy viewing / grepping (used by `aws_harden_password_policy.sh` before and after to show the differences)
- `aws_iam_harden_password_policy.sh` - strengthens [AWS password policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_passwords_account-policy.html) according to [CIS Foundations Benchmark](https://d1.awsstatic.com/whitepapers/compliance/AWS_CIS_Foundations_Benchmark.pdf) recommendations
- `aws_iam_replace_access_key.sh` - replaces the non-current IAM access key (Inactive, Not Used, longer time since used, or an explicitly given key), outputting the new key as shell export statements (useful for piping to the same tools listed for `aws_csv_creds.sh` above)
- `aws_iam_policies_attached_to_users.sh` - finds [AWS IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage.html) directly attached to users (anti-best practice) instead of groups
- `aws_iam_policies_granting_full_access.sh` - finds [AWS IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage.html) granting full access (anti-best practice)
- `aws_iam_policies_unattached.sh` - lists unattached [AWS IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage.html)
- `aws_iam_policy_attachments.sh` - finds all users, groups and roles where a given IAM policy is attached, so that you can remove all these references in your Terraform code and avoid this error `Error: error deleting IAM policy arn:aws:iam::***:policy/mypolicy: DeleteConflict: Cannot delete a policy attached to entities.`
- `aws_iam_policy_delete.sh` - deletes an IAM policy, by first handling all prerequisite steps of deleting all prior versions and all detaching all users, groups and roles
- `aws_iam_generate_credentials_report_wait.sh` - generates an AWS IAM [credentials report](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_getting-report.html)
- `aws_iam_users.sh` - list your IAM users
- `aws_iam_users_access_key_age.sh` - prints AWS users [access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html) status and age (see also `aws_users_access_key_age.py` in [DevOps Python tools](https://github.com/HariSekhon/DevOps-Python-tools) which can filter by age and status)
- `aws_iam_users_access_key_age_report.sh` - prints AWS users [access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html) status and age using a bulk credentials report (faster for many users)
- `aws_iam_users_access_key_last_used.sh` - prints AWS users [access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html) last used date
- `aws_iam_users_access_key_last_used_report.sh` - same as above using bulk credentials report (faster for many users)
- `aws_iam_users_last_used_report.sh` - lists AWS users password/access keys last used dates
- `aws_iam_users_mfa_active_report.sh` - lists AWS users password enabled and [MFA](https://aws.amazon.com/iam/features/mfa/) enabled status
- `aws_iam_users_without_mfa.sh` - lists AWS users with password enabled but no MFA
- `aws_iam_users_mfa_serials.sh` - lists AWS users [MFA](https://aws.amazon.com/iam/features/mfa/) serial numbers (differentiates Virtual vs Hardware MFAs)
- `aws_iam_users_pw_last_used.sh` - lists AWS users and their password last used date
- `aws_ip_ranges.sh` - get all AWS IP ranges for a given Region and/or Service using the IP range API
- `aws_kms_key_rotation_enabled.sh` - lists [AWS KMS](https://aws.amazon.com/kms/) keys and whether they have key rotation enabled
- `aws_kube_creds.sh` - auto-loads all [AWS EKS](https://aws.amazon.com/eks/) clusters credentials in the current --profile and --region so your kubectl is ready to rock on AWS
- `aws_kubectl.sh` - runs kubectl commands safely fixed to a given [AWS EKS](https://aws.amazon.com/eks/) cluster using config isolation to avoid concurrency race conditions
- `aws_logs_*.sh` - some useful log queries in last N hours (24 hours by default):
- `aws_logs_batch_jobs.sh` - lists AWS Batch job submission requests and their callers
- `aws_logs_ec2_spot.sh` - lists AWS EC2 Spot fleet creation requests, their caller and first tag value for origin hint
- `aws_logs_ecs_tasks.sh` - lists AWS ECS task run requests, their callers and job definitions
2 months ago
- `aws_meta.sh` - AWS [EC2 Metadata API](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-instance-metadata.html) query shortcut. See also the official [ec2-metadata](https://aws.amazon.com/code/ec2-instance-metadata-query-tool/) shell script with more features
- `aws_nat_gateways_public_ips.sh` - lists the public IPs of all NAT gateways. Useful to give to clients to permit through firewalls for webhooks or similar calls
3 months ago
- `aws_rds_open_port_to_my_ip.sh` - adds a security group to an RDS DB instance to open its native database SQL port to your public IP address
- `aws_rds_get_version.sh` - quickly retrieve the version of an RDS database to know which JDBC jar version to download using `install/download_*_jdbc.sh` when setting up connections
- `aws_route53_check_ns_records.sh` - checks AWS [Route 53](https://aws.amazon.com/route53/) public hosted zones NS servers are delegated in the public DNS hierarchy and that there are no rogue NS servers delegated not matching the Route 53 zone configuration
- `aws_sso_env_creds.sh` - retrieves AWS SSO session credentials in the format of environment export commands for copying to other systems like Terraform Cloud
- `aws_s3_bucket.sh` - creates an S3 bucket, blocks public access, enables versioning, encryption, and optionally locks out any given user/group/role ARNs via a bucket policy for safety (eg. to stop Power Users accessing a sensitive bucket like Terraform state)
- `aws_s3_buckets_block_public_access.sh` - blocks public access to one or more given S3 buckets or files containing bucket names, one per line
- `aws_s3_account_block_public_access.sh` - blocks S3 public access at the AWS account level
- `aws_s3_check_buckets_public_blocked.sh` - iterates each S3 bucket and checks it has public access fully blocked via policy. Parallelized for speedup
- `aws_s3_check_account_public_blocked.sh` - checks S3 public access is blocked at the AWS account level
- `aws_s3_sync.sh` - syncs multiple AWS S3 URLs from file lists. Validates S3 URLs, source and destination list lengths matches, and optionally that path suffixes match, to prevent off-by-one human errors spraying data all over the wrong destination paths
- `aws_s3_access_logging.sh` - lists [AWS S3](https://aws.amazon.com/s3/) buckets and their [access logging](https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html) status
- `aws_spot_when_terminated.sh` - executes commands when the [AWS EC2](https://aws.amazon.com/ec2/) instance running this script is notified of [Spot Termination](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-spot-instances.html), acts as a latch mechanism that can be set any time after boot
- `aws_sqs_check.sh` - sends a test message to an [AWS SQS](https://aws.amazon.com/sqs/) queue, retrieves it to check and then deletes it via the receipt handle id
- `aws_sqs_delete_message.sh` - deletes 1-10 messages from a given [AWS SQS](https://aws.amazon.com/sqs/) queue (to help clear out test messages)
- `aws_ssm_put_param.sh` - reads a value from a command line argument or non-echo prompt and saves it to AWS [Systems Manager Parameter Store](https://docs.aws.amazon.com/systems-manager/latest/userguide/what-is-systems-manager.html). Useful for uploading a password without exposing it on your screen
- `aws_secret*.sh` - AWS [Secrets Manager](https://aws.amazon.com/secrets-manager/) scripts:
- `aws_secret_list.sh` - returns the list of secrets, one per line
- `aws_secret_add.sh` - reads a value from a command line argument or non-echo prompt and saves it to Secrets Manager. Useful for uploading a password without exposing it on your screen
- `aws_secret_add_binary.sh` - base64 encodes a given file's contents and saves it to Secrets Manager as a binary secret. Useful for uploading things like QR code screenshots for sharing MFA to recovery admin accounts
- `aws_secret_update.sh` - reads a value from a command line argument or non-echo prompt and updates a given Secrets Manager secret. Useful for updating a password without exposing it on your screen
- `aws_secret_update_binary.sh` - base64 encodes a given file's contents and updates a given Secrets Manager secret. Useful for updating a QR code screenshot for a root account
- `aws_secret_get.sh` - gets a secret value for a given secret from Secrets Manager, retrieving either a secure string or secure binary depending on which is available
- `eksctl_cluster.sh` - downloads [eksctl](https://eksctl.io/) and creates an [AWS EKS](https://aws.amazon.com/eks/) Kubernetes cluster
4 months ago
See also [Knowledge Base notes for AWS](https://github.com/HariSekhon/Knowledge-Base/blob/main/aws.md).
### GCP - Google Cloud Platform
`gcp/` directory:
- [Google Cloud](https://cloud.google.com/) scripts - `gcp_*.sh` / `gce_*.sh` / `gke_*.sh` / `gcr_*.sh` / `bigquery_*.sh`:
8 months ago
- `.envrc-gcp` - copy to `.envrc` for [direnv](https://direnv.net/) to auto-load GCP configuration settings such as Project, Region, Zone, GKE cluster kubectl context or any other GCloud SDK settings to shorten `gcloud` commands. Applies to the local shell environment only to avoid race conditions caused by naively changing the global gcloud config at `~/.config/gcloud/active_config`
- calls `.envrc-kubernetes` to set the `kubectl` context isolated to current shell to prevent race conditions between shells and scripts caused by otherwise naively changing the global `~/.kube/config` context
- `gcp_terraform_create_credential.sh` - creates a service account for [Terraform](https://www.terraform.io/) with full permissions, creates and downloads a credential key json and even prints the `export GOOGLE_CREDENTIALS` command to configure your environment to start using Terraform immediately. Run once for each project and combine with [direnv](https://direnv.net/) for fast easy management of multiple GCP projects
7 months ago
- `gcp_ansible_create_credential.sh` - creates an [Ansible](https://www.ansible.com/) service account with permissions on the current project, creates and downloads a credential key json and prints the environment variable to immediately use it
- `gcp_cli_create_credential.sh` - creates a GCloud SDK CLI service account with full owner permissions to all projects, creates and downloads a credential key json and even prints the `export GOOGLE_CREDENTIALS` command to configure your environment to start using it. Avoids having to reauth to `gcloud auth login` every day.
- `gcp_spinnaker_create_credential.sh` - creates a [Spinnaker](https://spinnaker.io/) service account with permissions on the current project, creates and downloads a credential key json and even prints the Halyard CLI configuration commands to use it
- `gcp_info.sh` - huge [Google Cloud](https://cloud.google.com/) inventory of deployed resources within the current project - Cloud SDK info plus all of the following (detects which services are enabled to query):
- `gcp_info_compute.sh` - [GCE](https://cloud.google.com/compute/) Virtual Machine instances, [App Engine](https://cloud.google.com/appengine) instances, [Cloud Functions](https://cloud.google.com/functions), [GKE](https://cloud.google.com/kubernetes-engine) clusters, all [Kubernetes](https://kubernetes.io/) objects across all GKE clusters (see `kubernetes_info.sh` below for more details)
- `gcp_info_storage.sh` - [Cloud SQL](https://cloud.google.com/sql) info below, plus: [Cloud Storage](https://cloud.google.com/storage) Buckets, [Cloud Filestore](https://cloud.google.com/filestore), [Cloud Memorystore Redis](https://cloud.google.com/memorystore), [BigTable](https://cloud.google.com/bigtable) clusters and instances, [Datastore](https://cloud.google.com/datastore) indexes
- `gcp_info_cloud_sql.sh` - [Cloud SQL](https://cloud.google.com/sql) instances, whether their backups are enabled, and all databases on each instance
- `gcp_info_cloud_sql_databases.sh` - lists databases inside each [Cloud SQL](https://cloud.google.com/sql) instance. Included in `gcp_info_cloud_sql.sh`
- `gcp_info_cloud_sql_backups.sh` - lists backups for each [Cloud SQL](https://cloud.google.com/sql) instance with their dates and status. Not included in `gcp_info_cloud_sql.sh` for brevity. See also `gcp_sql_export.sh` further down for more durable backups to [GCS](https://cloud.google.com/storage)
- `gcp_info_cloud_sql_users.sh` - lists users for each running [Cloud SQL](https://cloud.google.com/sql) instance. Not included in `gcp_info_cloud_sql.sh` for brevity but useful to audit users
- `gcp_info_networking.sh` - VPC Networks, Addresses, Proxies, Subnets, Routers, Routes, VPN Gateways, VPN Tunnels, Reservations, Firewall rules, Forwarding rules, [Cloud DNS](https://cloud.google.com/dns) managed zones and verified domains
- `gcp_info_bigdata.sh` - [Dataproc](https://cloud.google.com/dataproc) clusters and jobs in all regions, [Dataflow](https://cloud.google.com/dataflow) jobs in all regions, [PubSub](https://cloud.google.com/pubsub) messaging topics, [Cloud IOT](https://cloud.google.com/iot-core) registries in all regions
- `gcp_info_tools.sh` - [Cloud Source Repositories](https://cloud.google.com/source-repositories), [Cloud Builds](https://cloud.google.com/cloud-build), [Container Registry](https://cloud.google.com/container-registry) images across all major repos (`gcr.io`, `us.gcr.io`, `eu.gcr.io`, `asia.gcr.io`), [Deployment Manager](https://cloud.google.com/deployment-manager) deployments
- `gcp_info_auth_config.sh` - Auth Configurations, Organizations & Current Config
- `gcp_info_projects.sh` - Projects names and IDs
- `gcp_info_services.sh` - Services & APIs enabled
- `gcp_service_apis.sh` - lists all available [GCP](https://cloud.google.com/) Services, APIs and their states (enabled/disabled), and provides `is_service_enabled()` function used throughout the adjacent scripts to avoid errors and only show relevant enabled services
- `gcp_info_accounts_secrets.sh` - [IAM](https://cloud.google.com/iam) Service Accounts, [Secret Manager](https://cloud.google.com/secret-manager) secrets
- `gcp_info_all_projects.sh` - same as above but for all detected projects
- `gcp_foreach_project.sh` - executes a templated command across all GCP projects, replacing `{project_id}` and `{project_name}` in each iteration (used by `gcp_info_all_projects.sh` to call `gcp_info.sh`)
- `gcp_find_orphaned_disks.sh` - lists orphaned disks across one or more GCP projects (not attached to any compute instance)
2 months ago
- `gcp_secret*.sh` - Google [Secret Manager](https://cloud.google.com/secret-manager) scripts:
- `gcp_secret_add.sh` - reads a value from a command line argument or non-echo prompt and saves it to GCP Secrets Manager. Useful for uploading a password without exposing it on your screen
- `gcp_secret_add_binary.sh` - uploads a binary file to GCP Secrets Manager by base64 encoding it first. Useful for uploading QR code screenshots. Useful for uploading things like QR code screenshots for sharing MFA to recovery admin accounts
- `gcp_secret_update.sh` - reads a value from a command line argument or non-echo prompt and updates a given GCP Secrets Manager secret. Useful for uploading a password without exposing it on your screen
- `gcp_secret_get.sh` - finds the latest version of a given GCP Secret Manager secret and returns its value. Used by adjacent scripts
- `gcp_secret_label_k8s.sh` - labels a given existing GCP secret with the current kubectl cluster name and namespace for later use by `gcp_secrets_to_kubernetes.sh`
- `gcp_secrets_to_kubernetes.sh` - loads GCP secrets to Kubernetes secrets in a 1-to-1 mapping. Can specify a list of secrets or auto-loads all GCP secrets with labels `kubernetes-cluster` and `kubernetes-namespace` matching the current `kubectl` context (`kcd` to the right namespace first, see `.bash.d/kubernetes`). See also `kubernetes_get_secret_values.sh` to debug the actual values that got loaded. See also [Sealed Secrets](https://github.com/bitnami-labs/sealed-secrets) / [External Secrets](https://external-secrets.io/) in my [Kubernetes repo](https://github.com/HariSekhon/Kubernetes-configs)
- `gcp_secrets_to_kubernetes_multipart.sh` - creates a Kubernetes secret from multiple GCP secrets (used to put `private.pem` and `public.pem` into the same secret to appear as files on volume mounts for apps in pods to use). See also [Sealed Secrets](https://github.com/bitnami-labs/sealed-secrets) / [External Secrets](https://external-secrets.io/) in my [Kubernetes repo](https://github.com/HariSekhon/Kubernetes-configs)
- `gcp_secrets_labels.sh` - lists GCP Secrets and their labels, one per line suitable for quick views or shell pipelines
- `gcp_secrets_update_lable.sh` - updates all GCP secrets in current project matching label key=value with a new label value
- `gcp_service_account_credential_to_secret.sh` - creates GCP service account and exports a credential key to GCP Secret Manager (useful to stage or combine with `gcp_secrets_to_kubernetes.sh`)
2 months ago
- `gke_*.sh` - Google [Kubernetes Engine](https://cloud.google.com/kubernetes-engine) scripts
- `gke_kube_creds.sh` - auto-loads all GKE clusters credentials in the current / given / all projects so your kubectl is ready to rock on GCP
- `gke_kubectl.sh` - runs kubectl commands safely fixed to a given GKE cluster using config isolation to avoid concurrency race conditions
- `gke_firewall_rule_cert_manager.sh` - creates a GCP firewall rule for a given GKE cluster's masters to access [Cert Manager](https://cert-manager.io/) admission webhook (auto-determines the master cidr, network and target tags)
- `gke_firewall_rule_kubeseal.sh` - creates a GCP firewall rule for a given GKE cluster's masters to access [Sealed Secrets](https://github.com/bitnami-labs/sealed-secrets) controller for `kubeseal` to work (auto-determines the master cidr, network and target tags)
- `gke_nodepool_nodes.sh` - lists all nodes in a given nodepool on the current GKE cluster via kubectl labels (fast)
- `gke_nodepool_nodes2.sh` - same as above via GCloud SDK (slow, iterates instance groups)
- `gke_nodepool_taint.sh` - taints/untaints all nodes in a given GKE nodepool on the current cluster (see `kubectl_node_taints.sh` for a quick way to see taints)
- `gke_nodepool_drain.sh` - drains all nodes in a given nodepool (to decommission or rebuild the node pool, for example with different taints)
- `gke_persistent_volumes_disk_mappings.sh` - lists GKE kubernetes persistent volumes to GCP persistent disk names, along with PVC and namespace, useful when investigating, resizing PVs etc.
2 months ago
- `gcr_*.sh` - Google [Container Registry](https://cloud.google.com/container-registry) scripts:
- `gcr_list_tags.sh` - lists all the tags for a given GCR docker image
- `gcr_newest_image_tags.sh` - lists the tags for the given GCR docker image with the newest creation date (can use this to determine which image version to tag as `latest`)
- `gcr_alternate_tags.sh` - lists all the tags for a given GCR docker `image:tag` (use arg `<image>:latest` to see what version / build hashref / date tag has been tagged as `latest`)
- `gcr_tag_latest.sh` - tags a given GCR docker `image:tag` as `latest` without pulling or pushing the docker image
- `gcr_tag_branch.sh` - tags a given GCR docker `image:tag` with the current Git branch without pulling or pushing the docker image
- `gcr_tag_datetime.sh` - tags a given GCR docker image with its creation date and UTC timestamp (when it was uploaded or created by [Google Cloud Build](https://cloud.google.com/cloud-build)) without pulling or pushing the docker image
- `gcr_tag_newest_image_as_latest.sh` - finds and tags the newest build of a given GCR docker image as `latest` without pulling or pushing the docker image
- `gcr_tags_timestamps.sh` - lists all the tags and their timestamps for a given GCR docker image
- `gcr_tags_old.sh` - lists tags older than N days for a given GCR docker image
- `gcr_delete_old_tags.sh` - deletes tags older than N days for a given GCR docker image. Lists the image:tags to be deleted and prompts for confirmation safety
- see also [cloudbuild.yaml](https://github.com/HariSekhon/Templates/blob/master/cloudbuild.yaml) in the [Templates](https://github.com/HariSekhon/Templates) repo
- CI/CD on GCP - trigger Google Cloud Build and GKE Kubernetes deployments from orthogonal CI/CD systems like Jenkins / TeamCity:
- `gcp_ci_build.sh` - script template for CI/CD to trigger Google Cloud Build to build docker container image with extra datetime and latest tagging
- `gcp_ci_deploy_k8s.sh` - script template for CI/CD to deploy GCR docker image to GKE Kubernetes using Kustomize
2 months ago
- `gce_*.sh` - Google [Compute Engine](https://cloud.google.com/compute/) scripts:
7 months ago
- `gce_foreach_vm.sh` - run a command for each GCP VM instance matching the given name/ip regex in the current GCP project
7 months ago
- `gce_host_ips.sh` - prints the IPs and hostnames of all or a regex match of GCE VMs for use in /etc/hosts
7 months ago
- `gce_ssh.sh` - Runs `gcloud compute ssh` to a VM while auto-determining its zone first to override any inherited zone config and make it easier to script iterating through VMs
7 months ago
- `gcs_ssh_keyscan.sh` - SSH keyscans all the GCE VMs returned from the above `gce_host_ips.sh` script and adds them to `~/.ssh/known_hosts`
- `gce_meta.sh` - simple script to query the GCE metadata API from within Virtual Machines
- `gce_when_preempted.sh` - GCE VM preemption latch script - can be executed any time to set one or more commands to execute upon preemption
- `gce_is_preempted.sh` - GCE VM return true/false if preempted, callable from other scripts
- `gce_instance_service_accounts.sh` - lists GCE VM instance names and their service accounts
- `gcp_firewall_disable_default_rules.sh` - disables those lax GCP default network "allow all" firewall rules
- `gcp_firewall_risky_rules.sh` - lists risky GCP firewall rules that are enabled and allow traffic from 0.0.0.0/0
- `gcp_sql_*.sh` - [Cloud SQL](https://cloud.google.com/sql) scripts:
- `gcp_sql_backup.sh` - creates Cloud SQL backups
- `gcp_sql_export.sh` - creates Cloud SQL exports to [GCS](https://cloud.google.com/storage)
- `gcp_sql_enable_automated_backups.sh` - enable automated daily Cloud SQL backups
- `gcp_sql_enable_point_in_time_recovery.sh` - enable point-in-time recovery with write-ahead logs
- `gcp_sql_proxy.sh` - boots a [Cloud SQL Proxy](https://cloud.google.com/sql/docs/postgres/sql-proxy) to all Cloud SQL instances for fast convenient direct `psql` / `mysql` access via local sockets. Installs Cloud SQL Proxy if necessary
- `gcp_sql_running_primaries.sh` - lists primary running Cloud SQL instances
- `gcp_sql_service_accounts.sh` - lists Cloud SQL instance service accounts. Useful for copying to [IAM](https://cloud.google.com/iam) to grant permissions (eg. Storage Object Creator for SQL export backups to GCS)
- `gcp_sql_create_readonly_service_account.sh` - creates a service account with read-only permissions to Cloud SQL eg. to run export backups to GCS
- `gcp_sql_grant_instances_gcs_object_creator.sh` - grants minimal GCS objectCreator permission on a bucket to primary Cloud SQL instances for exports
- `gcp_cloud_schedule_sql_exports.sh` - creates Google [Cloud Scheduler](https://cloud.google.com/scheduler) jobs to trigger a [Cloud Function](https://cloud.google.com/functions) via [PubSub](https://cloud.google.com/pubsub) to run [Cloud SQL](https://cloud.google.com/sql) exports to [GCS](https://cloud.google.com/storage) for all [Cloud SQL](https://cloud.google.com/sql) instances in the current GCP project
- the Python [GCF](https://cloud.google.com/functions) function is in the [DevOps Python tools](https://github.com/HariSekhon/DevOps-Python-tools) repo
- `bigquery_*.sh` - [BigQuery](https://cloud.google.com/bigquery) scripts:
- `bigquery_list_datasets.sh` - lists BigQuery datasets in the current GCP project
- `bigquery_list_tables.sh` - lists BigQuery tables in a given dataset
- `bigquery_list_tables_all_datasets.sh` - lists tables for all datasets in the current GCP project
- `bigquery_foreach_dataset.sh` - executes a templated command for each dataset
- `bigquery_foreach_table.sh` - executes a templated command for each table in a given dataset
- `bigquery_foreach_table_all_datasets.sh` - executes a templated command for each table in each dataset in the current GCP project
- `bigquery_table_row_count.sh` - gets the row count for a given table
- `bigquery_tables_row_counts.sh` - gets the row counts for all tables in a given dataset
- `bigquery_tables_row_counts_all_datasets.sh` - gets the row counts for all tables in all datasets in the current GCP project
- `bigquery_generate_query_biggest_tables_across_datasets_by_row_count.sh` - generates a BigQuery SQL query to find the top 10 biggest tables by row count
- `bigquery_generate_query_biggest_tables_across_datasets_by_size.sh` - generates a BigQuery SQL query to find the top 10 biggest tables by size
- see also the [SQL Scripts](https://github.com/HariSekhon/SQL-scripts) repo for many more straight BigQuery SQL scripts
- GCP [IAM](https://cloud.google.com/iam) scripts:
- `gcp_service_account*.sh`:
- `gcp_service_account_credential_to_secret.sh` - creates GCP service account and exports a credential key to GCP Secret Manager (useful to stage or combine with `gcp_secrets_to_kubernetes.sh`)
- `gcp_service_accounts_credential_keys.sh` - lists all service account credential keys and expiry dates, can `grep 9999-12-31T23:59:59Z` to find non-expiring keys
- `gcp_service_accounts_credential_keys_age.sh` - lists all service account credential keys age in days
- `gcp_service_accounts_credential_keys_expired.sh` - lists expired service account credential keys that should be removed and recreated if needed
- `gcp_service_account_members.sh` - lists all members and roles authorized to use any service accounts. Useful for finding GKE Workload Identity mappings
- `gcp_iam_*.sh`:
- `gcp_iam_roles_in_use.sh` - lists GCP IAM roles in use in the current or all projects
- `gcp_iam_identities_in_use.sh` - lists GCP IAM identities (users/groups/serviceAccounts) in use in the current or all projects
- `gcp_iam_roles_granted_to_identity.sh` - lists GCP IAM roles granted to identities matching the regex (users/groups/serviceAccounts) in the current or all projects
- `gcp_iam_roles_granted_too_widely.sh` - lists GCP IAM roles which have been granted to allAuthenticatedUsers or even worse allUsers (unauthenticated) in one or all projects
- `gcp_iam_roles_with_direct_user_grants.sh` - lists GCP IAM roles which have been granted directly to users in violation of best-practice group-based management
- `gcp_iam_serviceaccount_members.sh` - lists members with permissions to use each GCP service account
- `gcp_iam_serviceaccounts_without_permissions.sh` - finds service accounts without IAM permissionns, useful to detect obsolete service accounts after a 90 day unused permissions clean out
- `gcp_iam_workload_identities.sh` - lists GKE Workload Identity integrations, uses `gcp_iam_serviceaccount_members.sh`
- `gcp_iam_users_granted_directly.sh` - lists GCP IAM users which have been granted roles directly in violation of best-practice group-based management
7 months ago
- `gcs_bucket_project.sh` - finds the GCP project that a given bucket belongs to using the GCP Storage API
7 months ago
- `gcs_curl_file.sh` - retrieves a GCS file's contents from a given bucket and path using the GCP Storage API. Useful for starting shell pipelines or being called from other scripts
4 months ago
See also [Knowledge Base notes for GCP](https://github.com/HariSekhon/Knowledge-Base/blob/main/gcp.md).
### Kubernetes
`kubernetes/` directory:
- `.envrc-kubernetes` - copy to `.envrc` for [direnv](https://direnv.net/) to auto-load the right Kubernetes `kubectl` context isolated to current shell to prevent race conditions between shells and scripts caused by otherwise naively changing the global `~/.kube/config` context
- `aws/eksctl_cluster.sh` - quickly spins up an [AWS EKS](https://aws.amazon.com/eks/) cluster using `eksctl` with some sensible defaults
- `kubernetes_info.sh` - huge [Kubernetes](https://kubernetes.io/) inventory listing of deployed resources across all namespaces in the current cluster / kube context:
- cluster-info
- master component statuses
- nodes
- namespaces
- deployments, replicasets, replication controllers, statefulsets, daemonsets, horizontal pod autoscalers
- storage classes, persistent volumes, persistent volume claims
- service accounts, resource quotas, network policies, pod security policies
- container images running
- container images running counts descending
- pods (might be too much detail if you have high replica counts, so done last, comment if you're sure nobody has deployed pods outside deployments)
- `kubectl.sh` - runs kubectl commands safely fixed to a given context using config isolation to avoid concurrency race conditions
- `kubectl_diff_apply.sh` - generates a kubectl diff and prompts to apply
- `kustomize_diff_apply.sh` - runs Kustomize build, precreates any namespaces, shows a kubectl diff of the proposed changes, and prompts to apply
- `kustomize_diff_branch.sh` - runs Kustomize build against the current and target base branch for current or all given directories, then shows the diff for each directory. Useful to detect differences when refactoring, such as switching to tagged bases
- `kubectl_create_namespaces.sh` - creates any namespaces in yaml files or stdin, a prerequisite for a diff on a blank install, used by adjacent scripts for safety
- `kubernetes_check_objects_namespaced.sh` - checks Kubernetes yaml(s) for objects which aren't explicitly namespaced, which can easily result in deployments to the wrong namespace. Reads the API resources from your current Kubernetes cluster and if successful excludes cluster-wide objects
- `kustomize_check_objects_namespaced.sh` - checks Kustomize build yaml output for objects which aren't explicitly namespaced (uses above script)
- `kubectl_deployment_pods.sh` - gets the pod names with their unpredictable suffixes for a given deployment by querying the deployment's selector labels and then querying pods that match those labels
- `kubectl_get_all.sh` - finds all namespaced Kubernetes objects and requests them for the current or given namespace. Useful because `kubectl get all` misses a lof of object types
- `kubectl_get_annotation.sh` - find a type of object with a given annotation
- `kubectl_restart.sh` - restarts all or filtered deployments/statefulsets in the current or given namespace. Useful when debugging or clearing application problems
- `kubectl_logs.sh` - tails all containers in all pods or filtered pods in the current or given namespace. Useful when debugging a distributed set of pods in live testing
- `kubectl_kv_to_secret.sh` - creates a Kuberbetes secret from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `kubectl_secret_values.sh` - prints the keys and base64 decoded values within a given Kubernetes secret for quick debugging of Kubernetes secrets. See also: `gcp_secrets_to_kubernetes.sh`
- `kubectl_secrets_download.sh` - downloads all secrets in current or given namespace to local files of the same name, useful as a backup before migrating to Sealed Secrets
- `kubernetes_secrets_compare_gcp_secret_manager.sh` - compares each Kubernetes secret to the corresponding secret in GCP Secret Manager. Useful to safety check GCP Secret Manager values align before enabling [External Secrets](https://external-secrets.io/latest/) to replace them
- `kubernetes_secret_to_external_secret.sh` - generates an [External Secret](https://external-secrets.io/latest/) from an existing Kubernetes secret
- `kubernetes_secrets_to_external_secrets.sh` - generates [External Secrets](https://external-secrets.io/latest/) from all existing Kubernetes secrets found in the current or given namespace
- `kubernetes_secret_to_sealed_secret.sh` - generates a [Bitnami Sealed Secret](https://github.com/bitnami-labs/sealed-secrets) from an existing Kubernetes secret
- `kubernetes_secrets_to_sealed_secrets.sh` - generates [Bitnami Sealed Secrets](https://github.com/bitnami-labs/sealed-secrets) from all existing Kubernetes secrets found in the current or given namespace
- `kubectl_secrets_annotate_to_be_sealed.sh` - annotates secrets in current or given namespace to allow being overwritten by Sealed Secrets (useful to sync ArgoCD health)
- `kubectl_secrets_not_sealed.sh` - finds secrets with no SealedSecret ownerReferences
- `kubectl_secrets_to_be_sealed.sh` - finds secrets pending overwrite by Sealed Secrets with the managed annotation
- `kubernetes_foreach_context.sh` - executes a command across all kubectl contexts, replacing `{context}` in each iteration (skips lab contexts `docker` / `minikube` / `minishift` to avoid hangs since they're often offline)
- `kubernetes_foreach_namespace.sh` - executes a command across all kubernetes namespaces in the current cluster context, replacing `{namespace}` in each iteration
- Can be chained with `kubernetes_foreach_context.sh` and useful when combined with `gcp_secrets_to_kubernetes.sh` to load all secrets from GCP to Kubernetes for the current cluster, or combined with `gke_kube_creds.sh` and `kubernetes_foreach_context.sh` for all clusters!
- `kubernetes_api.sh` - finds Kubernetes API and runs your curl arguments against it, auto-getting authorization token and auto-populating OAuth authentication header
- `kubernetes_autoscaler_release.sh` - finds the latest Kubernetes Autoscaler release that matches your local Kubernetes cluster version using kubectl and the GitHub API. Useful for quickly finding the image override version for `eks-cluster-autoscaler-kustomization.yaml` in the [Kubernetes configs](https://github.com/HariSekhon/Kubernetes-configs) repo
- `kubernetes_etcd_backup.sh` - creates a timestamped backup of the Kubernetes Etcd database for a kubeadm cluster
- `kubernetes_delete_stuck_namespace.sh` - to forcibly delete those pesky kubernetes namespaces of 3rd party apps like Knative that get stuck and hang indefinitely on the finalizers during deletion
- `kubeadm_join_cmd.sh` - outputs `kubeadm join` command (generates new token) to join an existing Kubernetes cluster (used in [vagrant kubernetes](https://github.com/HariSekhon/DevOps-Bash-tools/tree/master/vagrant/kubernetes) provisioning scripts)
- `kubeadm_join_cmd2.sh` - outputs `kubeadm join` command manually (calculates cert hash + generates new token) to join an existing Kubernetes cluster
- `kubectl_exec.sh` - finds and execs to the first Kubernetes pod matching the given name regex, optionally specifying the container name regex to exec to, and shows the full generated `kubectl exec` command line for clarity
- `kubectl_exec2.sh` - finds and execs to the first Kubernetes pod matching given pod filters, optionally specifying the container to exec to, and shows the full generated `kubectl exec` command line for clarity
- `kubectl_pods_per_node.sh` - lists number of pods per node sorted descending
- `kubectl_pods_important.sh` - lists important pods and their nodes to check on scheduling
- `kubectl_pods_colocated.sh` - lists pods from deployments/statefulsets that are colocated on the same node
- `kubectl_node_labels.sh` - lists nodes and their labels, one per line, easier to read visually or pipe in scripting
- `kubectl_pods_running_with_labels.sh` - lists running pods with labels matching key=value pair arguments
- `kubectl_node_taints.sh` - lists nodes and their taints
- `kubectl_jobs_stuck.sh` - finds Kubernetes jobs stuck for hours or days with no completions
- `kubectl_jobs_delete_stuck.sh` - prompts for confirmation to delete stuck Kubernetes jobs found by script above
- `kubectl_images.sh` - lists Kubernetes container images running on the current cluster
- `kubectl_image_counts.sh` - lists Kubernetes container images running counts sorted descending
- `kubectl_image_deployments.sh` - lists which deployments, statefulsets or daemonsets container images belong to. Useful to find which deployment, statefulset or daemonset to upgrade to replace a container image eg. when replacing deprecated the k8s.gcr.io registry with registry.k8s.io
- `kubectl_pod_count.sh` - lists Kubernetes pods total running count
- `kubectl_pod_labels.sh` - lists Kubernetes pods and their labels, one label per line for easier shell script piping for further actions
- `kubectl_pod_ips.sh` - lists Kubernetes pods and their pod IP addresses
- `kubectl_container_count.sh` - lists Kubernetes containers total running count
- `kubectl_container_counts.sh` - lists Kubernetes containers running counts by name sorted descending
- `kubectl_pods_dump_*.sh` - dump stats / logs / jstacks from all pods matching a given regex and namespace to txt files for support debugging
- `kubectl_pods_dump_stats.sh` - dump stats
- `kubectl_pods_dump_logs.sh` - dump logs
- `kubectl_pods_dump_jstacks.sh` - dump Java jstacks
- `kubectl_pods_dump_all.sh` - calls the above `kubectl_pods_dump_*.sh` scripts for N iterations with a given interval
- `kubectl_empty_namespaces.sh` - finds namespaces without any of the usual objects using `kubectl get all`
- `kubectl_delete_empty_namespaces.sh` - removes empty namespaces, uses `kubectl_empty_namespaces.sh`
- `kubectl_<image>.sh` - quick launch one-off pods for interactive debuggging in Kubernetes
- `kubectl_alpine.sh`
- `kubectl_busybox.sh`
- `kubectl_curl.sh`
- `kubectl_dnsutils.sh`
- `kubectl_gcloud_sdk.sh`
7 months ago
- `kubectl_run_sa.sh` - launch a quick pod with the given service account to test private repo pull & other permissions
- `kubectl_port_forward.sh` - launches `kubectl port-forward` to a given pod's port with an optional label or name filter. If more than one pod is found, prompts with an interactive dialogue to choose one. Optionally automatically opens the forwarded localhost URL in the default browser
- `kubectl_port_forward_spark.sh` - does the above for Spark UI
- `helm_template.sh` - templates a Helm chart for Kustomize deployments
- `kustomize_parse_helm_charts.sh` - parses the [Helm](https://helm.sh/) charts from one or more `kustomization.yaml` files into TSV format for further shell pipe processing
- `kustomize_install_helm_charts.sh` - installs the [Helm](https://helm.sh/) charts from one or more `kustomization.yaml` files the old fashioned Helm CLI way so that tools like [Nova](https://github.com/FairwindsOps/nova) can be used to detect outdated charts (used in [Kubernetes-configs](https://github.com/HariSekhon/Kubernetes-configs) repo's [CI](https://github.com/HariSekhon/Kubernetes-configs/actions/workflows/nova.yaml))
8 months ago
- `kustomize_update_helm_chart_versions.sh` - updates one or more `kustomization.yaml` files to the latest versions of any charts they contain
- `kustomize_materialize.sh` - recursively materializes all `kustomization.yaml` to `kustomization.materialized.yaml` in the same directories for scanning with tools like [Pluto](https://github.com/FairwindsOps/pluto) to detect deprecated API objects inherited from embedded Helm charts. Parallelized for performance
5 months ago
- ArgoCD:
- `argocd_auto_sync.sh` - toggle Auto-sync on/off to allow repairs and maintenance operation for a given app and also disables / re-enables the App-of-Apps base apps to stop then re-enabling the app
- `argocd_apps_sync.sh` - sync's all [ArgoCD](https://argo-cd.readthedocs.io/en/stable/) apps matching an optional ERE regex filter on their names using the ArgoCD CLI
- `argocd_apps_wait_sync.sh` - sync's all [ArgoCD](https://argo-cd.readthedocs.io/en/stable/) apps matching an optional ERE regex filter on their names using the ArgoCD CLI's while also checking their health and operation
- `argocd_generate_resource_whitelist.sh` - generates a yaml cluster and namespace resource whitelist for ArgoCD project config. If given an existing yaml, will merge in its original whitelists, dedupe, and write them back into the file using an in-place edit. Useful because ArgoCD 2.2+ doesn't show resources that aren't explicitly allowed, such as ReplicaSets and Pods
- Pluto:
- `pluto_detect_helm_materialize.sh` - recursively materializes all helm `Chart.yaml` and runs [Pluto](https://github.com/FairwindsOps/pluto) on each directory to work around [this issue](https://github.com/FairwindsOps/pluto/issues/444)
- `pluto_detect_kustomize_materialize.sh` - recursively materializes all `kustomization.yaml` and runs [Pluto](https://github.com/FairwindsOps/pluto) on each directory to work around [this issue](https://github.com/FairwindsOps/pluto/issues/444)
5 months ago
- `pluto_detect_kubectl_dump_objects.sh` - dumps all live Kubernetes objects to /tmp all can run [Pluto](https://github.com/FairwindsOps/pluto) to detect deprecated API objects on the cluster from any source
- Rancher:
- `rancher_api.sh` - queries the Rancher API with authentication
- `rancher_kube_creds.sh` - downloads all Rancher clusters credentials into subdirectories matching cluster names, with `.envrc` in each, so a quick `cd` into one and your kubectl is ready to rock
- see also Google Kubernetes Engine scripts in the [GCP - Google Cloud Platform](https://github.com/HariSekhon/DevOps-Bash-tools/#gcp---google-cloud-platform) section above
- see also the [Kubernetes configs](https://github.com/HariSekhon/Kubernetes-configs) repo
4 months ago
See also [Knowledge Base notes for Kubernetes](https://github.com/HariSekhon/Knowledge-Base/blob/main/kubernetes.md).
### Docker
`docker/` directory:
- `docker_*.sh` / `dockerhub_*.sh` - [Docker](https://www.docker.com/) / [DockerHub](https://hub.docker.com/) API scripts:
- `dockerhub_api.sh` - queries DockerHub API v2 with or without authentication (`$DOCKERHUB_USER` & `$DOCKERHUB_PASSWORD` / `$DOCKERHUB_TOKEN`)
- `docker_api.sh` - queries a Docker Registry with optional basic authentication if `$DOCKER_USER` & `$DOCKER_PASSWORD` are set
3 months ago
- `docker_build_hashref.sh` - runs `docker build` and auto-generates docker image name and tag from relative Git path and commit short SHA hashref and a dirty sha suffix if git contents are modified. Useful to compare docker image sizes between your clean and modified versions of `Dockerfile` or contents
- `docker_package_check.sh` - runs package installs on all the major versions of a given docker image to check given packages are available before adding them and breaking builds across distro versions
- `docker_registry_list_images.sh` - lists images in a given private Docker Registry
- `docker_registry_list_tags.sh` - lists tags for a given image in a private Docker Registry
- `docker_registry_get_image_manifest.sh` - gets a given image:tag manifest from a private Docker Registry
- `docker_registry_tag_image.sh` - tags a given image with a new tag in a private Docker Registry via the API without pulling and pushing the image data (must faster and more efficient)
- `dockerhub_list_tags.sh` - lists tags for a given DockerHub repo. See also [dockerhub_show_tags.py](https://github.com/HariSekhon/DevOps-Python-tools/blob/master/dockerhub_show_tags.py) in the [DevOps Python tools](https://github.com/HariSekhon/DevOps-Python-tools) repo.
- `dockerhub_list_tags_by_last_updated.sh` - lists tags for a given DockerHub repo sorted by last updated timestamp descending
- `dockerhub_search.sh` - searches with a configurable number of returned items (older docker cli was limited to 25 results)
- `clean_caches.sh` - cleans out OS package and programming language caches, call near end of `Dockerfile` to reduce Docker image size
- see also the [Dockerfiles](https://github.com/HariSekhon/Dockerfiles) repo
- `quay_api.sh` - queries the [Quay.io](https://quay.io/) API with OAuth2 authentication token `$QUAY_TOKEN`
4 months ago
See also [Knowledge Base notes for Docker](https://github.com/HariSekhon/Knowledge-Base/blob/main/docker.md).
### Data
`data/` directory:
- `avro_tools.sh` - runs Avro Tools jar, downloading it if not already present (determines latest version when
downloading)
- `parquet_tools.sh` - runs Parquet Tools jar, downloading it if not already present (determines latest version
when downloading)
- `csv_header_indices.sh` - list CSV headers with their zero indexed numbers, useful reference when coding against
column positions
- Data format validation `validate_*.py` from [DevOps Python Tools repo](https://github.com/HariSekhon/DevOps-Python-tools):
- CSV
- JSON
- [Avro](https://avro.apache.org/)
- [Parquet](https://parquet.apache.org/)
- INI / Properties files (Java)
- LDAP LDIF
- XML
- YAML
- `json2yaml.sh` - converts JSON to YAML
- `yaml2json.sh` - converts YAML to JSON - needed for some APIs like GitLab CI linting (see [Gitlab](https://github.com/HariSekhon/DevOps-Bash-tools#git---github-gitlab-bitbucket-azure-devops) section above)
### Big Data & NoSQL
`bigdata/` and `kafka/` directories:
2 years ago
- `kafka_*.sh` - scripts to make [Kafka](http://kafka.apache.org/) CLI usage easier including auto-setting Kerberos to source TGT from environment and auto-populating broker and zookeeper addresses. These are auto-added to the `$PATH` when `.bashrc` is sourced. For something similar for [Solr](https://lucene.apache.org/solr/), see `solr_cli.pl` in the [DevOps Perl Tools](https://github.com/HariSekhon/DevOps-Perl-tools) repo.
- `zookeeper*.sh` - [Apache ZooKeeper](https://zookeeper.apache.org/) scripts:
- `zookeeper_client.sh` - shortens `zookeeper-client` command by auto-populating the zookeeper quorum from the environment variable `$ZOOKEEPERS` or else parsing the zookeeper quorum from `/etc/**/*-site.xml` to make it faster and easier to connect
- `zookeeper_shell.sh` - shortens Kafka's `zookeeper-shell` command by auto-populating the zookeeper quorum from the environment variable `$KAFKA_ZOOKEEPERS` and optionally `$KAFKA_ZOOKEEPER_ROOT` to make it faster and easier to connect
- `hive_*.sh` / `beeline*.sh` - [Apache Hive](https://hive.apache.org/) scripts:
- `beeline.sh` - shortens `beeline` command to connect to [HiveServer2](https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Overview) by auto-populating Kerberos and SSL settings, zookeepers for HiveServer2 HA discovery if the environment variable `$HIVE_HA` is set or using the `$HIVESERVER_HOST` environment variable so you can connect with no arguments (prompts for HiveServer2 address if you haven't set `$HIVESERVER_HOST` or `$HIVE_HA`)
- `beeline_zk.sh` - same as above for [HiveServer2](https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Overview) HA by auto-populating SSL and ZooKeeper service discovery settings (specify `$HIVE_ZOOKEEPERS` environment variable to override). Automatically called by `beeline.sh` if either `$HIVE_ZOOKEEPERS` or `$HIVE_HA` is set (the latter parses `hive-site.xml` for the ZooKeeper addresses)
- `hive_foreach_table.sh` - executes a SQL query against every table, replacing `{db}` and `{table}` in each iteration eg. `select count(*) from {table}`
- `hive_list_databases.sh` - list Hive databases, one per line, suitable for scripting pipelines
- `hive_list_tables.sh` - list Hive tables, one per line, suitable for scripting pipelines
- `hive_tables_metadata.sh` - lists a given DDL metadata field for each Hive table (to compare tables)
- `hive_tables_location.sh` - lists the data location per Hive table (eg. compare external table locations)
- `hive_tables_row_counts.sh` - lists the row count per Hive table
- `hive_tables_column_counts.sh` - lists the column count per Hive table
- ` impala*.sh` - [Apache Impala](https://impala.apache.org/) scripts:
- `impala_shell.sh` - shortens `impala-shell` command to connect to [Impala](https://impala.apache.org/) by parsing the Hadoop topology map and selecting a random datanode to connect to its Impalad, acting as a cheap CLI load balancer. For a real load balancer see [HAProxy config for Impala](https://github.com/HariSekhon/HAProxy-configs) (and many other Big Data & NoSQL technologies). Optional environment variables `$IMPALA_HOST` (eg. point to an explicit node or an HAProxy load balancer) and `IMPALA_SSL=1` (or use regular impala-shell `--ssl` argument pass through)
- `impala_foreach_table.sh` - executes a SQL query against every table, replacing `{db}` and `{table}` in each iteration eg. `select count(*) from {table}`
- `impala_list_databases.sh` - list Impala databases, one per line, suitable for scripting pipelines
- `impala_list_tables.sh` - list Impala tables, one per line, suitable for scripting pipelines
- `impala_tables_metadata.sh` - lists a given DDL metadata field for each Impala table (to compare tables)
- `impala_tables_location.sh` - lists the data location per Impala table (eg. compare external table locations)
- `impala_tables_row_counts.sh` - lists the row count per Impala table
- `impala_tables_column_counts.sh` - lists the column count per Impala table
- `hdfs_*.sh` - Hadoop [HDFS](https://en.wikipedia.org/wiki/Apache_Hadoop#Hadoop_distributed_file_system) scripts:
- `hdfs_checksum*.sh` - walks an HDFS directory tree and outputs HDFS native checksums (faster) or portable externally comparable CRC32, in serial or in parallel to save time
2 years ago
- `hdfs_find_replication_factor_1.sh` / `hdfs_set_replication_factor_3.sh` - finds HDFS files with replication factor 1 / sets HDFS files with replication factor <=2 to replication factor 3 to repair replication safety and avoid no replica alarms during maintenance operations (see also Python API version in the [DevOps Python Tools](https://github.com/HariSekhon/DevOps-Python-tools) repo)
- `hdfs_file_size.sh` / `hdfs_file_size_including_replicas.sh` - quickly differentiate HDFS files raw size vs total replicated size
- `hadoop_random_node.sh` - picks a random Hadoop cluster worker node, like a cheap CLI load balancer, useful in scripts when you want to connect to any worker etc. See also the read [HAProxy Load Balancer configurations](https://github.com/HariSekhon/HAProxy-configs) which focuses on master nodes
- `cloudera_*.sh` - [Cloudera](https://www.cloudera.com/) scripts:
- `cloudera_manager_api.sh` - script to simplify querying [Cloudera Manager](https://www.cloudera.com/products/product-components/cloudera-manager.html) API using environment variables, prompts, authentication and sensible defaults. Built on top of `curl_auth.sh`
- `cloudera_manager_impala_queries*.sh` - queries [Cloudera Manager](https://www.cloudera.com/products/product-components/cloudera-manager.html) for recent [Impala](https://impala.apache.org/) queries, failed queries, exceptions, DDL statements, metadata stale errors, metadata refresh calls etc. Built on top of `cloudera_manager_api.sh`
- `cloudera_manager_yarn_apps.sh` - queries [Cloudera Manager](https://www.cloudera.com/products/product-components/cloudera-manager.html) for recent [Yarn](https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/YARN.html) apps. Built on top of `cloudera_manager_api.sh`
- `cloudera_navigator_api.sh` - script to simplify querying [Cloudera Navigator](https://www.cloudera.com/products/product-components/cloudera-navigator.html) API using environment variables, prompts, authentication and sensible defaults. Built on top of `curl_auth.sh`
- `cloudera_navigator_audit_logs.sh` - fetches [Cloudera Navigator](https://www.cloudera.com/products/product-components/cloudera-navigator.html) audit logs for given service eg. hive/impala/hdfs via the API, simplifying date handling, authentication and common settings. Built on top of `cloudera_navigator_api.sh`
- `cloudera_navigator_audit_logs_download.sh` - downloads [Cloudera Navigator](https://www.cloudera.com/products/product-components/cloudera-navigator.html) audit logs for each service by year. Skips existing logs, deletes partially downloaded logs on failure, generally retry safe (while true, Control-C, not `kill -9` obviously). Built on top of `cloudera_navigator_audit_logs.sh`
4 months ago
See also [Knowledge Base notes for Hadoop](https://github.com/HariSekhon/Knowledge-Base/blob/main/hadoop.md).
### Git - GitHub, GitLab, Bitbucket, Azure DevOps
`git/`, `github/`, `gitlab/`, `bitbucket/` and `azure_devops/` directories:
- `git/*.sh` - [Git](https://git-scm.com/) scripts:
- `precommit_run_changed_files.sh` - runs pre-commit on all files changed on the current branch vs the default branch. Useful to reproduce `pre-commit` checks that are failing in pull requests to get your PRs to pass
- `git_diff_commit.sh` - runs git diff and commit with a generic `"updated $filename"` commit message for each file in the files or directories given if they have changed, or all committed files under `$PWD` if no args are given. Super convenient for fast commits on the command line, and in vim and IDEs via hotkeys
- `git_review_push.sh` - shows diff of what would be pushed upstream and prompts to push. Convenient for fast reviewed pushes via vim or IDEs hotkeys
- `git_branch_delete_squash_merged.sh` - carefully detects if a squash merged branch you want to delete has no changes with the default trunk branch before deleting it.
See [Squash Merges](https://github.com/HariSekhon/Knowledge-Base/blob/main/git.md#squash-merges-require-force-deleting-branches) in knowledge-base about why this is necessary.
- `git_foreach_branch.sh` - executes a command on all branches (useful in heavily version branched repos like in my [Dockerfiles](https://github.com/HariSekhon/Dockerfiles) repo)
- `git_foreach_repo.sh` - executes a command against all adjacent repos from a given repolist (used heavily by many adjacent scripts)
- `git_foreach_modified.sh` - executes a command against each file with git modified status
- `git_merge_all.sh` / `git_merge_master.sh` / `git_merge_master_pull.sh` - merges updates from master branch to all other branches to avoid drift on longer lived feature branches / version branches (eg. [Dockerfiles](https://github.com/HariSekhon/Dockerfiles) repo)
- `git_remotes_add_origin_providers.sh` - auto-creates remotes for the 4 major public repositories ([GitHub](https://github.com/)/[GitLab](https://gitlab.com/)/[Bitbucket](https://bitbucket.org)/[Azure DevOps](https://dev.azure.com/)), useful for `git pull -all` to fetch and merge updates from all providers in one command
- `git_remotes_set_multi_origin.sh` - sets up multi-remote origin for unified push to automatically keep the 4 major public repositories in sync (especially useful for [Bitbucket](https://bitbucket.org) and [Azure DevOps](https://dev.azure.com/) which don't have [GitLab](https://gitlab.com/)'s auto-mirroring from [GitHub](https://github.com/) feature)
8 months ago
- `git_remotes_set_https_to_ssh.sh` - converts local repo's remote URLs from https to ssh (more convenient with SSH keys instead of https auth tokens, especially since Azure DevOps expires personal access tokens every year)
- `git_remotes_set_ssh_to_https.sh` - converts local repo's remote URLs from ssh to https (to get through corporate firewalls or hotels if you travel a lot)
- `git_remotes_set_https_creds_helpers.sh` - adds Git credential helpers configuration to the local git repo to use http API tokens dynamically from environment variables if they're set
- `git_repos_pull.sh` - pull multiple repos based on a source file mapping list - useful for easily sync'ing lots of Git repos among computers
- `git_repos_update.sh` - same as above but also runs the `make update` build to install the latest dependencies, leverages the above script
- `git_grep_env_vars.sh` - find environment variables in the current git repo's code base in the format `SOME_VAR` (useful to find undocumented environment variables in internal or open source projects such as ArgoCD eg. [argoproj/argocd-cd #8680](https://github.com/argoproj/argo-cd/pull/8680))
- `git_log_empty_commits.sh` - find empty commits in git history (eg. if a `git filter-branch` was run but `--prune-empty` was forgotten, leaking metadata like subjects containing file names or other sensitive info)
- `git_files_in_history.sh` - finds all filename / file paths in the git log history, useful for prepping for `git filter-branch`
8 months ago
- `git_filter_branch_fix_author.sh` - rewrites Git history to replace author/committer name & email references (useful to replace default account commits). Powerful, read `--help` and `man git-filter-branch` carefully. Should only be used by Git Experts
- `git_filter_repo_replace_text.sh` - rewrites Git history to replace a given text to scrub a credential or other sensitive token from history. Refuses to operate on tokens less than 8 chars for safety
- `git_tag_release.sh` - creates a Git tag, auto-incrementing a `.N` suffix on the year/month/day date format if no exact version given
- `git_submodules_update_repos.sh` - updates submodules (pulls and commits latest upstream github repo submodules) - used to cascade submodule updates throughout all my repos
- `git_askpass.sh` - credential helper script to use environment variables for git authentication
- `markdown_generate_index.sh` - generates a markdown index list from the headings in a given markdown file such as README.md
- `markdown_replace_index.sh` - replaces a markdown index section in a given markdown file using `markdown_generate_index.sh`
- `github/*.sh` - [GitHub](https://github.com/) API / CLI scripts:
- `github_api.sh` - queries the GitHub [API](https://docs.github.com/en/rest/reference). Can infer GitHub user, repo and authentication token from local checkout or environment (`$GITHUB_USER`, `$GITHUB_TOKEN`)
- `github_install_binary.sh` - installs a binary from GitHub releases into $HOME/bin or /usr/local/bin. Auto-determines the latest release if no version specified, detects and unpacks any tarball or zip files
- `github_foreach_repo.sh` - executes a templated command for each non-fork GitHub repo, replacing the `{owner}`/`{name}` or `{repo}` placeholders in each iteration
- `github_clone_or_pull_all_repos.sh` - git clones or pulls all repos for a user or organization into directories of the same name under the current directory
- `github_download_release_file.sh` - downloads a file from GitHub Releases, optionally determining the latest version, uses `bin/download_url_file.sh`
- `github_download_release_jar.sh` - downloads a JAR file from GitHub Releases (used by `install/download_*_jar.sh` for things like [JDBC](https://github.com/HariSekhon/Knowledge-Base/blob/main/jdbc.md) drivers or [Java](#java) [decompilers](https://github.com/HariSekhon/Knowledge-Base/blob/main/java.md#java-decompilers)), optionally determines latest version to download, and finally validates the downloaded file's format
- `github_invitations.sh` - lists / accepts repo invitations. Useful to accept a large number of invites to repos generated by automation
- `github_mirror_repos_to_gitlab.sh` - creates/syncs GitHub repos to GitLab for migrations or to cron fast free Disaster Recovery, including all branches and tags, plus the repo descriptions. Note this doesn't include PRs/wikis/releases
- `github_mirror_repos_to_bitbucket.sh` - creates/syncs GitHub repos to BitBucket for migrations or to cron fast free Disaster Recovery, including all branches and tags, plus the repo descriptions. Note this doesn't include PRs/wikis/releases
- `github_mirror_repos_to_aws_codecommit.sh` - creates/syncs GitHub repos to AWS CodeCommit for migrations or to cron fast almost free Disaster Recovery (close to $0 compared to $100-400+ per month for [Rewind BackHub](https://rewind.com/products/backups/github/)), including all branches and tags, plus the repo descriptions. Note this doesn't include PRs/wikis/releases
- `github_mirror_repos_to_gcp_source_repos.sh` - creates/syncs GitHub repos to GCP Source Repos for migrations or to cron fast almost free Disaster Recovery (close to $0 compared to $100-400+ per month for [Rewind BackHub](https://rewind.com/products/backups/github/)), including all branches and tags. Note this doesn't include repo description/PRs/wikis/releases
- `github_pull_request_create.sh` - creates a Pull Request idempotently by first checking for an existing PR between the branches, and also checking if there are the necessary commits between the branches, to avoid common errors from blindly raising PRs. Useful to automate code promotion across environment branches. Also works across repo forks and is used by `github_repo_fork_update.sh`. Even populates github pull request template and does Jira ticket number replacement from branch prefix
- `github_pull_request_preview.sh` - opens a GitHub Pull Request preview page from the current local branch to the given or default branch
- `github_push_pr_preview.sh` - pushes to GitHub origin, sets upstream branch, then open a Pull Request preview from current branch to the given or default trunk branch in your browser
7 months ago
- `github_push_pr.sh` - pushes to GitHub origin, sets upstream branch, then idemopotently creates a Pull Request from current branch to the given or default trunk branch and opens the generated PR in your browser for review
- `github_merge_branch.sh` - merges one branch into another branch via a Pull Request for full audit tracking all changes. Useful to automate feature PRs, code promotion across environment branches, or backport hotfixes from Production or Staging to trunk branches such as master, main, dev or develop
2 months ago
- `github_remote_set_upstream.sh` - in a forked GitHub repo's checkout, determine the origin of the fork using GitHub CLI and configure a git remote to the upstream. Useful to be able to easily pull updates from the original source repo
- `github_pull_merge_trunk.sh` - pulls the origin or fork upstream repo's trunk branch and merges it into the local branch, In a forked GitHub repo's checkout, determines the origin of the fork using GitHub CLI, configures a git remote to the upstream, pulls the default branch and if on a branch other than the default then merges the default branch to the local current branch. Simplifies and automates keeping your checkout or forked repo up to date with the original source repo to quickly resolve merge conflicts locally and submit updated Pull Requests
2 months ago
- `github_forked_add_remote.sh` - quickly adds a forked repo as a remote from an interactive men list of forked repos
2 months ago
- `github_forked_checkout_branch.sh` - quickly check out a forked repo's branch from an interactive menu lists of forked repos and their branches
- `github_actions_foreach_workflow.sh` - executes a templated command for each workflow in a given GitHub repo, replacing `{name}`, `{id}` and `{state}` in each iteration
- `github_actions_aws_create_load_credential.sh` - creates an AWS user with group/policy, generates and downloads access keys, and uploads them to the given repo
- `github_actions_in_use.sh` - lists GitHub Actions directly referenced in the .github/workflows in the current local repo checkout
- `github_actions_in_use_repo.sh` - lists GitHub Actions for a given repo via the API, including following imported reusable workflows
- `github_actions_in_use_across_repos.sh` - lists GitHub Actions in use across all your repos
- `github_actions_repos_lockdown.sh` - secures GitHub Actions settings across all user repos to only GitHub, verified partners and selected 3rd party actions
- `github_actions_repo_set_secret.sh` - sets a secret in the given repo from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `github_actions_repo_env_set_secret.sh` - sets a secret in the given repo and environment from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `github_actions_repo_secrets_overriding_org.sh` - finds any secrets for a repo that are overriding organization level secrets. Useful to combine with `github_foreach_repo.sh` for auditing
- `github_actions_repo_restrict_actions.sh` - restricts GitHub Actions in the given repo to only running actions from GitHub and verfied partner companies (.eg AWS, Docker)
- `github_actions_repo_actions_allow.sh` - allows select 3rd party GitHub Actions in the given repo
- `github_actions_runner.sh` - generates a [GitHub Actions](https://github.com/features/actions) self-hosted runner token for a given Repo or Organization via the GitHub API and then runs a dockerized GitHub Actions runner with the appropriate configuration
- `github_actions_runner_local.sh` - downloads, configures and runs a local GitHub Actions Runner for Linux or Mac
- `github_actions_runner_token.sh` - generates a GitHub Actions runner token to register a new self-hosted runner
- `github_actions_runners.sh` - lists GitHub Actions self-hosted runners for a given Repo or Organization
- `github_actions_delete_offline_runners.sh` - deletes offline GitHub Actions self-hosted runners. Useful to clean up short-lived runners eg. Docker, Kubernetes
- `github_actions_workflows.sh` - lists GitHub Actions workflows for a given repo (or auto-infers local repository)
- `github_actions_workflow_runs.sh` - lists GitHub Actions workflow runs for a given workflow id or name
- `github_actions_workflows_status.sh` - lists all GitHub Actions workflows and their statuses for a given repo
- `github_actions_workflows_state.sh` - lists GitHub Actions workflows enabled/disabled states (GitHub now disables workflows after 6 months without a commit)
- `github_actions_workflows_disabled.sh` - lists GitHub Actions workflows that are disabled. Combine with `github_foreach_repo.sh` to scan all repos to find disabled workflows
- `github_actions_workflow_enable.sh` - enables a given GitHub Actions workflow
- `github_actions_workflows_enable_all.sh` - enables all GitHub Actions workflows in a given repo. Useful to undo GitHub disabling all workflows in a repo after 6 months without a commit
- `github_actions_workflows_trigger_all.sh` - triggers all workflows for the given repo
- `github_actions_workflows_cancel_all_runs.sh` - cancels all workflow runs for the given repo
- `github_actions_workflows_cancel_waiting_runs.sh` - cancels workflow runs that are in waiting state, eg. waiting for old deployment approvals
- `github_ssh_get_user_public_keys.sh` - fetches a given GitHub user's public SSH keys via the API for piping to `~/.ssh/authorized_keys` or adjacent tools
- `github_ssh_get_public_keys.sh` - fetches the currently authenticated GitHub user's public SSH keys via the API, similar to above but authenticated to get identifying key comments
- `github_ssh_add_public_keys.sh` - uploads SSH keys from local files or standard input to the currently authenticated GitHub account. Specify pubkey files (default: `~/.ssh/id_rsa.pub`) or read from standard input for piping from adjacent tools
- `github_ssh_delete_public_keys.sh` - deletes given SSH keys from the currently authenticated GitHub account by key id or title regex match
- `github_gpg_get_user_public_keys.sh` - fetches a given GitHub user's public GPG keys via the API
- `github_generate_status_page.sh` - generates a [STATUS.md](https://harisekhon.github.io/CI-CD/) page by merging all the README.md headers for all of a user's non-forked GitHub repos or a given list of any repos etc.
- `github_purge_camo_cache.sh` - send HTTP Purge requests to all camo urls (badge caches) for the current or given GitHub repo's landing/README.md page
- `github_ip_ranges.sh` - returns GitHub's IP ranges, either all by default or for a select given service such as hooks or actions
- `github_sync_repo_descriptions.sh` - syncs GitHub repo descriptions to GitLab & BitBucket repos
- `github_release.sh` - creates a GitHub Release, auto-incrementing a `.N` suffix on the year/month/day date format if no exact version given
- `github_repo_description.sh` - fetches the given repo's description (used by `github_sync_repo_descriptions.sh`)
- `github_repo_find_files.sh` - finds files matching a regex in the current or given GitHub repo via the GitHub API
- `github_repo_latest_release.sh` - returns the latest release tag for a given GitHub repo via the GitHub API
- `github_repo_latest_release_filter.sh` - returns the latest release tag matching a given regex filter for a given GitHub repo via the GitHub API. Useful for getting the latest version of things like Kustomize which has other releases for kyaml
- `github_repo_stars.sh` - fetches the stars, forks and watcher counts for a given repo
- `github_repo_teams.sh` - fetches the GitHub Enterprise teams and their role permisions for a given repo. Combine with `github_foreach_repo.sh` to audit your all your personal or GitHub organization's repos
- `github_repo_collaborators.sh` - fetches a repo's granted users and outside invited collaborators as well as their role permisions for a given repo. Combine with `github_foreach_repo.sh` to audit your all your personal or GitHub organization's repos
- `github_repo_protect_branches.sh` - enables branch protections on the given repo. Can specify one or more branches to protect, otherwise finds and applies to any of `master`, `main`, `develop`, `dev`, `staging`, `production`
- `github_repos_find_files.sh` - finds files matching a regex across all repos in the current GitHub organization or user account
- `github_repo_fork_sync.sh` - sync's current or given fork, then runs `github_repo_fork_update.sh` to cascade changes to major branches via Pull Requests for auditability
- `github_repo_fork_update.sh` - updates a forked repo by creating pull requests for full audit tracking and auto-merges PRs for non-production branches
- `github_repos_public.sh` - lists public repos for a user or organization. Useful to periodically scan and account for any public repos
- `github_repos_disable_wiki.sh` - disables the Wiki on one or more given repos to prevent documentation fragmentation and make people use the centralized documentation tool eg. Confluence or Slite
- `github_repos_with_few_users.sh` - finds repos with few or no users (default: 1), which in Enterprises is a sign that a user has created a repo without assigning team privileges
- `github_repos_with_few_teams.sh` - finds repos with few or no teams (default: 0), which in Enterprises is a sign that a user has created a repo without assigning team privileges
- `github_repos_without_branch_protections.sh` - finds repos without any branch protection rules (use `github_repo_protect_branches.sh` on such repos)
- `github_repos_not_in_terraform.sh` - finds all non-fork repos for current or given user/organization which are not found in `$PWD/*.tf` Terraform code
- `github_teams_not_in_terraform.sh` - finds all teams for given organization which are not found in `$PWD/*.tf` Terraform code
- `github_repos_sync_status.sh` - determines whether each GitHub repo's mirrors on GitLab / BitBucket / Azure DevOps are up to date with the latest commits, by querying all 3 APIs and comparing master branch hashrefs
- `github_teams_not_idp_synced.sh` - finds GitHub teams that aren't sync'd from an IdP like Azure AD. These should usually be migrated or removed
- `github_user_repos_stars.sh` - fetches the total number of stars for all original source public repos for a given user
- `github_user_repos_forks.sh` - fetches the total number of forks for all original source public repos for a given user
- `github_user_repos_count.sh` - fetches the total number of original source public repos for a given username
- `github_user_followers.sh` - fetches the number of followers for a given username
- `gitlab/*.sh` - [GitLab](https://gitlab.com/) API scripts:
- `gitlab_api.sh` - queries the GitLab [API](https://docs.gitlab.com/ee/api/api_resources.html). Can infer GitLab user, repo and authentication token from local checkout or environment (`$GITLAB_USER`, `$GITLAB_TOKEN`)
8 months ago
- `gitlab_install_binary.sh` - installs a binary from GitLab releases into $HOME/bin or /usr/local/bin. Auto-determines the latest release if no version specified, detects and unpacks any tarball or zip files
7 months ago
- `gitlab_push_mr_preview.sh` - pushes to GitLab origin, sets upstream branch, then open a Merge Request preview from current to default branch
- `github_push_mr.sh` - pushes to GitLab origin, sets upstream branch, then idemopotently creates a Merge Request from current branch to the given or default trunk branch and opens the generated MR in your browser for review
- `gitlab_foreach_repo.sh` - executes a templated command for each GitLab project/repo, replacing the `{user}` and `{project}` in each iteration
8 months ago
- `gitlab_project_latest_release.sh` - returns the latest release tag for a given GitLab project (repo) via the GitLab API
- `gitlab_project_set_description.sh` - sets the description for one or more projects using the GitLab API
- `gitlab_project_set_env_vars.sh` - adds / updates GitLab project-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `gitlab_group_set_env_vars.sh` - adds / updates GitLab group-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `gitlab_project_create_import.sh` - creates a GitLab repo as an import from a given URL, and mirrors if on GitLab Premium (can only manually configure for public repos on free tier, API doesn't support configuring even public repos on free)
- `gitlab_project_protect_branches.sh` - enables branch protections on the given project. Can specify one or more branches to protect, otherwise finds and applies to any of `master`, `main`, `develop`, `dev`, `staging`, `production`
- `gitlab_project_mirrors.sh` - lists each GitLab repo and whether it is a mirror or not
- `gitlab_pull_mirror.sh` - trigger a GitLab pull mirroring for a given project's repo, or auto-infers project name from the local git repo
- `gitlab_ssh_get_user_public_keys.sh` - fetches a given GitLab user's public SSH keys via the API, with identifying comments, for piping to `~/.ssh/authorized_keys` or adjacent tools
- `gitlab_ssh_get_public_keys.sh` - fetches the currently authenticated GitLab user's public SSH keys via the API
- `gitlab_ssh_add_public_keys.sh` - uploads SSH keys from local files or standard input to the currently authenticated GitLab account. Specify pubkey files (default: `~/.ssh/id_rsa.pub`) or read from standard input for piping from adjacent tools
- `gitlab_ssh_delete_public_keys.sh` - deletes given SSH keys from the currently authenticated GitLab account by key id or title regex match
- `gitlab_validate_ci_yaml.sh` - validates a `.gitlab-ci.yml` file via the GitLab API
- `bitbucket/*.sh` - [BitBucket](https://bitbucket.org/) API scripts:
- `bitbucket_api.sh` - queries the BitBucket [API](https://developer.atlassian.com/bitbucket/api/2/reference/resource/). Can infer BitBucket user, repo and authentication token from local checkout or environment (`$BITBUCKET_USER`, `$BITBUCKET_TOKEN`)
- `bitbucket_foreach_repo.sh` - executes a templated command for each BitBucket repo, replacing the `{user}` and `{repo}` in each iteration
- `bitbucket_workspace_set_env_vars.sh` - adds / updates Bitbucket workspace-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `bitbucket_repo_set_env_vars.sh` - adds / updates Bitbucket repo-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `bitbucket_repo_set_description.sh` - sets the description for one or more repos using the BitBucket API
- `bitbucket_enable_pipelines.sh` - enables the CI/CD pipelines for all repos
- `bitbucket_disable_pipelines.sh` - disables the CI/CD pipelines for all repos
- `bitbucket_repo_enable_pipeline.sh` - enables the CI/CD pipeline for a given repo
- `bitbucket_repo_disable_pipeline.sh` - disables the CI/CD pipeline for a given repo
- `bitbucket_ssh_get_public_keys.sh` - fetches the currently authenticated BitBucket user's public SSH keys via the API for piping to `~/.ssh/authorized_keys` or adjacent tools
- `bitbucket_ssh_add_public_keys.sh` - uploads SSH keys from local files or standard input to the currently authenticated BitBucket account. Specify pubkey files (default: `~/.ssh/id_rsa.pub`) or read from standard input for piping from adjacent tools
- `bitbucket_ssh_delete_public_keys.sh` - uploads SSH keys from local files or standard input to the currently authenticated BitBucket account. Specify pubkey files (default: `~/.ssh/id_rsa.pub`) or read from standard input for piping from adjacent tools
4 months ago
See also [Knowledge Base notes for Git](https://github.com/HariSekhon/Knowledge-Base/blob/main/git.md).
### CI/CD - Continuous Integration / Continuous Deployment
`jenkins/`, `terraform/`, `teamcity/`, `buildkite/`, `circlci/`, `travis/`, `azure_devops/`, ..., `cicd/` directories:
- `appveyor_api.sh` - queries [AppVeyor](https://www.appveyor.com/)'s API with authentication
- `azure_devops/*.sh` - [Azure DevOps](https://dev.azure.com/) scripts:
- `azure_devops_api.sh` - queries Azure DevOps's API with authentication
- `azure_devops_foreach_repo.sh` - executes a templated command for each Azure DevOps repo, replacing `{user}`, `{org}`, `{project}` and `{repo}` in each iteration
- `azure_devops_to_github_migration.sh` - migrates one or all Azure DevOps git repos to GitHub, including all branches and sets the default branch to match via the APIs to maintain the same checkout behaviour
- `azure_devops_disable_repos.sh` - disables one or more given Azure DevOps repos (to prevent further pushes to them after migration to GitHub)
- `circleci/*.sh` - [CircleCI](https://circleci.com/) scripts:
- `circleci_api.sh` - queries CircleCI's API with authentication
- `circleci_project_set_env_vars.sh` - adds / updates CircleCI project-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `circleci_context_set_env_vars.sh` - adds / updates CircleCI context-level environment variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `circleci_project_delete_env_vars.sh` - deletes CircleCI project-level environment variable(s) via the API
- `circleci_context_delete_env_vars.sh` - deletes CircleCI context-level environment variable(s) via the API
- `circleci_local_execute.sh` - installs CircleCI CLI and executes `.circleci/config.yml` locally
- `circleci_public_ips.sh` - lists [CircleCI](https://circleci.com) public IP addresses via dnsjson.com
- `codeship_api.sh` - queries [CodeShip](https://codeship.com/)'s API with authentication
- `drone_api.sh` - queries [Drone.io](https://drone.io/)'s API with authentication
- `shippable_api.sh` - queries [Shippable](https://www.shippable.com/)'s API with authentication
- `wercker_app_api.sh` - queries [Wercker](https://app.wercker.com/)'s Applications API with authentication
- `gocd_api.sh` - queries [GoCD](https://www.gocd.org/)'s API
- `gocd.sh` - one-touch [GoCD CI](https://www.gocd.org/):
- launches in Docker
- (re)creates config repo (`$PWD/setup/gocd_config_repo.json`) from which to source pipeline(s) (`.gocd.yml`)
- detects and enables agent(s) to start building
- call from any repo top level directory with a `.gocd.yml` config (all mine have it), mimicking structure of fully managed CI systems
- `concourse.sh` - one-touch [Concourse CI](https://concourse-ci.org/):
- launches in Docker
- configures pipeline from `$PWD/.concourse.yml`
- triggers build
- tails results in terminal
- prints recent build statuses at end
- call from any repo top level directory with a `.concourse.yml` config (all mine have it), mimicking structure of fully managed CI systems
- `fly.sh` - shortens [Concourse](https://concourse-ci.org/) `fly` command to not have to specify target all the time
- `jenkins/*.sh` - [Jenkins CI](https://jenkins.io/) scripts:
- `jenkins.sh` - one-touch [Jenkins CI](https://jenkins.io/):
- launches Docker container
- installs plugins
- validates `Jenkinsfile`
- configures job from `$PWD/setup/jenkins-job.xml`
- sets Pipeline to git remote origin's `Jenkinsfile`
- triggers build
- tails results in terminal
- call from any repo top level directory with a `Jenkinsfile` pipeline and `setup/jenkins-job.xml` (all mine have it)
- `jenkins_api.sh` - queries the Jenkins Rest API, handles authentication, pre-fetches CSFR protection token crumb, supports many environment variables such as `$JENKINS_URL` for ease of use
- `jenkins_jobs.sh` - lists Jenkins jobs (pipelines)
- `jenkins_foreach_job.sh` - runs a templated command for each Jenkins job
- `jenkins_jobs_download_configs.sh` - downloads all Jenkins job configs to xml files of the same name
- `jenkins_job_config.sh` - gets or sets a Jenkins job's config
- `jenkins_job_description.sh` - gets or sets a Jenkins job's description
- `jenkins_job_enable.sh` - enables a Jenkins job by name
- `jenkins_job_disable.sh` - disables a Jenkins job by name
- `jenkins_job_trigger.sh` - triggers a Jenkins job by name
- `jenkins_job_trigger_with_params.sh` - triggers a Jenkins job with parameters which can be passed as `--data KEY=VALUE`
7 months ago
- `jenkins_jobs_enable.sh` - enables all Jenkins jobs/pipelines with names matching a given regex
- `jenkins_jobs_disable.sh` - disables all Jenkins jobs/pipelines with names matching a given regex
- `jenkins_builds.sh` - lists Jenkins latest builds for every job
- `jenkins_cred_add_cert.sh` - creates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_add_kubernetes_sa.sh` - creates a Jenkins Kubernetes service account credential
- `jenkins_cred_add_secret_file.sh` - creates a Jenkins secret file credential from a file
- `jenkins_cred_add_secret_text.sh` - creates a Jenkins secret string credential from a string or a file
- `jenkins_cred_add_ssh_key.sh` - creates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_add_user_pass.sh` - creates a Jenkins username/password credential
- `jenkins_cred_delete.sh` - deletes a given Jenkins credential by id
- `jenkins_cred_list.sh` - lists Jenkins credentials IDs and Names
- `jenkins_cred_update_cert.sh` - updates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_update_kubernetes_sa.sh` - updates a Jenkins Kubernetes service account credential
- `jenkins_cred_update_secret_file.sh` - updates a Jenkins secret file credential from a file
- `jenkins_cred_update_secret_text.sh` - updates a Jenkins secret string credential from a string or a file
- `jenkins_cred_update_ssh_key.sh` - updates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_update_user_pass.sh` - updates a Jenkins username/password credential
- `jenkins_cred_set_cert.sh` - creates or updates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_set_kubernetes_sa.sh` - creates or updates a Jenkins Kubernetes service account credential
- `jenkins_cred_set_secret_file.sh` - creates or updates a Jenkins secret file credential from a file
- `jenkins_cred_set_secret_text.sh` - creates or updates a Jenkins secret string credential from a string or a file
- `jenkins_cred_set_ssh_key.sh` - creates or updates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_set_user_pass.sh` - creates or updates a Jenkins username/password credential
- `jenkins_cli.sh` - shortens `jenkins-cli.jar` command by auto-inferring basic configuations, auto-downloading the CLI if absent, inferrings a bunch of Jenkins related variables like `$JENKINS_URL`, `$JENKINS_CLI_ARGS` and authentication using `$JENKINS_USER`/`$JENKINS_PASSWORD`, or finds admin password from inside local docker container. Used heavily by `jenkins.sh` one-shot setup and the following scripts:
- `jenkins_foreach_job_cli.sh` - runs a templated command for each Jenkins job
7 months ago
- `jenkins_create_job_parallel_test_runs.sh` - creates a freestyle parameterized test sleep job and launches N parallel runs of it to test scaling and parallelization of [Jenkins on Kubernetes](https://github.com/HariSekhon/Kubernetes-configs#jenkins-on-kubernetes) agents
7 months ago
- `jenkins_create_job_check_gcp_serviceaccount.sh` - creates a freestyle test job which runs a GCP Metadata query to determine the GCP serviceaccount the agent pod is operating under to check GKE Workload Identity integration
- `jenkins_jobs_download_configs_cli.sh` - downloads all Jenkins job configs to xml files of the same name
- `jenkins_cred_cli_add_cert.sh` - creates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_cli_add_kubernetes_sa.sh` - creates a Jenkins Kubernetes service account credential
- `jenkins_cred_cli_add_secret_file.sh` - creates a Jenkins secret file credential from a file
- `jenkins_cred_cli_add_secret_text.sh` - creates a Jenkins secret string credential from a string or a file
- `jenkins_cred_cli_add_ssh_key.sh` - creates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_cli_add_user_pass.sh` - creates a Jenkins username/password credential
- `jenkins_cred_cli_delete.sh` - deletes a given Jenkins credential by id
- `jenkins_cred_cli_list.sh` - lists Jenkins credentials IDs and Names
- `jenkins_cred_cli_update_cert.sh` - updates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_cli_update_kubernetes_sa.sh` - updates a Jenkins Kubernetes service account credential
- `jenkins_cred_cli_update_secret_file.sh` - updates a Jenkins secret file credential from a file
- `jenkins_cred_cli_update_secret_text.sh` - updates a Jenkins secret string credential from a string or a file
- `jenkins_cred_cli_update_ssh_key.sh` - updates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_cli_update_user_pass.sh` - updates a Jenkins username/password credential
- `jenkins_cred_cli_set_cert.sh` - creates or updates a Jenkins certificate credential from a PKCS#12 keystore
- `jenkins_cred_cli_set_kubernetes_sa.sh` - creates or updates a Jenkins Kubernetes service account credential
- `jenkins_cred_cli_set_secret_file.sh` - creates or updates a Jenkins secret file credential from a file
- `jenkins_cred_cli_set_secret_text.sh` - creates or updates a Jenkins secret string credential from a string or a file
- `jenkins_cred_cli_set_ssh_key.sh` - creates or updates a Jenkins SSH key credential from a string or an SSH private key file
- `jenkins_cred_cli_set_user_pass.sh` - creates or updates a Jenkins username/password credential
- `jenkins_password.sh` - gets Jenkins admin password from local docker container. Used by `jenkins_cli.sh`
- `jenkins_plugins_latest_versions.sh` - finds the latest versions of given Jenkins plugins. Useful to programmatically upgrade your Jenkins on Kubernetes plugins defined in [values.yaml](https://github.com/HariSekhon/Kubernetes-configs/blob/6d9e34b74d3fa8f353b0fe56e74cea3af439e01a/jenkins/base/values.yaml#L145)
- `check_jenkinsfiles.sh` - validates all `*Jenkinsfile*` files in the given directory trees using the online Jenkins validator
4 months ago
- See also [Knowledge Base notes for Jenkins](https://github.com/HariSekhon/Knowledge-Base/blob/main/jenkins.md).
- `teamcity/*.sh` - [TeamCity CI](https://www.jetbrains.com/teamcity/) scripts:
- `teamcity.sh` - one-touch [TeamCity CI](https://www.jetbrains.com/teamcity/) cluster:
- launches Docker containers with server and 1 agent
- click proceed and accept the EULA
- waits for server to initialize
- waits for agent to register
- authorizes agent
- creates a VCS Root if `$PWD` has a `.teamcity.vcs.json` / `.teamcity.vcs.ssh.json` / `.teamcity.vcs.oauth.json` and corresponding `$TEAMCITY_SSH_KEY` or `$TEAMCITY_GITHUB_CLIENT_ID`+`$TEAMCITY_GITHUB_CLIENT_SECRET` environment variables
- creates a Project and imports all settings and builds from the VCS Root
- creates an admin user and an API token for you
- see also: [TeamCity CI](https://github.com/HariSekhon/TeamCity-CI) config repo for importing pipelines
- `teamcity_api.sh` - queries TeamCity's API, auto-handling authentication and other quirks of the API
- `teamcity_create_project.sh` - creates a TeamCity project using the API
- `teamcity_create_github_oauth_connection.sh` - creates a TeamCity GitHub OAuth VCS connection in the Root project, useful for bootstrapping projects from VCS configs
- `teamcity_create_vcs_root.sh` - creates a TeamCity VCS root from a save configuration (XML or JSON), as downloaded by `teamcity_export_vcs_roots.sh`
- `teamcity_upload_ssh_key.sh` - uploads an SSH private key to a TeamCity project (for use in VCS root connections)
- `teamcity_agents.sh` - lists TeamCity agents, their connected state, authorized state, whether enabled and up to date
- `teamcity_builds.sh` - lists the last 100 TeamCity builds along with the their state (eg. `finished`) and status (eg. `SUCCESS`/`FAILURE`)
- `teamcity_buildtypes.sh` - lists TeamCity buildTypes (pipelines) along with the their project and IDs
- `teamcity_buildtype_create.sh` - creates a TeamCity buildType from a local JSON configuration (see `teamcity_buildtypes_download.sh`)
- `teamcity_buildtype_set_description_from_github.sh` - sync's a TeamCity buildType's description from its Github repo description
- `teamcity_buildtypes_set_description_from_github.sh` - sync's all TeamCity buildType descriptions from their GitHub repos where available
- `teamcity_export.sh` - downloads TeamCity configs to local JSON files in per-project directories mimicking native TeamCity directory structure and file naming
- `teamcity_export_project_config.sh` - downloads TeamCity project config to local JSON files
- `teamcity_export_buildtypes.sh` - downloads TeamCity buildType config to local JSON files
- `teamcity_export_vcs_roots.sh` - downloads TeamCity VCS root config to local JSON files
- `teamcity_projects.sh` - lists TeamCity project IDs and Names
- `teamcity_project_set_versioned_settings.sh` - configures a project to track all changes to a VCS (eg. GitHub)
- `teamcity_project_vcs_versioning.sh` - quickly toggle VCS versioning on/off for a given TeamCity project (useful for testing without auto-committing)
- `teamcity_vcs_roots.sh` - lists TeamCity VCS root IDs and Names
- `travis/*.sh` - [Travis CI](https://travis-ci.org/) API scripts (one of my all-time favourite CI systems):
- `travis_api.sh` - queries the Travis CI API with authentication using `$TRAVIS_TOKEN`
- `travis_repos.sh` - lists Travis CI repos
- `travis_foreach_repo.sh` - executes a templated command against all Travis CI repos
- `travis_repo_build.sh` - triggers a build for the given repo
- `travis_repo_caches.sh` - lists caches for a given repo
- `travis_repo_crons.sh` - lists crons for a given repo
- `travis_repo_env_vars.sh` - lists environment variables for a given repo
- `travis_repo_settings.sh` - lists settings for a given repo
- `travis_repo_create_cron.sh` - creates a cron for a given repo and branch
- `travis_repo_delete_crons.sh` - deletes all crons for a given repo
- `travis_repo_delete_caches.sh` - deletes all caches for a given repo (sometimes clears build problems)
- `travis_delete_cron.sh` - deletes a Travis CI cron by ID
- `travis_repos_settings.sh` - lists settings for all repos
- `travis_repos_caches.sh` - lists caches for all repos
- `travis_repos_crons.sh` - lists crons for all repos
- `travis_repos_create_cron.sh` - creates a cron for all repos
- `travis_repos_delete_crons.sh` - deletes all crons for all repos
- `travis_repos_delete_caches.sh` - deletes all caches for all repos
- `travis_lint.sh` - lints a given `.travis.yml` using the API
- `buildkite/*.sh` - [BuildKite](https://buildkite.com/) API scripts:
- `buildkite_api.sh` - queries the BuildKite API, handling authentication using `$BUILDKITE_TOKEN`
- `buildkite_pipelines.sh` - list buildkite pipelines for your `$BUILDKITE_ORGANIZATION` / `$BUILDKITE_USER`
- `buildkite_foreach_pipeline.sh` - executes a templated command for each Buildkite pipeline, replacing the `{user}` and `{pipeline}` in each iteration
- `buildkite_agent.sh` - runs a buildkite agent locally on Linux or Mac, or in Docker with choice of Linux distros
- `buildkite_agents.sh` - lists the Buildkite agents connected along with their hostname, IP, started dated and agent details
- `buildkite_pipelines.sh` - lists Buildkite pipelines
- `buildkite_create_pipeline.sh` - create a Buildkite pipeline from a JSON configuration (like from `buildkite_get_pipeline.sh` or `buildkite_save_pipelines.sh`)
- `buildkite_get_pipeline.sh` - gets details for a specific Buildkite pipeline in JSON format
- `buildkite_update_pipeline.sh` - updates a BuildKite pipeline from a configuration provided via stdin or from a file saved via `buildkite_get_pipeline.sh`
- `buildkite_patch_pipeline.sh` - updates a BuildKite pipeline from a partial configuration provided as an arg, via stdin, or from a file saved via `buildkite_get_pipeline.sh`
- `buildkite_pipeline_skip_settings.sh` - lists the skip intermediate build settings for one or more given BuildKite pipelines
- `buildkite_pipeline_set_skip_settings.sh` - configures given or all BuildKite pipelines to skip intermediate builds and cancel running builds in favour of latest build
- `buildkite_cancel_scheduled_builds.sh` - cancels BuildKite scheduled builds (to clear a backlog due to offline agents and just focus on new builds)
- `buildkite_cancel_running_builds.sh` - cancels BuildKite running builds (to clear them and restart new later eg. after agent / environment change / fix)
- `buildkite_pipeline_disable_forked_pull_requests.sh` - disables forked pull request builds on a BuildKite pipeline to protect your build environment from arbitrary code execution security vulnerabilities
- `buildkite_pipelines_vulnerable_forked_pull_requests.sh` - prints the status of each pipeline, should all return false, otherwise run the above script to close the vulnerability
- `buildkite_rebuild_cancelled_builds.sh` - triggers rebuilds of last N cancelled builds in current pipeline
- `buildkite_rebuild_failed_builds.sh` - triggers rebuilds of last N failed builds in current pipeline (eg. after agent restart / environment change / fix)
- `buildkite_rebuild_all_pipelines_last_cancelled.sh` - triggers rebuilds of the last cancelled build in each pipeline in the organization
- `buildkite_rebuild_all_pipelines_last_failed.sh` - triggers rebuilds of the last failed build in each pipeline in the organization
- `buildkite_retry_jobs_dead_agents.sh` - triggers job retries where jobs failed due to killed agents, continuing builds from that point and replacing their false negative failed status with the real final status, slightly better than rebuilding entire jobs which happen under a new build
- `buildkite_recreate_pipeline.sh` - recreates a pipeline to wipe out all stats (see url and badge caveats in `--help`)
- `buildkite_running_builds.sh` - lists running builds and the agent they're running on
- `buildkite_save_pipelines.sh` - saves all BuildKite pipelines in your `$BUILDKITE_ORGANIZATION` to local JSON files in `$PWD/.buildkite-pipelines/`
- `buildkite_set_pipeline_description.sh` - sets the description of one or more pipelines using the BuildKite API
- `buildkite_set_pipeline_description_from_github.sh` - sets a Buildkite pipeline description to match its source GitHub repo
- `buildkite_sync_pipeline_descriptions_from_github.sh` - for all BuildKite pipelines sets each description to match its source GitHub repo
- `buildkite_trigger.sh` - triggers BuildKite build job for a given pipeline
- `buildkite_trigger_all.sh` - same as above but for all pipelines
- `terraform_cloud_*.sh` - [Terraform Cloud](https://www.terraform.io/cloud) API scripts:
- `terraform_cloud_api.sh` - queries the Cloudflare API, handling authentication using `$TERRAFORM_TOKEN`
- `terraform_cloud_ip_ranges.sh` - returns the list of IP ranges for Terraform Cloud
- `terraform_cloud_organizations.sh` - lists Terraform Cloud organizations
- `terraform_cloud_workspaces.sh` - lists Terraform Cloud workspaces
- `terraform_cloud_workspace_vars.sh` - lists Terraform Cloud workspace variables
- `terraform_cloud_workspace_set_vars.sh` - adds / updates Terraform workspace-level sensitive environment/terraform variable(s) via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `terraform_cloud_workspace_delete_vars.sh` - deletes one or more Terraform workspace-level variables
- `terraform_cloud_varsets.sh` - lists Terraform Cloud variable sets
- `terraform_cloud_varset_vars.sh` - lists Terraform Cloud variables in on or all variables sets for the given organization
- `terraform_cloud_varset_set_vars.sh` - adds / updates Terraform sensitive environment/terraform variable(s) in a given variable set via the API from `key=value` or shell export format, as args or via stdin (eg. piped from `aws_csv_creds.sh`)
- `terraform_cloud_varset_delete_vars.sh` - deletes one or more Terraform variables in a given variable set
- `terraform_*.sh` - [Terraform](https://www.terraform.io/) scripts:
7 months ago
- `terraform_gcs_backend_version.sh` - determines the Terraform state version from the tfstate file in a GCS bucket found in a local given `backend.tf`
7 months ago
- `terraform_gitlab_download_backend_variable.sh` - downloads backend.tf from a GitLab CI/CD variable to be able to quickly iterate plans locally
- `terraform_import.sh` - finds given resource type in `./*.tf` code or Terraform plan output that are not in Terraform state and imports them
- `terraform_import_aws_iam_users.sh` - parses Terraform plan output to import new `aws_iam_user` additions into Terraform state
- `terraform_import_aws_iam_groups.sh` - parses Terraform plan output to import new `aws_iam_group` additions into Terraform state
- `terraform_import_aws_iam_policies.sh` - parses Terraform plan output to import new `aws_iam_policies` additions, resolves their ARNs and imports them into Terraform state
- `terraform_import_aws_sso_permission_sets.sh` - finds all `aws_ssoadmin_permission_set` in `./*.tf` code, resolves the ARNs and imports them to Terraform state
- `terraform_import_aws_sso_account_assignments.sh` - parses Terraform plan output to import new `aws_ssoadmin_account_assignment` additions into Terraform state
- `terraform_import_aws_sso_managed_policy_attachments.sh` - parses Terraform plan output to import new `aws_ssoadmin_account_assignment` additions into Terraform state
- `terraform_import_aws_sso_permission_set_inline_policies.sh` - parses Terraform plan output to import new `aws_ssoadmin_permission_set_inline_policy` additions into Terraform state
- `terraform_import_github_repos.sh` - finds all `github_repository` in `./*.tf` code or Terraform plan output that are not in Terraform state and imports them. See also `github_repos_not_in_terraform.sh`
- `terraform_import_github_team.sh` - imports a given GitHub team into a given Terraform state resource, by first querying the GitHub API for the team ID needed to import into Terraform
- `terraform_import_github_teams.sh` - finds all `github_team` in `./*.tf` code or Terraform plan output that are not in Terraform state, then queries the GitHub API for their IDs and imports them. See also `github_teams_not_in_terraform.sh`
- `terraform_import_github_team_repos.sh` - finds all `github_team_repository` in Terraform plan that would be added, then queries the GitHub API for the repos and team IDs and if they both exist then imports them to Terraform state
- `terraform_resources.sh` - external program to get all resource ids and attribute for a given resource type to work around Terraform splat expression limitation ([#19931](https://github.com/hashicorp/terraform/issues/19931))
- `terraform_managed_resource_types.sh` - quick parse of what Terraform resource types are found in `*.tf` files under the current or given directory tree. Useful to give you a quick glance of what services you are managing
4 months ago
- See also [Knowledge Base notes for Terraform](https://github.com/HariSekhon/Knowledge-Base/blob/main/terraform.md).
- `checkov_resource_*.sh` - [Checkov](https://www.checkov.io/) resource counts - useful to estimate [Bridgecrew Cloud](https://www.bridgecrew.cloud/) costs which are charged per resource:
- `checkov_resource_count.sh` - counts the number of resources Checkov is scanning in the current or given directory
- `checkov_resource_count_all.sh` - counts the total number of resources Checkov is scanning across all given repo checkouts
- `octopus_api.sh` - queries the [Octopus Deploy](https://octopus.com/) API
4 months ago
See also [Knowledge Base notes for CI/CD](https://github.com/HariSekhon/Knowledge-Base/blob/main/ci-cd.md).
### AI & IPaaS
`ai/` and `ipaas/` directories:
- `openai_api.sh` - queries the [OpenAI](https://openai.com/) (ChatGPT) API with authentication
- `make_api.sh` - queries the [Make.com](https://www.make.com) API with authentication
### Internet Services
`bin/`, `pingdom/`, `terraform/` directories:
- `digital_ocean_api.sh` / `doapi.sh` - queries the [Digital Ocean](https://www.digitalocean.com/) API with authentication
- see also the Digital Ocean CLI `doctl` (`install/install_doctl.sh`)
- `atlassian_ip_ranges.sh` - lists [Atlassian](https://www.atlassian.com/)'s IPv4 and/or IPv6 cidr ranges via its API
- `circleci_public_ips.sh` - lists [CircleCI](https://circleci.com) public IP addresses via dnsjson.com
- `cloudflare_*.sh` - [Cloudflare](https://www.cloudflare.com/) API queries and reports:
- `cloudflare_api.sh` - queries the Cloudflare API with authentication
- `cloudflare_ip_ranges.sh` - lists Cloudflare's IPv4 and/or IPv6 cidr ranges via its API
- `cloudflare_custom_certificates.sh` - lists any custom SSL certificates in a given Cloudflare zone along with their status and expiry date
- `cloudflare_dns_records.sh` - lists any Cloudflare DNS records for a zone, including the type and ttl
- `cloudflare_dns_records_all_zones.sh` - same as above but for all zones
7 months ago
- `cloudflare_dns_record_create.sh` - creates a DNS record in the given domain
- `cloudflare_dns_record_update.sh` - updates a DNS record in the given domain
- `cloudflare_dns_record_delete.sh` - deletes a DNS record in the given domain
- `cloudflare_dns_record_details.sh` - lists the details for a DNS record in the given domain in JSON format for further pipe processing
- `cloudflare_dnssec.sh` - lists the Cloudflare DNSSec status for all zones
- `cloudflare_firewall_rules.sh` - lists Cloudflare Firewall rules, optionally with filter expression
- `cloudflare_firewall_access_rules.sh` - lists Cloudflare Firewall Access rules, optionally with filter expression
- `cloudflare_foreach_account.sh` - executes a templated command for each Cloudflare account, replacing the `{account_id}` and `{account_name}` in each iteration (useful for chaining with `cloudflare_api.sh`)
- `cloudflare_foreach_zone.sh` - executes a templated command for each Cloudflare zone, replacing the `{zone_id}` and `{zone_name}` in each iteration (useful for chaining with `cloudflare_api.sh`, used by adjacent `cloudflare_*_all_zones.sh` scripts)
- `cloudflare_purge_cache.sh` - purges the entire Cloudflare cache
- `cloudflare_ssl_verified.sh` - gets the Cloudflare zone SSL verification status for a given zone
- `cloudflare_ssl_verified_all_zones.sh` - same as above for all zones
- `cloudflare_zones.sh` - lists Cloudflare zone names and IDs (needed for writing Terraform Cloudflare code)
- `datadog_api.sh` - queries the [DataDog](https://www.datadoghq.com/) API with authentication
- `gitguardian_api.sh` - queries the [GitGuardian](https://www.gitguardian.com/) API with authentication
- `jira_api.sh` - queries [Jira](https://www.atlassian.com/software/jira) API with authentication
- `kong_api.sh` - queries the [Kong API Gateway](https://docs.konghq.com/gateway/latest/)'s Admin API, handling authentication if enabled
- `traefik_api.sh` - queries the [Traefik](https://traefik.io/) API, handling authentication if enabled
- `ngrok_api.sh` - queries the [NGrok](https://ngrok.com/) API with authentication
- `pingdom_*.sh` - [Pingdom](https://www.pingdom.com/) API queries and reports for status, latency, average response times, latency averages by hour, SMS credits, outages periods and durations over the last year etc.
- `pingdom_api.sh` - queries the Solarwinds [Pingdom](https://www.pingdom.com/) API with authentication
- `pingdom_foreach_check.sh` - executes a templated command against each Pingdom check, replacing the `{check_id}` and `{check_name}` in each iteration
- `pingdom_checks.sh` - show all Pingdom checks, status and latencies
- `pingdom_checks_outages.sh` / `pingdom_checks_outages.sh` - show one or all Pingdom checks outage histories for the last year
- `pingdom_checks_average_response_times.sh` - shows the average response times for all Pingdom checks for the last week
- `pingdom_check_latency_by_hour.sh` / `pingdom_checks_latency_by_hour.sh` - shows the average latency for one or all Pingdom checks broken down by hour of the day, over the last week
- `pingdom_sms_credits.sh` - gets the remaining number of Pingdom SMS credits
- `terraform_cloud_api.sh` - queries [Terraform Cloud](https://www.terraform.io/cloud) API with authentication
- `terraform_cloud_ip_ranges.sh` - returns the list of IP ranges for [Terraform Cloud](https://www.terraform.io/cloud) via the API, or optionally one or more of the ranges used by different functions
10 months ago
- `wordpress.sh` - boots Wordpress in docker with a MySQL backend, and increases the upload_max_filesize to be able to restore a real world sized export backup
8 months ago
- `wordpress_api.sh` - queries the Wordpress API with authentication
- `wordpress_posts_without_category_tags.sh` - checks posts (articles) for categories without corresponding tags and prints the posts and their missing tags
### Java
`java/` directory:
- `java_show_classpath.sh` - shows Java classpaths, one per line, of currently running Java programs
- `jvm_heaps*.sh` - show all your Java heap sizes for all running Java processes, and their total MB (for performance tuning and sizing)
- Java Decompilers:
- `java_decompile_jar.sh` - decompiles a Java JAR in /tmp, finds the main class and runs a Java decompiler on its main .class file using `jd_gui.sh`
- `jd_gui.sh` - runs Java Decompiler JD GUI, downloading its jar the first time if it's not already present
- `bytecode_viwer.sh` - runs Bytecode-Viewer GUI Java decompiler, downloading its jar the first time if it's not already present
- `cfr.sh` - runs CFR command line Java decompiler, downloading its jar the first time if it's not already present
- `procyon.sh` - runs Procyon command line Java decompiler, downloading its jar the first time if it's not already present
See also [Knowledge Base notes for Java](https://github.com/HariSekhon/Knowledge-Base/blob/main/java.md)
and [JVM Performance Tuning](https://github.com/HariSekhon/Knowledge-Base/blob/main/java-jvm-performance-tuning.md).
4 months ago
### Python
`python/` directory:
- `python_compile.sh` - byte-compiles Python scripts and libraries into `.pyo` optimized files
- `python_pip_install.sh` - bulk installs PyPI modules from mix of arguments / file lists / stdin, accounting for User vs System installs, root vs user sudo, VirtualEnvs / Anaconda / GitHub Workflows/ Google Cloud Shell, Mac vs Linux library paths, and ignore failure option
- `python_pip_install_if_absent.sh` - installs PyPI modules not already in Python libary path (OS or pip installed) for faster installations only where OS packages are already providing some of the modules, reducing time and failure rates in CI builds
- `python_pip_install_for_script.sh` - installs PyPI modules for given script(s) if not already installed. Used for dynamic individual script dependency installation in the [DevOps Python tools](https://github.com/HariSekhon/DevOps-Python-tools) repo
- `python_pip_reinstall_all_modules.sh` - reinstalls all PyPI modules which can fix some issues
- `pythonpath.sh` - prints all Python libary search paths, one per line
- `python_find_library_path.sh` - finds directory where a PyPI module is installed - without args finds the Python library base
- `python_find_library_executable.sh` - finds directory where a PyPI module's CLI program is installed (system vs user, useful when it gets installed to a place that isn't in your `$PATH`, where `which` won't help)
- `python_find_unused_pip_modules.sh` - finds PyPI modules that aren't used by any programs in the current directory tree
- `python_find_duplicate_pip_requirements.sh` - finds duplicate PyPI modules listed for install under the directory tree (useful for deduping module installs in a project and across submodules)
- `python_translate_import_module.sh` - converts Python import modules to PyPI module names, used by `python_pip_install_for_script.sh`
- `python_translate_module_to_import.sh` - converts PyPI module names to Python import names, used by `python_pip_install_if_absent.sh` and `python_find_unused_pip_modules.sh`
- `python_pyinstaller.sh` - creates [PyInstaller](https://pypi.org/project/pyinstaller/) self-contained Python programs with Python interpreter and all PyPI modules included
- `python_pypi_versions.sh` - prints all available versions of a given PyPi module using the API
4 months ago
See also [Knowledge Base notes for Python](https://github.com/HariSekhon/Knowledge-Base/blob/main/python.md).
### Perl
`perl/` directory:
- `perl_cpanm_install.sh` - bulk installs CPAN modules from mix of arguments / file lists / stdin, accounting for User vs System installs, root vs user sudo, [Perlbrew](https://perlbrew.pl/) / Google Cloud Shell environments, Mac vs Linux library paths, ignore failure option, auto finds and reads build failure log for quicker debugging showing root cause error in CI builds logs etc
- `perl_cpanm_install_if_absent.sh` - installs CPAN modules not already in Perl libary path (OS or CPAN installed) for faster installations only where OS packages are already providing some of the modules, reducing time and failure rates in CI builds
- `perl_cpanm_reinstall_all.sh` - re-installs all CPAN modules. Useful for trying to recompile XS modules on Macs after migration assistant from an Intel Mac to an ARM Silicon Mac leaves your home XS libraries broken as they're built for the wrong architecture
- `perlpath.sh` - prints all Perl libary search paths, one per line
- `perl_find_library_path.sh` - finds directory where a CPAN module is installed - without args finds the Perl library base
- `perl_find_library_executable.sh` - finds directory where a CPAN module's CLI program is installed (system vs user, useful when it gets installed to a place that isn't in your `$PATH`, where `which` won't help)
- `perl_find_unused_cpan_modules.sh` - finds CPAN modules that aren't used by any programs in the current directory tree
- `perl_find_duplicate_cpan_requirements.sh` - finds duplicate CPAN modules listed for install more than once under the directory tree (useful for deduping module installs in a project and across submodules)
- `perl_generate_fatpacks.sh` - creates [Fatpacks](https://metacpan.org/pod/App::FatPacker) - self-contained Perl programs with all CPAN modules built-in
See also [Knowledge Base notes for Perl](https://github.com/HariSekhon/Knowledge-Base/blob/main/perl.md).
### Golang
`packages/` directory:
- `golang_install.sh` - bulk installs Golang modules from mix of arguments / file lists / stdin
- `golang_install_if_absent.sh` - same as above but only if the package binary isn't already available in `$PATH`
- `golang_rm_binaries.sh` - deletes binaries of the same name adjacent to `.go` files. Doesn't delete you `bin/` etc as these are often real deployed applications rather than development binaries
### Media
`media/` directory:
- `mp3_set_artist.sh` / `mp3_set_album.sh` - sets the artist / album tag for all mp3 files under given directories. Useful for grouping artists/albums and audiobook author/books (eg. for correct importing into Mac's Books.app)
- `mp3_set_track_name.sh` - sets the track name metadata for mp3 files under given directories to follow their filenames. Useful for correctly displaying audiobook progress / chapters etc.
- `mp3_set_track_order.sh` - sets the track order metadata for mp3 files under given directories to follow the lexical file naming order. Useful for correctly ordering album songs and audiobook chapters (eg. for Mac's Books.app). Especially useful for enforcing global ordering on multi-CD audiobooks after grouping into a single audiobook using `mp3_set_album.sh` (otherwise default track numbers in each CD interleave in Mac's Books.app)
- `avi_to_mp4.sh` - converts avi files to mp4 using ffmpeg. Useful to be able to play videos on devices like smart TVs that may not recognize newer codecs otherwise
- `mkv_to_mp4.sh` - converts mkv files to mp4 using ffmpeg. Same use case as above
- `youtube_download_channel.sh` - downloads all videos from a given YouTube channel URL
See also [Knowledge Base notes for MultiMedia](https://github.com/HariSekhon/Knowledge-Base/blob/main/multimedia.md).
### Spotify
40+ [Spotify](https://www.spotify.com/) API scripts (used extensively to manage my [Spotify-Playlists](https://github.com/HariSekhon/Spotify-Playlists) repo).
`spotify/` directory:
- `spotify_playlists*.sh` - list playlists in either `<id> <name>` or JSON format
- `spotify_playlist_tracks*.sh` - gets playlist contents as track URIs / `Artists - Track` / CSV format - useful for backups or exports between music systems
- `spotify_backup.sh` - backup all Spotify playlists as well as the ordered list of playlists
- `spotify_backup_playlist*.sh` - backup Spotify playlists to local files in both human readable `Artist - Track` format and Spotify URI format for easy restores or adding to new playlists
- `spotify_search*.sh` - search Spotify's library for tracks / albums / artists getting results in human readable format, JSON, or URI formats for easy loading to Spotify playlists
- `spotify_release_year.sh` - searches for a given track or album and finds the original release year
- `spotify_uri_to_name.sh` - convert Spotify track / album / artist URIs to human readable `Artist - Track` / CSV format. Takes Spotify URIs, URL links or just IDs. Reads URIs from files or standard input
- `spotify_create_playlist.sh` - creates a Spotify playlist, either public or private
- `spotify_rename_playlist.sh` - renames a Spotify playlist
- `spotify_set_playlists_public.sh` / `spotify_set_playlists_private.sh` - sets one or more given Spotify playlists to public / private
- `spotify_add_to_playlist.sh` - adds tracks to a given playlist. Takes a playlist name or ID and Spotify URIs in any form from files or standard input. Can be combined with many other tools listed here which output Spotify URIs, or appended from other playlists. Can also be used to restore a spotify playlist from backups
- `spotify_delete_from_playlist.sh` - deletes tracks from a given playlist. Takes a playlist name or ID and Spotify URIs in any form from files or standard input, optionally prefixed with a track position to remove only specific occurrences (useful for removing duplicates from playlists)
- `spotify_delete_from_playlist_if_in_other_playlists.sh` - deletes tracks from a given playlist if their URIs are found in the subsequently given playlists
- `spotify_delete_from_playlist_if_track_in_other_playlists.sh` - deletes tracks from a given playlist if their 'Artist - Track' name match are found in the subsequently given playlists (less accurate than exact URI deletion above)
- `spotify_duplicate_uri_in_playlist.sh` - finds duplicate Spotify URIs in a given playlist (these are guaranteed exact duplicate matches), returns all but the first occurrence and optionally their track positions (zero-indexed to align with the Spotify API for easy chaining with other tools)
- `spotify_duplicate_tracks_in_playlist.sh` - finds duplicate Spotify tracks in a given playlist (these are idential `Artist - Track` name matches, which may be from different albums / singles)
- `spotify_delete_duplicates_in_playlist.sh` - deletes duplicate Spotify URI tracks (identical) in a given playlist using `spotify_duplicate_uri_in_playlist.sh` and `spotify_delete_from_playlist.sh`
- `spotify_delete_duplicate_tracks_in_playlist.sh` - deletes duplicate Spotify tracks (name matched) in a given playlist using `spotify_duplicate_tracks_in_playlist.sh` and `spotify_delete_from_playlist.sh`
- `spotify_delete_any_duplicates_in_playlist.sh` - calls both of the above scripts to first get rid of duplicate URIs and then remove any other duplicates by track name matches
- `spotify_playlist_tracks_uri_in_year.sh` - finds track URIs in a playlist where their original release date is in a given year or decade (by regex match)
- `spotify_playlist_uri_offset.sh` - finds the offset of a given track URI in a given playlist, useful to find positions to resume processing a large playlist
- `spotify_top_artists*.sh` - lists your top artists in URI or human readable format
- `spotify_top_tracks*.sh` - lists top tracks in URI or human readable format
- `spotify_liked_tracks*.sh` - lists your `Liked Songs` in URI or human readable formats
- `spotify_liked_artists*.sh` - list artists from `Liked Songs` in URI or human readable formats
- `spotify_artists_followed*.sh` - lists all followed artists in URI or human readable formats
- `spotify_artist_tracks.sh` - gets all track URIs for a given artist, from both albums and single for chain loading to playlists
- `spotify_follow_artists.sh` - follows artists for the given URIs from files or standard input
- `spotify_follow_top_artists.sh` - follows all artists in your current Spotify top artists list
- `spotify_follow_liked_artists.sh` - follows artists with N or more tracks in your `Liked Songs`
- `spotify_set_tracks_uri_to_liked.sh` - sets a list of spotify track URIs to 'Liked' so they appear in the `Liked Songs` playlist. Useful for marking all the tracks in your best playlists as favourite tracks, or for porting historical `Starred` tracks to the newer `Liked Songs`
- `spotify_foreach_playlist.sh` - executes a templated command against all playlists, replacing `{playlist}` and `{playlist_id}` in each iteration
- `spotify_playlist_name_to_id.sh` / `spotify_playlist_id_to_name.sh` - convert playlist names <=> IDs
- `spotify_api_token.sh` - gets a Spotify authentication token using either [Client Credentials](https://developer.spotify.com/documentation/general/guides/authorization-guide/#client-credentials-flow) or [Authorization Code](https://developer.spotify.com/documentation/general/guides/authorization-guide/#authorization-code-flow) authentication flows, the latter being able to read/modify private user data, automatically used by `spotify_api.sh`
- `spotify_api.sh` - query any Spotify [API](https://developer.spotify.com/documentation/web-api/reference/) endpoint with authentication, used by adjacent spotify scripts
### More Linux & Mac
8 months ago
`bin/`, `install/`, `packages/`, `setup/` directories:
- [Linux](https://en.wikipedia.org/wiki/Linux) / [Mac](https://en.wikipedia.org/wiki/MacOS) systems administration scripts:
8 months ago
- `install/` - installation scripts for various OS packages (RPM, Deb, Apk) for various Linux distros ([Redhat RHEL](https://www.redhat.com/en/technologies/linux-platforms/enterprise-linux) / [CentOS](https://www.centos.org/) / [Fedora](https://getfedora.org/), [Debian](https://www.debian.org/) / [Ubuntu](https://ubuntu.com/), [Alpine](https://alpinelinux.org/))
- install if absent scripts for Python, Perl, Ruby, NodeJS and Golang packages - good for minimizing the number of source code installs by first running the OS install scripts and then only building modules which aren't already detected as installed (provided by system packages), speeding up builds and reducing the likelihood of compile failures
8 months ago
- install scripts for tarballs, Golang binaries, random 3rd party installers, [Jython](https://www.jython.org/) and build tools like [Gradle](https://gradle.org/) and [SBT](https://www.scala-sbt.org/) for when Linux distros don't provide packaged versions or where the packaged versions are too old
- `packages/` - OS / Distro Package Management:
- `install_packages.sh` - installs package lists from arguments, files or stdin on major linux distros and Mac, detecting the package manager and invoking the right install commands, with `sudo` if not root. Works on [RHEL](https://www.redhat.com/en) / [CentOS](https://www.centos.org/) / [Fedora](https://getfedora.org/), [Debian](https://www.debian.org/) / [Ubuntu](https://ubuntu.com/), [Alpine](https://alpinelinux.org/), and [Mac Homebrew](https://brew.sh/). Leverages and supports all features of the distro / OS specific install scripts listed below
- `install_packages_if_absent.sh` - installs package lists if they're not already installed, saving time and minimizing install logs / CI logs, same support list as above
- Redhat RHEL / CentOS:
- `yum_install_packages.sh` / `yum_remove_packages.sh` - installs RPM lists from arguments, files or stdin. Handles Yum + Dnf behavioural differences, calls `sudo` if not root, auto-attempts variations of python/python2/python3 package names. Avoids yum slowness by checking if rpm is installed before attempting to install it, accepts `NO_FAIL=1` env var to ignore unavailable / changed package names (useful for optional packages or attempts for different package names across RHEL/CentOS/Fedora versions)
- `yum_install_packages_if_absent.sh` - installs RPMs only if not already installed and not a metapackage provided by other packages (eg. `vim` metapackage provided by `vim-enhanced`), saving time and minimizing install logs / CI logs, plus all the features of `yum_install_packages.sh` above
- `rpms_filter_installed.sh` / `rpms_filter_not_installed.sh` - pipe filter packages that are / are not installed for easy script piping
- Debian / Ubuntu:
- `apt_install_packages.sh` / `apt_remove_packages.sh` - installs Deb package lists from arguments, files or stdin. Auto calls `sudo` if not root, accepts `NO_FAIL=1` env var to ignore unavailable / changed package names (useful for optional packages or attempts for different package names across Debian/Ubuntu distros/versions)
- `apt_install_packages_if_absent.sh` - installs Deb packages only if not already installed, saving time and minimizing install logs / CI logs, plus all the features of `apt_install_packages.sh` above
- `apt_wait.sh` - blocking wait on concurrent apt locks to avoid failures and continue when available, mimicking yum's waiting behaviour rather than error'ing out
- `debs_filter_installed.sh` / `debs_filter_not_installed.sh` - pipe filter packages that are / are not installed for easy script piping
- Alpine:
- `apk_install_packages.sh` / `apk_remove_packages.sh` - installs Alpine apk package lists from arguments, files or stdin. Auto calls `sudo` if not root, accepts `NO_FAIL=1` env var to ignore unavailable / changed package names (useful for optional packages or attempts for different package names across Alpine versions)
- `apk_install_packages_if_absent.sh` - installs Alpine apk packages only if not already installed, saving time and minimizing install logs / CI logs, plus all the features of `apk_install_packages.sh` above
- `apk_filter_installed.sh` / `apk_filter_not_installed.sh` - pipe filter packages that are / are not installed for easy script piping
- Mac:
- `brew_install_packages.sh` / `brew_remove_packages.sh` - installs Mac Hombrew package lists from arguments, files or stdin. Accepts `NO_FAIL=1` env var to ignore unavailable / changed package names (useful for optional packages or attempts for different package names across versions)
- `brew_install_packages_if_absent.sh` - installs Mac Homebrew packages only if not already installed, saving time and minimizing install logs / CI logs, plus all the features of `brew_install_packages.sh` above
- `brew_filter_installed.sh` / `brew_filter_not_installed.sh` - pipe filter packages that are / are not installed for easy script piping
2 months ago
- `brew_package_owns.sh` - finds which brew package owns a given filename argument
- all builds across all my GitHub repos now `make system-packages` before `make pip` / `make cpan` to shorten how many packages need installing, reducing chances of build failures
6 years ago
### Builds, Languages & Linting
`bin/`, `checks/`, `cicd/` or language specific directories:
- `lint.sh` - lints one or more files, auto-determines the file types, parses lint headers and calls appropriate scripts and tools. Integrated with my custom `.vimrc`
- `run.sh` - runs one or more files, auto-determines the file types, any run or arg headers and executes each file using the appropriate script or CLI tool. Integrated with my custom `.vimrc`
- `check_*.sh` - extensive collection of generalized tests - these run against all my GitHub repos via [CI](https://harisekhon.github.io/CI-CD/). Some examples:
4 years ago
4 years ago
- Programming language linting:
4 years ago
- [Python](https://www.python.org/) (syntax, pep8, byte-compiling, reliance on asserts which can be disabled at runtime, except/pass etc.)
- [Perl](https://www.perl.org/)
- [Java](https://www.java.com/en/)
- [Scala](https://www.scala-lang.org/)
- [Ruby](https://www.ruby-lang.org/en/)
- [Bash](https://www.gnu.org/software/bash/) / Shell
- Misc (whitespace, custom code checks etc.)
4 years ago
- Build System, Docker & CI linting:
4 years ago
- [Make](https://www.gnu.org/software/make/)
- [Maven](https://maven.apache.org/)
- [SBT](https://www.scala-sbt.org/)
- [Gradle](https://gradle.org/)
- [Travis CI](https://travis-ci.org/)
- [Circle CI](https://circleci.com/)
- [GitLab CI](https://docs.gitlab.com/ee/ci/)
- [Concourse CI](https://concourse-ci.org/)
- [Codefresh CI](https://codefresh.io/)
4 years ago
- [Dockerfiles](https://docs.docker.com/engine/reference/builder/)
- [Docker Compose](https://docs.docker.com/compose/)
4 years ago
- [Vagrantfiles](https://www.vagrantup.com/docs/vagrantfile)
6 years ago
## Individual Setup Parts
Optional, only if you don't do the full `make install`.
Install only OS system package dependencies and [AWS CLI](https://aws.amazon.com/cli/) via Python Pip (doesn't symlink anything to `$HOME`):
```shell
make
```
Adds sourcing to `.bashrc` and `.bash_profile` and symlinks dot config files to `$HOME` (doesn't install OS system package dependencies):
```shell
make link
```
undo via
```shell
make unlink
```
Install only OS system package dependencies (doesn't include [AWS CLI](https://aws.amazon.com/cli/) or Python packages):
```shell
make system-packages
```
Install [AWS CLI](https://aws.amazon.com/cli/):
```shell
make aws
```
Install [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/):
```shell
make azure
```
Install [GCP GCloud SDK](https://cloud.google.com/sdk) (includes CLI):
```shell
make gcp
```
Install [GCP GCloud Shell](https://cloud.google.com/shell) environment (sets up persistent OS packages and all home directory configs):
```shell
make gcp-shell
```
Install generically useful Python CLI tools and modules (includes [AWS CLI](https://aws.amazon.com/cli/), autopep8 etc):
```shell
make python
```
### Full Help
```shell
> make help
Usage:
Common Options:
make help show this message
make build installs all dependencies - OS packages and any language libraries via native tools eg. pip, cpanm, gem, go etc that are not available via OS packages
make build-retry retries 'make build' x 3 until success to try to mitigate temporary upstream repo failures triggering false alerts in CI systems
make ci prints env, then runs 'build-retry' for more resilient CI builds with debugging
make printenv prints environment variables, CPU cores, OS release, $PWD, Git branch, hashref etc. Useful for CI debugging
make system-packages installs OS packages only (detects OS via whichever package manager is available)
make test run tests
make clean removes compiled / generated files, downloaded tarballs, temporary files etc.
make submodules initialize and update submodules to the right release (done automatically by build / system-packages)
make init same as above, often useful to do in CI systems to get access to additional submodule provided targets such as 'make ci'
make cpan install any modules listed in any cpan-requirements.txt files if not already installed
make pip install any modules listed in any requirements.txt files if not already installed
make python-compile compile any python files found in the current directory and 1 level of subdirectory
make pycompile
make github open browser at github project
make readme open browser at github's README
make github-url print github url and copy to clipboard
make status open browser at Github CI Builds overview Status page for all projects
make ls print list of code files in project
make wc show counts of files and lines
Repo specific options:
make install builds all script dependencies, installs AWS CLI, symlinks all config files to $HOME and adds sourcing of bash profile
make link symlinks all config files to $HOME and adds sourcing of bash profile
make unlink removes all symlinks pointing to this repo's config files and removes the sourcing lines from .bashrc and .bash_profile
make python-desktop installs all Python Pip packages for desktop workstation listed in setup/pip-packages-desktop.txt
make perl-desktop installs all Perl CPAN packages for desktop workstation listed in setup/cpan-packages-desktop.txt
make ruby-desktop installs all Ruby Gem packages for desktop workstation listed in setup/gem-packages-desktop.txt
make golang-desktop installs all Golang packages for desktop workstation listed in setup/go-packages-desktop.txt
make nodejs-desktop installs all NodeJS packages for desktop workstation listed in setup/npm-packages-desktop.txt
make desktop installs all of the above + many desktop OS packages listed in setup/
make mac-desktop all of the above + installs a bunch of major common workstation software packages like Ansible, Terraform, MiniKube, MiniShift, SDKman, Travis CI, CCMenu, Parquet tools etc.
make linux-desktop
make ls-scripts print list of scripts in this project, ignoring code libraries in lib/ and .bash.d/
make github-cli installs GitHub CLI
make kubernetes installs Kubernetes kubectl and kustomize to ~/bin/
make terraform installs Terraform to ~/bin/
make vim installs Vundle and plugins
make tmux installs TMUX TPM and plugin for kubernetes context
make ccmenu installs and (re)configures CCMenu to watch this and all other major HariSekhon GitHub repos
make status open the Github Status page of all my repos build statuses across all CI platforms
make aws installs AWS CLI tools
make azure installs Azure CLI
make gcp installs Google Cloud SDK
make digital-ocean installs Digital Ocean CLI
make aws-shell sets up AWS Cloud Shell: installs core packages and links configs
(maintains itself across future Cloud Shells via .aws_customize_environment hook)
make gcp-shell sets up GCP Cloud Shell: installs core packages and links configs
(maintains itself across future Cloud Shells via .customize_environment hook)
make azure-shell sets up Azure Cloud Shell (limited compared to gcp-shell, doesn't install OS packages since there is no sudo)
Now exiting usage help with status code 3 to explicitly prevent silent build failures from stray 'help' arguments
make: *** [help] Error 3
```
(`make help` exits with error code 3 like most of my programs to differentiate from build success to make sure a stray `help` argument doesn't cause silent build failure with exit code 0)
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=HariSekhon/DevOps-Bash-tools&type=Date)](https://star-history.com/#HariSekhon/DevOps-Bash-tools&Date)
[git.io/bash-tools](https://git.io/bash-tools)
## More Core Repos
<!-- OTHER_REPOS_START -->
### Knowledge
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Knowledge-Base&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Knowledge-Base)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Diagrams-as-Code&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Diagrams-as-Code)
<!--
Not support on GitHub Markdown:
<iframe src="https://raw.githubusercontent.com/HariSekhon/HariSekhon/main/knowledge.md" width="100%" height="500px"></iframe>
Does nothing:
<embed src="https://raw.githubusercontent.com/HariSekhon/HariSekhon/main/knowledge.md" width="100%" height="500px" />
-->
### DevOps Code
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=DevOps-Bash-tools&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/DevOps-Bash-tools)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=DevOps-Python-tools&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/DevOps-Python-tools)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=DevOps-Perl-tools&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/DevOps-Perl-tools)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=DevOps-Golang-tools&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/DevOps-Golang-tools)
<!--
[![Gist Card](https://github-readme-stats.vercel.app/api/gist?id=f8f551332440f1ca8897ff010e363e03)](https://gist.github.com/HariSekhon/f8f551332440f1ca8897ff010e363e03)
-->
### Containerization
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Kubernetes-configs&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Kubernetes-configs)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Dockerfiles&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Dockerfiles)
### CI/CD
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=GitHub-Actions&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/GitHub-Actions)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Jenkins&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Jenkins)
### DBA - SQL
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=SQL-scripts&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/SQL-scripts)
### DevOps Reloaded
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Nagios-Plugins&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Nagios-Plugins)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=HAProxy-configs&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/HAProxy-configs)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Terraform&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Terraform)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Packer-templates&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Packer-templates)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Nagios-Plugin-Kafka&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Nagios-Plugin-Kafka)
3 weeks ago
### Templates
3 weeks ago
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Templates&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Templates)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Template-repo&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Template-repo)
3 weeks ago
### Misc
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Spotify-tools&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Spotify-tools)
[![Readme Card](https://github-readme-stats.vercel.app/api/pin/?username=HariSekhon&repo=Spotify-playlists&theme=ambient_gradient&description_lines_count=3)](https://github.com/HariSekhon/Spotify-playlists)
The rest of my original source repos are
[here](https://github.com/HariSekhon?tab=repositories&q=&type=source&language=&sort=stargazers).
Pre-built Docker images are available on my [DockerHub](https://hub.docker.com/u/harisekhon/).
<!-- 1x1 pixel counter to record hits -->
![](https://hit.yhype.me/github/profile?user_id=2211051)
<!-- OTHER_REPOS_END -->