updated URLS

pull/2/head
Hari Sekhon 2 years ago
parent 1224e1731e
commit 124c769e6c

@ -321,7 +321,7 @@ etc.
- `aws_terraform_create_credential.sh` - creates a AWS terraform service account with Administrator permissions for Terraform Cloud or other CI/CD systems to run Terraform plan and apply, since no CI/CD systems can work with AWS SSO workflows. Stores the access key as both CSV and prints shell export commands and credentials file config as above
- `.envrc-aws` - copy to `.envrc` for `direnv` to auto-load AWS configuration settings such as AWS Profile, Compute Region, EKS cluster kubectl context etc.
- calls `.envrc-kubernetes` to set the `kubectl` context isolated to current shell to prevent race conditions between shells and scripts caused by otherwise naively changing the global `~/.kube/config` context
- `aws_account_summary.sh` - prints AWS account summary in `key = value` pairs for easy viewing / grepping of things like `AccountMFAEnabled`, `AccountAccessKeysPresent`, useful for checking whether the root account has MFA enabled and no access keys, comparing number of users vs number of MFA devices etc. (see also `check_aws_root_account.py` in [Advanced Nagios Plugins](https://github.com/harisekhon/nagios-plugins))
- `aws_account_summary.sh` - prints AWS account summary in `key = value` pairs for easy viewing / grepping of things like `AccountMFAEnabled`, `AccountAccessKeysPresent`, useful for checking whether the root account has MFA enabled and no access keys, comparing number of users vs number of MFA devices etc. (see also `check_aws_root_account.py` in [Advanced Nagios Plugins](https://github.com/HariSekhon/Nagios-Plugins))
- `aws_billing_alarm.sh` - creates a [CloudWatch](https://aws.amazon.com/cloudwatch/) billing alarm and [SNS](https://aws.amazon.com/sns/) topic with subscription to email you when you incur charges above a given threshold. This is often the first thing you want to do on an account
- `aws_budget_alarm.sh` - creates an [AWS Budgets](https://aws.amazon.com/cloudwatch/) billing alarm and [SNS](https://aws.amazon.com/sns/) topic with subscription to email you when both when you start incurring forecasted charges of over 80% of your budget, and 90% actual usage. This is often the first thing you want to do on an account
- `aws_batch_stale_jobs.sh` - lists [AWS Batch](https://aws.amazon.com/batch/) jobs that are older than N hours in a given queue
@ -570,7 +570,7 @@ etc.
#### Big Data & NoSQL
- `kafka_*.sh` - scripts to make [Kafka](http://kafka.apache.org/) CLI usage easier including auto-setting Kerberos to source TGT from environment and auto-populating broker and zookeeper addresses. These are auto-added to the `$PATH` when `.bashrc` is sourced. For something similar for [Solr](https://lucene.apache.org/solr/), see `solr_cli.pl` in the [DevOps Perl Tools](https://github.com/harisekhon/devops-perl-tools) repo.
- `kafka_*.sh` - scripts to make [Kafka](http://kafka.apache.org/) CLI usage easier including auto-setting Kerberos to source TGT from environment and auto-populating broker and zookeeper addresses. These are auto-added to the `$PATH` when `.bashrc` is sourced. For something similar for [Solr](https://lucene.apache.org/solr/), see `solr_cli.pl` in the [DevOps Perl Tools](https://github.com/HariSekhon/DevOps-Perl-tools) repo.
- `zookeeper*.sh` - [Apache ZooKeeper](https://zookeeper.apache.org/) scripts:
- `zookeeper_client.sh` - shortens `zookeeper-client` command by auto-populating the zookeeper quorum from the environment variable `$ZOOKEEPERS` or else parsing the zookeeper quorum from `/etc/**/*-site.xml` to make it faster and easier to connect
- `zookeeper_shell.sh` - shortens Kafka's `zookeeper-shell` command by auto-populating the zookeeper quorum from the environment variable `$KAFKA_ZOOKEEPERS` and optionally `$KAFKA_ZOOKEEPER_ROOT` to make it faster and easier to connect
@ -595,7 +595,7 @@ etc.
- `impala_tables_column_counts.sh` - lists the column count per Impala table
- `hdfs_*.sh` - Hadoop [HDFS](https://en.wikipedia.org/wiki/Apache_Hadoop#Hadoop_distributed_file_system) scripts:
- `hdfs_checksum*.sh` - walks an HDFS directory tree and outputs HDFS native checksums (faster) or portable externally comparable CRC32, in serial or in parallel to save time
- `hdfs_find_replication_factor_1.sh` / `hdfs_set_replication_factor_3.sh` - finds HDFS files with replication factor 1 / sets HDFS files with replication factor <=2 to replication factor 3 to repair replication safety and avoid no replica alarms during maintenance operations (see also Python API version in the [DevOps Python Tools](https://github.com/harisekhon/devops-python-tools) repo)
- `hdfs_find_replication_factor_1.sh` / `hdfs_set_replication_factor_3.sh` - finds HDFS files with replication factor 1 / sets HDFS files with replication factor <=2 to replication factor 3 to repair replication safety and avoid no replica alarms during maintenance operations (see also Python API version in the [DevOps Python Tools](https://github.com/HariSekhon/DevOps-Python-tools) repo)
- `hdfs_file_size.sh` / `hdfs_file_size_including_replicas.sh` - quickly differentiate HDFS files raw size vs total replicated size
- `hadoop_random_node.sh` - picks a random Hadoop cluster worker node, like a cheap CLI load balancer, useful in scripts when you want to connect to any worker etc. See also the read [HAProxy Load Balancer configurations](https://github.com/HariSekhon/HAProxy-configs) which focuses on master nodes
- `cloudera_*.sh` - [Cloudera](https://www.cloudera.com/) scripts:
@ -998,7 +998,7 @@ etc.
#### Data Format Conversion & Validation
- `csv_header_indices.sh` - list CSV headers with their zero indexed numbers, useful reference when coding against column positions
- Data format validation `validate_*.py` from [DevOps Python Tools repo](https://github.com/harisekhon/devops-python-tools):
- Data format validation `validate_*.py` from [DevOps Python Tools repo](https://github.com/HariSekhon/DevOps-Python-tools):
- CSV
- JSON
@ -1014,7 +1014,7 @@ etc.
### See Also
- [DevOps Python Tools](https://github.com/harisekhon/devops-python-tools) - 80+ DevOps CLI tools for AWS, GCP, Hadoop, HBase, Spark, Log Anonymizer, Ambari Blueprints, AWS CloudFormation, Linux, Docker, Spark Data Converters & Validators (Avro / Parquet / JSON / CSV / INI / XML / YAML), Elasticsearch, Solr, Travis CI, Pig, IPython
- [DevOps Python Tools](https://github.com/HariSekhon/DevOps-Python-tools) - 80+ DevOps CLI tools for AWS, GCP, Hadoop, HBase, Spark, Log Anonymizer, Ambari Blueprints, AWS CloudFormation, Linux, Docker, Spark Data Converters & Validators (Avro / Parquet / JSON / CSV / INI / XML / YAML), Elasticsearch, Solr, Travis CI, Pig, IPython
- [SQL Scripts](https://github.com/HariSekhon/SQL-scripts) - 100+ SQL Scripts - PostgreSQL, MySQL, AWS Athena, Google BigQuery
@ -1022,7 +1022,7 @@ etc.
- [Kubernetes configs](https://github.com/HariSekhon/Kubernetes-configs) - Kubernetes YAML configs - Best Practices, Tips & Tricks are baked right into the templates for future deployments
- [The Advanced Nagios Plugins Collection](https://github.com/harisekhon/nagios-plugins) - 450+ programs for Nagios monitoring your Hadoop & NoSQL clusters. Covers every Hadoop vendor's management API and every major NoSQL technology (HBase, Cassandra, MongoDB, Elasticsearch, Solr, Riak, Redis etc.) as well as message queues (Kafka, RabbitMQ), continuous integration (Jenkins, Travis CI) and traditional infrastructure (SSL, Whois, DNS, Linux)
- [The Advanced Nagios Plugins Collection](https://github.com/HariSekhon/Nagios-Plugins) - 450+ programs for Nagios monitoring your Hadoop & NoSQL clusters. Covers every Hadoop vendor's management API and every major NoSQL technology (HBase, Cassandra, MongoDB, Elasticsearch, Solr, Riak, Redis etc.) as well as message queues (Kafka, RabbitMQ), continuous integration (Jenkins, Travis CI) and traditional infrastructure (SSL, Whois, DNS, Linux)
- [DevOps Perl Tools](https://github.com/harisekhon/perl-tools) - 25+ DevOps CLI tools for Hadoop, HDFS, Hive, Solr/SolrCloud CLI, Log Anonymizer, Nginx stats & HTTP(S) URL watchers for load balanced web farms, Dockerfiles & SQL ReCaser (MySQL, PostgreSQL, AWS Redshift, Snowflake, Apache Drill, Hive, Impala, Cassandra CQL, Microsoft SQL Server, Oracle, Couchbase N1QL, Dockerfiles, Pig Latin, Neo4j, InfluxDB), Ambari FreeIPA Kerberos, Datameer, Linux...

@ -48,7 +48,7 @@ This can be safely ignored, the rest of the IAM account summary info containing
See Also:
aws_iam_users_mfa_active_report.sh (adjacent)
check_aws_root_account.py - in The Advanced Nagios Plugins collection (https://github.com/harisekhon/nagios-plugins)
check_aws_root_account.py - in The Advanced Nagios Plugins collection (https://github.com/HariSekhon/Nagios-Plugins)
$usage_aws_cli_required

@ -35,8 +35,8 @@ See Also:
more AWS tools in the DevOps Python Tools repo and The Advanced Nagios Plugins Collection:
- https://github.com/harisekhon/devops-python-tools
- https://github.com/harisekhon/nagios-plugins
- https://github.com/HariSekhon/DevOps-Python-tools
- https://github.com/HariSekhon/Nagios-Plugins
$usage_aws_cli_required

@ -31,7 +31,7 @@ See Also:
aws_users_access_key_age.py - in DevOps Python Tools which is able to filter by age and status
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
awless list accesskeys --format tsv | grep 'years[[:space:]]*$'

@ -36,7 +36,7 @@ See Also:
aws_users_access_key_age.py - in DevOps Python Tools which is able to filter by age and status
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
awless list accesskeys --format tsv | grep 'years[[:space:]]*$'

@ -36,7 +36,7 @@ See Also:
See similar tools in DevOps Python Tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
$usage_aws_cli_required

@ -31,7 +31,7 @@ user,access_key_1_active,access_key_1_last_used_date,access_key_2_active,access_
See similar tools in DevOps Python Tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
$usage_aws_cli_required

@ -38,8 +38,8 @@ to check your root account isn't being used
See similar tools in the DevOps Python Tools repo and The Advanced Nagios Plugins Collection:
- https://github.com/harisekhon/devops-python-tools
- https://github.com/harisekhon/nagios-plugins
- https://github.com/HariSekhon/DevOps-Python-tools
- https://github.com/HariSekhon/Nagios-Plugins
$usage_aws_cli_required

@ -38,8 +38,8 @@ to check your root account isn't being used
See similar tools in the DevOps Python Tools repo and The Advanced Nagios Plugins Collection:
- https://github.com/harisekhon/devops-python-tools
- https://github.com/harisekhon/nagios-plugins
- https://github.com/HariSekhon/DevOps-Python-tools
- https://github.com/HariSekhon/Nagios-Plugins
$usage_aws_cli_required

@ -28,7 +28,7 @@ See Also:
- check_aws_users_password_last_used.py in the Advanced Nagios Plugins collection
https://github.com/harisekhon/nagios-plugins
https://github.com/HariSekhon/Nagios-Plugins
awless list users

@ -59,7 +59,7 @@ See also:
https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Usinghive-site.xmltoautomaticallyconnecttoHiveServer2
hive_foreach_table.py / impala_foreach_table.py and similar tools in DevOps Python Tools repo - https://github.com/harisekhon/devops-python-tools
hive_foreach_table.py / impala_foreach_table.py and similar tools in DevOps Python Tools repo - https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -19,7 +19,7 @@
# This is only for local use, there is a much better Python version in my DevOps Python Tools repo:
#
# https://github.com/harisekhon/devops-python-tools
# https://github.com/HariSekhon/DevOps-Python-tools
set -euo pipefail

@ -35,7 +35,7 @@ find_duplicate_files.py
in the DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -32,7 +32,7 @@ find_duplicate_files.py
in the DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -19,9 +19,9 @@
#
# See also:
#
# find_active_*.py - https://github.com/harisekhon/devops-python-tools
# find_active_*.py - https://github.com/HariSekhon/DevOps-Python-tools
#
# HAProxy Configs for many Hadoop and other technologies - https://github.com/harisekhon/haproxy-configs
# HAProxy Configs for many Hadoop and other technologies - https://github.com/HariSekhon/HAProxy-configs
#
set -euo pipefail

@ -37,7 +37,7 @@ See also:
hdfs_find_replication_factor_1.py in DevOps Python tools repo which can
also reset these found files back to replication factor 3 to fix the issue
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
usage: ${0##*/} <file_or_directory_paths>

@ -40,7 +40,7 @@ and
hdfs_find_replication_factor_1.py in DevOps Python tools repo which can
also reset these found files back to replication factor 3 to fix the issue
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
usage: ${0##*/} <file_or_directory_paths>

@ -32,7 +32,7 @@ Tested on Hive 1.1.0 on CDH 5.10, 5.16
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
you will need to comment out / remove the 'set -o pipefail' to skip errors if you aren't authorized to use
any of the databases to avoid the script exiting early upon encountering any authorization error such:

@ -37,7 +37,7 @@ Tested on Hive 1.1.0 on CDH 5.10, 5.16
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
you will need to comment out / remove '-o pipefail' below to skip errors if you aren't authorized to use
any of the databases to avoid the script exiting early upon encountering any authorization error such:

@ -44,7 +44,7 @@ For Hive < 3.0 - consider using adjacent impala_list_tables.sh instead as it is
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
Hive doesn't suffer from db authz issue listing metadata like Impala, which gets:

@ -40,7 +40,7 @@ For more documentation see the comments at the top of beeline.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -40,7 +40,7 @@ For more documentation see the comments at the top of beeline.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -40,7 +40,7 @@ For more documentation see the comments at the top of beeline.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -37,7 +37,7 @@ Tested on Hive 1.1.0 on CDH 5.10, 5.16
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
you will need to comment out / remove '-o pipefail' below to skip errors if you aren't authorized to use
any of the databases to avoid the script exiting early upon encountering any authorization error such:

@ -37,7 +37,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
'set -o pipefail' is not enabled in order to skip authorization errors such as that documented in impala_list_tables.sh
and also ignore errors from the 'select count(*)' in the loop as Impala often has metadata errors such as:

@ -35,7 +35,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -39,7 +39,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
you will need to comment out / remove 'set -o pipefail' below to skip errors if you aren't authorized to use
any of the databases to avoid the script exiting early upon encountering any authorization error such:

@ -35,9 +35,9 @@ If using dedicated coordinators then consider setting IMPALA_HOST to one of thos
See also:
find_active_impalad.py - https://github.com/harisekhon/devops-python-tools
find_active_impalad.py - https://github.com/HariSekhon/DevOps-Python-tools
HAProxy Configs for Impala and many other technologies - https://github.com/harisekhon/haproxy-configs
HAProxy Configs for Impala and many other technologies - https://github.com/HariSekhon/HAProxy-configs
If you get an error such as:
@ -108,7 +108,7 @@ if [ -n "${IMPALA_HOST:-}" ]; then
elif [ -f "$topology_map" ]; then
#echo "picking random impala from hadoop topology map" >&2
# nodes in the topology map that aren't masters, namenodes, controlnodes etc probably have impalad running on them, so pick one at random to connect to
# or alternatively use HAProxy config for load balanced impala clusters - see https://github.com/harisekhon/haproxy-configs
# or alternatively use HAProxy config for load balanced impala clusters - see https://github.com/HariSekhon/HAProxy-configs
impalad="$(
awk -F'"' '/<node name="[A-Za-z]/{print $2}' "$topology_map" |
grep -Ev '^[^.]*(name|master|control)' |

@ -48,7 +48,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -48,7 +48,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -48,7 +48,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
"
# used by usage() in lib/utils.sh

@ -40,7 +40,7 @@ For more documentation see the comments at the top of impala_shell.sh
For a better version written in Python see DevOps Python tools repo:
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
'set -o pipefail' is commented out to skip authorization errors such as that documented in impala_list_tables.sh
and also ignore errors from the 'select count(*)' in the loop as Impala often has metadata errors such as:

@ -32,7 +32,7 @@ Uses Microsoft Active Directory LDAP extension, so is not portable to other LDAP
See the python version in the DevOps Python Tools repo for a more generalized version with nicer control and output
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
usage: ${0##*/} <group_dn> [<attribute_filter>]

@ -34,7 +34,7 @@ Uses Microsoft Active Directory LDAP extension, so is not portable to other LDAP
See the python version in the DevOps Python Tools repo for a more generalized version with nicer control and output
https://github.com/harisekhon/devops-python-tools
https://github.com/HariSekhon/DevOps-Python-tools
usage: $script <user_dn> [<attribute_filter>]

@ -5,7 +5,7 @@
# Author: Hari Sekhon
# Date: 2015-05-25 01:38:24 +0100 (Mon, 25 May 2015)
#
# https://github.com/harisekhon/devops-python-tools
# https://github.com/HariSekhon/DevOps-Python-tools
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -114,7 +114,7 @@ mybranch(){
git rev-parse --abbrev-ref HEAD
}
# shouldn't need to use this any more, git_check_branches_upstream.py from DevOps Python Tools repo has a --fix flag which will do this for all branches if they have no upstream set - https://github.com/harisekhon/devops-python-tools
# shouldn't need to use this any more, git_check_branches_upstream.py from DevOps Python Tools repo has a --fix flag which will do this for all branches if they have no upstream set - https://github.com/HariSekhon/DevOps-Python-tools
set_upstream(){
git branch --set-upstream-to "origin/$(mybranch)" "$(mybranch)"
}

@ -14,7 +14,7 @@
#
# putting the $TRAVIS_TOKEN in your environment is useful for the travis tools available in
#
# https://github.com/harisekhon/devops-python-tools
# https://github.com/HariSekhon/DevOps-Python-tools
set -euo pipefail
[ -n "${DEBUG:-}" ] && set -x

@ -4,7 +4,7 @@
# Author: Hari Sekhon
# Date: 2020-07-23 18:02:26 +0100 (Thu, 23 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -4,7 +4,7 @@
# Author: Hari Sekhon
# Date: 2020-07-23 22:11:57 +0100 (Thu, 23 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-24 19:05:25 +0100 (Fri, 24 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-24 19:05:25 +0100 (Fri, 24 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -4,7 +4,7 @@
# Author: Hari Sekhon
# Date: 2020-07-24 19:05:25 +0100 (Fri, 24 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-24 19:05:25 +0100 (Fri, 24 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-24 19:05:25 +0100 (Fri, 24 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -4,7 +4,7 @@
# Author: Hari Sekhon
# Date: 2020-07-23 23:26:15 +0100 (Thu, 23 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-23 23:26:15 +0100 (Thu, 23 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -6,7 +6,7 @@
# Author: Hari Sekhon
# Date: 2020-07-23 23:26:15 +0100 (Thu, 23 Jul 2020)
#
# https://github.com/harisekhon/spotify-playlists
# https://github.com/HariSekhon/Spotify-Playlists
#
# License: see accompanying Hari Sekhon LICENSE file
#

@ -32,7 +32,7 @@ For a much better version of this see check_ssl_cert.pl in the Advanced Nagios P
check_ssl_cert.pl - checks Expiry days remaining, Domain, Subject Alternative Names, SNI
https://github.com/harisekhon/nagios-plugins
https://github.com/HariSekhon/Nagios-Plugins
"
# used by usage() in lib/utils.sh

Loading…
Cancel
Save