IT Russian Roulette

Posted: 2016-03-21 21:12:43 by Alasdair Keyes

Direct Link | RSS feed


Living dangerously... http://www.commitstrip.com/en/2014/05/16/russian-roulette/


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Return of the Gopher Server

Posted: 2016-03-17 19:07:01 by Alasdair Keyes

Direct Link | RSS feed


After being bombared with literally no requests as to where my Gopher server went after my server move, It's back!!

gopher://gopher.akeyes.co.uk

(You'll have to hunt out your own Gopher client, Firefox doesn't support the protocol anymore and I doubt any of the other browsers do either)


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Puppet Nagios Plugin

Posted: 2016-03-14 10:39:58 by Alasdair Keyes

Direct Link | RSS feed


Over the weekend I finished migrating all my servers to being managed by a PuppetMaster. Some of these servers were quite old (One is over 5 years old, so forcing it's management into Puppet when it has lots of customisation and idiosynchrocies was a little nerve-wracking... thankfully all went well!

Now that everything is all under control, I wanted to ensure that Puppet was working correctly and nothing was getting left behind an errors weren't silently going un-noticed. There are a number of Puppet-centric tools that do this, but I didn't really want the extra Puppet functionality. Since I already monitor my systems with Nagios, I thought a simple Plugin would be useful.

With that, I've just released the first version of nagios-plugin-check_puppet_run

The core functionality is just to report the last run, the number of resources and the number of changes and any errors generated on the last run.

OK: Successes:0 Failures:0 Last Run:Mon Mar 14 10:10:41 2016 Version:3.7.2 Changes:0 Resources:44
WARNING: Successes:0 Failures:1 Last Run:Mon Mar 14 10:10:41 2016 Version:3.7.2 Changes:0 Resources:44

The script checks the Puppet last run summary file /var/lib/puppet/state/last_run_summary.yaml rather than other plugins that check if the Puppet service is running, this way, you can execute puppet by Cron/Daemon and still get a valid result on your Puppet install.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Inbox Zero: Update

Posted: 2016-03-10 07:25:53 by Alasdair Keyes

Direct Link | RSS feed


It's been almost 6 months since I started using Inbox Zero (See original post) so I thought I would provide an update....

Overall it's been a success, at present my personal inbox has 1 item in it and my work has 3 outstanding items.

It's not easy to measure it's value as I've been using email pretty succesfully for the past 20 years, however by the following metrics, it has been a great success...

Overall, if you can start using it, I think you'll really feel the benefits


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Munin Automatic plugin addition with Puppet

Posted: 2016-03-06 14:40:25 by Alasdair Keyes

Direct Link | RSS feed


I manage the core of my servers with Puppet and also use Munin for graphing system resources and metrics.

Munin has a lot of built in plugins to record system metrics and on installation will auto-detect what is available on the system to monitor, but it doesn't activate new plugins automatically. For example, I installed Munin Node on a server and then after installing NTP, Munin didn't know to monitor NTP metrics until I updated the plugins with the munin-node-configure command.

To combat this I wrote the following Puppet stanza to do this for me. In essence, it checks if munin-node-configure has detected any new plugins, if so, it just activates them and notifies the munin-node service to reload.

exec { 'add_suggested_munin_checks': 
    path    => [ "/usr/bin", "/usr/sbin", "/sbin", "/usr/local/sbin", "/bin" ],
    command => "munin-node-configure --suggest --shell | grep 'ln -s' | bash",
    onlyif  => "munin-node-configure --suggest --shell | grep 'ln -s'",
    notify  => Service['munin-node'],
}

service { 'munin-node':
    ensure  => 'running',
    enable  => true,
}

If anyone else manages Munin Nodes via Puppet, this could well help you speed up your Munin updates over large estates


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Worst passwords of 2015

Posted: 2016-02-15 20:42:23 by Alasdair Keyes

Direct Link | RSS feed


For any sysadmin, the use of weak passwords and the havoc it can wreak across your infrastructure and data can keep you up at night if you think about it too hard. A good password policy with correct enforcement can really help but people will use the easiest that they can get away with.

See the following worst passwords of 2015

https://www.teamsid.com/worst-passwords-2015/


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

rsync Puppet YUM repository

Posted: 2016-01-16 11:50:37 by Alasdair Keyes

Direct Link | RSS feed


I noticed that the Puppetlabs APT repository README has information on rsync'ing a local copy, but the YUM repository doesn't. So for anyone who wants to do it, the following will help...

YUM

rsync -av --stats --progress --copy-links --del rsync://yum.puppetlabs.com/packages/yum/ /home/repos/yum/puppet

APT

rsync -av --stats --progress --copy-links --del rsync://apt.puppetlabs.com/packages/apt/ /home/repos/apt/puppet


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Document versioning with GIT

Posted: 2016-01-05 23:43:55 by Alasdair Keyes

Direct Link | RSS feed


I've been at my new job for 3 months now and my home folder is slowly growing in size. Like most people often have files that are updated frequently, spreadsheets, build specs etc and I thought it would be quite nice to have a quick and dirty versioning system for my documents.

I didn't want to get too in-depth with log structured filesystems such as NILFS or really have to use a new FS or fuse-based FS as it seems an unnecessary length to go to. I don't have any requirement to store every single version, but just give me snapshots I could refer back to in future or easily restore mistakenly deleted files.

I decided that git would be a suitable base for this for the following reasons

I started by initializing my docs

$ cd /home/akeyes/mydocs
$ git init
Initialised empty Git repository in /home/akeyes/mydocs/.git
$ git add .
$ git commit -m "Initial commit"
[master (root-commit) 4cbb3d8] Initial commit
 152 files changed, 0 insertions(+), 0 deletions(-)
 create mode 100644 xxxxxx.docx
 create mode 100644 xxxxxx.odt
 ...
 create mode 100644 xxxxxx.txt

Now I have my initial commit, I wrote the followig script to add to cron

#!/bin/bash

FOLDER=/home/akeyes/mydocs
cd $FOLDER

if [ `git status --porcelain | wc -l` -gt 0 ]; then
    git add --all .
    git commit -m "Checkpoint"
fi

I was unaware of the --porcelain argument for git, but it essentially shows a simple machine parseable output of git status. For my usage if there's any output, it shows there's a change and it needs to be snapshotted

$ git status --porcelain
?? newfile
?? newfile1

the --all switch to git add allows git add to succesfully add in file deletions and not require git rm

To start with I just added this into my crontab

* * * * *    /home/akeyes/checkpoint_docs.sh

And now git log shows my versioning working

$ git log --name-only
commit d0544e40b856eb98cc2129f98383895708083deb
Author: Alasdair Keyes <x@x.com>
Date:   Tue Jan 5 19:47:00 2016 +0000

    Checkpoint

newfile2

commit 139fd9ff825232e0551bf570e4ac8957bc93c8b1
Author: Alasdair Keyes <x@x.com>
Date:   Tue Jan 5 19:46:00 2016 +0000

    Checkpoint

newfile
newfile2
newfile3

commit 8539658cd62581c3a514081c30d3c30a2a0a7ac9
Author: Alasdair Keyes <x@x.com>
Date:   Tue Jan 5 19:45:00 2016 +0000

    Checkpoint

newfile
newfile1

commit ba6ec4f43aecb81db0709ea18418d94d24cbb3d8
Author: Alasdair Keyes <x@x.com>
Date:   Tue Jan 5 19:42:33 2016 +0000

    Initial commit

xxxxxx.docx
xxxxxx.odt
xxxxxx.txt

It seems to work very well, on top of this I've created a .gitignore

# Libreoffice temp files
**/.~*#
# VIM swap files
**/.*.swp
# Keepass lock files
**/*.lock

Which stops temp files from LibreOffice being commited.


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Putt's Law

Posted: 2015-12-19 12:12:32 by Alasdair Keyes

Direct Link | RSS feed


I've just become aquainted with this law today

Technology is dominated by two types of people:  those who understand what they do not manage and those who manage what they do not understand.

It's pretty much on the money


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

Let's Encrypt webroot on NGINX

Posted: 2015-12-06 14:09:42 by Alasdair Keyes

Direct Link | RSS feed


Like a number of people I've been looking forward to the release of Let's Encrypt, the free system to allow every one to get an SSL Certificate. It's now in open beta and can be used by all.

You can read how it works and how to get it setup here.

After some playing about I found the following setup good for my needs. My system is NGINX running on Debian Jessie.

Run the following with superuser access.

mkdir /var/le_root
chown www-data: /var/le_root
chmod 700 /var/le_root

Create /etc/nginx/snippets/lets_encrypt.conf with the following text

location /.well-known/acme-challenge/ {
    allow all;
    auth_basic off;
    root /var/le_root;
}

The allow all; and auth_basic off; is because some of my sites have IP or basic auth restrictions which I don't want taking effect on this folder as it'll stop Lets Encrypt validating the site.

In each website virtualhost config add the line

include snippets/lets_encrypt.conf;

This snippet aliases /.well-known/acme-challenge/ on any hosting space for the to /var/le_root, we can then tell Let's Encrypt to use /var/le_root for all its validation files so with one command, create certs for any site I have on my server

Then just run

./letsencrypt-auto certonly --webroot -w /var/le_root -d mydomain.com

And your cert/key will be available in /etc/letsencrypt/live/mydomain.com/

The Let's Encrypt certs only last 90 days, whilst this may increase in future, I've added it to my Nagios checks, however, you can also use the following bash script in a cron to check the expiry dates of your certificates. It's easily ammended to auto renew certificates if you wish, I'll update it to auto-renew once I've had to renew one of my own certs.

https://gitlab.com/snippets/1731323/raw


If you found this useful, please feel free to donate via bitcoin to 1NT2ErDzLDBPB8CDLk6j1qUdT6FmxkMmNz

© Alasdair Keyes

IT Consultancy Services

I'm now available for IT consultancy and software development services - Cloudee LTD.



Happy user of Digital Ocean (Affiliate link)


Version:master-fa5e8fd8f4


Validate HTML 5