Atom editor is worth checking out

As I write this entry in markdown, I see the live results a separate pane to the right. It’s only been a few minutes but so far I’m very impressed with Atom. You should go check it out if you haven’t already. Here’s what I like so far.

Don’t call them plugins, they’re ‘packages’

Shrug. Whatever you call them, their plugin/package system is great. It’s easy to install packages and the ones I’ve installed start working immediately without restart. Smells like there’s some good design under the hood.

Command Interface Autocomplete

Packages can provide various ‘commands’ you can run. Running these commands is like spot line on OS X. You type command + shift + P, then as you start typing it searches the commands. Very nice.

Git integration with git-plus

In particular the git integration seems interesting. When you edit a file, it indicates which lines have changed, and which are new. With the git-plus package installed it’s easy to run git commands without ever leaving your editor. Normally I’m not a big fan of IDE intgration with git. With the command searching interface, it’s fun.

It’s built with Electron

It’s built with Electron which is also terrific as well. I think that’s separate post. But you should check it out if you want to build desktop apps using node.js + HTML5.

It’s FLOSS!

It’s FLOSS. Turns out it’s made by GitHub. Lots of great FLOSS coming out of GitHub. Thanks!

Update (2016-02-28): The goto declaration ability of WebStorm is FAR superior
to any Atom packages I’ve tried. This is the only thing keeping me from using Atom
as my daily driver for development.

Buh-bye Jekyll, Hello Hexo

Buh-bye Jekyll

I tried to get Jekyll working again after 3 years on inactivity. It was painful enough that I said eff it. My tolerance for Ruby related pain is extremely low these days.

Hello Hexo

Since we use a lot of Node.js at Kash, I decided to switch to Hexo. It seems to be a polished Jekyll clone. Really the most important thing for me to keep my actual content in markdown. I expect to move again, and markdown does the job.

Blogging again

While Kash is keeping me insanely busy, I to start blogging again. I’m going to try to write once a week but keep it very MVBP (Minimum Viable Blog Post) ;)

Secure Replication With Postgres 9.1

Having been the MySQL DBA-By-Default (DBA-B-D) in another life, I’ve must to admit to being much happier with postgres despite what I consider to be documentation holes. As a DBA-B-D (aka DevOps, aka Co-Founder), I find postgres lacking concise up-to-date documentation for getting specific tasks done quickly, or howtos. Replication is one such task. I had to merge bits and pieces from a number of sources, including mailing list posts, together in order to get what I wanted. I’m not complaining though, rather this my contribution to improving this situation.

Why Secure Replication

The Cloud, aka outsourced VPS hosting with an API. Most of the documentation seems to expect you to be running this in our private secure network partitioned data center.

High Level Overview

TODO

Get Yourself A Cert

You’ll probably want to generate one yourself. THere’s not much point paying for a new one since you can easily distribute your own CA cert. Google it, there’s lot of info out there.

On The Master

Update the postgres.conf on your master to enable WAL support for replication:



wal_level = hot_standby
max_wal_senders = 3

Add the following to authorize the client to replicate against the db. Note that we’re only authorizing an SSL connection from replication user on all databases from $SLAVE_IP with password based authentication (md5).



hostssl replication all $SLAVE_IP/32 md5

Note: You’ll need to restart your postgres server for the wal related setting to take affect now.

The Postgres data dir for Ubuntu 12-04 is in /var/lib/postgresql/9.1/main

You’ll need an SSL key and cert and root cert (CA). You can generate your own CA and self signed cert if you want as well. To do so see the Keys and Certs section of
this article.

On The Slave

If postgres is running on the SLAVE, bring it down. You’re going to wipe out whatever is there and create a backup from the master.

Switch to postgres user from here on.

First, delete the the contents of $PG_DATA ( /var/lib/postgresql/9.1/main/ on Ubuntu/Debian ).



sudo su - postgres
rm -rf /var/lib/postgresql/9.1/main/

Now use pg_basebackup to create the backup we’re going to start replicaiton from. You’ll be prompted for the postgres user password ($PG_PASS).



pg_basebackup -D /var/lib/postgresql/9.1/main/ -x -h $MASTER_IP

As root, create links to the certs, including your CA/root cert.



sudo su
cd /var/lib/postgresql/9.1/main/
ln -s /etc/ssl/certs/yourcert.crt server.crt
ln -s /etc/ssl/private/yourkey.key server.key
ln -s /etc/ssl/certs/root.crt root.cert

Once complete, create a file called recovery.conf with the following contents inside your postgres data dir on the slave.



standby_mode = ‘on’
# ‘touch’ the file below to initiate fail over ( break replication, become read-write )
trigger_file = ‘$PG_DATA/failover’
primary_conninfo=’host=$MASTER_IP port=5432 sslmode=verify-ca password=$PG_PASS’

Add the followng to the postgres.conf file:



hot_standby = on

Link to the root cert used to verify the master:



ln -s /etc/ssl/

Start postgres and tail the log, you should see replication starting. On Ubuntu:



service postgres start
tail -f /var/log/postgresql/postgresql-9.1-main.log

SSH Agent Forwarding

I was having some issues getting my SSH Agent to forward. Turns out my understanding was completely inadequate. Here’s a quick run through for anyone else who might benefit.

High Level Concept

Say you have hosts A B C and you want to connect like so: A->B->C. SSH allows you forward your ‘Agent’ such that your credentials for host A can be used on on host C as if B wasn’t even involved. It does so by forwarding a unix domain socket provided by A’s agent to B (usually in /tmp/ssh-??) when you connect from A->B. Then when you connect B->C instead of C interacting with B’s agent, it interacts with the forwarded Agent provided by A.

Configuration

  • Make sure ‘AllowAgentForwarding no’ is not set it in /etc/ssh/sshd_config on B (it defaults to yes if it’s not explicitly set to no).
  • Make sure your client config has ‘ForwardAgent yes’, you’ll likely want to do this which specific hosts you trust/control, as a program on B will able to login as you (only while you’re connected) if it wants. Edit ~/.ssh/config and add something similar to the following:

    Host examplehost.com
    ForwardAgent yes

  • Tell your ssh-agent on A that you want make an identity available through it. Run the follow from a terminal:

    ssh-add


    That’s it. You should be able to login to C from B using your credentials securely stored on A.

Client SSL Auth

The HTTPS system allows you to communicate securely with a server and trust it’s idenity. This is how it’s generally used. However it’s also possible for the server to trust the identity of the client as well.

It’s fairly straight forward to take advantage of this with Node.js, below you will find a simple tutorial for doing just that.

Keys and Certs

First, you’re going to need all your certs and keys. Follow along with the instructions below to do so. It’s adapted from this article.

Notice the ‘365’. If you want your keys to valide for longer than a year, change this.


# Create the CA Key and Certificate for signing Client Certs
openssl genrsa -des3 -out ca.key 4096
openssl req -new -x509 -days 365 -key ca.key -out ca.crt

# Create the Server Key, CSR (the signing request the CA is given)
openssl genrsa -des3 -out server.key 1024
openssl req -new -key server.key -out server.csr

# You likely want a server key without a passphrase (put the passphrase protected one in your private git repo)
openssl rsa -in server.key -out server.key.pem 

# We're self signing our own server cert here.  This is a no-no in production.
openssl x509 -req -days 365 -in server.csr -CA ca.crt -CAkey ca.key -set_serial 01 -out server.crt

# Create the Client Key and CSR
openssl genrsa -des3 -out client.key 1024
openssl req -new -key client.key -out client.csr

# You likely want a client key without a passphrase for deployment (put the passphrase protected one in your private git repo)
openssl rsa -in client.key -out client.key.pem 

# Sign the client certificate with our CA cert.  Unlike signing our own server cert, this is what we want to do.
openssl x509 -req -days 365 -in client.csr -CA ca.crt -CAkey ca.key -set_serial 01 -out client.crt`

Server:


var https = require('https');
var fs = require('fs');

var options = {
    key: fs.readFileSync('server.key.pem'),
    cert: fs.readFileSync('server.crt'),

    //for client certs, this validates the client                                                                                                                                                           
    ca : [ fs.readFileSync('ca.crt') ],
    requestCert : true,
    rejectUnauthorized: true
};

https.createServer(options, function (req, res) {
    res.writeHead(200);
    res.end("hello world\n");
}).listen(4443);

Client:


var https = require(‘https’);
var fs = require(‘fs’);

var options = {
host: ‘localhost’,
port: 4443,
path: ‘/‘,
method: ‘GET’,
key: fs.readFileSync(‘client.key.pem’),
cert: fs.readFileSync(‘client.crt’),
ca : [ fs.readFileSync(‘ca.crt’) ],
requestCert : true,
rejectUnauthorized: true
};
options.agent = new https.Agent(options);

var req = https.request(options, function(res) {

console.log('STATUS: ' + res.statusCode);
console.log('HEADERS: ' + JSON.stringify(res.headers));
res.setEncoding('utf8');
res.on('data', function (chunk) {
    console.log('BODY: ' + chunk);
});

} );

req.end();

Finding Quality Node Modules

Last night we had a great second Node.js Toronto meetup. There was some frustration directed towards the problem knowing which modules are of high quality. This is an acknowledged pain point in the node.js community but it’ll get better soon.

It’s worth noting this problem has been discussed a couple times by the current node Project Manager isaacs on the NodeUp* podcast, though I’m not exactly sure which. Apparently they’re working a new version of npm website right now.

In the meantime, finding quality node modules isn’t all that hard. With a bit of experience you’ll start to recognize names and learn to judge module quality quite quickly. Here’s some tips:

  • the following gives you the most depended on modules http://search.npmjs.org/#/_browse/deps
  • take a look at this list first
  • take note of the names of these authors
  • you’ll notice certain people or organizations publish a lot (substack and NodeJitsu for instance )
  • also take a look at the modules wiki at https://github.com/joyent/node/wiki/modules
  • otherwise search http://search.npmjs.org/
  • visit the github page for the module, it’s usually next to the name
  • there’s no real excuse not to have a github page (or google code I supposed), I avoid these modules
  • look for a clean well written README
  • if someone really wants you to use their module they’ll make it easy to do so with a synopsis/quick start/tutorial
  • does the module have watchers?
  • tests?
  • how does the code look?
  • are there alternatives?
  • how hard is to write your own?

* The NodeUp podcast is great. I highly recommend it.