Skip to content

Index

host@home: Choose A Virtualization Platform

Use Virtualization

If I could offer you only one tip for the future, virtualization would be it. The long-term benefits of virtualization have been proved by scientists, whereas the rest of my advice has no basis more reliable than my own meandering experience.

OK, so which server OS virtualization platform are we going to use in our home data center? We've got a few choices:

VMware vSphere Hypervisor

VMware shows little regard for non Windows folk, requiring us to run the vSphere Client under a virtual machine. running windows. VMware can fuck right off.

Xen

I love Xen and am pleased to see debian is again supporting it. Amazon use Xen for their EC2 service and the last time I looked the majority of big virtual hosting providers were using it. A couple of years back I wrote a web interface for Xen called Xenium.

After all that gushing you're probably going to think I'm recommending you use Xen right? Nope!

Oracle VirtualBox

I think my love for Xen caused me to ignore this option until recently.

VirtualBox is cross platform open source (GPL2) virtualization software for Linux, OSX, Windows and Solaris. It's easy to install and  someone has created a web frontend (phpvirtualbox) for it that looks just like the native GUI.

I've only just started exploring VirtualBox but for ease of installation, freedom and functionality it looks like a winner for hosting @ home.

Things I like about VirtualBox

  • sound from Windows guest is playing on my Linux host
  • mouse moves smoothly with no need to use keys to release
  • easy to share a directory from host to guest (well...windows needed reboot)
  • VRDP gives you RDP even when your OS doesn't

host@home: Setting Up Your Network

[2013-12-19 I'm not currently hosting @ home but Snowden leaks have made me rethink]

My Billion 7300 modem/router does a simple task well My Billion 7300 modem/router does a simple task well

My first step on the path to hosting @ home was to get a good Net connection. I selected a Naked ADSL plan from Australian provider Internode. Australians pay crazy high prices for Internet but $60 a month for 150GB (combined download/upload) doesn't seem too steep. In order to host @ home I needed to disable the Internode network firewall to enable incoming traffic.

Internode gives me a dynamic IP (which I actually prefer). Most DSL routers come with support for dynamic DNS built in and mine does a great job of updating my DNS entry within seconds of my IP changing. All other domains hosted here will have CNAMEs pointing this host only a single hostname needs to be updated when my IP changes.

When my modem light glowed steady on Friday I knew I could get started configuring the router. I'm fond of Billion modem/routers which sell for around $60.

Configuring the router

ADSL Routers tend to be pretty easy to configure via their web interface providing you remember the admin password or have something to poke in the hole to reset it to factory defaults. The other thing you have to work out is the IP address the modem is  running on. For some reason 192.168.0.1 is not the universal standard - my modem was on 192.168.1.254.  Go figure. Here's what I do when setting up a new host@home network.

  • Change root password from the factory default
  • Configure DHCP to handout my ISP's nameservers and my own domain
  • Configure DHCP to IPs from 100 - 200 (I reserve others for manual addressing)
  • Forward incoming connections to a gateway IP (which forwards traffic using HAProxy)
  • Configure dynamic dns

ADSL router updates dynamic dns entry when IP changes ADSL router updates dynamic dns entry when IP changes

I installed apache2 on my gateway host to test external access. You should be able to access it here: home.failmode.com

In the next installment...

The next post will cover setting up HAProxy on the gateway host to so that incoming requests can be routed to the correct internal servers.

SSH compression can slow you down

Repost from my old blog

When copying some large files between hosts at our rack I noticed some pretty poor transfer speeds. Having recently forked out $70 for a rackmount gigabit switch I wondered what was slowing things.

It seems ssh was trying to help out by compressing everything however compressing the data took more than twice as long as transferring the uncompressed data. I proved this by disabling compression at the commandline and seeing the transfer speed more than triple.

mbailey@modu1:~/vm$ scp -r r03 modu4:vm/
Ubuntu 64-bit-f001.vmdk          7%  147MB   8.1MB/s   03:54 ETA

mbailey@modu1:~/vm$ scp -r -o 'Compression no'  r03 modu4:vm/
Ubuntu 64-bit-f001.vmdk        100% 2048MB  28.1MB/s   01:13 ETA

So how can we put this into practice everywhere?

Setting 'Compression no' in /etc/ssh/ssh_config will do the trick for me. There's no need for my vmware hosts to be compressing files I copy between them. I do want compression when copying files over shared/slower links but I can achieve that by initiating connections that benefit from compression on boxes that are configured to use it.

If I want to enable compression I can always use the '-C' switch to enable compression.

Wired to Care - Will Neuroscience Prove Jesus Right?

Humans, like the apes we evolved from, are social creatures. Cooperation has given survival advantages to our species. One interesting aspect of our wiring is empathy. It's been argued that this involves mirror neurons, which fire both when an animal acts and when the animal observes the same action performed by another. Whatever the cause, we feel other peoples feelings and therefore share in the benefits when we make someone feel better. Conversely, causing someone distress should give us unpleasant feelings. This seems like a simple yet effective system for social regulation that doesn't require modern (human) innovations like verbal language, law or religion.

So how can people still be cruel to other people if they're going to feel their emotions? Did we evolve a way to selectively shutdown this mechanism? If your tribe was under attack, surely it would be advantageous if your warriors could suppress empathic responses just long enough to beat your enemies to a pulp. You would want them to be re-enabled once the threat has gone though!

I think we have built in empathy circuit breakers that get activated when we see people we dislike. There may be other ways to suppress empathy such as "just following orders" but it's the identification of people as bad or an enemy that has me curious. Is this a major contributor to cruelty and callousness from otherwise kind individuals? Do we become 'selectively psychopathic' by shutting down an important part our 'humanity'?

Some advice I always had trouble comprehending was "love your enemies". It may turn out to be another way of saying, "don't disable your empathic mechanisms". While this may not be helpful advice for a soldier on a battlefield, it's probably good advice for us in our schools, workplaces and social gatherings.

I suspect a way to achieve this is not view anyone you deal with as 'bad guys' which reminds me of another quote attributed to Jesus, "Do not judge others".

There seems to be a fair bit of research going into modulation of empathic responses. I suspect there's a significant difference between the response of males and females to seeing a person being punished for wrongdoing. In the TV show The Shield, it's made very clear to you who the bad guy is in each episode and he usually gets a pounding by the end of it. My dad used to enjoy shows like The Professionals when I was a kid but my mum found the violence distressing. I wonder whether males, being the ones who probably had to defend the camp, developed the ability to 'reverse the polarity' on their empathic circuits and actually gain pleasure from seeing 'bad guys' suffer?

Perhaps avoiding the human tendency to judge others unfavourably can help us avoid triggering a survival mechanism which doesn't have a place in modern society and in doing so, allow us to act better toward one another.

Jesus must have given this a lot of thought.

How To Loose Your Customer Base

Yes, the misspelling in the title is deliberate. See image below.

If you're building expensive tools for managing server infrastructure, please don't assume your users are idiots. Warning me about the possible consequences of my action is fine. Preventing me from doing what I want to do won't endear me to your product and poor spelling won't increase my confidence. I'm LOOSING confidence in you VMware.

You can't go there. Sorry, it's the rules. You can't go there. Sorry, it's the rules.

Rob Postill had a little rant about VMware last night.

JSON in your Web Browser

Web browsers don't tend to display JSON very nicely. See how yours does with this example. Some handy plugins that help with this. While they currently only allow viewing, how long before we se some PUT/POST action?

JSONView for Firefox

Normally when encountering a JSON document (content type "application/json"), Firefox simply prompts you to download the file. With the JSONView extension, JSON documents are shown in the browser similar to how XML documents are shown. The document is formatted, highlighted, and arrays and objects can be collapsed. Even if the JSON document contains errors, JSONView will still show the raw text.

There is also an Unofficial JSONView port for Chrome and the Unfortunately Named "XML View" for Safari

When installed, your JSON will look more like this:

JSONView browser plugins prettify JSONView browser plugins prettify

Posts in this series:

JSON with Ruby and Rails

JSON is a beautiful format for storing objects as human readable text. It’s succeeded where XML has failed. Not only is it not shit, it’s actually quite good! But don’t just take my word for it, have a look at some of the “cool” ways you can generate and consume JSON.

Ruby support for JSON

Ruby's JSON library makes parsing and generating JSON simple.

Converting between hash and json in Ruby

$ irb
>> require 'json'
=> true
>> json_text = { :name => 'Mike', :age => 70 }.to_json
=> "{\"name\":\"Mike\",\"age\":70}"
>> JSON.parse(json_text)
=> {"name"=>"Mike", "age"=>70}

HTTParty helps with communicating with RESTful services

Here we grab a record from Facebook.

Retrieve a JSON Resource

$ irb
>> require 'awesome_print'
=> true
>> require 'json'
=> true
>> require 'httparty'
=> true
>> ap JSON.parse HTTParty.get('https://graph.facebook.com/Stoptheclock').response.body
{
                  "about" => "Abolish the 28 Day Rule for Victorian Shelters\n\nhttp://stoptheclock.com.au\n\ninfo@stoptheclock.com.au",
               "category" => "Community",
                "founded" => "2010",
           "is_published" => true,
                "mission" => "To bring an end to the law requiring Victorian shelters to kill healthy adoptable cats and dogs after four weeks.",
    "talking_about_count" => 3,
               "username" => "Stoptheclock",
                "website" => "http://stoptheclock.com.au",
        "were_here_count" => 0,
                     "id" => "167163086642552",
                   "name" => "Stop The Clock",
                   "link" => "http://www.facebook.com/Stoptheclock",
                  "likes" => 5517
}
=> nil

HTTParty gets Classy

Creating a simple class allows you to DRY things up a bit

$ irb
>> require 'httparty'
=> true
>> class Facebook
>>   include HTTParty
>>   base_uri 'https://graph.facebook.com/'
>>   # default_params :output => 'json'
?>   format :json
>>
?>   def self.object(id)
>>     get "/#{id}"
>>   end
>> end
=> nil
>>
>> require 'awesome_print'
>> ap Facebook.object('Stoptheclock').parsed_response
{
                  "about" => "Abolish the 28 Day Rule for Victorian Shelters\n\nhttp://stoptheclock.com.au\n\ninfo@stoptheclock.com.au",
               "category" => "Community",
                "founded" => "2010",
           "is_published" => true,
                "mission" => "To bring an end to the law requiring Victorian shelters to kill healthy adoptable cats and dogs after four weeks.",
    "talking_about_count" => 3,
               "username" => "Stoptheclock",
                "website" => "http://stoptheclock.com.au",
        "were_here_count" => 0,
                     "id" => "167163086642552",
                   "name" => "Stop The Clock",
                   "link" => "http://www.facebook.com/Stoptheclock",
                  "likes" => 5517
}
=> nil

Rails support for JSON

ActiveSupport::JSON knows how to convert ActiveRecord objects (and more) to JSON. Simone Carletti explains how this differs from the standard lib.

## Encode
json = ActiveSupport::JSON.encode(object) # extra methods like :include
json = Offering.first.to_json(:include => :outlet, :methods => [:days_waiting])

## Decode
ActiveSupport::JSON.decode(json)

Rails3 niceness

Adding JSON to your Rails3 app doesn't require a lot of extra code. You can specify method calls and associated objects to include as well as restrict the attributes returned. Simple eh?

class PostController < ApplicationController
  respond_to :json, :html, :jpg, :xml

  def index
    respond_with(@posts = Post.all),
                   :methods => [:average_rating],
                   :include => :comments
  end

  def show
    respond_with(@post = Post.find(params[:id])), :only => [:name, :body]
  end

end

Posts in this series

JSON from Javascript

JSON is a beautiful format for storing objects as human readable text. It’s succeeded where XML has failed. Not only is it not shit, it’s actually quite good! But don’t just take my word for it, have a look at some of the “cool” ways you can generate and consume JSON.

JSON support added to Javascript

JSON is an acronym for Javascript Object Notation and while it's designed for data interchange it's a string containing valid Javascript. While you could instantiate the object using eval, executing data is like eating off food off the ground - unhygienic and with unknown side effects. Fortunately support for JSON was added to ECMAscript 5 so you can generate and parse JSON without fear.

Someone on StackOverflow reckons Internet Explorer 8, Firefox 3.5+, Safari 4+, Chrome, and Opera 10+ support native JSON parsing. Douglas Crockford's json2.js library adds the standard JSON methods to browsers that lack them. jQuery's JSON parser makes use of the browsers native implementation where this is one.

Enough yakking, time for a demo

This is using native Javascript to generate and consume JSON. You can run this stuff in Firebug. Sorry, I don't have any CSS to make it look all "firebuggy". Does anyone know of a Javascript interpreter I can run in a shell?

// Basic Javascript
myJSONtext = '{"name":"mike","species":"human"}'

// Parse JSON
reviver = null;
myObject = JSON.parse(myJSONtext, reviver);
myObject.species // => 'human'

// Generate JSON
replacer = null
myNewJSONtext = JSON.stringify(myObject, replacer);
myJSONtext == myNewJSONtext // They should be the same

jQuery support for JSON

Open Firefox to any webpage that loads jQuery (e.g. jquery.com) and paste this into your Firebug console.

// JSONP gets around 'same origin policy'
// jQuery generates randomly named callback function
var returnVal = '';
var ajaxUrl = 'https://graph.facebook.com/goodfordogs?callback=?';
// var ajaxUrl = 'http://localhost:5984/facebook/goodfordogs?callback=?';
$.getJSON(ajaxUrl, null, function(data) {
  alert(data.likes + ' people like '+ data.name);
});

var returnVal = '';
var ajaxUrl = 'https://graph.facebook.com/goodfordogs?callback=?';
// var ajaxUrl = 'http://localhost:5984/facebook/goodfordogs?callback=?';
$.getJSON(ajaxUrl, null, function(data) {
  alert(data.likes + ' people like '+ data.name);
});

Web browsers prevent Javascript from sending requests to domains other than the originating one. So you can't get your AJAX request to simply request a JSON object from Facebook. A neat way around this is to write script tags to the document that load a remote javascript file. JSONP lets us specify a callback that will be called with the resultant JSON. It's easier to show you:

$ curl graph.facebook.com/goodfordogs?callback=blah
blah({
   "id": "171644807551",
   "name": "GoodForDogs",
   "picture": "http://profile.ak.fbcdn.net/hprofile-ak-snc4/50503_171644807551_2269219_s.jpg",
   "link": "http://www.facebook.com/Goodfordogs",
   "category": "Local business",
   "website": "http://Goodfordogs.org",
   "username": "Goodfordogs",
   "likes": 902
});

jQuery picks a random name for the callback to make things easier for you. In fact it does a pretty good job of hiding the inner workings of JSONP from you.

Posts in this series:

JSON from the Command Line

JSON is a beautiful format for storing objects as human readable text. It's succeeded where XML has failed. Not only is it not shit, it's actually quite good! But don't just take my word for it, have a look at some of the "cool" ways you can generate and consume JSON.

Syntax Highlighting for Vim

Useful when writing or editing JSON. Grab it here and drop into ~/.vim/plugin/

Consuming with CURL

Disable curl's progress bar and enable compression

$ echo "silent=true" >> ~/.curlrc
$ echo "compressed=true" >> ~/.curlrc

Install a Node.js package that will help us prettify output

npm install -g jsontool

Now let's grab a record from Facebook

$ curl graph.facebook.com/stoptheclock | json
{
  "about": "Abolish the 28 Day Rule for Victorian Shelters\n\nhttp://stoptheclock.com.au\n\ninfo@stoptheclock.com.au",
  "category": "Community",
  "founded": "2010",
  "is_published": true,
  "mission": "To bring an end to the law requiring Victorian shelters to kill healthy adoptable cats and dogs after four weeks.",
  "talking_about_count": 2,
  "username": "Stoptheclock",
  "website": "http://stoptheclock.com.au",
  "were_here_count": 0,
  "id": "167163086642552",
  "name": "Stop The Clock",
  "link": "http://www.facebook.com/Stoptheclock",
  "likes": 5515
}

Storing our JSON

The simplest solution is to save it to a file...

$ curl -s graph.facebook.com/stoptheclock | prettify_json.rb > fb_gfd.json

...but other applications can accept JSON such as CouchDB

Here we'll create a new database, store our JSON to it and then retrieve it.

# Create a database on CouchDB
$ curl localhost:5984/_all_dbs-X PUT localhost:5984/facebook
{"ok":true}

# Check it's there
$ curl localhost:5984/_all_dbs
["facebook","test"]

# Save our document to it
$ curl -X PUT localhost:5984/facebook/stoptheclock -d @fb_stc.json
{"ok":true,"id":"stoptheclock","rev":"1-f0422f8044e911b2f97c6ad71136eda1"}

# Check it's there
$ curl localhost:5984/facebook/stoptheclock | json
{
  "about": "Abolish the 28 Day Rule for Victorian Shelters\n\nhttp://stoptheclock.com.au\n\ninfo@stoptheclock.com.au",
  "category": "Community",
  "_rev": "1-f0422f8044e911b2f97c6ad71136eda1",
  "_id": "stoptheclock",
  "founded": "2010",
  "is_published": true,
  "mission": "To bring an end to the law requiring Victorian shelters to kill healthy adoptable cats and dogs after four weeks.",
  "talking_about_count": 2,
  "username": "Stoptheclock",
  "website": "http://stoptheclock.com.au",
  "were_here_count": 0,
  "id": "167163086642552",
  "name": "Stop The Clock",
  "link": "http://www.facebook.com/Stoptheclock",
  "likes": 5515
}

Querying CouchDB is outside the scope of this tutorial

How do I query the data now it's in CouchDB? How do I query the data now it's in CouchDB?

Posts in this series:

Fixing Chef’s “Attribute hotspot is not defined” error

I was trying to install Opscode's Chef using the "Bootstrap Chef Rubygems Installation" method. It was failing hard with something about a 'hotspot'.

[mbailey@island chef]$ sudo chef-solo -c /etc/chef/solo.rb -j ~/chef.json -r http://s3.amazonaws.com/chef-solo/bootstrap-latest.tar.gz
[sudo] password for mbailey:
[Mon, 14 Feb 2011 15:20:34 +1100] INFO: Setting the run_list to ["recipe[chef::bootstrap_server]"] from JSON
[Mon, 14 Feb 2011 15:20:34 +1100] INFO: Starting Chef Run (Version 0.9.12.1)
[Mon, 14 Feb 2011 15:20:34 +1100] ERROR: Running exception handlers
[Mon, 14 Feb 2011 15:20:34 +1100] ERROR: Exception handlers complete
/usr/local/lib/ruby/gems/1.9.1/gems/mbailey-chef-0.9.12.1/lib/chef/node/attribute.rb:428:in `method_missing': Attribute hotspot is not defined! (ArgumentError)

There's a bug in ohai-0.5.8 (19 Oct 2010) that is fixed in HEAD. The simple steps fixed my problem and allowed me to install Chef.

$ git clone git://github.com/opscode/ohai.git
$ cd ohai
$ sudo rake install
gem install pkg/ohai-0.5.8 # but I wanted sudo gem install
Successfully installed ohai-0.5.8
1 gem installed
$ sudo gem install pkg/ohai-0.5.8.gem
Successfully installed ohai-0.5.8
1 gem installed

I hope this saves you some time and frustration.