Using Meraki to deploy SMB networks

Posted by & filed under Networking.

I’ve recently become very infatuated with the Meraki brand of routing, switching and wireless product line. Recently bought out by Cisco, Meraki products primarily rely on an extremely neat cloud-based software management service that comes with every Meraki device licence.

Its funny how I’ve seen my colleagues react to the way the Meraki products work with their tight coupling with their cloud service – Some love the idea and some hate it. Personally I see this kind of solution being the primary solution for all small-medium businesses in the near future, with the CLI-based network solutions that come with expensive engineers and management dieing out. Which is I suppose why Cisco bought Meraki for some $1.2 billion.

I’m not going to run a marketing speech for these guys but my elevator pitch is this: Meraki products are in the same price range as Cisco products however the time saved in initial deployment, future management, and additional features makes them a much cheaper option.

The key points I typically harp on about are:

  • All Meraki products have one licence and all software features are unlocked.
  • All Meraki products come with software-based layer 7 firewalls, user traffic tracking and management, WAN optimisation, VPNs, and more – Outside of enterprise and the datacentre you don’t need to buy any additional appliances. Don’t need Palo Alto’s or F5’s.
  • The cloud management service is actually nice – Updates just happen. As the consultant you’re much more likely to be able to simply walk away after deployment.
  • Configuration of some of the more complex features (VPNs) are actually very simple – In my experience deployment time-frames are a fraction of the time usually spent on our favourite CLI-based product.

I have produced a solution template for SMBs that can help consultants quickly price up site solutions. You can download the PDF here or go to my GitHub page here for the Visio diagram.

2014-10-10 16_10_04-Clipboard

Installing Mono3 on Ubuntu 12.04

Posted by & filed under Uncategorized.

Unfortunately Ubuntu 12.04 does not have any packages for Mono3, so we must compile and build it ourselves. Below are the instructions on how to do so. This article is a slightly modified version from

Installing Dependencies

First, we need to install all the dependencies we will need.

sudo apt-get install git autoconf automake libtool g++ gettext mono-gmcs
sudo apt-get install libglib2.0-dev libpng12-dev libx11-dev
sudo apt-get install libfreetype6-dev libfontconfig1-dev libcairo2-dev
sudo apt-get install libtiff4-dev libjpeg8-dev libgif-dev libexif-dev

Cloning source code repositories

Now, lets get all the code we will need. The Mono project has all of their source code on their GitHub account. You want to create a folder in /opt or your home folder to contain your working source code repositories. Clone the repositories while inside this folder.

git clone git://

libgdiplus is needed by Mono:

git clone git://

Compiling libgdiplus

Navigate to the root folder of the libgdiplus repository you cloned previously, and then run the following commands.

./ --prefix=/usr/local

This will configure the compilation process and ensure your computer has all the proper libraries and dependencies installed. If you ran everything above, you should be good to go. If it fails on an error and indicates you are missing a libary or package, you may need to install a development version of that package, and then try to run the command again.

Once this completes successfully and you see the words Now type ‘make’ to compile, you can run the following commands.

sudo make install-strip

make will compile the package, and sudo make install will install it into your system.

Compiling Mono

First you need to get all the submodules for Mono:

git submodule init
git submodule update --recursive

Now navigate to the directory where you cloned the Mono source code. Run the compilation setup script similar to the previous step.

./ --prefix=/usr/local

Once this completes successfully and you should again see the words Now type ‘make’ to compile. This article assumes you do not already have Mono installed. You must first make monolite, which is then used to build the full Mono.

make get-monolite-latest

Now you can build Mono:

make EXTERNAL_MCS=${PWD}/mcs/class/lib/monolite/gmcs.exe

If you already have a working Mono installation, or you want to update it you can just run regular make without having to build monolite.

Now you can install it:

sudo make install

You can optionally run make check to run the mono and mcs test suites.

That’s it! You should now have a working Mono installation on your system. You can now run:

mono -V

You should see some information on Mono and the version number. As of the time of this writing, that is 3.8.1.

Edit and Apply registry settings via PowerShell

Posted by & filed under Programming.

The video game Diablo 3 is a great game, however due to its extremely vibrant and chaotic particle effects during fights it can become very difficult to see the mouse cursor.

So I’ve written a quick script for Windows users to turn on mouse cursor trails in Diablo 3. It will automatically disable the trails when you exit the game. I’ve found it helps *a lot* in being able to keep track of the cursor.

No need to download anything so no keyloggers or malware here. And it doesn’t touch Diablo what-so-ever, so no chance of being banned.

Read more »

Using Git with 3D Games

Posted by & filed under Game Development, Programming, Unity3D.

Git can work fine with 3D Games out of the box. However the main caveat here is that versioning large (>5MB) media files can be a problem over the long term as your commit history bloats. We have solved this potential issue in our projects by only versioning the binary asset when it is considered final. Our 3D artists use Dropbox to work on WIP assets, both for the reason above and because its much faster and simpler (Not many artists will actively want to use Git!).

Read more »

Performing a Clean Install of F5 BIG-IP software

Posted by & filed under Networking.

I was recently tasked with replacing a dead F5 BIG-IP 1600 at our datacenter. A seemingly typical piece of work, however I ran into some huge issues in the process of doing so largely due to the replacement F5 device coming with an early version of the OS at v9.4.6 – F5 switched from a partition-based storage framework to an LVM Volume-based storage framework in the transition between v9.x and v10.x as well as quite heavily changing the OS upgrade process (For the better).

What this means in practice is that it is quite a complicated process to upgrade from v9.x -> v10.x -> v11.x. The guides F5 provide to perform this span a number of knowledgebase articles and it can be fairly complicated to follow properly, and missing one step could mean wiping and bricking the F5.

Additionally, the official F5 guides on performing a clean install of an OS Image requires either a linux PC or a pre-existing and working F5 to create the requisite bootable media to install from.

In this article I present a simple and effective way to upgrade from any version of OS straight to the latest v11.x OS, from any PC (Windows, Linux or Mac).

Read more »

Collaborative Code Design

Posted by & filed under Game Development, Programming.

When working within a team of programmers I have always found that maximizing collaboration between programmers in the design phase has great benefits down the road. Even if a programmer isn’t expected to work within the implementation of a specific design I find they still get great value out of knowing, at least at a high level, how each area of the codebase is meant to flow.

To that end I’m always on the hunt for better tools to help achieve this. I’ve yet to find the perfect combination of non-invasive tools across the entire programming workflow end-to-end as unfortunately many of the most powerful ones can tend to dictate what other tools you must use to keep compatibility.

However I believe I have found a good piece of the puzzle for at least the design phase of the programming workflow.

Read more »