April 30, 2014

Connected Government – Cloud Enabling Government

Filed under: Uncategorized — mifan @ 7:41 pm
Tags: , ,

Cloud adoption has had a tremendous impact on e-Government or digital government, leading to reduced operations cost, reduced IT footprint, sharing of knowledge and infrastructure and overall more effective public services. National and local governments that are capitalizing on the cloud infrastructure are starting to reap benefits with improved government to government (G2G), government to business (G2B) and government to citizen (G2C) services.

As applications and their architecture become more complex by the day and as more and more applications start moving towards the cloud, nonfunctional requirements start becoming the key set of requirements that need to be satisfied in order to ensure successful implementation. NFRs have become the key set of issues that enterprise architectects, devops, and implementers face in the move towards cloud-based systems.

This whitepaper on Connected Government – Cloud enabling public services provides an insight into the challenges of building an eGovernment solution, and how an enterprise grade Platform as a Service can help achieve the key NFRs.

February 12, 2014

Operational Business Intelligence for the Reactive Enterprise

We are in a day and age where infrastructure, and to some extent, businesses, are moving towards reactive IT instead of traditional proactive IT. Why reactive IT? Think auto-scaling vs traditional capacity planning for instance. Or the move towards schema-less NoSQL instead of traditional RDBMS. All in all, reactive IT enables an enterprise and its infrastructure to react to its internal and external environment. In order to achieve this, a real-time, or near real time, event driven model is a must –  agents to pull/push streams of events, systems to process these events quickly and efficiently and the ability to use these processed events to react.

Operational Business Intelligence (knowns as OBI or operational BI) or Operational Intelligence is defined as the analysis of operational data and information in an enterprise – this deals with the real time or low latency analysis of streaming events or batched enterprise data providing feedback in terms of input to the enterprise. OI provides organisations with real time insights into enterprise operations, providing organisations the ability to act upon, or in other words react, to events – enabling organisations to ‘listen’ to and process events as they come in, detect anomalies and patterns, and take reactive measures.

Of course, with large amounts of data come big challenges, and with big challenges come big data. With the possibility of dealing with massive data sets within OI, big data concepts have started becoming a mainstay in organisational OI. Modern OI solutions have started focusing on large NoSQL data stores possibly processed in batch mode and streaming events.

OI would provide a unified, correlated view of streaming big data, processed big data, complex events and processes, with the ability to analyse, mine and process data and information – a prerequisite to building a reactive enterprise. Enterprise users and devops have come to expect the following kind of information from a unified OI view

  • Analysis of information, and mining for patterns
  • Adhoc, real time dashboards
  • Adhoc search of patterns across the enterprise
  • Alerting based on event occurences
  • Monitoring of system health, load, KPIs


OI has many technology components, often with shared feature sets. Some of the notable solutions are

  • Business Activity Monitoring (BAM) – Monitoring of activities and events, usually batch processed and provided via dashboards. Used to track  KPIs related to activities and performance.
  • Complex Event Processing (CEP) – Processing of a continuous stream of events, usually in-memory. High performance with the ability to detect certain patterns and anomalies in the incoming streams.
  • Business Process Management (BPM) – Model driven execution of processes and policies, including business workflows with human intervention.

Independently the above components handle a very specific area of OI, and together they provide the building blocks for a comprehensive and unified OI.

In a later post, we will explore how the WSO2 stack, a comprehensive Open Source middleware stack, can be utilised for Operational Intelligence in an enterprise.

May 28, 2013

SSH must-haves for Productivity

Filed under: Development,FOSS,Linux,Uncategorized — mifan @ 9:21 am

Devops work with many different remote systems at once. For the lazy bunch ( a.k.a the truly productive bunch who look for ways to make their work easy – a.k.a the really bright ones ) there are ways to make life easier.

So instead of typing

ssh whitewizard@

(or wait, was it, and then entering the password that no one can remember given to us by the sysadmin, I’d now say

ssh devserver

and voila.

The steps to get there are:

SSH Passwordless Entry

Step one is to get rid of those pesky passwords required each time you SSH into a host. This can be achieved by setting up an RSA (public/private key). And besides, who’d say no to added security?

To generate a SSH key if you don’t have one already, run

mkdir ~/.ssh
chmod 700 ~/.ssh
ssh-keygen -t rsa

You will be prompted for a location to save the keys, and a passphrase for the keys. This passphrase will protect your private key while it’s stored on the system.

Generating public/private rsa key pair.
Enter file in which to save the key (/home/b/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/b/.ssh/id_rsa.
Your public key has been saved in /home/b/.ssh/

Your public key is now available as .ssh/ in your home folder.

Next, the little bit of magic where you’d copy your key over to the remote/host machine. If you can SSH into the machine, then this should be no problem. Run

ssh-copy-id whitewizard@

Done. You can now login by saying

ssh whitewizard@

And the system shouldn't ask you for a password.

SSH Hostname Shortcuts

But wait – we want more (or less, actually). The next step is to shorten this further. For this create a file .ssh/config with the following entries:

Host [shortcut]     
 Hostname [full-hostname]    
 User [username]

I’d use

Host devserver
     User whitewizard

And voila, now running the following will log you into the system

SSH devserver

Of course, replace the above hostnames and usernames with you own – else you’d be login into Saruman’s own server!

May 2, 2013

Real time Complex Analytics for Football

Filed under: Development,WSO2 — mifan @ 6:54 pm
Tags: , ,

Srinath recently wrote about the use of WSO2 CEP for the ACM DEBS 2013 challenge on Real time Complex Analytics which was a real eye opener. The challenge, enabled by sensor networks on player’s shoes, the goalkeeper and the football, was to conduct real time analytics on a football game, and provide useful analytics and real-time reports to the managers.

The post describes the usage of WSO2’s Complex Event Processor (CEP) which was used to implement the use cases of the challenge – namely Running analysis of the players, Ball possession analysis, a heatmap of player locations at various times and shots on goal analysis. CEP is a high performance and scalable event processor that can read streams of ‘data’, extract meaningful events and process them real time in memory – this means the ability to process large amounts of events, fast! In this case, continuous steams of data and events would be fed into the system via the various sensors, at a rate of 15,000 position events per second, whilst the player sensors and the ball sensors output events at a rate of 200MHz and 2000MHz respectively. According the blog, WSO2 CEP processed 50,000 events per second, which is quite impressive.

Just imagine the possibilities such an implementation can provide to the game – I’m awaiting the day when the TV alerts me, possibly 2 seconds before the event, that Robin Van Persie’s shot would have a 99% chance of finding the back of the net beating Petr Cech – based on analytics of the kick (the curvature , wind speed, rotation) and analytics of the defense (the distance of the goalkeeper and probability of him reaching the ball based on historical data, the distance between defenders). Or based on Theo Walcott’s speedy run and Carzola’s immaculate pass, as well as the position of the defense and the keeper, Walcott would end up with the ball beyond the defense in an on-side goal scoring position in the next 3 seconds – the mini siren on the TV goes off, telling me to watch the screen for the next 5 seconds (of course, that is assuming my reaction times are good, let alone the goal keeper’s). And I can hear us hard-core fans of football saying – “who would take their eyes off a football game anyways?”, but what if!

Minority Report for football, anyone?

XKCD Future Comic

Future (Source: XKCD)

April 17, 2013

Clearer Active Tab in Gnome 3 Terminal

Filed under: Uncategorized — mifan @ 7:56 am

I use Gnome on Ubuntu 12.10 and use Gnome terminal as the main terminal. A common issue I faced is the lack of clarity of the active/open tab amongst many tabs open in the terminal window. I use the Ambience theme that comes as default.

A simple fix for this is to create a style in Gnome 3:
Create a gtk.css file as follows, if it doesn’t exist

vim ~/.config/gtk-3.0/gtk.css

Add the following to the file – you can change the color value to any color as required.

TerminalWindow .notebook tab:active {
background-color: #CC6600;

And now you’ve got an active tab that is much more visible

Gnome terminal with custom active tab

Gnome terminal with custom active tab

March 19, 2013

What Your Favorite Map Projection Says About You!

Filed under: Uncategorized — mifan @ 4:54 am

Yet another brilliant XKCD comic – and this time for the geography/neogeography folk – and true to some ‘extent’
Map Projections XKCD comic

February 7, 2012

Creating Patches with SVN Diff

Filed under: Development,FOSS,Linux — mifan @ 12:33 am
Tags: , ,

I recently came across a situation where I had to generate a patch from SVN, where I was in a multi-developer environment, my code was committed in 2 batches, and there were changes from other developers in between. Thanks to the power of SVN, this is a breeze (of course, I’m not even going to start the version control debate here!!).
The format for SVN diff is as follows:

svn diff [-r N[:M]] [--old OLD-TGT] [--new NEW-TGT] [PATH...]

In this case, to generate a patch between revisions 1000 to 1020, only for files file1.php and file2.php would be as follows

svn diff -r 1000:1020 dir1/file1.php dir2/file2.php > mypatch.patch

For conventional patches, you would checkout the code, make your changes, and from the root locations run the svn diff, as

svn diff > mypatch.patch

October 5, 2011

Download Directories with wget

Filed under: Development,FOSS,Linux — mifan @ 6:53 am

wget is a non-interactive download utility available in the *nix OS, which can be used to remotely download files, amongst other things.

I’ve looked for the correct wget command to download a directory remotely, but couldn’t find the right one. This one, however, worked for me:

wget -r --level=3 -np -nH "<URL of Directory>"


r – recursive download (download recursively looping through sub directories

np – no parent directories (without this command, you get some weird results from the recursion working for the parent directory as well)

nH – no host (get rid of the host name – without this, your folder will be created with some weird name that includes the host/url

level=3 -upto what level/depth you want to recurse or go into. Change to an appropriate number

Check the man page of wget for more information

September 16, 2011

Multiple Terminals in the same Session with Screen

Filed under: Development,FOSS,Linux — mifan @ 8:30 am

As an administrator or unix user who has just SSH’d into a remote server, have you had the requirement to open multiple remote terminals without having to resort to re-logging in multiple times? I’ve had this one too many times, where I needed to run multiple apps, view logs etc. etc. in separate terminals, which can be made easy using GNU Screen.
GNU Screen is a screen multiplexer, which can be installed via a simple apt-get install screen on your linux server. This should be available by default as well. Once that is done, the following steps should suffice

  • Login to your remote terminal (e.g: via SSH)
  • Open Screen
 user@remote> screen
  • Use the following commands to create, navigate etc. CTRL+A (CTRL A) is the control key in my Ubuntu server by default, and this might be the case for you as well. Thus when I refer to CTRL A p, I mean holding CTRL + A down, press the ‘p’ key
  • Create a new terminal:
user@remote> CTRL A c
  • Move to the next terminal:
user@remote> CTRL A n
  • Move to the previous terminal:
user@remote> CTRL A p

Etc. Some more commands are as follows

  • Move to a specific terminal: CTRL A “
  • Move to the last used terminal: CTRL A A
  • Move to terminal number 0 (0-9) : CTRL A 0

These commands were enough to get me going. For even further commands and to explore the power of screen in depth, here are some useful resources:

Resource 1, Resource 2

August 16, 2011

Espatialy for You!

Filed under: Uncategorized — mifan @ 9:16 am

Espatialy for You! is my attempt to jot down my notes, thoughts, research, my ultimate goal to advocate GIS catalog world dominance and what not from a neogeographers point of view. Head over to Espatialy for You! for more.

Next Page »

The Rubric Theme. Create a free website or blog at


Get every new post delivered to your Inbox.