Sunday 22 December 2013

A Flimsy Foundation

Today, I felt a strong desire to respond to the BBC’s recent point of view article titled “A long winter for Christians in the Middle East” for the sake of offering an alternative angle on an otherwise interesting article IMHO. The reader should be warned that we are delving into very deep and passionate mire of social science and theology with a dash of Middle Eastern dynamic, all subjects worthy of knowledge harvesting. If any or all of these subjects give you indigestion, you are free to go on to other articles on this blog.

In a recent BBC point of view article, William Dalrymple (a Scottish catholic travel writer) talks about how the “Arab Spring” has given rise to a winter for the Middle East’s 14 million strong Christians, forcing mass emigrations on scale never before seen since the Iraq war of 2003. As protests morphed to armed struggles and opposition forces become more radicalised in different states, the spirit of communal activities and togetherness has seemingly been torn apart (maybe forever) in many Arab and Middle-Eastern states, particularly in Syria.

Mr. Darymple goes on to illustrate the dimensions of this communal spirit in these areas, mentioning Mughal art sporting apparently Christian scriptural quotations (which he rightly points out are probably from non-canonical texts which have been rejected by mainstream Christians but have somehow been incorporated into Islamic thought).

An example of Mughal art depicting the crowning of a
new ruler. All in all, they are quite beautiful
(Commons)
It is here that the article finally reaches a crescendo of reasoning; Mr. Dalrymple talks about a Mughal manuscript dating back to the 16th Century that now resides in the British Library that contains an illustration of the Nativity but with features that are undeniably Qur’anic (i.e. oasis and palm trees instead of a stable) in nature. With this, the author further reasons that this picture shows how much devotion that Muslim rulers apparently showed to the Christians’ Lord and that it is above the simplistic thinking associated with the ‘clash of civilisations’ (he mentions Samuel Huntington’s book and theory by the same name).

In short, ladies and gentleman; Christians and Muslims both believe in the Nativity and therefore Christmas is Mr. Dalrymple’s solution for rebuilding a brotherly foundation between these two communities. He delivers the coup de grace at the end of the article where he describes the commonality of these groups is “... to gather around the Christ child and pray for peace.”

This is a dramatic piece of theological mediation with a major flaw, a flaw that has cost Syrians once and is now being put forward again to risk them again (this time by a writer and not a diplomat/politician). There was a time when people hoped that the Sunni rebel fighters would identify with national unity, commonality with the diversity of their country and other secular ideals to win the fight against tyranny. Instead, we have a rebel force that is further being assimilated into radical Islamism, fuelled by the socio-religious and cultural differences between communities as well as better organisation and funding. Now Christmas is being touted (I sincerely hope not by politicians eventually) as a foundation for peace in the hope that the rebel Islamist fighters would identify with the Nativity and stop fighting. If even Ramadhan could not stop the fighting, what hope is there in relying on a tradition that is more prominently associated with Christians than with Muslims? We cannot afford to force-build on flimsy foundations.

Instead, I offer my own, alternative theological argument; both sides view Jesus as a figure of authority. He once said we should love our enemies. This can therefore be used as a basis of peace because it is more of a command than a theological tradition. The catch; it is more closely identified with Christians than with Muslims (perhaps the moderates might accept it but they are rapidly being assimilated or swept away by radicals). Nevertheless, enforcing this fundamental commandment into both sides’ psyche might be the only way to build a reasonably more solid foundation than Mr. Dalrymple’s tradition argument for peace.

Finally, with regard to the clash of civilisations, I have this to say; neighbours are different just as civilisations are different. Despite those differences, we tend to coexist because not only is it beneficial to communal peace and prosperity but also because it is morally more praiseworthy. But clashes are there in the form of friendly arguments at the market place or bazaar (this is from experience). We can prevent those clashes from becoming more serious by nurturing trust, freedom of thought and respect for life (with a scriptural base to boost). These are the keys to a solid foundation that will prevent things like this from happening.


Friday 13 December 2013

The Death of the Universe

In 1999, I begged my mom to buy me a book on the M.V. Doulos, the famous floating library. It was an introduction to astronomy textbook for college and I loved it (I was 8 years old at the time).

One of the most fascinating concepts that I discovered while reading chapters on stellar workings and cosmology (the study of the evolution of the universe) was the idea that not only do stars die (including our own), but the universe too will die. This of course depended on the universe's present properties which had yet to be fully explored on the book's publication.

Today, the answer to the question of how our universe will die is a little more certain if not more mysterious than I thought before. This TEDEd video sums it all up. Enjoy!

Monday 2 December 2013

Amazon's Prime Air service

This has nothing to do with knowledge harvesting whatsoever other than simply being cool!

Behold, Amazon's newest mode of delivery... via drones! Coming soon to your front door in 2015/16.

I wonder though how the company plans on tackling the possibility of 'drone bandits' who might feel the urge of making a lucky hit with their pigeon shooters...

New Explorers of the Final Frontier

The new month has begun with a bang with the launch of China's latest in a series of missions to Earth's natural satellite, Chang'e 3 with its surface rover, Yuta.



At the same time India's mission to Mars, Mangalyaan, has successfully cleared Earth orbit and is now heading for Mars, a journey that will take it 300 days. Space is truly becoming a busy place; as busy as any other area buzzing with human activity but we are perhaps entering a new era of space exploration. An era where space not only becomes part of the human cosmos, but becomes an essential part for every nation in the world. And we are seeing the beginnings of such a world in the form of our global need and dependency on environmental data and space-based telecoms capability, and maybe someday soon, raw materials.

No boost to interplanetary sojourns is complete without a look back home; here's a parting shot of the Earth taken by Mangalyaan almost 10 days ago. A sort of visual salute to the land of its creators no doubt.
Earth by Mangalyaan, taken on November 20, 2013 (ISRO)
You can read details on the Indian Mars Orbiter Mission here and about Chang'e 3 here.

Tuesday 19 November 2013

Teaching with Robots

Last week while perusing the net for news I happened to chance upon this BBC article and I have to say, it grabbed my attention completely. The article talks about a recent Kickstarter project that successfully managed to achieve its funding goal to design and produce two robots called Bo and Yana.

At first glance these rather cute robots may look like ordinary children's playthings but they are actually for a serious purpose that is far reaching in importance. While truly made for children from 5 years and above, these robots can be programmed by the child to do many sorts of things (like acting out a story). The child essentially becomes a hacker while playing, learning how to conceptualise programming with languages like Scratch. Awesome!

The robots reflect the philosophy of their maker, Play-i, and indeed many education experts in the modern world today; in the post-computer era, must it not be a priority to educate children to become not only able to manage computing devices but also know how to program such devices? Indeed, this may not be as far fetched as us trying to teach children not just to read other people's books and papers, but also how to pick up a pen and trace letters, words and finally ideas on a piece of paper. Such knowledge requires mastering language and the same goes for computing.

Some may dismiss this idea as irrelevant to the modern child but perhaps they wouldn't dismiss it if they understood what actually drives computers. It is a language that bodes evil for many children unfortunately and it is called mathematics. Therefore learning more about computers and programming should make you a better math student, right? The idea is tantalizing and is endorsed by many luminaries in the mathematical field. For good or ill, our thinking is being shaped by these thinking machines just as much as we shape their creation. And Play-i's Bo and Yana are a testament to that end.

Monday 28 October 2013

Weekend Review: Machine Learning and its Possibilities

Because I have now started clinical rotations for this academic year, the number of posts I can write will consequently be curtailed and even haphazard in frequency. I thank my readers for their support in making this project worth the effort. Please continue to visit and you can keep track by bookmarking or using your favourite RSS application.

While going through my weekend net readings, I was simply delighted by this very engaging article from the Verge that features the cofounder of Microsoft, Paul Allen's thoughts on machine learning and its potential for impacting our lives. But to really truly understand the nature of what exactly we're dealing with here, I wish to touch on a couple of other articles that serve to augment Paul Allen's thoughts.

 Machine learning has much to do with the famous English mathematician, Alan Turing. His 1950 seminal paper titled 'Computing Machinery and Intelligence' is still cited today. In this BBC news feature from earlier this month, I discovered to my surprise (surprise because I am, of course, NOT a learned computer scientist, only a humble knowledge harvester/doctor) that Turing had tried to refute the some of the claims put forth by an equally famous predecessor, Ada Lovelace, regarded by some history's first computer programmer. It appears Ms. Lovelace was of the opinion that computing machines can never give us surprising insights. They can only put forth what we expect them to. While this seems straightforward to many of us, it didn't ring true to Mr. Turing. He proposed that if the computational power of computers continue to increase with time, what's to stop them from becoming as sophisticated as the human brain and (like the aforementioned organ) come up with some surprising ideas of their own.

Even today, it is argued that even Google's autocomplete function can sometimes suggest insightful queries that we, the searchers, might never have thought to ask. All this brings us back to Paul Allen; although the man is a supporter of artificial intelligence development and has a good number of institutions under his name which are doing just that, he denies the idea that computers will soon (as in less than a century from now) match or even outstrip the computing power of their creators' brains, the so called 'Singularity event'. He offers several points to support his rather surprising stance.

In good academic fashion, Ray Kurzweil, the originator of the term 'Singularity', offers a response to Paul's refutation, citing several counter points and also the possibility that his opponent may have misunderstood the crux of the problem. It is not my job to determine who is right and who is wrong. I'll leave that to you to decide. But what is agreed upon is that computers are indeed getting more powerful everyday. We are promised opportunities (like intelligent space probes that will explore the galaxy for us) and threats (like autonomous drones that will decide for themselves whether to kill or not). As a result, the future seems more murky and wonderful than ever before.

Tuesday 15 October 2013

Celebrating Ada Lovelace

Many people might know Charles Babbage, the designer of the famed differential and analytical engines (common examples of the first mechanical computers that could have worked if they actually had been built). But have you ever heard of a woman named Ada Lovelace?
Portrait of Ada Lovelace (Commons)

Ada Lovelace is an important historical figure in the early days of computing in two fundamental ways; first, she was the first person to write and describe what could simply be called a program (a set of instructions telling the computer what to do) that could be run through Babbage's analytical engine (in this case it was an algorithm that computes a set of rational numbers called 'Bernoulli numbers').

While this proves her prowess as a mathematician, her truest (and undisputed) contribution came in the form of a leap of thought while observing Babbage's engines. While Babbage (being a number geek) was only interested in his engines being nothing but proficient handlers of big numbers, Lovelace thought beyond this; she envisioned a computing devices which processed, say, sounds of varying characteristics. This was gigantic leap of thought because that allows you to invent something like a music note processing system. Outstanding!

Today, this exemplary act of feminine ingenuity and scholarship is celebrated on every year in mid-October in the form of Ada Lovelace day which celebrates women's contributions to the advancement of human knowledge. I only discovered this today through Google plus's hashtag trends (#AdaLovelaceDay). This is quite an important event and I encourage you to talk about it amongst your social media peers, family and friends. Its important that when we engage in knowledge harvesting, EVERYONE participates. That way we bring in important different ways in looking at a problem(s) and finding solutions to them. Oh, and while you're doing that, spare a minute to think of at least one famous female scientist. If you can, keep searching for more. If you can't, now's the time to take a plunge into discovery!

Sunday 13 October 2013

The Unreasonable Unknown

Is such a voyage unreasonable?
Artist's depiction of the Voyager spacecraft.
(NASA/JPL)
In late August 2012, after 35 years of travelling through space, NASA's veteran spacecraft Voyager 1 opened a new chapter in our species' long story of exploration by crossing and consequently mapping what is believed to mark the end of the sun's sphere of magnetic influence; the heliopause for the first time.

However, due to the unknown nature of the region the spacecraft is surveying, it took the mission team almost a year since seeing the first suggestive signs of the crossing to study the data before they could confidently announce to the public that the humanity has now become an interstellar faring race. Undoubtedly, this is a remarkable achievement and event not just because we have proven that it is possible to send a craft to the distant reaches of our sun's domain but because it awakens that deep sense of the unknown inside all of us. Whenever we progress into unfamiliar territory, be it worlds, continents, life stages or situations, there is that deep, exotic feeling that one gets; a mixture of hope, awe and apprehension. We have reached the edge of what we know, now we venture into unchartered waters.

This drive to reach the edge of knowledge, for good or for ill, drove a lot humanity's doings; from the discovery of the New World by the Europeans to the exploration of the Inner Space under the seas to the venturing of humans and mechanic emissaries into space to our peering into the distances of the sky to fathom the heavenly domains. What we gain from doing all this is nothing short of meaningful progress. The very same progress that has allowed us to tame nature (somewhat) and allow us and our children to thrive and live more comfortably. As the famous playwright and political activist George Bernard Shaw once put so eloquently, "The reasonable man adapts himself to the conditions that surround him... The unreasonable man adapts surrounding conditions to himself... All progress depends on the unreasonable man". The Voyager mission, itself an unreasonable mission, is truly a work of unreasonable people!

So, we are left with the question; do we owe our comfort today to the people who accepted things as the will of the universe/deities or to the people who went about asking seemingly unreasonable questions, and then ventured out to the edge of the known to find the answer? I believe the answer goes without saying.

Meanwhile, Voyager 1 will continue gathering data on this unexplored region of sun's domain (it can still be regarded as the being the sun's domain despite the crossing because the sun's gravitational influence extends farther outwards, up until the Oort cloud where the majority of our solar system's cometary bodies reside; Voyager has yet to leave that area of influence as illustrated below) until its radioisotope thermoelectric generator stops producing power somewhere in the mid 2020s. From then on, she continue drifting further from us, a silent emissary to the stars.

Learn more about Voyager 1 and her sister craft Voyager 2 here.
Where Voyager 1 is as of 2013. (NASA/JPL)

Friday 11 October 2013

Will October 17 be the day?

Reading history can be quite fun if only for leisure. One of the most fascinating lessons that anyone can draw from reading history books is that there isn't a single national or state entity that will not go over the dreaded 'rise and fall' hill. In fact, I dare say, one could easily summarise human history as a sequence of civilisation 'bubbles' that eventually burst into exceedingly smaller bubbles or disappear into nothingness altogether.

We're all descendants of other, sometimes bigger, historical bubbles. The birth of these smaller bubbles however usually involves tumultuous changes that can sometimes result in human misery not excluding bloodshed. Isaac Asimov's wonderful Foundation series illustrates the process in a fictional setting beautifully while offering us a scenario of just how we might be able to shorten the periods of anarchy.

Without divulging too much of the plot for those who have not yet read the aforementioned series, one of the signs given by one of Asimov's characters that points to the story's primary state entity's fall is the lack of significant scientific progress in the realm. This could further be attributed to economic decline. Sounds familiar
The resignation of Romulus
Augustus to the Germanic soldier Odoacer
traditionally marks the end of the Western
Roman Empire. (Commons) 

Although it is obviously difficult to determine when nations officially go kaput (the Roman empire is thought to have declined over a period of four centuries though historians take the resignation of Romulus Augustus in September 4, 476 as the moment of the western empire's fall), what is easy to ascertain is when they do, things can go sour very quickly. Take the U.S. right now with its looming debt crisis; if they indeed fail to repay their debt by October 17 this year (a little more than a week from now), it will leave a lot of creditors hanging. And in this globalised world, the knock-on effect of an unprecedented event like a U.S. debt default will leave the entire world devastated. Not to mention the fact that default will make it more difficult for America to borrow more money in the future, stifling the country's already sluggish growth rate. 

Can this be defined as the beginning of America's end (if it hasn't begun already)? And what will happen to all of us in the meantime? I wish we had Hari Seldon's psychohistory to figure that one out! But if it happens, October 17 will most decidedly be the date that future historians will choose to mark the beginning of the end of the U.S.A. They might very well also scour the ancient archives for insightful analyses written during this time period and they might probably find this one and judge it either as useful speculations or prophetic writings, depending on what happens in the next few days.

Saturday 14 September 2013

BIG data: its gains, losses and absurdities

In my first post ever on this blog I talked about the something called the 'knowledge economy'. This is one of the most common memes of our time (if you have no idea what a 'meme' is, click here). Like other things in this world, memes tend to cluster around other memes and one of them as far as 'knowledge economy' goes tends to be something called 'big data'.

Big data is just what it is; enormous amounts of raw, unprocessed information that cannot be sorted or dealt with using traditional, hands on means. No human can look at big data and derive any real meaning from it. Big data is usually handled with the help of computers. Examples of big data being used today abound in the world (meteorological data being used to forecast the weather; the richer the data set, the more accurate the forecast) and the problem has also given birth to many companies, the most commonly cited being Google (whose very name is derived the term 'googolplex' which is 10^10^100). The company's computers crawl through the web, indexing trillions of pages to make them searchable for all of us. Despite that, the internet remains far from being entirely indexed for 2 reasons; primo its growing everyday and secundo its so damn big. As a side note, check out this nifty website to see how big the internet probably is (I say probably because the map is obviously incomplete).

Big data exists the moment something exists in this universe. The universe and all its contents has existed for 13.7 billion years. The trick is capturing that data, storing and analyzing it for useful patterns. The tools for performing these actions have only arrived in the last two centuries in the form of cheap, compact storage devices, powerful processors, networking technologies and, most importantly perhaps, necessity.

To illustrate the importance of necessity's role in spurring the development of tools for handling big data, take a look at this TEDEd video:


Since the time when IT first became a ubiquitous part of our lives, big data has become relatively easy to collect. Mapping has enjoyed the fruits of big data; the maps of today are no longer physical pieces of paper that my parents and I used extensively on our excursions around the city of London in 2002. Nowadays people have access to compact, electronic maps on their smartphones which are rich in social data, never become outdated and can even feature live updates on such important things like traffic information (the company Waze, which was purchased by Google in June 2013. allows you to access social info on traffic with GPS enabled phones).

However, what good technology has given us can also be used against us. Politics aside, big data technology has also allowed security agencies to look into our activities with unprecedented impunity. Privacy policies and laws may forever be left in the dust as the pace of technological development outstrips the ability of lawmakers to protect our digital privacy online (if you believe in such a thing anyway). Though the aim is purely (I think) to catch the bad guy, I guess we are in real danger of being wrongly accused by overzealous/over-legislated security agencies or caught by nervous, tyrannical regimes seeking to protect their illegitimate hold on power. All thanks to big data and the technologies it has spawned.

But the ultimate problem of big data (especially of the social media variety) is its terrible need to be verified or curated (and that's why crowd-sourced projects like Wikipedia still needs editors or there would be chaos) to ensure we don't end up turning noise into conclusions. Taking unverified data as real information is the biggest absurdity of the internet today. Perhaps that is the reason why teachers hate students referring to Wikipedia for information. Though I disagree with not using Wikipedia completely, CITING it in presentations and essays is another thing entirely. Its just not there yet. Give the technology time to mature and maybe...

Some also say that social media (think Twitter and Facebook which also deal with big data produced by millions of narcissistic humans) could be the new source for ALL news. While that is true if you have friends who love tweeting the news or giving accurate tweets (you could end up being given a skewed picture of the news), you really should be able to verify that raw piece of data with multiple sources. Big data is nothing if there is no useful way of deriving valuable and realistic knowledge from it, that is if there is any useful thing to be derived from it at all, which is another problem entirely.

The one thing that I find most amazing about big data is how much we humans have produced in our existence; more than 90% of it has been produced only in the last several years. By the time my grand children are born, we're going to have a real problem of storing that data let alone finding ways of scrutinizing it for useful patterns. But as usual, necessity will always mother an invention for the job.

Saturday 31 August 2013

The Best Analogy for the 'God Particle'

Last year there had been great excitement within the circles of theoretical physicists as well as particle physicists concerning the possibility that the  Large Hadron Collider, the world's biggest and most powerful particle accelerator, might have given scientists enough data to clinch the formal discovery of something called the Higgs Boson particle, otherwise known as the god particle (I always get goosebumps when I say that). Recent reviews performed on the data collected so far seem to indicate without a doubt that the god particle is real.

Although this is being described as probably the most sensational discovery yet in modern physics, many of us mere mortals are still unable to understand what's gotten the physics nerds at CERN (the European Organisation for Nuclear Research) all pumped up with excitement. I'm sure if it weren't for the albeit scary nickname we wouldn't be talking much about this particle in popular channels. Here's a quick summary about it:

Simply put, the god particle is so named because it is theorized that all the particles in this universe that have mass (like the protons and neutrons in atomic nuclei) all owe their massive existence to this one particle when the universe formed in a hot big bang almost 14 billion years ago. It was first theorised by a man named Peter Higgs in 1964 as part of a solution for the standard model of the universe but we needed something like the Large Hadron Collider to observe this particle in the flesh.

As important as it is however, one can't seem to find a good explanation out there about how on earth can one measly particle do all that it is theorised to do. This recent video from TEDEd provides the best analogy that I have seen yet that tries to do that in simple, layman's terms. Enjoy!

Sunday 25 August 2013

Agriculture and Climate change: the *other* Inconvenient Truth

Last week I watched a very interesting video talk the TEDEd youtube channel given by a rather relatively obscure chap (in the popular mind at least) called Jonathan Foley. He is the director of the Institute on the Environment at the University of Minnesota in St. Paul, MN, USA and in this video he talks about the issue of agriculture and its effects on the environment and climate change. The title of the talk is taken from the documentary An Inconvenient Truth which features former US vice president Al Gore talking about climate change.

I believe Foley's talk is very important for 2 reasons. First, in this highly misinformed age of ours, the media tends to focus on the hype of climate change which is unfortunate because it is such a technical problem (yes deniers, I'm talking to you) that the populace fails to follow the arguments and research and consequently ends up disputing findings because they applied common sense where one shouldn't, scientifically speaking. Climatology is a complicated thing and not something that should be abused by naysayers.

The second reason occurs as a result of the first; if you want to talk with the populace about climate change, start with the fundamentals. In this case it is the environment and agricultural practices which affects and is in turn affected by the environment. No climate change apparent at first but it is still there, its just being explained in the context of familiar human problems.

Here's the video. Enjoy!

Friday 23 August 2013

The Process of Innovation: not as straightforward as you think

Did you think all swans are white?
If you did, you've just experienced the
devastating black swan effect! (Commons)
The long vacation from medical school has allowed me to catch up on some good old reading. This month I'm trying to finish up a beautiful philosophical work titled The Black Swan by Nassim Nicholas Taleb. A prominent author and professor in epistemology (who's witty style of discussing unintuitive philosophical concepts and passionate arguments leaves me absolutely refreshed after every reading session) which is simply the study of knowledge, Nassim Taleb explores in the book the theory that he calls 'the Black Swan theory' which describes probabilistic events with the following characteristics:


  1. They are rare enough to surprise us.
  2. They have drastic consequences.
  3. They are prone to erroneous retrospective rationalisation i.e. we try to explain them away after they have happened.
A good example is the 2008 financial meltdown which resulted in a global recession which some regions like Europe are still trying to recover from. Nobody expected it, it hurt us and arrogant economists are trying to explain it away (and in some cases have no idea what they're talking about).

Without going into too much details about the book, the point is that Mr. Taleb hits on an interesting idea that I have never before imagined. In chapter eleven of the book he talks about the role of randomness in fostering scientific and technological progress in the world today. He posits that most of the designed technology today yield are simply toys that wait for an application to appear. As he aptly puts it 'solutions waiting for problems'.

We tend to apply retrospective historicism to explain that x invention was  made to solve y problem and so on in a neat chain towards today's progress. Taleb tries to squash that version of history and replaces with something less romantic but probably more grounded in reality. In fact, according to him, there might be no difference to evolution and our progress.

Evolutionary theorists posit that innovations in biological designs are actually a product of random mutations that are filtered out by environment and internal pressures such as food availability, selection during mating. He considers the laser (whose inventor was in fact ridiculed by colleagues) and Viagra (which started out as a drug for hypertension). Today we can't live without laser and its field of uses is ever expanding even today as for Viagra, it has uplifted the lives of so many men living with impotence. Or consider the internet for that matter. It was originally designed for military purposes, now we all live online.

Some companies even capitalise on such a process according to Talab; scientists can sit down in such companies and simply tinker just for the fun of it. Commercialisation comes later. Whether such a thing is feasible in the long term is unknown to me but it certainly is tantalizing to see a business model that doesn't force a scientist to become a businessman full time.

Now considering this seemingly random process of innovation, how do we make of something like '3d printing' work? Some describe it as the biggest thing since the Industrial Revolution and it honestly looks like it. You can print anything you might fancy in 3d, your face, prototypes, shoes, designs and what not. Sceptics call it a gimmick, a toy with no future, a fad. In short, the solution is so elegant, but what's the use of a layman having it in his house?
An example of a 3d printer; the ORDbot Quantum printer
(Commons)
Recently, a new desktop scanner has come out on sale that allows small items to be scanned by a laser in 3d and a computer prints a 3d copy. So we now have a desktop scanner and a home ready printer that anyone can use.  So now what? Is the technology going to be for fun or art or what. Some analysts suggest that 3d printing is only for industries and they may be right but then again, who really knows? It seems to me like we're living in the age of another growing bubble; the 3d bubble. Like the dot-com bubble of the late 1990s and early 2000s, this one might burst and make a lot of investors up on Wall street lose their breakfast.

The moral of the story is that its very hard to predict these kinds of things. Randomness has gotten the better of us and no amount of risk management will ever remove it and it is ludicrous to imagine that we will ever master risk management. Th least we can do is enjoy the ride. At least, that's what Nassim Taleb says.

Tuesday 20 August 2013

Celebrating a 136 years of Mars moon discovery

Asaph Hall (Commons)
It's 136 years since American Asaph Hall discovered the Martian moons Phobos and Deimos in August 1877! Today we can now see the two moons from Earth, Mars orbit and the surface of the planet itself!

We have come a long way and we've still got loads of places to go. Like to my Mars Science Lab journal page where I've written a piece today all about the moons of Mars!

Tuesday 6 August 2013

Justifying Space Exploration

A self-portrait of the Curiosity rover
on Mars. How do we justify this?
(NASA/JPL/MSSS)
I remember being on a family visit at one time when I was still in my early teens. We were settling down for lunch and the table discussions inexplicably veered towards the financial justification of conducting space missions by America and other countries.

The debate split the table into 2 camps for and against. Naturally, being citizens of a third world country, the latter camp was full to breaking point. Guess who was all by his lonesome self in the for camp. Moi!

I don't remember much details but what I do know is that for the first time in my life, I felt completely hopeless in the face of open scrutiny. How could an ignorant young chap justify such expensive endeavours that he doesn't immediately benefit from and yet loves as much as a young person would love, say, a rock star or a professional wrestler?

Time has passed and today America is running a one year old Mars rover. They're still spending on space exploration and still a superpower by any objective standard and we're still, *ahem* stuck in the mud. I should feel vindicated (and indeed I do) but an explanation is in order.

To that end, I have written a short reflection on this other blog of mine as a sort of delayed response to that old debate that I couldn't even hope of ever winning at the time. But we must remind ourselves that in the art of important debates, it isn't a question of winning or losing but a question of clearing the air and revealing truth. That is the sign of a true knowledge gatherer!

Saturday 3 August 2013

THE IT SOCIETY: In databases we trust!

Today I wish to explore the extent to which our IT-enabled society could go in making a seemingly straightforward situation more complicated and how we can make our technological expertise more grounded in reality.
Our modern society is what one would call a complex system rather than a linear system. In linear systems, outcomes between different variables are predictable with a clockwork regularity that can be described neatly with mathematical and physical principles. Examples include classical mechanics such as a swinging pendulum. Complex systems on the other hand comprise of thousands if not millions of variables which interact in a non-straightforward manners with such unpredictable outcomes that simplistic mathematical theorems that you learned in class simply fall apart. Unless you develop new ways in describing these non-linear systems (like chaos theory which I will not bother talking about here), you haven't a hope in deducing outcomes, let alone understanding the system in the first place. Classical examples include the weather. Forecasting the next day’s weather only became easier with the advent of sophisticated computers and, more importantly for today’s topic, better meteorological databases.


For many scientific disciplines, you need lots of up-to-date data. Research and survey projects are a dime a dozen nowadays but such data is useless if you don't have an efficient way to store, clean (removing input errors) and retrieve it easily. We usually rely on IT to do most of these things nowadays and we are so used to hearing from manufacturers about the benefits of these machines that we might be forgiven to believe that there aren't any catches at all with regard to IT.

Two months ago on the 15th of June I gave a review talk at my university on the subject of health information technology or HIT. I described its usefulness and the need for its implementation in the African health arena. My talk focused heavily on the history of one particular attempt in my home country of Tanzania. Many countries require all health institutions to record outpatient and inpatient details to allow abnormalities in outcomes to be tracked on a long-term basis. Typical systems employed include a centralised health database system, most of which are digital in nature. In Tanzania the database (referred to as the Health Management Information System or HMIS or MTUHA in the local language) was mostly paper-based, and still is at the district, level for a good number of years since its inception in 1993. Digitisation of the HMIS records only began at the turn of the new millennium and instead of improving data quality and efficiency in collection, IT augmentation has seemingly lowered the already poor quality of the data and has even made it more difficult for the users i.e. the health professionals to use. Researchers complain that far from being a source of life-changing knowledge, harvesting HMIS data is like looking for crops in a field of weeds. Useless!

What’s most disturbing however is the lacklustre enthusiasm that came out of the HMIS managing team based in the country’s ministry of health in the form of the following quote in a 2002 report:
“Contract out the review of the current HMIS software in the light of alternative packages available, with a view to recommending the best option for the national [health] system” (MoH, 2002:16).
In less than 2 years after digitisation was complete at the upper levels of governance, changing the system’s software was being contemplated by the team overseeing HMIS operations. More than a million U.S. dollars were used to build a broken system!

So what went wrong? The explanation I offered in the talk was that the stakeholders involved in the design and execution of the new system failed to come together in the open. Apparently the donors funding the program decided that it was better to give money directly to software vendors rather than risk losing it to the depths of government bureaucracy. But I insist that the more important reason is that these stakeholders failed to design the system to serve the health system effectively. IT is useless if the customer does not know how to use it, hence Samsung’s catchy phrase ‘designed for humans’ used in advertisements of the Galaxy series of smartphones. Although steps are now being taken to ensure that the next generation of Tanzanian health professionals are IT literate by introducing definite IT studies in the university curriculum, I think it would be better if the whole country were to prioritise IT in its national development goals just like in its neighbour Kenya which has recently seen a surge in investments from big names like IBM, Google and Hewlett Packard.

What we see here is a case of fundamental system design failure. The stakeholders forgot about the complexities of the African health system, hence the seemingly inexplicable nature of the digital HMIS’ outcome. Health workers are either too busy or too ignorant to work the gadgets. The solution is elegant but not necessarily simple; build a strong culture of IT, make people more aware of it then design the system to fit the people and not the other way around. This bottom-up approach will help us prevent building loaded but useless systems that will inhibit rather than allow us to harvest life-saving health knowledge.


You can also read my original paper here where you will also find a list of good references.

Monday 29 July 2013

The Era of Knowledge Harvesters

In this present day and age, intellectual capital
 has increasingly taken centre stage
in the global economic theatre
(Commons, University of Erfut)
There was a time when progress in the world was measured by how much output countries could produce, be it cotton, cars, raw minerals, gross domestic output, et cetera. I think this philosophy was most pronounced immediately after the industrial revolution took off in Great Britain in the 19th century. So much that it really didn't matter whether you're a communist or an uncaring capitalist, if you didn’t make your quota then someone suffered the consequences (be it dismissal or worse).

Countries strived for better production in larger quantities for commerce and trade while seeking means to control as much of the natural resources as possible through sheer military and political might. But while all this was happening something else was occurring in the background. An exponential increase in the rate of discovery was taking place. It would have been impossible to power the industrial revolution without a knowledge base in steam engines and thermodynamics. The same goes for all the engines that came afterwards. Similarly, material sciences gave birth to new forms of metal alloys including steel which was used to make bigger and stronger structures like bridges and skyscrapers. A combination of the two and improved understanding of timekeeping allowed the creation of the railway and steam-powered ships which revolutionised commerce and trade throughout the world.

Other discoveries like the telegraph and, eventually, the telephone gave rise to the still powerful modern north Atlantic trading bloc. The practical advantages of instantaneous communications continue to be seen today with the internet, satellite and telecom industries. The knowledge market is born. Knowledge is being created every day, bought, sold, transmitted, gathered and organised every second. New methods of recalling, organising and manipulating data for whatever means and purposes are required every day by individuals and institutions and this is exactly what powers Silicon valley and the like. Our brains are designed to comprehend things in the now with a limited number of variables. But unfortunately for us, the world is composed by thousands of variables that act in a non-linear manner. It’s a complex system we’re dealing with here and our heads are simply unable to take it all in without making serious asinine acts like the 2008 global financial meltdown. And even with the assistance of humongous computing power, we are about as graceful as a bull in a china shop. Wisdom is required to navigate this complex world of ours.

Wisdom and knowledge in its many forms can be found in the most unlikely of places. Most assuredly our ancestors were as smart as we are today. They could observe, formulate hypotheses and test them for predicted outcomes. The only difference is that with time, we have made the process into an art. Whatever our forefathers could do, we can now do better. A perfection of thought and acts that has allowed our race to prosper and grow at rates never before experienced.

It has also brought forth a paradigm shift. As I said in the beginning of this essay, progress was once measured by the output of heavy industries. Now, however, heavy industries do not necessarily imply progress (to the delight of green activists I’m sure!). What we now have today is a knowledge economy that now measures progress primarily by how much intellectual capital you have. The rest can be outsourced thanks to a highly networked economy. Such a system has allowed even small countries like Israel, Rwanda and Switzerland to become highly developed (or fast growing) economies. With limited resources, they as well as many other institutions including companies are left sinking their money into people’s education. The knowledge that grows from such investments is harvested and used again for more growth. This time and era clearly belongs to the harvesters of knowledge.