|An example of Mughal art depicting the crowning of a|
new ruler. All in all, they are quite beautiful
Sunday, 22 December 2013
Friday, 13 December 2013
Monday, 2 December 2013
This has nothing to do with knowledge harvesting whatsoever other than simply being cool!
Behold, Amazon's newest mode of delivery... via drones! Coming soon to your front door in 2015/16.
I wonder though how the company plans on tackling the possibility of 'drone bandits' who might feel the urge of making a lucky hit with their pigeon shooters...
At the same time India's mission to Mars, Mangalyaan, has successfully cleared Earth orbit and is now heading for Mars, a journey that will take it 300 days. Space is truly becoming a busy place; as busy as any other area buzzing with human activity but we are perhaps entering a new era of space exploration. An era where space not only becomes part of the human cosmos, but becomes an essential part for every nation in the world. And we are seeing the beginnings of such a world in the form of our global need and dependency on environmental data and space-based telecoms capability, and maybe someday soon, raw materials.
No boost to interplanetary sojourns is complete without a look back home; here's a parting shot of the Earth taken by Mangalyaan almost 10 days ago. A sort of visual salute to the land of its creators no doubt.
|Earth by Mangalyaan, taken on November 20, 2013 (ISRO)|
Tuesday, 19 November 2013
At first glance these rather cute robots may look like ordinary children's playthings but they are actually for a serious purpose that is far reaching in importance. While truly made for children from 5 years and above, these robots can be programmed by the child to do many sorts of things (like acting out a story). The child essentially becomes a hacker while playing, learning how to conceptualise programming with languages like Scratch. Awesome!
The robots reflect the philosophy of their maker, Play-i, and indeed many education experts in the modern world today; in the post-computer era, must it not be a priority to educate children to become not only able to manage computing devices but also know how to program such devices? Indeed, this may not be as far fetched as us trying to teach children not just to read other people's books and papers, but also how to pick up a pen and trace letters, words and finally ideas on a piece of paper. Such knowledge requires mastering language and the same goes for computing.
Some may dismiss this idea as irrelevant to the modern child but perhaps they wouldn't dismiss it if they understood what actually drives computers. It is a language that bodes evil for many children unfortunately and it is called mathematics. Therefore learning more about computers and programming should make you a better math student, right? The idea is tantalizing and is endorsed by many luminaries in the mathematical field. For good or ill, our thinking is being shaped by these thinking machines just as much as we shape their creation. And Play-i's Bo and Yana are a testament to that end.
Monday, 28 October 2013
Because I have now started clinical rotations for this academic year, the number of posts I can write will consequently be curtailed and even haphazard in frequency. I thank my readers for their support in making this project worth the effort. Please continue to visit and you can keep track by bookmarking or using your favourite RSS application.
While going through my weekend net readings, I was simply delighted by this very engaging article from the Verge that features the cofounder of Microsoft, Paul Allen's thoughts on machine learning and its potential for impacting our lives. But to really truly understand the nature of what exactly we're dealing with here, I wish to touch on a couple of other articles that serve to augment Paul Allen's thoughts.
Tuesday, 15 October 2013
|Portrait of Ada Lovelace (Commons)|
Ada Lovelace is an important historical figure in the early days of computing in two fundamental ways; first, she was the first person to write and describe what could simply be called a program (a set of instructions telling the computer what to do) that could be run through Babbage's analytical engine (in this case it was an algorithm that computes a set of rational numbers called 'Bernoulli numbers').
While this proves her prowess as a mathematician, her truest (and undisputed) contribution came in the form of a leap of thought while observing Babbage's engines. While Babbage (being a number geek) was only interested in his engines being nothing but proficient handlers of big numbers, Lovelace thought beyond this; she envisioned a computing devices which processed, say, sounds of varying characteristics. This was gigantic leap of thought because that allows you to invent something like a music note processing system. Outstanding!
Today, this exemplary act of feminine ingenuity and scholarship is celebrated on every year in mid-October in the form of Ada Lovelace day which celebrates women's contributions to the advancement of human knowledge. I only discovered this today through Google plus's hashtag trends (#AdaLovelaceDay). This is quite an important event and I encourage you to talk about it amongst your social media peers, family and friends. Its important that when we engage in knowledge harvesting, EVERYONE participates. That way we bring in important different ways in looking at a problem(s) and finding solutions to them. Oh, and while you're doing that, spare a minute to think of at least one famous female scientist. If you can, keep searching for more. If you can't, now's the time to take a plunge into discovery!
Sunday, 13 October 2013
|Is such a voyage unreasonable?|
Artist's depiction of the Voyager spacecraft.
However, due to the unknown nature of the region the spacecraft is surveying, it took the mission team almost a year since seeing the first suggestive signs of the crossing to study the data before they could confidently announce to the public that the humanity has now become an interstellar faring race. Undoubtedly, this is a remarkable achievement and event not just because we have proven that it is possible to send a craft to the distant reaches of our sun's domain but because it awakens that deep sense of the unknown inside all of us. Whenever we progress into unfamiliar territory, be it worlds, continents, life stages or situations, there is that deep, exotic feeling that one gets; a mixture of hope, awe and apprehension. We have reached the edge of what we know, now we venture into unchartered waters.
This drive to reach the edge of knowledge, for good or for ill, drove a lot humanity's doings; from the discovery of the New World by the Europeans to the exploration of the Inner Space under the seas to the venturing of humans and mechanic emissaries into space to our peering into the distances of the sky to fathom the heavenly domains. What we gain from doing all this is nothing short of meaningful progress. The very same progress that has allowed us to tame nature (somewhat) and allow us and our children to thrive and live more comfortably. As the famous playwright and political activist George Bernard Shaw once put so eloquently, "The reasonable man adapts himself to the conditions that surround him... The unreasonable man adapts surrounding conditions to himself... All progress depends on the unreasonable man". The Voyager mission, itself an unreasonable mission, is truly a work of unreasonable people!
So, we are left with the question; do we owe our comfort today to the people who accepted things as the will of the universe/deities or to the people who went about asking seemingly unreasonable questions, and then ventured out to the edge of the known to find the answer? I believe the answer goes without saying.
Meanwhile, Voyager 1 will continue gathering data on this unexplored region of sun's domain (it can still be regarded as the being the sun's domain despite the crossing because the sun's gravitational influence extends farther outwards, up until the Oort cloud where the majority of our solar system's cometary bodies reside; Voyager has yet to leave that area of influence as illustrated below) until its radioisotope thermoelectric generator stops producing power somewhere in the mid 2020s. From then on, she continue drifting further from us, a silent emissary to the stars.
Learn more about Voyager 1 and her sister craft Voyager 2 here.
|Where Voyager 1 is as of 2013. (NASA/JPL)|
Friday, 11 October 2013
|The resignation of Romulus|
Augustus to the Germanic soldier Odoacer
traditionally marks the end of the Western
Roman Empire. (Commons)
Saturday, 14 September 2013
Big data is just what it is; enormous amounts of raw, unprocessed information that cannot be sorted or dealt with using traditional, hands on means. No human can look at big data and derive any real meaning from it. Big data is usually handled with the help of computers. Examples of big data being used today abound in the world (meteorological data being used to forecast the weather; the richer the data set, the more accurate the forecast) and the problem has also given birth to many companies, the most commonly cited being Google (whose very name is derived the term 'googolplex' which is 10^10^100). The company's computers crawl through the web, indexing trillions of pages to make them searchable for all of us. Despite that, the internet remains far from being entirely indexed for 2 reasons; primo its growing everyday and secundo its so damn big. As a side note, check out this nifty website to see how big the internet probably is (I say probably because the map is obviously incomplete).
Big data exists the moment something exists in this universe. The universe and all its contents has existed for 13.7 billion years. The trick is capturing that data, storing and analyzing it for useful patterns. The tools for performing these actions have only arrived in the last two centuries in the form of cheap, compact storage devices, powerful processors, networking technologies and, most importantly perhaps, necessity.
To illustrate the importance of necessity's role in spurring the development of tools for handling big data, take a look at this TEDEd video:
Since the time when IT first became a ubiquitous part of our lives, big data has become relatively easy to collect. Mapping has enjoyed the fruits of big data; the maps of today are no longer physical pieces of paper that my parents and I used extensively on our excursions around the city of London in 2002. Nowadays people have access to compact, electronic maps on their smartphones which are rich in social data, never become outdated and can even feature live updates on such important things like traffic information (the company Waze, which was purchased by Google in June 2013. allows you to access social info on traffic with GPS enabled phones).
However, what good technology has given us can also be used against us. Politics aside, big data technology has also allowed security agencies to look into our activities with unprecedented impunity. Privacy policies and laws may forever be left in the dust as the pace of technological development outstrips the ability of lawmakers to protect our digital privacy online (if you believe in such a thing anyway). Though the aim is purely (I think) to catch the bad guy, I guess we are in real danger of being wrongly accused by overzealous/over-legislated security agencies or caught by nervous, tyrannical regimes seeking to protect their illegitimate hold on power. All thanks to big data and the technologies it has spawned.
But the ultimate problem of big data (especially of the social media variety) is its terrible need to be verified or curated (and that's why crowd-sourced projects like Wikipedia still needs editors or there would be chaos) to ensure we don't end up turning noise into conclusions. Taking unverified data as real information is the biggest absurdity of the internet today. Perhaps that is the reason why teachers hate students referring to Wikipedia for information. Though I disagree with not using Wikipedia completely, CITING it in presentations and essays is another thing entirely. Its just not there yet. Give the technology time to mature and maybe...
Some also say that social media (think Twitter and Facebook which also deal with big data produced by millions of narcissistic humans) could be the new source for ALL news. While that is true if you have friends who love tweeting the news or giving accurate tweets (you could end up being given a skewed picture of the news), you really should be able to verify that raw piece of data with multiple sources. Big data is nothing if there is no useful way of deriving valuable and realistic knowledge from it, that is if there is any useful thing to be derived from it at all, which is another problem entirely.
The one thing that I find most amazing about big data is how much we humans have produced in our existence; more than 90% of it has been produced only in the last several years. By the time my grand children are born, we're going to have a real problem of storing that data let alone finding ways of scrutinizing it for useful patterns. But as usual, necessity will always mother an invention for the job.
Saturday, 31 August 2013
Although this is being described as probably the most sensational discovery yet in modern physics, many of us mere mortals are still unable to understand what's gotten the physics nerds at CERN (the European Organisation for Nuclear Research) all pumped up with excitement. I'm sure if it weren't for the albeit scary nickname we wouldn't be talking much about this particle in popular channels. Here's a quick summary about it:
Simply put, the god particle is so named because it is theorized that all the particles in this universe that have mass (like the protons and neutrons in atomic nuclei) all owe their massive existence to this one particle when the universe formed in a hot big bang almost 14 billion years ago. It was first theorised by a man named Peter Higgs in 1964 as part of a solution for the standard model of the universe but we needed something like the Large Hadron Collider to observe this particle in the flesh.
As important as it is however, one can't seem to find a good explanation out there about how on earth can one measly particle do all that it is theorised to do. This recent video from TEDEd provides the best analogy that I have seen yet that tries to do that in simple, layman's terms. Enjoy!
Sunday, 25 August 2013
I believe Foley's talk is very important for 2 reasons. First, in this highly misinformed age of ours, the media tends to focus on the hype of climate change which is unfortunate because it is such a technical problem (yes deniers, I'm talking to you) that the populace fails to follow the arguments and research and consequently ends up disputing findings because they applied common sense where one shouldn't, scientifically speaking. Climatology is a complicated thing and not something that should be abused by naysayers.
The second reason occurs as a result of the first; if you want to talk with the populace about climate change, start with the fundamentals. In this case it is the environment and agricultural practices which affects and is in turn affected by the environment. No climate change apparent at first but it is still there, its just being explained in the context of familiar human problems.
Here's the video. Enjoy!
Friday, 23 August 2013
|Did you think all swans are white?|
If you did, you've just experienced the
devastating black swan effect! (Commons)
- They are rare enough to surprise us.
- They have drastic consequences.
- They are prone to erroneous retrospective rationalisation i.e. we try to explain them away after they have happened.
Without going into too much details about the book, the point is that Mr. Taleb hits on an interesting idea that I have never before imagined. In chapter eleven of the book he talks about the role of randomness in fostering scientific and technological progress in the world today. He posits that most of the designed technology today yield are simply toys that wait for an application to appear. As he aptly puts it 'solutions waiting for problems'.
We tend to apply retrospective historicism to explain that x invention was made to solve y problem and so on in a neat chain towards today's progress. Taleb tries to squash that version of history and replaces with something less romantic but probably more grounded in reality. In fact, according to him, there might be no difference to evolution and our progress.
Evolutionary theorists posit that innovations in biological designs are actually a product of random mutations that are filtered out by environment and internal pressures such as food availability, selection during mating. He considers the laser (whose inventor was in fact ridiculed by colleagues) and Viagra (which started out as a drug for hypertension). Today we can't live without laser and its field of uses is ever expanding even today as for Viagra, it has uplifted the lives of so many men living with impotence. Or consider the internet for that matter. It was originally designed for military purposes, now we all live online.
Some companies even capitalise on such a process according to Talab; scientists can sit down in such companies and simply tinker just for the fun of it. Commercialisation comes later. Whether such a thing is feasible in the long term is unknown to me but it certainly is tantalizing to see a business model that doesn't force a scientist to become a businessman full time.
Now considering this seemingly random process of innovation, how do we make of something like '3d printing' work? Some describe it as the biggest thing since the Industrial Revolution and it honestly looks like it. You can print anything you might fancy in 3d, your face, prototypes, shoes, designs and what not. Sceptics call it a gimmick, a toy with no future, a fad. In short, the solution is so elegant, but what's the use of a layman having it in his house?
|An example of a 3d printer; the ORDbot Quantum printer|
The moral of the story is that its very hard to predict these kinds of things. Randomness has gotten the better of us and no amount of risk management will ever remove it and it is ludicrous to imagine that we will ever master risk management. Th least we can do is enjoy the ride. At least, that's what Nassim Taleb says.
Tuesday, 20 August 2013
|Asaph Hall (Commons)|
We have come a long way and we've still got loads of places to go. Like to my Mars Science Lab journal page where I've written a piece today all about the moons of Mars!
Tuesday, 6 August 2013
|A self-portrait of the Curiosity rover|
on Mars. How do we justify this?
The debate split the table into 2 camps for and against. Naturally, being citizens of a third world country, the latter camp was full to breaking point. Guess who was all by his lonesome self in the for camp. Moi!
I don't remember much details but what I do know is that for the first time in my life, I felt completely hopeless in the face of open scrutiny. How could an ignorant young chap justify such expensive endeavours that he doesn't immediately benefit from and yet loves as much as a young person would love, say, a rock star or a professional wrestler?
Time has passed and today America is running a one year old Mars rover. They're still spending on space exploration and still a superpower by any objective standard and we're still, *ahem* stuck in the mud. I should feel vindicated (and indeed I do) but an explanation is in order.
To that end, I have written a short reflection on this other blog of mine as a sort of delayed response to that old debate that I couldn't even hope of ever winning at the time. But we must remind ourselves that in the art of important debates, it isn't a question of winning or losing but a question of clearing the air and revealing truth. That is the sign of a true knowledge gatherer!
Saturday, 3 August 2013
Today I wish to explore the extent to which our IT-enabled society could go in making a seemingly straightforward situation more complicated and how we can make our technological expertise more grounded in reality.Our modern society is what one would call a complex system rather than a linear system. In linear systems, outcomes between different variables are predictable with a clockwork regularity that can be described neatly with mathematical and physical principles. Examples include classical mechanics such as a swinging pendulum. Complex systems on the other hand comprise of thousands if not millions of variables which interact in a non-straightforward manners with such unpredictable outcomes that simplistic mathematical theorems that you learned in class simply fall apart. Unless you develop new ways in describing these non-linear systems (like chaos theory which I will not bother talking about here), you haven't a hope in deducing outcomes, let alone understanding the system in the first place. Classical examples include the weather. Forecasting the next day’s weather only became easier with the advent of sophisticated computers and, more importantly for today’s topic, better meteorological databases.
In less than 2 years after digitisation was complete at the upper levels of governance, changing the system’s software was being contemplated by the team overseeing HMIS operations. More than a million U.S. dollars were used to build a broken system!“Contract out the review of the current HMIS software in the light of alternative packages available, with a view to recommending the best option for the national [health] system” (MoH, 2002:16).
So what went wrong? The explanation I offered in the talk was that the stakeholders involved in the design and execution of the new system failed to come together in the open. Apparently the donors funding the program decided that it was better to give money directly to software vendors rather than risk losing it to the depths of government bureaucracy. But I insist that the more important reason is that these stakeholders failed to design the system to serve the health system effectively. IT is useless if the customer does not know how to use it, hence Samsung’s catchy phrase ‘designed for humans’ used in advertisements of the Galaxy series of smartphones. Although steps are now being taken to ensure that the next generation of Tanzanian health professionals are IT literate by introducing definite IT studies in the university curriculum, I think it would be better if the whole country were to prioritise IT in its national development goals just like in its neighbour Kenya which has recently seen a surge in investments from big names like IBM, Google and Hewlett Packard.
Monday, 29 July 2013
|In this present day and age, intellectual capital|
has increasingly taken centre stage
in the global economic theatre
(Commons, University of Erfut)