Showing posts with label Science. Show all posts
Showing posts with label Science. Show all posts

Tuesday, 3 October 2017

The End of Cassini: Why We Must Keep Exploring

An illustration of Cassini’s “Grand Finale” at Saturn (NASA/JPL)

Semper Exploro. Always exploring. The spirit of this motto filled my heart as I watched the planned disintegration of the 20 year old Saturn orbiter Cassini unfold on social media. When the spacecraft finally stopped signalling the homeworld at 1155 UTC on September 15th, I realised that, although the mission was a resounding success, the spirit of Semper Exploro demands that we should send another probe in its place. After all, there is so much left undiscovered at the Saturnian system; what lies under the icy crust of the geyser-spewing moon Enceladus? What do the seas of Titan actually look like? Cassini has left us with even more questions than before it entered orbit in 2004 after a 7 year journey across the solar system. And those questions should be answered not merely because it is scientifically relevant (it is a given), but because the spirit of exploration which Cassini embodied and which Semper Exploro captures so well has the potential to further unite humanity and bring about the best of us in more ways than what any formal ideological framework can do. And we need a better alternative today, now more than ever.

Exploring can be a risky venture, but its a worthy risk. Indeed, an explorer’s death may be the only kind of death worthy of glorification. We stand today only because of a few men and women who risked it all stepping into the unknown in all kinds of fields. With exploration comes advancement. With curiosity comes the gifts of innovation. With ventures comes prosperity. With each expedition into the unknown comes priceless knowledge that uplifts us all as a species. How much more prosperous would we be today had we dedicated all our efforts to kill or dominate one another towards instead settling space, curing illnesses and so forth? The most logical answer would be: many times over, perhaps a thousand fold.

Film works like Star Trek that attempt to portray the future as an advanced utopia are sometimes criticised of being overoptimistic, naive and ignorant of ‘reality on the ground’. But reality is only what we allow it to be. If we want, we could have a Star Trek kind of future right now. If we want, we could have institutions whose only purpose is to explore, discover and advance peaceful prosperity across the stars. Yes, even with our severely flawed humanity we can still have our cake and eat it IF we believe we can. After all, this flawed humanity is the same humanity that has created the better present we see today. And experience can serve as a catalyst; people today are more motivated than ever to improve not just their own circumstances, but the circumstances of everyone else around them, simply because of inspired hope. The momentum created by this hopeful belief means that, for the most part, we have no where else to go but up.

We cannot hope to have a smooth ride to the future. But fantastic endeavours like the Cassini mission can help remind and solidify our global desire as a species: to see what’s on the other side of the distant horizon. To learn and grow wiser. And to do it with everyone around us.

If we can just keep on exploring, perhaps one day we might discover an even better version of ourselves than we could ever have imagined. But we must keep exploring. Semper exploro forever Cassini.

Wednesday, 12 April 2017

Our World is Changing: How Might We Master Change?

The Libyan Revolution, an example of change gone wrong. Image shows a tank outside the city of Misrata (Wikimedia Commons)
Change is guaranteed in Nature. Matter undergoes state transitions, ecosystems suffer violent episodes and recover and your body turns over your red blood cells every 90 days.
Even aspects of our personhood are in a constant state of flux; from childhood to puberty, from adulthood to old age, what we might consider as ‘us’ never stays quite the same. We always seem to find an alien of sorts to cringe at when staring at records of our past selves.

Far from being an existential conundrum, changes in the self is evidently our most precious characteristic, imparting us a degree of adaptability that has allowed us to survive a world where Change rules supreme. Besides that, in a world where humanity is immutable in belief and structure, would we discover anything new? Would we discover that tenacious form of ‘human spirit’?

And yet, despite all this, we also happen to live in a world where change receives very mixed reception. Few of us embrace change as we would embrace life itself. Even when most people would prefer change over the status quo, there are always those who would prefer the past over the present and its futures, no matter what.

The concept of ‘traditions’, ‘customs’ and perhaps even animism evolved partly out of our natural desire to always be connected to the past. This is something to celebrate, but tradition that fails to acknowledge change tends to breed instability in the long run as pressure inevitably builds up between the forces for change and the forces of status quo.

Therefore, in a world gripped in ever morphing new challenges, it is imperative that we seek to master the science of change through the creation of “tools for mastering Change”. Our present tools and structures for managing ourselves (i.e. governments, institutions, policies, etc) thrive on creating artificial stability. They are fragile. A single parameter change can quickly and violently end things (the global debt market is but one example) and we would be left helpless and unprepared, all ripe for extinction.
Any fool can create change and the tools for manifesting change already exist; the internet is the tamer kind and modern weaponry is the deadliest of them all.
The astute reader will note, however, that I draw a distinction between tools for creating change versus mastering change. Any fool can create change and the tools for manifesting change already exist; the internet is the tamer kind and modern weaponry is the deadliest of them all. What we lack are the tools to manage and understand change. Consequently, while we can establish change amongst ourselves (revolutions, protests for policy change, lobbying) we are COMPLETELY blind as to question of what those changes might entail and how to implement them without killing ourselves in the end.

To put it simply, while it is easy for us to implement said change, it is difficult for us today to manage that change once it is out there in the wild. We are fragile, glass figures that easily shatter and shatter ourselves in the process, all in the name of unquestioned, unmanaged, undisciplined change.

Mastering change is about more than merely predicting what lies ahead (an unreliable science in the hands of fools), its about embracing change just as we would like to live a good life without becoming hurt in the process. It is about us as a species learning to drive confidently into this brave new world of ours while not under influence of our naive selves.

Wednesday, 1 January 2014

Calendars, Clocks and the joys of Time keeping

A sundial (Photo taken by Alexandre Mirgorodski)
I'd like to wish all the readers of this blog a very happy and prosperous New Year and would like to indulge a little of your time to talk about the very thing that allow us to have these celebrations in the first place; the science of chronometry; the science of timekeeping. This is not to be confused with horology which is the study of the instruments used to measure time.

Time keeping is a very fastidious sort of activity you can ever engage in. This is because time is but a continuous value; you can keep on dividing it till you end up with extremely small measurements like microsecond (millionth of a second), nanosecond (billionth) or even picoseconds (trillionth), depending on the level of accuracy you want.

What's worse, we have come to realise that the time is not an absolute dimension, but a relative one, i.e. it changes in the rate of progression due to the distortion of the space-time continuum (imagine reality as a fabric with time being one of the threads; gravitational distortion changes the appearance of reality as well as the time in said reality). These distortions are not very noticeable when we measure time with your dependable Rolex watch for two reasons: first, it's not accurate enough. You need to have a pretty high level of measurement to detect the subtle changes in the flow of time due to the gravitational distortion brought about by Earth. Global Positioning Satellites have such exquisitely accurate clocks that they have to take this discrepancy into account when calculating positions on the ground where time moves slightly slower.

Also, just to muddy the waters a bit, the standards used to measure time and to set our various clocks to may tend to gain or lose time as they keep on ticking. These errors can accumulate with time and cause us to become hopelessly disconnected from reality.
A chip-sized atomic clock; these
are some of the most powerful
time keepers around. (Commons)

Speaking of reality, we do like it if these tick-tocking standards march to the tune of our everyday lives; we wish to wake up when the sun's actually up and not before or we would like the clock to be accurate enough to tell our computers to start trading with the stock markets the moment the opening bell rings. And astronomers would like the little star in the sky to be right there when they expect it to. It is incredible to realise how much humans have been dependent on clocks to measure and predict so many regular aspects of our lives. With time, our clocks have become more elaborate (think computers) but we are still vexed by the need to accurately match this regularity to human nuances (think time zones, calender types and the bizarre day light savings). Our desire to mark out cyclical events in our lives (the calender or almanac) sometimes forces us to make reconcile natural temporal variations with our more accurate ability of timekeeping. This brings out very interesting problems.

Consider for example a software developer trying to write up an app that needs to do some accurate international time keeping. The result is illustrated very well in this video. Happy 2014 everybody!


P.S. That little reference about Google's way of dealing with some of the issues mentioned can be found here.

Friday, 13 December 2013

The Death of the Universe

In 1999, I begged my mom to buy me a book on the M.V. Doulos, the famous floating library. It was an introduction to astronomy textbook for college and I loved it (I was 8 years old at the time).

One of the most fascinating concepts that I discovered while reading chapters on stellar workings and cosmology (the study of the evolution of the universe) was the idea that not only do stars die (including our own), but the universe too will die. This of course depended on the universe's present properties which had yet to be fully explored on the book's publication.

Today, the answer to the question of how our universe will die is a little more certain if not more mysterious than I thought before. This TEDEd video sums it all up. Enjoy!

Monday, 2 December 2013

New Explorers of the Final Frontier

The new month has begun with a bang with the launch of China's latest in a series of missions to Earth's natural satellite, Chang'e 3 with its surface rover, Yuta.



At the same time India's mission to Mars, Mangalyaan, has successfully cleared Earth orbit and is now heading for Mars, a journey that will take it 300 days. Space is truly becoming a busy place; as busy as any other area buzzing with human activity but we are perhaps entering a new era of space exploration. An era where space not only becomes part of the human cosmos, but becomes an essential part for every nation in the world. And we are seeing the beginnings of such a world in the form of our global need and dependency on environmental data and space-based telecoms capability, and maybe someday soon, raw materials.

No boost to interplanetary sojourns is complete without a look back home; here's a parting shot of the Earth taken by Mangalyaan almost 10 days ago. A sort of visual salute to the land of its creators no doubt.
Earth by Mangalyaan, taken on November 20, 2013 (ISRO)
You can read details on the Indian Mars Orbiter Mission here and about Chang'e 3 here.

Monday, 28 October 2013

Weekend Review: Machine Learning and its Possibilities

Because I have now started clinical rotations for this academic year, the number of posts I can write will consequently be curtailed and even haphazard in frequency. I thank my readers for their support in making this project worth the effort. Please continue to visit and you can keep track by bookmarking or using your favourite RSS application.

While going through my weekend net readings, I was simply delighted by this very engaging article from the Verge that features the cofounder of Microsoft, Paul Allen's thoughts on machine learning and its potential for impacting our lives. But to really truly understand the nature of what exactly we're dealing with here, I wish to touch on a couple of other articles that serve to augment Paul Allen's thoughts.

 Machine learning has much to do with the famous English mathematician, Alan Turing. His 1950 seminal paper titled 'Computing Machinery and Intelligence' is still cited today. In this BBC news feature from earlier this month, I discovered to my surprise (surprise because I am, of course, NOT a learned computer scientist, only a humble knowledge harvester/doctor) that Turing had tried to refute the some of the claims put forth by an equally famous predecessor, Ada Lovelace, regarded by some history's first computer programmer. It appears Ms. Lovelace was of the opinion that computing machines can never give us surprising insights. They can only put forth what we expect them to. While this seems straightforward to many of us, it didn't ring true to Mr. Turing. He proposed that if the computational power of computers continue to increase with time, what's to stop them from becoming as sophisticated as the human brain and (like the aforementioned organ) come up with some surprising ideas of their own.

Even today, it is argued that even Google's autocomplete function can sometimes suggest insightful queries that we, the searchers, might never have thought to ask. All this brings us back to Paul Allen; although the man is a supporter of artificial intelligence development and has a good number of institutions under his name which are doing just that, he denies the idea that computers will soon (as in less than a century from now) match or even outstrip the computing power of their creators' brains, the so called 'Singularity event'. He offers several points to support his rather surprising stance.

In good academic fashion, Ray Kurzweil, the originator of the term 'Singularity', offers a response to Paul's refutation, citing several counter points and also the possibility that his opponent may have misunderstood the crux of the problem. It is not my job to determine who is right and who is wrong. I'll leave that to you to decide. But what is agreed upon is that computers are indeed getting more powerful everyday. We are promised opportunities (like intelligent space probes that will explore the galaxy for us) and threats (like autonomous drones that will decide for themselves whether to kill or not). As a result, the future seems more murky and wonderful than ever before.

Tuesday, 15 October 2013

Celebrating Ada Lovelace

Many people might know Charles Babbage, the designer of the famed differential and analytical engines (common examples of the first mechanical computers that could have worked if they actually had been built). But have you ever heard of a woman named Ada Lovelace?
Portrait of Ada Lovelace (Commons)

Ada Lovelace is an important historical figure in the early days of computing in two fundamental ways; first, she was the first person to write and describe what could simply be called a program (a set of instructions telling the computer what to do) that could be run through Babbage's analytical engine (in this case it was an algorithm that computes a set of rational numbers called 'Bernoulli numbers').

While this proves her prowess as a mathematician, her truest (and undisputed) contribution came in the form of a leap of thought while observing Babbage's engines. While Babbage (being a number geek) was only interested in his engines being nothing but proficient handlers of big numbers, Lovelace thought beyond this; she envisioned a computing devices which processed, say, sounds of varying characteristics. This was gigantic leap of thought because that allows you to invent something like a music note processing system. Outstanding!

Today, this exemplary act of feminine ingenuity and scholarship is celebrated on every year in mid-October in the form of Ada Lovelace day which celebrates women's contributions to the advancement of human knowledge. I only discovered this today through Google plus's hashtag trends (#AdaLovelaceDay). This is quite an important event and I encourage you to talk about it amongst your social media peers, family and friends. Its important that when we engage in knowledge harvesting, EVERYONE participates. That way we bring in important different ways in looking at a problem(s) and finding solutions to them. Oh, and while you're doing that, spare a minute to think of at least one famous female scientist. If you can, keep searching for more. If you can't, now's the time to take a plunge into discovery!

Sunday, 13 October 2013

The Unreasonable Unknown

Is such a voyage unreasonable?
Artist's depiction of the Voyager spacecraft.
(NASA/JPL)
In late August 2012, after 35 years of travelling through space, NASA's veteran spacecraft Voyager 1 opened a new chapter in our species' long story of exploration by crossing and consequently mapping what is believed to mark the end of the sun's sphere of magnetic influence; the heliopause for the first time.

However, due to the unknown nature of the region the spacecraft is surveying, it took the mission team almost a year since seeing the first suggestive signs of the crossing to study the data before they could confidently announce to the public that the humanity has now become an interstellar faring race. Undoubtedly, this is a remarkable achievement and event not just because we have proven that it is possible to send a craft to the distant reaches of our sun's domain but because it awakens that deep sense of the unknown inside all of us. Whenever we progress into unfamiliar territory, be it worlds, continents, life stages or situations, there is that deep, exotic feeling that one gets; a mixture of hope, awe and apprehension. We have reached the edge of what we know, now we venture into unchartered waters.

This drive to reach the edge of knowledge, for good or for ill, drove a lot humanity's doings; from the discovery of the New World by the Europeans to the exploration of the Inner Space under the seas to the venturing of humans and mechanic emissaries into space to our peering into the distances of the sky to fathom the heavenly domains. What we gain from doing all this is nothing short of meaningful progress. The very same progress that has allowed us to tame nature (somewhat) and allow us and our children to thrive and live more comfortably. As the famous playwright and political activist George Bernard Shaw once put so eloquently, "The reasonable man adapts himself to the conditions that surround him... The unreasonable man adapts surrounding conditions to himself... All progress depends on the unreasonable man". The Voyager mission, itself an unreasonable mission, is truly a work of unreasonable people!

So, we are left with the question; do we owe our comfort today to the people who accepted things as the will of the universe/deities or to the people who went about asking seemingly unreasonable questions, and then ventured out to the edge of the known to find the answer? I believe the answer goes without saying.

Meanwhile, Voyager 1 will continue gathering data on this unexplored region of sun's domain (it can still be regarded as the being the sun's domain despite the crossing because the sun's gravitational influence extends farther outwards, up until the Oort cloud where the majority of our solar system's cometary bodies reside; Voyager has yet to leave that area of influence as illustrated below) until its radioisotope thermoelectric generator stops producing power somewhere in the mid 2020s. From then on, she continue drifting further from us, a silent emissary to the stars.

Learn more about Voyager 1 and her sister craft Voyager 2 here.
Where Voyager 1 is as of 2013. (NASA/JPL)

Saturday, 14 September 2013

BIG data: its gains, losses and absurdities

In my first post ever on this blog I talked about the something called the 'knowledge economy'. This is one of the most common memes of our time (if you have no idea what a 'meme' is, click here). Like other things in this world, memes tend to cluster around other memes and one of them as far as 'knowledge economy' goes tends to be something called 'big data'.

Big data is just what it is; enormous amounts of raw, unprocessed information that cannot be sorted or dealt with using traditional, hands on means. No human can look at big data and derive any real meaning from it. Big data is usually handled with the help of computers. Examples of big data being used today abound in the world (meteorological data being used to forecast the weather; the richer the data set, the more accurate the forecast) and the problem has also given birth to many companies, the most commonly cited being Google (whose very name is derived the term 'googolplex' which is 10^10^100). The company's computers crawl through the web, indexing trillions of pages to make them searchable for all of us. Despite that, the internet remains far from being entirely indexed for 2 reasons; primo its growing everyday and secundo its so damn big. As a side note, check out this nifty website to see how big the internet probably is (I say probably because the map is obviously incomplete).

Big data exists the moment something exists in this universe. The universe and all its contents has existed for 13.7 billion years. The trick is capturing that data, storing and analyzing it for useful patterns. The tools for performing these actions have only arrived in the last two centuries in the form of cheap, compact storage devices, powerful processors, networking technologies and, most importantly perhaps, necessity.

To illustrate the importance of necessity's role in spurring the development of tools for handling big data, take a look at this TEDEd video:


Since the time when IT first became a ubiquitous part of our lives, big data has become relatively easy to collect. Mapping has enjoyed the fruits of big data; the maps of today are no longer physical pieces of paper that my parents and I used extensively on our excursions around the city of London in 2002. Nowadays people have access to compact, electronic maps on their smartphones which are rich in social data, never become outdated and can even feature live updates on such important things like traffic information (the company Waze, which was purchased by Google in June 2013. allows you to access social info on traffic with GPS enabled phones).

However, what good technology has given us can also be used against us. Politics aside, big data technology has also allowed security agencies to look into our activities with unprecedented impunity. Privacy policies and laws may forever be left in the dust as the pace of technological development outstrips the ability of lawmakers to protect our digital privacy online (if you believe in such a thing anyway). Though the aim is purely (I think) to catch the bad guy, I guess we are in real danger of being wrongly accused by overzealous/over-legislated security agencies or caught by nervous, tyrannical regimes seeking to protect their illegitimate hold on power. All thanks to big data and the technologies it has spawned.

But the ultimate problem of big data (especially of the social media variety) is its terrible need to be verified or curated (and that's why crowd-sourced projects like Wikipedia still needs editors or there would be chaos) to ensure we don't end up turning noise into conclusions. Taking unverified data as real information is the biggest absurdity of the internet today. Perhaps that is the reason why teachers hate students referring to Wikipedia for information. Though I disagree with not using Wikipedia completely, CITING it in presentations and essays is another thing entirely. Its just not there yet. Give the technology time to mature and maybe...

Some also say that social media (think Twitter and Facebook which also deal with big data produced by millions of narcissistic humans) could be the new source for ALL news. While that is true if you have friends who love tweeting the news or giving accurate tweets (you could end up being given a skewed picture of the news), you really should be able to verify that raw piece of data with multiple sources. Big data is nothing if there is no useful way of deriving valuable and realistic knowledge from it, that is if there is any useful thing to be derived from it at all, which is another problem entirely.

The one thing that I find most amazing about big data is how much we humans have produced in our existence; more than 90% of it has been produced only in the last several years. By the time my grand children are born, we're going to have a real problem of storing that data let alone finding ways of scrutinizing it for useful patterns. But as usual, necessity will always mother an invention for the job.

Saturday, 31 August 2013

The Best Analogy for the 'God Particle'

Last year there had been great excitement within the circles of theoretical physicists as well as particle physicists concerning the possibility that the  Large Hadron Collider, the world's biggest and most powerful particle accelerator, might have given scientists enough data to clinch the formal discovery of something called the Higgs Boson particle, otherwise known as the god particle (I always get goosebumps when I say that). Recent reviews performed on the data collected so far seem to indicate without a doubt that the god particle is real.

Although this is being described as probably the most sensational discovery yet in modern physics, many of us mere mortals are still unable to understand what's gotten the physics nerds at CERN (the European Organisation for Nuclear Research) all pumped up with excitement. I'm sure if it weren't for the albeit scary nickname we wouldn't be talking much about this particle in popular channels. Here's a quick summary about it:

Simply put, the god particle is so named because it is theorized that all the particles in this universe that have mass (like the protons and neutrons in atomic nuclei) all owe their massive existence to this one particle when the universe formed in a hot big bang almost 14 billion years ago. It was first theorised by a man named Peter Higgs in 1964 as part of a solution for the standard model of the universe but we needed something like the Large Hadron Collider to observe this particle in the flesh.

As important as it is however, one can't seem to find a good explanation out there about how on earth can one measly particle do all that it is theorised to do. This recent video from TEDEd provides the best analogy that I have seen yet that tries to do that in simple, layman's terms. Enjoy!

Sunday, 25 August 2013

Agriculture and Climate change: the *other* Inconvenient Truth

Last week I watched a very interesting video talk the TEDEd youtube channel given by a rather relatively obscure chap (in the popular mind at least) called Jonathan Foley. He is the director of the Institute on the Environment at the University of Minnesota in St. Paul, MN, USA and in this video he talks about the issue of agriculture and its effects on the environment and climate change. The title of the talk is taken from the documentary An Inconvenient Truth which features former US vice president Al Gore talking about climate change.

I believe Foley's talk is very important for 2 reasons. First, in this highly misinformed age of ours, the media tends to focus on the hype of climate change which is unfortunate because it is such a technical problem (yes deniers, I'm talking to you) that the populace fails to follow the arguments and research and consequently ends up disputing findings because they applied common sense where one shouldn't, scientifically speaking. Climatology is a complicated thing and not something that should be abused by naysayers.

The second reason occurs as a result of the first; if you want to talk with the populace about climate change, start with the fundamentals. In this case it is the environment and agricultural practices which affects and is in turn affected by the environment. No climate change apparent at first but it is still there, its just being explained in the context of familiar human problems.

Here's the video. Enjoy!

Tuesday, 20 August 2013

Celebrating a 136 years of Mars moon discovery

Asaph Hall (Commons)
It's 136 years since American Asaph Hall discovered the Martian moons Phobos and Deimos in August 1877! Today we can now see the two moons from Earth, Mars orbit and the surface of the planet itself!

We have come a long way and we've still got loads of places to go. Like to my Mars Science Lab journal page where I've written a piece today all about the moons of Mars!

Tuesday, 6 August 2013

Justifying Space Exploration

A self-portrait of the Curiosity rover
on Mars. How do we justify this?
(NASA/JPL/MSSS)
I remember being on a family visit at one time when I was still in my early teens. We were settling down for lunch and the table discussions inexplicably veered towards the financial justification of conducting space missions by America and other countries.

The debate split the table into 2 camps for and against. Naturally, being citizens of a third world country, the latter camp was full to breaking point. Guess who was all by his lonesome self in the for camp. Moi!

I don't remember much details but what I do know is that for the first time in my life, I felt completely hopeless in the face of open scrutiny. How could an ignorant young chap justify such expensive endeavours that he doesn't immediately benefit from and yet loves as much as a young person would love, say, a rock star or a professional wrestler?

Time has passed and today America is running a one year old Mars rover. They're still spending on space exploration and still a superpower by any objective standard and we're still, *ahem* stuck in the mud. I should feel vindicated (and indeed I do) but an explanation is in order.

To that end, I have written a short reflection on this other blog of mine as a sort of delayed response to that old debate that I couldn't even hope of ever winning at the time. But we must remind ourselves that in the art of important debates, it isn't a question of winning or losing but a question of clearing the air and revealing truth. That is the sign of a true knowledge gatherer!