Monday, April 4, 2016

Quantum Computers -- The next 'WAVE' of Computing....

Over this weekend, please join me as I delve into the world of "Quantum Computing" and share my personal thoughts on this rather exotic and fascinating subject....

NOTE: For those who do not have time to read the full article, please have a quick glance at the attached illustration where you will see "Raman Rays" named after the great Indian Physicist Sir C V Raman... The contributions to the arguably most complex field of science viz. Quantum Physics in the first part of 20th century by Indian Scientists like Dr J C Bose after whom the GOD particle or "Higgs-Boson" particle is named and Sir C V Raman who is credited with the discovery of "Raman Effect" and the evolution of the intricate theoretical physics and mathematics behind the same are indeed remarkable...

We have all heard of "Quantum Computing" which is the next big wave in the field of computing and is arguably going to unleash the huge turbo power boost much needed in the current computing world...




With the Moore's law already slowing down in its tracks and Microprocessor pioneers like Intel delaying the next release of their chips as they are battling the engineering challenges while trying to fit more transistors in a square nano meter of space, alternatives to Silicon chips need to be actively explored. While Intel managed to release the current generation of Xeon Chips in 2015 with inter-transistor gap of 14 nm compared to the 22 nm in their earlier generation of chips, the next generation of chips with 10 nm gap would probably be released only towards the end of 2017.

And this challenge of the declining returns from the Moore's law will only be all the more accentuated in the coming years and in my minds eye I see the delta increments in technology falling down drastically by 2025...
And if we are to keep producing faster and faster computing chips together with lowering storage costs that are quintessential to make discernible progress in the newer fields of computer science such as Artificial Intelligence and Robotics as well as IoT which will inturn result in huge value addition to the human race, we desperately need the following: Rapid progress in the fields of Quantum Computing (Disruptive Innovation) and Nano Technology (Radical Innovation).

The quantum computers that are being touted as great innovation today are in reality only at a very low level proof of concept stage and cannot be used to do anything useful other than solve a few preprogrammed arithmetic operations. There are 2 sub-categories that are being pursued in the field of Quantum Computing: 
1) "Quantum Annealers" which are more akin to the the Application Specific Integrated Circuits(ASIC) of the yesteryears which are used to implement one particular algorithm only and cannot be reprogrammed for any other general purpose tasks.  

2) "Universal Quantum Computers" which are the general purpose computing platforms akin to the Microprocessors of today which can be used to perform any general purpose computing tasks and can be programmed on the fly like their Silicon counterparts from Intel today....

Despite the huge amount of hype and hoopla surrounding the area of quantum computing today and claims of huge wins and achievements in this field , all the working prototypes built today ( which work only in carefully controlled laboratory conditions ) belong to the category of "Quantum Annealers" which are capable of running only a single algorithm. However the point to be noted is that if we choose the one right algorithm say a machine vision algorithm from Google or Natural Language Processing algorithm from Microsoft Skype, then that one specific chosen algorithm will run at an unbelievable speed of "100 million" times more in a Quantum Computer than the speed at which it runs on a conventional top of the line CPU available today from say Intel. In the world of AI and Machine learning where the key contributor to the functionality being developed is the computing speed, a "100 MILLION-X" improvement in the computing speed will mean wonders and realisation of unimaginable functionalities in the world of computing....

The picture shown below is from an MIT Tech Review, April 2016, article on the latest Quantum Computer showcased a few weeks ago at a top American University R&D lab. This Quantum Computer is only a "5 bit" computer whose circuitry can handle only 5 bits of data at a time. The circuitry and the quantum chips operate at a temperature of -272 degrees centigrade in the realm of superconductive speeds which brings to the fore certain special quantum properties which make the superlative high speed computing a realistic possibility. Perhaps one would be tempted to compare this '5-bit' computer which runs a "single algorithm" that solves certain mathematical problems such as Laplace and Fourier transformations with the Intel 4004 Integrated Circuit which is a '4-bit' silicon computer that was released a whopping 45 years ago in 1971. One might also be tempted to compare it with the modern day commonly available 64-bit computers that sit on every desktop and confidently conclude that the latest quantum computing invention is no big deal.

However it is indeed a big deal as the overall computing speed depends primarily on the following factors: 1) Processor Bits and 2) Processor Speeds & 3) Processor Parallelism

It is to be noted that the bits processed has increased only 4 fold from 16 bits in the first Intel 8086 Processor released in 1978 which is still the primary Architectural backbone and acts as a template even for the 64 bit Intel Xeon Chips being produced today....

The primary factor among the three key factors that determine the overall computing speed is the raw processing power of the microprocessor chip.

So the state of the art "Quantum Annealer" chip that was showcased in March 2016 and whose architecture is shown in the MIT Tech Review illustration below may be only a 5-bit special purpose circuit which is capable of running only a single math algorithm and not capable of being re-programmed or being  any where close to a general purpose microprocessor such as the 8086 the late 1970s but it makes up for all its short comings by running at a speed that is "100 Million X" times that of a modern state of the art Silicon based CPU.....

To put things in context the human brain which is the most powerful and fastest supercomputer known to us today runs at a speed of 1 Billion X times using the complex and intricate connections of hybrid bio-chemical and electrical circuits aka neurons with their superbly tentacled web of connectors called as synapses...And on top of this raw computing biological circuit speeds is the incredible amount of massive parallelism which is simply unprecedented in the computing world....

Assuming that the Moore's law will continue giving the same returns as it did over the last three decades it will take yet another good three decades from today before the silicon microprocessors catch up with the theoretical speed possible via quantum computing and possibly a good 4 decades from today to catchup with the processing speed of the human brain... However as I argued earlier the Moore's law is showing diminishing returns already and will most likely hit a plateau in about a decade's time making the above conjectures null and void...

If the dreams of computer scientists over the last century of building a computer whose capability equals that of the human brain are to be realised the only way is thru' the advancement of the "Quantum General Computing"....

In my mind's eye I see the "Quantum General Computing" evolving to the stage where it can power Robots with near human like intelligences in about decade or so from now....
As the ancient Chinese proverb goes, "May we all live in interesting times..." 😊

Note: One might wonder as to why I have been only talking about the Intel chips as a baseline while comparing Silicon Transistor vs other technologies. What about the cutting edge chips in mobile phones and tablets which are supposedly the in-thing today? Unfortunately the microprocessors and other computing chips used in the so called hi-tech modern gadgets such as tablets or mobiles are no where near the cutting edge of innovation in the field of chip design. Intel still continues to lead the race in the innovations belonging to the realm of silicon computing chip design...😊

#Musings  #Technology  #Future  #Computing  #AI

Saturday, January 9, 2016

Today's AI -- IS IT TRULY DISRUPTIVE?



The launch of the book titled ‘The Second Machine Age’ by Andrew McAfee of MIT Media Labs in early 2014 triggered off a series of debates, heated arguments and expression of opinions on a range of topics such as ‘Robots ruling over human beings’, ‘Computers replacing humans in their jobs’, ‘Ethical, Philosophical and Moral implications of various possibilities opened up by Artificial Intelligence’ and so on. The year 2015 has indeed witnessed what could be termed as a “CAMBRIAN EXPLOSION” in the field of Artificial Intelligence. The spate of progress in this field during this one year alone has astonished many AI experts and engineers as progress of this magnitude was simply unexpected. The incredible euphoria surrounding the pace of advancements in this field will continue in the year 2016 as well and we will witness computer systems and Robots doing things that were confined to science fiction till last year.  Like all other technological revolutions that preceded it, the Digital revolution will certainly cause disruptions to the business models and the industry structures. Technologies like Big Data Analytics and Internet of Things (IoT) will cause huge shifts in the way the businesses are conducted in future. Similarly Artificial Intelligence will reduce the need to employ workers in the low-skill or low-education category as well as reduce the workforce in the middle management layer.

The interesting point to note however according to me is  that Artificial Intelligence, Big Data and IoT are being made possible today primarily because of the cheaper and miniature versions of the electronic circuits available today thanks to the fact that more and more silicon transistors are packed tightly in one micron of space whose growth rate is defined popularly by what is known as Moore’s Law.

Relying on incremental increase in computing speeds and lowering of computing costs in a predictable manner  every year for  the last several decades is not something that can be called as "DISRUPTIVE INNOVATION"  or path breaking in the truest sense of the word. If one really looks at the computing world not much has really changed in terms of basic  technology per-se from the time, Alan Turing cracked Enigma code during World War 2 using the computer based on Von Neumann architecture in 1940s to the IBM Mainframe with DB2 in the 1960s to the Big Data systems of today where the data base is only far more larger and fairly old and established statistical tools and techniques are applied on processed data. The IoT is in its most essential form is a miniaturisation of an A/D and D/A convertor Integrated Circuits that have been available for decades now. And yes the most talked about subject of Artificial Intelligence has not changed in a disruptive fashion from the days when Alan Turing presented his paper on ‘Turing Machine’ in the 1950s to the ‘Multi Level Perceptrons’ of the 1960s based on theoretical Mathematical techniques such as Sigmoid Activation Functions, Convolution Operations and Feed Forward Neural Networks et al which had good implementations of working Algorithms in place as early as in mid 1980s.


In a nut shell the lack of a fundamental disruption on the technological front might never be able to create a fundamental disruption to the business systems or macro-economies in the long run.

"This might just be an ephemeral phase in the history of technological innovation and like all those that preceded it, this too shall pass by..."

Note: 
I did quite some research on the topic of Artificial Intelligence with specific concentration on the topic of SINGULARITY... I have read arguments both in favor of and against this eventuality happening as well as the arguments on whether or not it will ever happen during our life times... Starting with Nick Bostrom's seminal book and articles on "Super Intelligence" to the philosophical, moral, ethical and technical debates on his masterpiece to the work done by Jeff Hawking titled "On Intelligence" to the book by Ray Kurzweil titled "How to create a mind?" and so on.... I then ventured into the area of Neuroscience and looked at works of the top guns such as "Sebastian Seung" centred around CONNECTOME which looks at advanced possibilities in AI and topics in medicine such as Downloading Ones Brain & Neural Circuitry for posterity etc.... The last but not the least were the papers and articles by Peter Diamandis from his "Abundance" series.... And finally my own grounds up programming experiments which I have personally carried out in the field of AI and Deep Learning based on my hand written programs implementing some of the latest algorithms just to get a first hand feel of the topic... And at this point of time I guess what I have written in this article us based on my gut feel and understanding... I could change my mind after a few months though based on the new developments in this field...

Making a young girl fall in love with Computer Science...



I touched and felt a computer for the first time in my life in October, 1992, a week after joining the engineering college. I recollect that in my first year engineering, I simply loathed the subject called 'computer programming' and vividly remember arguing for 20 minutes with my professor that I could do everything using my scientific calculator that she was saying could be done by a computer. I was very frustrated when my clarity on the topic was no better at the end of the argument. I then spent next 2 weeks reading the prescribed text book titled 'Programming in Fortran 77’ by V Rajaraman, multiple times which helped to some extent but at the end of all that, I still disliked computers. I strongly felt that I and they do not understand each other very well and can never be together. Then I firmly made up my mind that after completing my first year, I will never again look at a computer in my life. I was in Mechanical Engineering at that time which did not have any more courses on computers after the first year. I was also aspiring to do an MBA after my engineering and this further added firmness to my resolve.
After finishing the first 4 months in first year engineering, I expressed my anguish and frustration to a close friend of mine, who was regarded as a computer whiz kid in college, and confessed that I HATED computer science. He said nothing and simply gave me a book on 'Introduction to Programming' from Schaum Series and that helped me a lot to understand the real power of computer programming. I also chanced upon a book called ‘Applications of computers in the business and industry’ that for the very first time gave me a very broad perspective and helped me discover the potentialities and possibilities in this field. I had no clue till that day that computers could be useful to mankind in so many ways.  At that point I realized that computers will play an important role in future and hence I resolved to work on the subject deeply and know as much as I can within the remaining part of the year. I was still quite firm that I will abandon computer science after my first year and therefore it was important that I learn whatever is possible within this limited time.
During next 7 months, unbeknownst to me I got quite intimately involved with the subject and before I realized, I was madly in LOVE with computer science. I was so much in love that I shifted to Computer Engineering branch after the first year and an intense and raging affair with computer science ensued ever since.  I did finish my MBA as aspired and then chose to join the computer industry in the management stream and remained there. Computer Science continues to remain my first love even today and the intensity of love remains as much now as it was then.  This is one of those decisions in life that I will never regret.
Computer Science is a philosophy and not a technology. It is a way of thinking and perhaps even a way of living.  In a nutshell, Computer Science has transformed me in every possible way and has influenced varied aspects of my personality such as: mind set and thinking, self esteem and confidence, way of looking at the world, approach to problem solving, approach to technology, reasoning and higher order thinking and so on. Nothing in me has been left untouched by this amazing subject [It probably merits a full fledged article to explain my fascination for this subject!!]
It surprises me that even after two and a half decades, the percentage of girls who pursue computers and mathematics is very low and the reason being solely the lack of interest and awareness. There are several reasons why many youngsters might not be interested in the subject as it is unique and exotic in its own way and has an élan of its own kind. It is extremely important that a correct approach is adopted while introducing the subject to a novice. The right kind of appreciation is needed to be developed before one can start liking the subject and discover the elegance and beauty of computer science.
The discovery of my love for computer science was purely accidental and the probability of that happening in those days was very very low. It took me one complete year and a very frustrating one at that. There was a very high probability that I would have abandoned this subject for good. The probability of abandoning the subject and considering it as a drudgery holds good for students at the university even today. I am not talking about learning a programming language or learning a technology or a platform which is very easy and one might be driven to do that due to the lure of a well paid software jobs. It is probably easy to accomplish this in developing countries where one tends to be driven by job market and money as one might not be able to afford to be driven purely by interest or passion.  But if we really want to increase the participation in this field in a country like the U.S. where typically many people tend to choose to work in the area of their passion, one needs to be able to generate that intense passion and fervor.
It has a lot to do with creating a mind blowing first impression, generating a deep urge to know more, creating an enabling atmosphere for getting intimate with the subject, developing a mindset that will allow one to realize and then fully absorb its awesome beauty, and finally falling in love with it. In the context of the efforts being made by the U.S. government and corporates to generate interest in the STEM area especially among young girls, it is very imperative to note that the goal is in reality all about generating passion.  This needs to be looked as an endeavor that helps youngsters experience the ethereal beauty and immerse them in a magical world. The curriculum, reading materials, class room sessions, practical sessions on computer etc should be carefully developed  with the aim of being able to create ‘an experience like never before’ in the minds of youngsters and must focus on their feelings, emotions and mindset as a central theme. It is all about making them fall in love that lasts a life time and not about making them learn a technique or luring them with good career prospects.
What concerns me is that the interest shown in computer science by many youngsters and especially young girls remains lukewarm even today.  I was some what surprised when I read somewhere that in subjects like computer science, girls constitute only 15% of the overall students in the class at University level. And these are numbers for the U.S. which is arguably the most progressive country in the world. I am aware of several initiatives by U.S. Government and large corporates in the last few years to increase the awareness and ultimately the representation of women in the STEM field but we indeed have a long way to go before we would have reached the goal. On the contrary, countries like India beat the trend with 37% women enrolled today in computer science at university level. It is very startling to note that the corresponding number in the U.S. was 35% in 1980 and it steadily dropped over the years to a low of 10% in 2010 and then rose slowly to reach 15%.
I found the following observation by a researcher to be quite interesting and profound: “The omission of women from the history of Computer Science perpetuates misconceptions of women as uninterested or incapable in the field”.
Keeping this in view it will be a good idea to familiarize our young girls with the "history of computer science" and the path breaking contributions to this field by women, in 19th and 20th centuries, which would inspire them. There is a lot of information on the Internet and even many books are available on this topic which I am sure could generate interest amongst the youngsters on this seemingly esoteric but in reality an adorable subject.
Let me quickly talk about the four women who in my opinion were the pioneers in the field of computer science during their times and hopefully this will act as a starting point….
Ada Lovelace could be termed as the “First Lady of Computer Science". She was a great mathematician and was the first woman to actually write a computer program in as early as 1843 based on a very complex algorithm and executed the same on what was called as ‘Charles Babbage Analytical Engine’ which is the precursor of the modern day computer. The programming language used for developing modern day ‘Real Time Military’ software systems is named as ADA in her honor. ADA is a very structured and evolved programming language with a number of innovations which could be termed as a work of ingenuity. The ‘C’ programming language of today inherited a number of concepts and principles from ADA.
Grace Hopper was one of the very few computer scientists who worked for the U.S. Navy in 1950s. Grace conceptualized and designed a new programming language for use in business applications and also wrote a full fledged compiler for the same. The modern day COBOL programming language has a lot of resemblances to the one that was designed by Grace.
Edith Clarke was initially trained as mathematician and worked in that area before proceeding to pursue her Masters in Electrical Engineering from MIT in 1918. She was the first woman to be awarded that degree. Edith pioneered the usage of advanced mathematical methods in computer science. She leveraged mathematics and computer science solve complex problems related to electrical power systems. This was unheard of in those times and forms a basis for the modern day computer based automation and control systems used in the industry today.
Hedy Lamarr was a very popular actress in her times who also made significant innovations to the field of wireless communications. She pioneered the concept of ‘Frequency Hopping’ which is the foundation for modern day Data/Wireless communication technologies. The application of her invention in those days was to actually control and guide the ‘torpedoes’ even while in motion and prevent them from being intercepted by the enemy leveraging the concept of frequency hopping. She was far ahead of her times to have been able to conceive this idea and for others to be able to comprehend what she was talking about. Those were the Second World War (1939-45) days and she was a very famous actress who was very well known for her ‘stunning looks’ and ‘ethereal beauty’ and not many knew of her intellectual abilities. Electrical engineering was an area that she picked up as a hobby during her spare time which indeed speaks volumes about her intellect. The invention that she made however was dismissed right away by the naval authorities who asked her to stop wasting time on areas that she does not understand and instead use her popularity for raising funds needed for the war. This was an area that could have potentially altered the history of the war. The advanced missile systems and anti-missile systems of today accomplish similar objectives as envisaged by Hedy Lamarr in the 1940s. It was 1997, when her contributions were finally recognized and she was honored with a special international award for being a pioneer in the area of wireless communications.
Better Late than never. Isn't it?