
The launch of the book titled ‘The Second Machine Age’ by Andrew McAfee of MIT Media Labs in early 2014 triggered off a series of debates, heated arguments and expression of opinions on a range of topics such as ‘Robots ruling over human beings’, ‘Computers replacing humans in their jobs’, ‘Ethical, Philosophical and Moral implications of various possibilities opened up by Artificial Intelligence’ and so on. The year 2015 has indeed witnessed what could be termed as a “CAMBRIAN EXPLOSION” in the field of Artificial Intelligence. The spate of progress in this field during this one year alone has astonished many AI experts and engineers as progress of this magnitude was simply unexpected. The incredible euphoria surrounding the pace of advancements in this field will continue in the year 2016 as well and we will witness computer systems and Robots doing things that were confined to science fiction till last year. Like all other technological revolutions that preceded it, the Digital revolution will certainly cause disruptions to the business models and the industry structures. Technologies like Big Data Analytics and Internet of Things (IoT) will cause huge shifts in the way the businesses are conducted in future. Similarly Artificial Intelligence will reduce the need to employ workers in the low-skill or low-education category as well as reduce the workforce in the middle management layer.
The interesting point to note however according to me is that Artificial Intelligence, Big Data and IoT are being made possible today primarily because of the cheaper and miniature versions of the electronic circuits available today thanks to the fact that more and more silicon transistors are packed tightly in one micron of space whose growth rate is defined popularly by what is known as Moore’s Law.
Relying on incremental increase in computing speeds and lowering of computing costs in a predictable manner every year for the last several decades is not something that can be called as "DISRUPTIVE INNOVATION" or path breaking in the truest sense of the word. If one really looks at the computing world not much has really changed in terms of basic technology per-se from the time, Alan Turing cracked Enigma code during World War 2 using the computer based on Von Neumann architecture in 1940s to the IBM Mainframe with DB2 in the 1960s to the Big Data systems of today where the data base is only far more larger and fairly old and established statistical tools and techniques are applied on processed data. The IoT is in its most essential form is a miniaturisation of an A/D and D/A convertor Integrated Circuits that have been available for decades now. And yes the most talked about subject of Artificial Intelligence has not changed in a disruptive fashion from the days when Alan Turing presented his paper on ‘Turing Machine’ in the 1950s to the ‘Multi Level Perceptrons’ of the 1960s based on theoretical Mathematical techniques such as Sigmoid Activation Functions, Convolution Operations and Feed Forward Neural Networks et al which had good implementations of working Algorithms in place as early as in mid 1980s.
The interesting point to note however according to me is that Artificial Intelligence, Big Data and IoT are being made possible today primarily because of the cheaper and miniature versions of the electronic circuits available today thanks to the fact that more and more silicon transistors are packed tightly in one micron of space whose growth rate is defined popularly by what is known as Moore’s Law.
Relying on incremental increase in computing speeds and lowering of computing costs in a predictable manner every year for the last several decades is not something that can be called as "DISRUPTIVE INNOVATION" or path breaking in the truest sense of the word. If one really looks at the computing world not much has really changed in terms of basic technology per-se from the time, Alan Turing cracked Enigma code during World War 2 using the computer based on Von Neumann architecture in 1940s to the IBM Mainframe with DB2 in the 1960s to the Big Data systems of today where the data base is only far more larger and fairly old and established statistical tools and techniques are applied on processed data. The IoT is in its most essential form is a miniaturisation of an A/D and D/A convertor Integrated Circuits that have been available for decades now. And yes the most talked about subject of Artificial Intelligence has not changed in a disruptive fashion from the days when Alan Turing presented his paper on ‘Turing Machine’ in the 1950s to the ‘Multi Level Perceptrons’ of the 1960s based on theoretical Mathematical techniques such as Sigmoid Activation Functions, Convolution Operations and Feed Forward Neural Networks et al which had good implementations of working Algorithms in place as early as in mid 1980s.
In a nut shell the lack of a fundamental disruption on the technological front might never be able to create a fundamental disruption to the business systems or macro-economies in the long run.
"This might just be an ephemeral phase in the history of technological innovation and like all those that preceded it, this too shall pass by..."
Note:
I did quite some research on the topic of Artificial Intelligence with specific concentration on the topic of SINGULARITY... I have read arguments both in favor of and against this eventuality happening as well as the arguments on whether or not it will ever happen during our life times... Starting with Nick Bostrom's seminal book and articles on "Super Intelligence" to the philosophical, moral, ethical and technical debates on his masterpiece to the work done by Jeff Hawking titled "On Intelligence" to the book by Ray Kurzweil titled "How to create a mind?" and so on.... I then ventured into the area of Neuroscience and looked at works of the top guns such as "Sebastian Seung" centred around CONNECTOME which looks at advanced possibilities in AI and topics in medicine such as Downloading Ones Brain & Neural Circuitry for posterity etc.... The last but not the least were the papers and articles by Peter Diamandis from his "Abundance" series.... And finally my own grounds up programming experiments which I have personally carried out in the field of AI and Deep Learning based on my hand written programs implementing some of the latest algorithms just to get a first hand feel of the topic... And at this point of time I guess what I have written in this article us based on my gut feel and understanding... I could change my mind after a few months though based on the new developments in this field...
No comments:
Post a Comment