From the Dawn of Intelligence to the Rise of Artificial Intelligence

Deepak Mehta
4 min readJan 4, 2024

--

Photo generated by DALL-E
Photo generated by DALL-E

Intelligence has always been a cornerstone in shaping human existence. Throughout our history on Earth, estimated at approximately 300,000 years (Harari, Y. N. “Sapiens: A Brief History of Humankind”), we have not only survived but thrived by continuously inventing tools to augment our intelligence. From the dawn of language to the rise of Artificial Intelligence, many tools have played crucial role in our intellectual evolution.

Around 100,000 years ago, our development experienced a significant turning point with the emergence of language (Poulos, George. “On the Origins of Human Speech and Language”). We acquired the ability to communicate via speech, allowing us to share knowledge, ideas, and experiences efficiently. Our collective intelligence grew exponentially as stories, traditions, and information were passed down through generations, laying the groundwork for further advancements in human society. However, the thirst for knowledge outpaced the limits of oral transmission.

Approximately 5,000 years ago, humanity made a monumental leap forward in intelligence with the invention of writing (Robinson, A. “The Story of Writing”), later complemented by printing. The written word became a powerful tool for preserving information and ideas, enabling us to record history, laws, scientific discoveries, and philosophical insights. Writing facilitated the spread of knowledge across vast distances and time, laying the foundation for the rapid advancement of human civilisation.

Both language and writing have played pivotal roles in augmenting our intelligence. Language initially empowered us to communicate and share knowledge, while writing further enhanced our ability to preserve and distribute information and knowledge effectively. Together, these tools have been instrumental in shaping our intellectual growth and facilitating the progress of human society.

As our knowledge base broadened, so did our curiosity, prompting us to delve into more profound questions. The need for a specialised language to facilitate scientific reasoning became apparent. This necessity gave rise to the development of mathematics and the formulation of algorithms around 3000 BCE, with their significance growing notably in the 9th century (Boyer, C. B. “A History of Mathematics”). Mathematics evolved into a universal logic language, empowering us to engage in abstract reasoning and tackle intricate problems. Concurrently, algorithms began to refine our thought processes, guiding us towards making rational decisions rooted in logical principles.

The advent of computers in the mid-20th century marked a revolutionary leap in enhancing human intelligence (Ceruzzi, P. E. “A History of Modern Computing”). Initially, tasks involving mathematical calculations, logical reasoning, and algorithms were performed manually by professionals known as computers. However, these tasks’ growing complexity and volume highlighted the need for automation. This led to the development of electro-mechanical computers, marking a significant transformation in our ability to process large volumes of data. These early computers provided unprecedented computational power, revolutionising the landscape of information processing and decision-making. They enabled the handling of intricate tasks and large-scale computations, far exceeding the limitations of manual human effort.

In today’s rapidly evolving digital landscape, we are witnessing an unprecedented surge in data generation, presenting unique challenges and opportunities. This phenomenon is fueled by advancements in affordable, high-performance hardware and the proliferation of high-speed internet, leading to the production of vast quantities of data at an extraordinary rate. The key to leveraging this data lies in extracting meaningful insights and enhancing resource utilisation, which is becoming increasingly critical.

In response to these challenges, the role of Artificial Intelligence (AI) has become more prominent than ever. Originating in the mid-20th century, AI’s evolution has been transformative, fundamentally changing our approach to managing and interpreting vast information repositories, as highlighted by Russell, S., and Norvig, P. in “Artificial Intelligence: A Modern Approach”. AI employs advanced algorithms to process, analyse, and decipher complex data sets autonomously, offering a powerful tool in our quest for knowledge. More than just a tool, AI acts as a complementary force to human intelligence, augmenting our cognitive capacities. It is enhancing our reasoning skills and expanding our ability to navigate and interpret intricate data, revealing patterns hidden within massive data sets. The emergence of AI marks a significant milestone in our journey to fully harness the spectrum of intelligence fully, offering new avenues for innovation and discovery in an increasingly data-driven world.

Interestingly, the way we acquire and learn these tools from birth follows the same order: we first learn a language, then move on to writing, followed by understanding basic mathematics, and finally, in our current digital age, we develop proficiency in using computers. This natural progression emphasises these tools’ crucial role in human intellectual development.

As we look towards 2024 and beyond, the ongoing evolution of AI tools, such as large language models (LLMs), promises to augment our intelligence further. These groundbreaking models are changing how we interact with computers, bridging the communication gap between humans and machines. Tools like LLMs are poised not only to enhance individual intelligence but also have the potential to narrow the divides among people. Looking ahead, the ongoing advancement of AI is set to enhance human intelligence further, steering us towards greater wisdom and new opportunities, unveiling the infinite horizons of human potential.

--

--