AI has made rapid progress in recent years, marking its presence in almost every field. Advancements in AI will only increase with time as it engulfs our entire world. From self-driving cars to predictive medicine and personalized learning, AI has shaped our everyday life’s both practical and intimate aspects, including finding life-partners for us. The rise of AI has created a demand for highly skilled professionals in the field. Young professionals can have a fruitful career in AI by completing various AI courses like msc data analytics, ms data science etc in India from institutes like INSOFE.

Generally, when people talk about AI, the discussion is centered around AI’s applications and predictions about AI-driven products and their effect on our lives. But we often overlook the fact that today’s AI revolution is only possible because of computing hardware and computing ecosystem evolution. 

AI and computing: A match made in heaven

In a nutshell, AI enables computers to mimic human intelligence. Machine learning (ML) is a crucial tool in AI that helps computers in acquiring intelligence. ML enables machines to learn from data with statistical techniques instead of being programmed to perform specific tasks. Regression models are a common ML technique. It includes decision trees and neural networks, having adaptive systems of interconnected nodes modeling relationship patterns between input and output data. Multilayered neural networks come in handy when dealing with complex tasks like image recognition and text synthesis. These are a subset of machine learning and are commonly known as deep learning (DL), as the neural networks involved here have more depth. Both ML and DL techniques are data- and computing-heavy. For example, a simple deep neural network that classifies animal pictures into dogs and cats requires thousands of classified animal pictures in training data and billions of iterative computations to mimic a four-year-old kid’s ability to differentiate cats from dogs. Hence, data and computing form the engine of high-performance AI systems.

Core computing trends that define AI

  • Moore’s law: According to Intel co-founder Gordon Moore, the number of transistors per square inch in an integrated circuit (IC) has doubled every 18 months to two years since the mid-1960s. The development in computing traced by Moore is famously known as Moore’s law. Today, computers are smaller and faster than ever, reducing computing costs by ~30% per year. 
  • The internet of things (IoT): By 2020, 40 billion (mini)-computers are connected to the internet. The IoT has enabled the collection of exabytes of text, voice, image, and other forms of training data, feeding into ML and DL models, and increased the accuracy and precision of these models.
  • The rise of application (and AI) specific computing hardware: The traditional computer comes with a CPU (central processing unit) representing the most standardized and general-purpose microprocessor. However, new processing units like GPUs (graphics processing units) and TPUs (tensor processing units) have emerged recently. A GPU and TPU have a specialized architecture that boosts computing-intensive deep learning applications such as image processing and voice recognition. 
  • The era of exascale computing: The computing hardware evolution has enabled computing giants like HPE to build supercomputers capable of at least one exaFLOPS or a quintillion calculations per second. 

Hurdles ahead for computing status quo

People rarely consider computing as a scarce resource for enabling the AI revolution. However, computer systems have several limitations that can slow the development of AI applications built on top. 

  • The end of Moore’s law: Moore’s law has limitations due to the transistors’ physical limitations. The transistors cannot go in size below a certain level. Hence it will limit the ability to shrink microprocessors infinitely. 
  • Increasing data regulation: Data regulation has become mainstream globally, for example, the recent adoption of GDPR in Europe. It directly affects the development of supercomputers. Because the data has to be stored close to the processors to ensure speedy exascale data processing. However, data regulation makes centralized data placement more difficult. 
  • The costs (and feasibility) of data transfer and storage: Massive amount of training data and the bandwidth costs for data transfer closer to centralized processors require a redesign of computer memory and underlying I/O (input/output) operations. Running centralized supercomputers has other costs as well. For example, there are OPEX and environmental costs like power consumption and CO2 emissions.

What does the future hold?

The regulatory constraints and the energy and bandwidth costs of centralized super-computing will drive data storage and computing to the edge. The imminent end of Moore’s law will push the emergence of more specialized computing architectures, changing the structure of processing units and individual circuits to enhance performance without shrinking transistor size. 

Meanwhile, quantum computing is becoming mainstream. Quantum computers are revolutionizing computing by leveraging the strange laws of quantum mechanics. Quantum computing’s ‘superposition’ and ‘entanglement’ properties can be pathbreaking. These properties make it possible to carry several operation streams parallelly. Hence, a quantum computer is better suited to tackle complex computational problems instead of a traditional computer. However, building a physically ‘universal’ quantum computer remains a difficulty in engineering terms. 

Working in STEM should be encouraged to ensure the computer manufacturing industry has the talent to disrupt itself and stay relevant with AI revolution needs. 

Data scientists should focus on the least complex way to solve the problem while building ML models and writing algorithms. Opting for a complex neural network when a regression can deliver accuracy is a classic example of wasting a scarce resource such as computing. Hence, promoting computing-friendly practices within the tech ecosystem has become a necessity.