When AI Dances With Humans

There is a gargantuan appetite for data in the AI spectrum

Update: 2024-09-14 03:56 GMT

‘To watch us Dance is to hear our hearts speak’
Hopi Indian saying

Of the top 20 international companies, by market capital, nine are Artificial Intelligence/tech companies. The five leading companies in 2023 listings were all tech/AI companies. As of today, four are in the same category, with Amazon a tech, and multidomain company, being the fifth.

There are 5.45 billion internet users, and 5.17 billion social media users globally. The average global time spent on social networking has risen from 90 minutes per day in 2012, to 151 minutes per day in 2023. The largest digital populations are in China 1,050 billion, India 692 million, and the USA 311.3 million.

Unsurprisingly, the most popular sites in descending order are Google, YouTube, Facebook and Pornhub. This huge upsurge in paradigm change commenced in 2014, and saw a major leap 2019 Onwards.

Let’s take a view of the present dynamics and shifts in this industry, through the lens of a lead critical player, Nvidia. With a total market Capitalisation of 2.629 trillion USD, this company has seen a meteoric rise since 2020.

While its beginnings were in the gaming industry, when the need arose to train neural nets that form the basis of AI, Nvidia provided the critical chip. Their modified Graphic Processing Unit, GPU, Chip, trumped the CPUs in many multiples of speed, capacity and economy. Deep learning and machine learning for neural nets, now had the lodestone to surge and with them, Nvidia has risen to tidal proportions.

But all businesses seek to do “more with less”. Also, market leaders attract competition. Nvidia’s A100 GPU, and newer Blackwell Chips, are now in contest, from chip makers and process designers, who seek to train and refine AI, for targeted use.

Nvidia also reportedly faces an early-stage investigation into its domination of the AI Chip market. Unsurprisingly Nvidia has now announced its foray into the establishment and running of high-tech Data Centres and networked chips as a service. Google’s TPU tensor processing units, Huawei Ascend Chip, AMD and Intel chips, are a few of the growing tribe of competition for Nvidia.

Let’s also visit Nvidia’s geo-political dimension. Its chips are designed in the United States. The most advanced machines of ASML, a Dutch company, craft the impressions on silicon. The leading cutting-edge foundries of TSMC in Taiwan, and other components from Japan and the US complete the mix.

Consequently, this centrality permits control over the global market and facilitates both access and access denial. Qualitatively in the big league, there is no match for Nvidia. The company also controls the software that programs its unique chips.

Alex Net which was fielded in 2012, is based on deep learning. In 2020, 11 AI models were trained on 10 (to the power 23) FLOPS, floating point operations, (a mathematical computation value). It is estimated in 2024 the number of such AI models is in excess of 81. Such enormous requirements for speed and capacity are the lifeblood of the demand for Nvidia products.

The chase now is for cheaper, more targeted chips and innovative methods of training. Innovation does not necessarily mean better for the end user. For example, improving the efficacy, by enhancing machine-based learning and using synthetically generated data, as opposed to open data, is unlikely to improve the comprehensiveness of the model.

There is also a notable shift to cheaper smaller models, targeted specifically for defined users or domains. These are easier to train and adapt. Adding to the spectrum is the need to fine-tune models for very specific functionality.

This has led to a cheaper, RAG or Retrieval Augmented Generation. A method used to connect models with external data sources that is more economical than the earlier fine-tuning.

The heart of all this of course is data, and big data. Open and vaulted in companies and institutions. There is serious concern that the big AI players are running out of unused data. Text and image-based data that has fired the AI revolution is estimated to dry up by 2028.

The appetite for data in the AI spectrum is gargantuan. There are also copyright issues, such as the New York Times, suing Open AI and Microsoft for the use of its articles and content over time. Custodians of images have also taken up issues with AI companies.

Future data will only come at a high cost with a cascading financial effect. Custodians of vaulted data may opt to develop their smaller-focused AI options. Further, a lot of data has just been exploited and exhausted.

The other anchor in this ecosystem is data centres. The scramble to get them going to meet the ever-expanding needs of AI, has generated its own set of anxieties. It takes up to two years to set up a large new data centre. Modern data centres need extensive planning and complex supply chain management.

Billions of US Dollars are being invested in these, with fallout on real estate, power, shortages of components and data connectivity. The enormous power requirements have driven the search for locations with adequate yet cheap power, such as in Africa. Innovatively data centres are being built in containers that can be delivered and set up anywhere.

Far from delivering the knowledge of the world to all, AI companies are amassing huge profits, developing enormous dependencies, and ingesting resources. On offer now is immortality, through Ghostbots and Deadbots, all the way up to the replacement of human reasoning, logic and artistic expression. There is also the ever-dedicated world of medicine, science and consumer goods, where AI assumes a far more nuanced identity.

We need to reflect on the central issue raised by Issac Asimov in his book ‘I,Robot’, of developing AI without structured boundaries. The more granular aspects of emerging AI also need a look.

Our changed consciousness, as well as mutated human skill sets, deserves deliberation. This and many other aspects of the Dance of AI as it entices humanity, will be central to our journey in the next issues.

Lt General Sanjiv Langer PVSM, AVSM, is a former Member Armed Forces Tribunal, and former Deputy Chief Integrated Defence Staff. Views expressed here are the writer's own.

Similar News

A Fragile Balance
The Culture Of Resistance
Country Before Partisanship
Thus Spake Trump!
The Unheard Cry For Justice
Seven-hours In Ukraine