4 March 2024
With its potential to support and advance life-critical industries such as medical diagnostics, drug development and green technology, Generative Artificial Intelligence (GenAI) might be considered the ultimate ‘game of life’. Which is fitting given its roots are firmly embedded in video gaming history.
The early days - Tic Tac Toe to Super Mario
That story began in the 1950s, with the likes of Tic Tac Toe and Nim, then the Bernstein Chess Program which broke ground with a then-giddying throughput rate of 42,000 instructions per second and an eight-minute response time.
Such was the pace of progress thereafter that, in 1965, early IT pioneer Gordon Moore predicted that the number of transistors on a microchip would double every two years over the next decade, while microprocessor numbers grew exponentially and the cost of computers halved. Astonishing as that had seemed then, ’Moore’s Law’ continues to hold true - some 60 years on.
In the intervening period we saw innovations in the 1980s such as the Nintendo Entertainment System - think Super Mario - which introduced 8-bit graphics, colours, sound and gameplay; and 1994’s Donkey Kong Country became one of the first mainstream console games to use pre-rendered 3D graphics.
Then came the ‘GPU’ (Graphics Processing Unit), and things got really interesting.
Rise of the 'Metaverse'
Instead of processing tasks serially like Centralised Processing Units (CPUs), GPUs break them up and run them in parallel, thereby processing overall jobs much faster. For gamers this meant a far higher quality and more immersive experience.
Such advances gave rise to avatar-based ‘metaverse’ games like Minecraft, Roblox and Fortnite, as well as ‘real experience’ titles like Microsoft Flight Simulator which forged new frontiers in the virtual clouds, and PlayStation’s Gran Turismo which aimed to deliver a ‘real driving performance’.
Along the way, GPUs evolved with thousands of processing cores to handle hundreds of millions of instructions per second. This catapulted gaming tech to a US$100bn global industry – and beyond the gaming bubble.
Most notably, ‘crypto miners’ began to use GPUs to solve the highly complex mathematical puzzles used to validate electronic transactions of cryptocurrency. By 2021 miners were buying some 25% of all GPUs manufactured – and disrupting the banking sector.
Now, GenAI is fast building on this legacy to shake up many more industries. According to Forbes, it will mushroom into a US$400bn+ market by 2027.
And want an awful lot more GPUs.
Quality control
Every revolution has a grinding reality check and, for GenAI, it’s quality control in GPU production - no easy task, given their increased complexity as well as demand. Now comprising up to 80bn transistors a hundredth of a human hair’s width apart, GPUs can suffer from over-heating, and ‘crosstalk’ (signal interference between neighbouring transistors) which affects the speed and clarity of data transmission.
To check for any imperfections before shipping, microprocessors like CPUs and GPUs are quality-checked by ‘test sockets’ which gauge their signal integrity, speed and accuracy. Crucially, they consist of an array of spring probes, or pins, which electrically connect the microprocessor to the test system, so mimicking the effect of soldering in the finished product. These probes are typically shielded by a plastic housing or ‘insulator case’ to prevent crosstalk between the signals being tested.
Traditional test sockets have struggled to reliably test GPUs. Most commonly, the plastic insulator didn’t sufficiently isolate the vast number of signals transmitted by a modern GPU, which affected transmission accuracy, consistency and quality. This resulted in either a falsely high ‘fail’ rate which meant output struggled to keep up with demand, or a false ‘pass’ rate which generated a high level of returns.
A genius move
This threatened the growth not only of gaming but every new sector champing at the GPU bit. But true to their pioneering heritage, Smiths Interconnect’s engineers broke this stalemate through the launch of the patented DaVinci Macro high-performance coaxial socket, designed specifically to test high-performance microprocessors like GPUs.
Instead of using a plastic insulator case, DaVinci’s probes are housed in an innovative ‘insulated metal’ that was developed in-house. This delivers several significant advantages: it isolates each signal to protect its integrity whilst ensuring full connectivity between the GPU and test system, allowing signals to transmit consistently at a manufacturer’s intended rates, and increasing spring probe durability.
Combined, this enabled complete and reliable volume testing. It was such a game-changer that ‘DaVinci’ quickly became a byword for quality control amongst GPU manufacturers. Indeed, chances are the graphics card in the device on which you’re reading this has been DaVinci-tested.
What the future holds
And in future, so might your increasingly digitised world – because our next release, DaVinci Gen V, is a GPU test socket specifically designed to handle the superfast transmission speed (224 Gbps) required for GenAI applications as well as to withstand the production pace the GenAI revolution will demand.
As it progressively integrates with our lives, GenAI will undoubtedly have many hurdles to overcome. But, thanks to its gaming heritage and pioneers like Smiths, GPU testing won’t one of them.
Related insights
Integrating a sustainable ethos into our products
Find out moreHow our engineers are helping hydrogen to become a source of new energy
Read our latest Engineering Explained insight piece on how our engineers are helping hydrogen to become a safe, secure and profitable source of new energy
Find out more