Tedium - No Shortage of GPUs Here 🖥

Understanding what makes a GPU a GPU.

Hunting for the end of the long tail • March 26, 2021

Hey all, Ernie here with a piece from Andrew Egan, who has a story to share about GPUs, the chips whose economics are really annoying a lot of gamers right now.

Today in Tedium: Names change. Perhaps the most jarring element of the recent, widely reported “alien” activity isn’t so much descriptions of sonic boomless sonic flight but that UFOs (unidentified flying objects) are now called UAPs (unidentified aerial phenomena). Companies rebrand; Google became Alphabet and the Washington Football Team decided that was a good idea. With technology, terminology tends to become antiquated as industries progress beyond understanding their own achievements. Today’s Tedium is looking at changes to the GPU (now the graphics processing unit) acronym and how it harkened a new era of computing applications, while frustrating an obvious customer base. — Andrew @ Tedium

Before we get going, be sure to check out or sponsor below:

Pasted image 0

Ad from Tektronix Interactive Graphics, an alleged forebearer of GPU technology. Or were they?

Most of the history of GPUs doesn’t really count

Computers compute numbers, most people probably understand that. Using computers in a way that has felt “intuitive” to people has generally meant using some type of graphics. Or more importantly, making the numbers look pretty. (Now you know why your math teacher wanted you to show your work. Same principle.)

How mainstream consumers came to expect graphic interfaces to work with computers is a long, fascinating history covered by many books and at least one excellent made-for-TV movie. On the backend, however, the story was just beginning.

While Windows and Apple were gaining acceptance for their point-and-click interface, hardcore computer users, i.e. gamers, needed those points and clicks to register a lot faster. They also wanted the graphics to look more realistic. Then they started asking for features like online multiplayer game play, instant chat, and a slew of other features we expect nowadays, but seemed like a lot in the early to mid-1990s.

One could contribute the increase in technically demanding game play to the 1993 classic Doom, as well as its successor Quake, which drove consumer interest in dedicated graphics cards. However, after talking with an expert that’s watched the field develop from the beginning, the history of GPUs just isn’t that simple.

Dr. Jon Peddie first got involved in the computer graphics industry in the 1960s when he was part of a team that made 3D topographic maps from aerial photography, leading to the creation of his company, Data Graphics. By the early 1980s, he was considering retirement and a career writing sci-fi (sounds nice) when he noticed an explosion in the field that was hard to ignore. Practical applications for high performance graphics were initially driven by CAD and GIS companies, though the video game explosion of the ’80s would change that.

“Gaming was (and still is) the driver because [of] the volume of the customers,” Peddie said in an email. “The other users of 3D and GPUs were engineering (CAD, and molecular modeling), and the movies. But that market had (in the ’80s and ’90s) maybe 100k users total. Consumer 3D had millions. But, the pro market would pay more—thousands to tens of thousands, whereas the consumer would pay a few hundred. So the trick was to build enough power into a chip that could, in a final product, be sold for a few hundred.”

At this point in computing history, the acronym GPU had been introduced into the tech lexicon. This blast-from-the-past article from a 1983 edition of Computerworld details the Tektronix line of graphics terminals. But if you look a little closer, GPU didn’t yet stand for “graphic processing unit”. Instead, this iteration stood for “graphic processor unit”. Small difference, but this is Tedium, so it must have a big impact, right? No. So is there even a difference?

“None,” Peddie explains. “Tense at best case. English is not the first language for a lot of people who write for (on) the web.”

Okay, fair enough. But this isn’t actually the problem or even the interesting element of GPU history to consider, Peddie points out. It’s the fact that before 1997, the GPU didn’t actually exist, even if the acronym was being used. A proper GPU, it turns out, requires a transform and lighting (T&L) engine.

“Why shouldn’t, couldn’t, a graphics chip or board developed before 1997 be called a GPU?” Peddie asks. “It does graphics (albeit only in 2D space). Does it process the graphics? Sure, in a manner of speak[ing]. It draws lines and circles—that’s processing. It repositions polygons on the screen—that’s processing. So the big distinction, that is a GPU must do full 3D (and that requires a T&L).”

Ultimately, like much of tech history, the story quickly becomes about competing claims between an industry leader and a forgotten innovator.

Graphics chip glint550x258

The Glint 3D graphics chip by 3Dlabs, arguably the first company to produce a true GPU. Largely used for “high-end 3D CAD applications”, it was released in November 1994 (still not the first “real” GPU but still a cool graphics chip). Though first to market, 3Dlabs would not enjoy the economies of scale available to their competitors, like Nvidia.

Bragging rights are claimed by the winners

Let’s get this out of the way since it’s a common mistake. The first PlayStation was not the first mass market GPU. That belief comes from the powerful marketing efforts of Sony and Toshiba. As Peddie explains, “The original PlayStation [had] a geometry transformation engine (GTE), which was a co-processor to a 2D chip that was incorrectly labeled (by marketing) as a GPU.”

Marketing is a big element in this era of GPUs, which is just before they actually came out. The breakthrough for a true 3D GPU was on the horizon and plenty of companies wanted to get there first. But the honor would go to a little outfit from the UK imaginitely called 3Dlabs. The specific innovation that gave 3Dlabs the title of first accurately named GPU was their development of a two-chip graphics processor that included a geometry processor known as a transform and lighting (T&L) engine. Compared with their competitors, 3Dlabs focused on the CAD market though it was trying to make inroads with the larger consumer market by partnering with Creative Labs.

A technology demo highlighting the 3Dlabs Glint chipset.

The smaller size and professional focus of 3Dlabs meant there were still plenty of “firsts” to be had in the consumer GPU market.

The graphics-card sector was incredibly busy during this period, with one-time big names such as Matrox, S3, and 3Dfx competing for mindshare among Quake players.

But the winners write the history books, and a dominant player emerged during this period. By late 1999, Nvidia was ready to release the first mass consumer GPU with integrated T&L, known as the GeForce 256.

“That, by Nvidia’s mythology, was the introduction of the GPU, and they claim the invention [of it],” Peddie explains. “So you can slice and dice history as you like. Nvidia is at $10 billion on its way to $50 billion, and no one remembers 3Dlabs.”

(Side note: Nvidia is and always will be a noun and not an acronym despite the wide belief it is one.)

Pretty soon, the market would be loaded with competing GPUs each aiming at their own particular market niche. Canadian manufacturer ATI Technologies, which was later purchased by Nvidia’s biggest competitor AMD, attempted to differentiate their entry into the market by calling their GPU a VPU, or video processor unit, even though they were the same thing. This effort didn’t last.

“ATI gave up, they couldn’t stand up to Nvidia’s superior (and I mean that) marketing skills, volume, sexiness, and relentless push,” Peddie says.

By the early 2000s, major players like Nvidia had dominated the consumer market, quickly becoming villains to gamers everywhere. Interestingly enough, this exact market consolidation helps explain exactly why high-end graphics cards are so hard to find nowadays.

RTX 3080

Behold! One of the most coveted items in the world. And it’s not even the top of the line.

So who do we blame for that GPU shortage, anyway?

If you’ve gotten this far into an article on GPU history and naming convention, I bet you’re wondering when I’m going to get to the Great GPU Shortage of 2020 (and probably beyond).

For those who don’t know what I’m talking about, the gist is this: the price of higher-end GPUs has exploded in recent months, if you can even find them.

For example, the folks over at Nvidia have three models of graphics cards that are generally sought after by gamers:

  • RTX 3090: MSRP $1,499

  • RTX 3080: MSRP $699

  • RTX 3070: MSRP $499

The individual merits of these models can be (and very much are) debated relative to their given price points and performance. However, scarcity has made the resale markets for these GPUs shoot through the roof as supply becomes scarce. Current listings price the middle-tier RTX 3080 at $1,499, while the 3090 and 3070 are nearly impossible to find. One listing for a 3090 on eBay is over $3,000 at time of writing.

The AMD line of graphics cards also deserve a mention here. Though not as highly sought after because, traditionally, they haven’t been as powerful, AMD has nonetheless been affected by the supply chain limitations for GPU manufacturing. Like the Nvidia line, the AMD RX 6700, 6800, and 6900 models have seen similar price spikes in the secondary market with most models fetching more than twice their original values in resale markets.

(Ed note from Ernie: Some fun context here—my old Xeon has a refurbished AMD RX 570, a card I paid slightly more than $100 for on the Newegg website in mid-2019. That same card, which is basically a budget model and was already a little old at the time I purchased it, currently sells for $599 on Newegg’s website.)

Clearly there is heavy demand and capitalism is usually pretty good at filling that gap. Like many things wrong with 2020, a good bit of the blame is being placed on COVID-19. Manufacturing hubs in China and Taiwan, along with most of the world, had to shut down. While much of the work in hardware manufacturing can be automated, the delicate nature of GPUs requires some degree of human interaction.

GPU Chart

A flow chart describing the current shape of the GPU industry. (courtesy Dr. Jon Peddie)

Still, this explanation oversimplifies processes that have been trending in the graphics industry long before COVID-19 hit. Again, I’ll let Dr. Peddie explain:

About 15 [plus] years ago, the manufacturing pipeline was established for GPU manufacturing (which includes sourcing the raw silicon ingots), slicing and dicing the wafers, testing, packaging, testing again and finally shipping to a customer. All the companies in the pipeline and downstream (the OEM customers who have a similar pipeline) were seeking ways to respond faster, and at the same time minimize their inventory. So, the JIT (just in time) manufacturing model was developed. This relied on everyone in the chain providing accurate forecasts and therefore orders. If one link in the chain broke everyone downstream would suffer … When governments shut down their countries all production ground to a halt – no parts shipped—the pipeline was broken. And, when and if production could be restarted, it would take months to get everyone in sync again.

At the same time people were being sent home to work, and they didn’t have the tools needed to do that. That created a demand for PCs, notebooks especially. [Thirty to forty percent] of PCs have two GPUs in them, so the demand for GPUs increased even more.

And then [crypto] coins started to inflate … Now the miners (people who use GPUs to monitor and report …) were after every and any GPU they could get their hands on. That caused speculators to buy all the graphics boards and offer them at much higher prices.

So, the supply line got hit with a 1-2-3 punch and was down for the count.

And that was him keeping a long story short. To put it plainly, companies that make GPUs were operating on a thin margin of error without the ability to predict the future. And this applies more to the general market for GPUs while tangentially addressing the higher-end customers.

Another point of frustration to add here was the unfortunate timing of the latest generation of video game consoles in 2020, which also meant a new generation of video games. The highly anticipated PlayStation 5 along with _Cyberpunk 2077 _was met with numerous supply and technical issues upon launch. Cyberpunk players reported inconsistent experiences largely dependent on hardware the game was being played on. On the differences between the game on a PS4 and a PS5, one YouTuber commented, “At least it’s playable on PS5.”

While Dr. Peddie expects the shortage to self-correct by the first quarter of 2022 (hooray …), he is not optimistic about the industry avoiding such missteps in the future.

“The [next] problem will be double-ordering that is going on now and so we have the prospect of a giant slump in the semi market due to excess inventory,” he concludes. “Yin-yang—repeat.”

There is a lot to learn from history even if it’s fairly recent. While it might be tempting to lean into market failures to meet demand, obviously the story is more complicated. Though GPUs have become required for billions on a daily basis, higher performance is left to a few with niche interests.

Still, the larger market should pay attention to frustrated gamers, at least on this point. Their needs push the industry into innovation that becomes standard in more common devices. With each iteration, devices gain a little more of those advanced graphics as they drip down to people who hadn’t noticed them before but now expect it.

After all, if it doesn’t have painstakingly realistic 3D graphics, can we even call it a phone anymore?

--

Find this one an interesting read? Share it with a pal!

Oh, and a lot of appreciation for Dr. Jon Peddie for his time and his insight. Quite conveniently, he is also working on a book called, “The History of GPUs” which will go into much greater detail about this fascinating slice of computing history.

And thanks again to our sponsor.

Share this post:

follow on Twitter | privacy policy | advertise with us

Copyright © 2015-2021 Tedium, all rights reserved.

Disclosure: From time to time, we may use affiliate links in our content—but only when it makes sense. Promise.

unsubscribe from this list | view email in browser | sent with Email Octopus

Older messages

Novell Cooperation 💾

Wednesday, March 24, 2021

The company that nearly brought MacOS to the PC in the '90s. Here's a version for your browser. Hunting for the end of the long tail • March 24, 2021 Today in Tedium: “Whatever is good for our

Newsletter, Untethered ✍️

Friday, March 19, 2021

You don't need Substack to send a great newsletter. Here's a version for your browser. Hunting for the end of the long tail • March 19, 2021 Today in Tedium: Recently, I got an opportunity to

Tangential Juice Innovation 🧃

Wednesday, March 17, 2021

The long, innovative road to the juice box. Here's a version for your browser. Hunting for the end of the long tail • March 17, 2021 Hey all, Ernie here with a piece on juice boxes. This past week

Caddy Confusion 💽

Saturday, March 13, 2021

Why did early CD-ROM drives need caddies? Here's a version for your browser. Hunting for the end of the long tail • March 12, 2021 Today in Tedium: The first time I ever saw a CD-ROM, I had never

Thunderbolt Road ⚡️

Wednesday, March 3, 2021

Why Thunderbolt isn't quite as cool as it could be. Here's a version for your browser. Hunting for the end of the long tail • March 03, 2021 Today in Tedium: Two decades ago, the fastest data

You Might Also Like

Software Testing Weekly - Issue 247

Tuesday, November 26, 2024

QA Job Hunting Resources 📚 View on the Web Archives ISSUE 247 November 26th 2024 COMMENT Welcome to the 247th issue! Today, I'd like to highlight a fantastic set of QA Job Hunting Resources.

🔒 The Vault Newsletter: November issue 🔑

Monday, November 25, 2024

Get the latest business security news, updates, and advice from 1Password. ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏

🧐 The Most Interesting Phones You Didn't See in 2024 — Making Reddit Faster on Older Devices

Monday, November 25, 2024

Also: Best Black Friday Deals So Far, and More! How-To Geek Logo November 25, 2024 Did You Know If you look closely over John Lennon's shoulder on the iconic cover of The Beatles Abbey Road album,

JSK Daily for Nov 25, 2024

Monday, November 25, 2024

JSK Daily for Nov 25, 2024 View this email in your browser A community curated daily e-mail of JavaScript news JavaScript Certification Black Friday Offer – Up to 54% Off! Certificates.dev, the trusted

Ranked | How Americans Rate Business Figures 📊

Monday, November 25, 2024

This graphic visualizes the results of a YouGov survey that asks Americans for their opinions on various business figures. View Online | Subscribe Presented by: Non-consensus strategies that go where

Spyglass Dispatch: Apple Throws Their Film to the Wolves • The AI Supercomputer Arms Race • Sony's Mobile Game • The EU Hunts Bluesky • Bluesky Hunts User Trust • 'Glicked' Pricked • One Massive iPad

Monday, November 25, 2024

Apple Throws Their Film to the Wolves • The AI Supercomputer Arms Race • Sony's Mobile Game • The EU Hunts Bluesky • Bluesky Hunts User Trust • 'Glicked' Pricked • One Massive iPad The

Daily Coding Problem: Problem #1619 [Hard]

Monday, November 25, 2024

Daily Coding Problem Good morning! Here's your coding interview problem for today. This problem was asked by Google. Given two non-empty binary trees s and t , check whether tree t has exactly the

Unpacking “Craft” in the Software Interface & The Five Pillars of Creative Flow

Monday, November 25, 2024

Systems Over Substance, Anytype's autumn updates, Ghost's progress with its ActivityPub integration, and a lot more in this week's issue of Creativerly. Creativerly Unpacking “Craft” in the

What Investors Want From AI Startups in 2025

Monday, November 25, 2024

Top Tech Content sent at Noon! How the world collects web data Read this email in your browser How are you, @newsletterest1? 🪐 What's happening in tech today, November 25, 2024? The HackerNoon

GCP Newsletter #426

Monday, November 25, 2024

Welcome to issue #426 November 25th, 2024 News LLM Official Blog Vertex AI Announcing Mistral AI's Large-Instruct-2411 on Vertex AI - Google Cloud has announced the availability of Mistral AI's