fbpx
Home - Breaking News, Events, Things-To-Do, Dining, Nightlife

HPNM

BPF Book Review of “Chip War” by Chris Miller

Share this article

‘Big data’ is the world we live in now. Powerful companies like Amazon, Google, Microsoft and Facebook, are the current masters of the ‘big data’ universe.  With their hyper-scale software platforms these behemoths dominate today’s economy. But software fortunes that bloom into the trillions need cutting edge hardware to splice, dice and process the tsunamis of data that spew in, through and out of our iPhones daily. The fount of that cutting edge hardware is the engineering, physics, chemistry and lithography expertise housed in the clean rooms of multi-billion dollar semiconductor fabrication plants (fabs). The top tier fabs are now strategic assets at the highest levels of global economics and politics.

In Chip War – The Fight for the World’s Most Critical Technology, author Chris Miller masterfully recounts the epic journey of the transistor from rudimentary invention in the early 50s to today’s system-on-a-chip with billions of transistors that are, as the book’s subtitle suggests “the world’s most critical technology.” Chip War is a technology history book that also provides keen insight into the rising economic confrontation between America and China.

It is an extraordinary tale with many twists and turns and a cast of eclectic characters armed primarily with slide rules and pocket protectors. Included are inventors, entrepreneurs, outside-the-box thinkers, refugees fleeing communism – Morris Chang from Shanghai in 1948 and Andy Grove from Hungary in 1956, marketers, purveyors of capital and lastly, a marketplace hungry for more speed, cheaper prices, smaller footprints and new products.

The journey began with the invention of the transistor for which William Shockley, John Bardeen and Walter Brattain received the Nobel Prize in 1956. Shockley had rightly concluded that a semiconducting material like silicon would be an excellent substrate for a transistor. In 1958, Jack Kilby invented the integrated circuit – multiple transistors on one chip, for which he also received the Nobel Prize. The first market for the integrated circuit was the chance to replace expensive and clunky vacuum tubes that served as switches in the hulking, room-sized computers of the 1950s and early 60s. Over time, chips would change everything about computing – its price, its performance, its size and its ubiquity.

In 1965, Gordon Moore, co-founder of Intel, made his famous prediction of an annual doubling of transistors per chip for the decade ahead. In 1975, he revised it to a doubling every two years. Moore’s insight that few others appreciated was that as transistors shrunk in size they would use less power, be more reliable and fall in price.

Moore’s Law was prescient and it set the trajectory of the semiconductor industry for half a century. Such enormous productivity gains year after year, decade after decade have never been experienced before or after, and these gains created unique business challenges for the fledgling industry. If capacity was growing exponentially and prices fell in pace, markets would need to grow rapidly to avoid collapses in revenue for the producers. New markets had to be identified to absorb

Read original article by clicking here.

Local Dining Stream

Things To Do

Related articles