The chips at the heart of our digital devices are manufactured by a few large companies, but an open-source approach to design could end their dominance – with implications for everyone
ON THE one hand was a perfect storm of problems hampering supplies: a global pandemic, a trade war, a blaze at a manufacturing plant and atrocious weather including drought and snowstorms. On the other was the unprecedented demand for one of the world’s most sought-after products – a market worth $40 billion in January this year alone.
Even so, the news earlier this year that there was a global shortage of computer chips, pushing up the price of everything from laptops to fridges, took a lot of people by surprise. The crisis has seen nations and companies around the world scrambling to establish more chip-building capacity. It has also shone a light on an industry whose products have become ubiquitous and essential, leaving many to conclude that it is due a fundamental overhaul. For critics, the stranglehold on the market of just a few firms acts as a brake on innovation and increases the industry’s vulnerability to disruption.
Some big names are now backing an alternative model – taking the collaborative, open-source principles that have changed the way software is written and applying them to chips. For its proponents, it is only a matter of time before this becomes the new standard. But when the chips are down, does it have what it takes – and what does it mean for us?
Computer chips are everywhere – not just in our laptops, desktops and smartphones. They are in the myriad mysterious servers that run our webmail, online bank and other digital services we use daily. They are in many microwaves, televisions, washing machines and watches. The average new car today contains hundreds of them. …