The tech giants’ latest machine-learning system comes with both ethical and environmental costs
Unless you’ve been holidaying on Mars, or perhaps in Spain (alongside the transport secretary), you may have noticed some fuss on social media about something called GPT-3. The GPT bit stands for the “generative pre-training” of a language model that acquires knowledge of the world by “reading” enormous quantities of written text. The “3” indicates that this is the third generation of the system.
GPT-3 is a product of OpenAI, an artificial intelligence research lab based in San Francisco. In essence, it’s a machine-learning system that has been fed (trained on) 45 terabytes of text data. Given that a terabyte (TB) is a trillion bytes, that’s quite a lot. Having digested all that stuff, the system can then generate all sorts of written content – stories, code, legal jargon, poems – if you prime it with a few words or sentences.
Selected by softengoxford