Making AI One Million Times More Efficient: Exploring the Possibility of Neuromorphic Computing

ChatGPT burns millions of dollars every day. Can computer scientists make AI one million times more efficient?

ChatGPT is expensive to run, costing millions of dollars per day. OpenAI, the company that created the natural-language artificial intelligence, has launched ChatGPT Plus a $20/month subscription. Our brains are one million times more efficient that the GPUs and CPUs used in ChatGPT cloud hardware. Researchers in neuromorphic computing are working to bring the cloud-based server farms’ miracles to small devices, such as those in our homes, hospitals and offices.

Modeling computing hardware after computing wetware found in the human brain is one of the keys.

Dylan Patel, Afzal Ahmed and SemiAnalysis say that \”Inference costs are far greater than training costs\” when using a model on a reasonable scale. In fact, on a weekly base the costs of inference ChatGPT surpass the training costs. If LLMs like ChatGPT are deployed in search, this represents a transfer of $30 billion from Google’s profits into the hands the picks-and-shovels of the computing industries.\”

The implications of running the numbers as they did are staggering.

They write: \”To deploy current ChatGPT in every Google search, we would need 512,820 HGX A100 servers, with a total 4,102,568 GPUs.\” The total cost of servers and networking is more than $100 billion, which Nvidia will receive a significant portion of.

Source:
https://www.forbes.com/sites/johnkoetsier/2023/02/10/chatgpt-burns-millions-every-day-can-computer-scientists-make-ai-one-million-times-more-efficient/?ss=ai&sh=13383a06944e

Leave a Reply

Your email address will not be published. Required fields are marked *