As per Open AI data, the amount of computational power needed to train large AI models has grown massively — doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point. Typically, to carry out high-performance computing tasks, conventional AI chips are…

The post Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? appeared first on Analytics India Magazine.