21 Mar No free lunch theorem in Quantum Computing Shraddha Goled albert einstein In machine learning, the no-free lunch theorem suggests that all optimisation algorithms perform equally well when their performance is averaged over many problems and training data sets.