I doubt it - modern computers are already pretty fast at brute-forcing all sorts of things and the first AGI will be perfectly functional even if it takes a full day to respond, and AGI research requires you to fully understand what you're creating before you create it so full-system testing is something you'd only want to do near the end of the project anyway. At which point you basically want an ASIC.
All the sci-fi seems think this. I have seen Intel chips from the 80s that implemented neuron like behavior on a chip.
We are still waiting on a practical memristor.