I read here in What algorithms are best be cracked by GPU? that, hashing algorithms such as SHA1, SHA224, SHA256 which do 32 bit integer arithmetic and logical operations are better implemented on GPU as opposed to SHA512 hashing algorithm which work on 64 bit integers.
What is the reason behind this? GPUs do not provide a good support for development using 64 bit integers?
I read on the Nvidia CUDA Developer's FAQ the question: Does CUDA support long integers?
And the answer is Yes, however it says that operations on 64 bit long integers compile to multiple instruction sequences. Is that the reason why GPUs can handle 32 bit integers much better than 64 bit integers?
More information around this would be helpful. Also if there are any references which discuss this more in depth, I would like to know.