Simple question: Both MD5 and SHA1 are susceptible to a collision attack. It's reasonable to assume that SHA256, the next algorithm we're switching to, also has such an issue, but one that's hidden because of the requirements to find such a collision.
The thing is, why don't we use multiple algorithms to verify file integrity? like, calculate multiple checksums using multiple algorithms for the same file and only declare it acceptable if all of them match? I mean, finding a collision for MD5 right now is doable on smartphones, and finding one for SHA1 has been proven feasible with the SHAttered attack. however, if you had to find a collision for both MD5 AND SHA1, wouldn't that increase the time needed as well?
Clarification: While this particular suggestion may actually be in use in places, what I'm talking about is: why isn't this technique commonly proposed as an alternative to upgrading to SHA256?