I'm dreaming of a service where a client sends my service a hash of a file that they generate locally from that file. The idea is that having such a hash would be enough to show that the user is actually holding that file.
My service would rely on nobody being able to create hash collisions, preferably in the future as well. In other words, nobody should be able to create a slightly changed file (i.e. change a word in a pdf or in an image) that would generate the same hash as another file. Simple hash security stuff that all the non-broken hash functions can manage, but what I'm worried about is longevity. Would a hash generated today be still safe in 10 years?
So, my question is, would it not be a good idea to design the API so that the user would need to generate and send multiple hashes with different hash functions? Wouldn't it be a lot more difficult (even in 10 years time) to design hash collisions that collide in all those hash functions? My gut feeling is that my thinking is wrong, but I cannot figure out why it would be.