0

I'm working on a web application that exists at two sites. Each system is hosted on it's own LAMP stack, with access restricted to specific users on the network.

When content is updated via user input on one side, a file is transferred from that system to the other to update it, so both systems match.

These files are transferred by a 'guaranteed delivery' system. However, sometimes the files get bunched up and arrive in the wrong order, causing problems at the receiving end.

This is particularly noticable with rapid user input, as the delay from the transmission system causes the files to 'bump into eachother'

My question is: How can I ensure the order the files are generated is the order they are received by the system? Are there any standard ways of performing this I should be aware of?

blarg
  • 207
  • As suggested in the answer by SvW, you should try a different method to solve the issue such as database replication, or if you prefer to not use the database, use a FIFO queue with an API to update the other sites. If you do have constraints that it must be done with files, you'll need to go more in depth on how the generation process occurs, you'll still need to create a queue somewhere whether it be numbered symlinks or an application. – Regan Nov 13 '13 at 12:51

1 Answers1

1

That is entirely up to the system that handles that generation and transfer, which we don't know anything about.

However, databases already solved this problem, in the form of replication. Why don't you use this instead of building your own, obviously unreliable solution?

Sven
  • 99,533
  • 15
  • 182
  • 228
  • Would database replication be suitable for such small incremental updates? – blarg Nov 13 '13 at 12:29
  • Certainly much better than transferring files to communicate state changes which is about the least efficient method I can think of. – Sven Nov 13 '13 at 12:40