? Just to let you know, this question is under Theoretical Computer Science - Distributed Systems - Synchronization & Clocks ...
> you have 5 Clocks, denoted C1, C2, .., C5 .. with different consistent rates ...
1. you calculate Clock-per-Clock Drift (difference in rate), so you have 10 drift rates ...
---- Dij = rate difference between Clock Ci and Clock Cj
2. get minimal Dij among the 10 Drift Rates
3. by choosing Dij, you keep Clock Ci and Cj only, throwing other clocks
:: you will get 2 ticks per hour, unless if Dij (minimal D) is Zero, but the error is minimized as possible ...
----------------------------------------------------------------------------------------------------
example:
assume that every clocks, gives the skew (difference in reading) per second:
-- C1 = -1.05, C2 = +0.8, C3 = +0.2, C4 = -1.0, C5 = +0.35
Dij: D12 = 1.85, D13 = 1.25, D14 = 0.05, D15 = 1.4, D23 = 0.6, D24 = 1.8,
-- D25 = 0.45, D34 = 1.2, D35 = 0.15, D45 = 1.35
choosing minimum Dij: D14 = 0.05
solution: keep clocks C1 and C4, throw other clocks
result: you will get two ticks with 0.05 difference every hour
constraint: the clocks must be hourly synchronized