Jump to content
WCarp

Verifying Digital Clock Accuracy

Recommended Posts

I'm not sure if this is the correct place to post this. On a piece of equipment, I purchased new, the clock ran slow. The second clock that was installed ran slow---it lost approximately 4.1 minutes in one month. Below is what I sent the Service Manager who I understand thinks it is possible to test the newly installed clock immediately after installing it with a cell phone (the NH Rep thinks it's possible also, I understand. (I did not measure the time initially with the precision indicated below but just included all the digits displayed by the Windows computer calculator and actually, if you round up, the clock loses approximately 27.8 ms per hour.) Even if a person had a super precise stopwatch, there would be a time lag between seeing the digit or digits change on the clock being timed and pushing the stopwatch button. So I suggest timing a newly installed clock for a period of preferably two weeks approximately to get an accurate assessment of the clock accuracy.

"You mentioned something about possibly checking the accuracy of the clock with a stopwatch while up here. It’s not possible to check the accuracy of the clock right away after installing the instrument panel, like I mentioned since the current clock loses a small but cumulative amount and it is impossible to accurately time such a small interval of time with a stopwatch. In my email dated February 7, 2018, I write, “Difference between clock and watch in three weeks was 14 seconds or 14 / 21 = 0.6666666666666667 loss in seconds per day.” So, in one hour, it loses 0.6666666666666667 / 24 = 0.0277777777777778 seconds. That is 27.7 milliseconds in an hour. I don’t know how you could time it so accurately, since just pushing the stopwatch button has some variability and you would have to wait one hour at least and if you were off just a few thousandths of a second, that would greatly affect the accuracy of your results."

 

What are your comments?

 

Thanks

Share this post


Link to post
Share on other sites

Make computer script. It should connect to atomic Internet clock (google: "atomic internet clock time" e.g. https://time.is/ or "International Atomic Time"), and at the same time take local computer clock, and store them somewhere, then wait quite a long period of time e.g. 24h and repeat readings. If they are different, you will be able to calculate loss per day, loss per hour, or loss per second..

 

Computers/smartphones/tablets can be manually or automatically synchronized to "time servers". Google "how to synchronize time server android" (or replace Android by other OS name).

 

Edited by Sensei

Share this post


Link to post
Share on other sites

We don't want to be involved in any scripts. I was looking for a comment related to how long a person needs to wait to accurately see how the new clock compares with a time source such as a wristwatch or cell phone.

Thanks

 

Share this post


Link to post
Share on other sites
6 hours ago, WCarp said:

We don't want to be involved in any scripts. I was looking for a comment related to how long a person needs to wait to accurately see how the new clock compares with a time source such as a wristwatch or cell phone.

Thanks

 

As you note in your OP the precision of your measurement will be related to how long you measure. If you can measure to the nearest second, then after a day you can tell to almost a part in 10^5 (as there are 86,400 seconds in a day) In ~11.6 days you can do a part in a million per day

 

This may not be the clock's fault. A lot of plug-in clocks use the mains frequency to keep time, since time is the integral of frequency. So whatever that is (50 Hz or 60 Hz), if the local power grid is slightly off, your clock will accumulate errors. In the US they recently passed a rule freeing utilities from keeping tight control on the frequency of their electricity, and timing issues were pointed out as a consequence of this decision.

A colleague of mine participated in a study of this

Quote

The changes could be just matters of seconds and all but unnoticeable, but the time could drift by as much as seven and a half minutes between time changes in March and November, when people reset their clocks, according to a study conducted by researchers at the National Institute of Standards and Technology and the U.S. Naval Observatory.

...

“They’ll think something is wrong with their clock but they won’t know what,” said Matsakis, co-author of the study.

https://apnews.com/05e80db0279441eb8051418f1e310e35

If you had the right instruments you could measure the mains frequency and confirm this.

Share this post


Link to post
Share on other sites

"This may not be the clock's fault. A lot of plug-in clocks use the mains frequency to keep time, since time is the integral of frequency. So whatever that is (50 Hz or 60 Hz), if the local power grid is slightly off, your clock will accumulate errors. In the US they recently passed a rule freeing utilities from keeping tight control on the frequency of their electricity, and timing issues were pointed out as a consequence of this decision."

The clock is a digital clock, and is not a plug in clock on equipment using power from a battery so the mains frequency has nothing to do with it and has nothing to do with drifting with time changes in Mach and November.

Again, I am asking,  how long a person needs to wait to accurately see how the new clock compares with a time source such as a wristwatch or cell phone by visual comparison?

Share this post


Link to post
Share on other sites
18 minutes ago, WCarp said:

"This may not be the clock's fault. A lot of plug-in clocks use the mains frequency to keep time, since time is the integral of frequency. So whatever that is (50 Hz or 60 Hz), if the local power grid is slightly off, your clock will accumulate errors. In the US they recently passed a rule freeing utilities from keeping tight control on the frequency of their electricity, and timing issues were pointed out as a consequence of this decision."

The clock is a digital clock, and is not a plug in clock on equipment using power from a battery so the mains frequency has nothing to do with it and has nothing to do with drifting with time changes in Mach and November.

Again, I am asking,  how long a person needs to wait to accurately see how the new clock compares with a time source such as a wristwatch or cell phone by visual comparison?

Until you see a deviation of one second, then you can say the the device is "accurate to one second in x time". 

For your reference: swansont builds atomic clocks as his day job. His clocks are accurate to 9 billionths of a second. :) 

Share this post


Link to post
Share on other sites
22 minutes ago, WCarp said:

Again, I am asking,  how long a person needs to wait to accurately see how the new clock compares with a time source such as a wristwatch or cell phone by visual comparison?

It depends on the level of precision you want.

As I said above, a little over a day gets you a part in 10^5, if you can discern a 1 second difference. That's a millisecond difference for every 100 seconds elapsed. 

Your 27.7 ms per hour is 0.665 seconds a day. It will take at least a day and a half to see a one-second difference.

7 minutes ago, StringJunky said:

 For your reference: swansont builds atomic clocks as his day job. His clocks are accurate to 9 billionths of a second. :) 

1. You have to say over what period of time that applies. "My" clocks are precise to better than 10 ns per year.

2. Our particular devices strive for precision rather than accuracy. (i.e. they are stable, but we do not tell you the length of the second)

 

Share this post


Link to post
Share on other sites
11 minutes ago, swansont said:

2. Our particular devices strive for precision rather than accuracy. (i.e. they are stable, but we do not tell you the length of the second)

 

I never thought of those terms as distinct.

Share this post


Link to post
Share on other sites
6 minutes ago, StringJunky said:

I never thought of those terms as distinct.

Think of shooting arrows. Accuracy means you are hitting or close to a target, on average. Precise means your arrows are closely bunched. 

Shoot an arrow 1m to the left, 1m to the right, 1m high, and 1m low. On average, you are spot on. You are accurate. But missing by 1m every time means you are not precise.

Shooting arrows that are always within 1mm of each other is precise. But they might be hitting anywhere. You can only claim accuracy if you are near the bullseye.

Similarly, in experiments you could be getting the wrong answer because of some bias, but you have lots of zeroes after the decimal. That's being precise without being accurate.

 

Share this post


Link to post
Share on other sites
24 minutes ago, swansont said:

1. You have to say over what period of time that applies. "My" clocks are precise to better than 10 ns per year.

When I said "9 billionths of a second", does that mean for every elapsed second of the device the error  is one part in 9 billion?

4 minutes ago, swansont said:

Think of shooting arrows. Accuracy means you are hitting or close to a target, on average. Precise means your arrows are closely bunched. 

Shoot an arrow 1m to the left, 1m to the right, 1m high, and 1m low. On average, you are spot on. You are accurate. But missing by 1m every time means you are not precise.

Shooting arrows that are always within 1mm of each other is precise. But they might be hitting anywhere. You can only claim accuracy if you are near the bullseye.

Similarly, in experiments you could be getting the wrong answer because of some bias, but you have lots of zeroes after the decimal. That's being precise without being accurate.

 

Right.Thanks for the clarification. A clock can be perfectly stable (precise) but still show the wrong time (not accurate)?

Edited by StringJunky

Share this post


Link to post
Share on other sites
2 hours ago, StringJunky said:

When I said "9 billionths of a second", does that mean for every elapsed second of the device the error  is one part in 9 billion?

Typically you give the fractional frequency stability. So you might report that a clock is stable at 1 part in 10^15. From that you could estimate that e.g. after 10^6 seconds (just under 12 days), the clock would be off by no more than a part in 10^9, or 1 ns.  (In truth the calculation is slightly more complicated, but that gives a reasonable estimate)

Quote

Right.Thanks for the clarification. A clock can be perfectly stable (precise) but still show the wrong time (not accurate)?

Yes. If it's ticking slightly fast or slow the time will be off, and get worse, but the frequency isn't changing very much, so whatever the reading is, it's precise. Running fast or slow can be corrected with calibration, if it's predictably doing so — i.e. if you know your clock is adding or losing time, you can compensate. If your clock is adding 1 ns every 10 days, you can add in a 100 ps "steer" every day (in the other direction) and keep it on time. You don't have to physically adjust the clock's frequency, either - the time can be added with an external device, or just kept track of, and the output manually computed. 

 

Share this post


Link to post
Share on other sites
54 minutes ago, swansont said:

Typically you give the fractional frequency stability. So you might report that a clock is stable at 1 part in 10^15. From that you could estimate that e.g. after 10^6 seconds (just under 12 days), the clock would be off by no more than a part in 10^9, or 1 ns.  (In truth the calculation is slightly more complicated, but that gives a reasonable estimate)

Yes. If it's ticking slightly fast or slow the time will be off, and get worse, but the frequency isn't changing very much, so whatever the reading is, it's precise. Running fast or slow can be corrected with calibration, if it's predictably doing so — i.e. if you know your clock is adding or losing time, you can compensate. If your clock is adding 1 ns every 10 days, you can add in a 100 ps "steer" every day (in the other direction) and keep it on time. You don't have to physically adjust the clock's frequency, either - the time can be added with an external device, or just kept track of, and the output manually computed. 

 

Right. Thanks.

Share this post


Link to post
Share on other sites

I don't have much knowledge on this but let me see what happens when forward the time in two digital clock out of which one is accurate

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.