Very interesting article. It is a real-world occurrance of the old joke about the clock store where they set all the clocks based on the local factory whistle, only to find out that the factory whistle operator was setting his watch by looking at the clocks in the store window when he walked by every morning.
And that story always reminds me of working with CSMP (Continuous Systems Modeling Program), a digital simulation of an analog computer, back in the day. To generate a sine wave, we would pretend we had a cosine wave, and integrate it. Then we would take the sine wave, integrate and invert it, and feed it back into first integrator in place of our pretend cosine wave. It always seemed kind of like magic, producing a sine wave out of nothing, but apparently random noise in the system was enough to eventually get the pair of integrators oscillating.
He is off by 1000 on his units. AM stations typically have a frequency between 540 and 1600 KHz, 1000 MHz would put him higher frequency than TV (1000 MHz=1GHz). Your comment was right on saying you used a station in Chicago @ 1MHz., not 1000 MHz.
Back in the 60's I would surf the AM band for stations from Pittsburg Kansas, and finding them from all over the country. We could pick up WBBM, WLS, WWLS, WJR, KOMA, KOA, The Mexican station on 800 KHz and many others it has been a long time. I had a shortwave radio to pick up WWV, and others, as well. I still remember the call letters of the stations and their frequencies to some extent 50 years later.
Using a strong 1 MHz signal makes calibration incredibly easy. Connect an antenna to the vertical input of the scope, with a tuned circuit from the input to ground. Connect the frequency standard to be calibrated to the horizontal input. Adjust the standard until the Lissajous pattern is stable and you are done!
Back in the days of NTSC television, the color burst was generated from a rubidium oscillator for network programs. The exact frequency was calibrated against NITS standards and published. A frequency standard, tracable to NITS, was no further than the nearest television!
This is indeed an interesting article, and the part about the meter reading the deviation brought up an issue that I have had to deal with, which, to summarize, was "How do you tell the difference between zero and nothing." One part of the answer explains part of the reason for the use of 4 to 20Ma analog signal loops, which is that it is quite simple to detect the difference between "zero" = 4Ma, and "nothing",= 0Ma. So an opened data pair would be quite obvious.
I never thought about the concerns of remotely reading a frequency deviation display, but a 4-20Ma loop would have solved at least part of the problem.
Similarly, telephone audio test sets default to 1004Hz instead of 1.000kHz (to avoid aliasing issues with an 8.000kHz audio sample rate).
If what you say is correct (army band always 2ppm higher) maybe that was the fault of this particular radio station. Someone in the Army finally realized the reference they had been using for years was actually 2ppm high, so rather than fix a "problem", they simply made 2ppm high THE "standard".
I rember the first time I saw my favorite college professor Dr. Howe. He was sitting on his desk, feet on his chair, watching a galvanometer with a stopwatch in his hand. He had just gotten a brand new frequency synthesized signal generator and he was checking it out by driving the galvanometer on one side with a synthesyzed 10MHz and WWV on the other side. The stopwatch was to time how long it took for them to drift a cycle apart.
Nancy you are correct that the article is very interesting and captivating but according to me what i understood is when no signal used to come the concern person automatically converts or moves it to 2 Hertz.Secondly i am not sure but according to me different organisations have different bands assigned similarly army band is always 2ppm high in frequency
You're right Nancy, "anyone else using it would be responsible for verifying their reference". My guess is that they knew the tolerance for the station was 10 ppm and figured that was more than accurate enough. Back then it was tough to have any lab instrument that accurate, and we did the same thing, picking up WCFL in Chicago at 1MHz.
Very interesting article on early radio and the Gates BC-1T transmitter. I found the 2 Hertz high adjustment solution interesting and an easy solution to verifying the frequency – especially since it was a very stable transmitter and was operating well within the FCC guardband. As for its use as a reference - Anyone else using it would be responsible for verifying their reference source...and sometimes it is consistency that reflects proper calibration rather than the actual reading itself. Not my area of expertise, but I am guessing they were using it because of the stability of the transmitter and the 2ppm was either known or negligible.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.