Re: [math-fun] Bell 202 1200 baud modem emulation
Interesting, but not very physical. The frequencies present in a waveform are unchanged if you replace the waveform with its derivative. As such, it makes no difference whether you have a whole number of cycles before changing frequencies. The derivative of a sine wave that changes frequency when it crosses zero will change frequency at a different amplitude. A sudden change of frequency will splatter all over the spectrum, even if the change happens at an instantaneous amplitude of zero. For a two-frequency modem on a standard analog telephone line, that doesn't matter, since the telephone circuit blocks extreme frequencies, and since the receiving modem is listening only for those two frequencies. Frequency shift keying on radio is generally done by simply adding a radio frequency to the two tones. So instead of 1200 and 2200 Hz, it might be at 3501200 and 3502200 Hz. Obviously that will change whether the (unchanged) duration of one bit is a whole number of waves for one or both frequencies. Fortunately, that doesn't matter. One way to avoid splattering all over the radio spectrum, which would of course be a bad thing, is to change the frequency gradually rather than abruptly. Slide smoothly from 1200 to 2200 or vice versa. Another way is to gradually decrease the amplitude of the 1200 tone while gradually increasing the amplitude of the 2200 tone or vice versa. Yet another way is to just run the signal through a narrow-pass analog filter. None of these depend on whether there are a whole number of waves, or on when the waveform crosses zero, or on what the radio frequency is. Similarly with the even older and simpler method of on-off keying (e.g. Morse code). The amplitude should gradually increase and decrease, since suddenly starting or stopping a radio signal will produce "key click" noise all over the spectrum. Of course by "gradually" I mean typically on a millisecond time scale. Though the time scale can be anything. The shorter it is, the more bandwidth the signal uses, and the more bits can be sent per second. The link between channel capacity (bits per second) and bandwidth (difference between the highest and lowest frequency used) is so strong that most people say "bandwidth" when they mean "channel capacity." The link between channel capacity and bandwidth is due to the Heisenberg uncertainty principle. The more precisely you know the frequency of a signal, the less precisely you can know its time of arrival, and vice versa. What frequencies are present in a signal depends as strongly on the receiver as on the transmitter. Alternate once per second between 1000000-1 Hz and 1000000+1 Hz. Is there any energy at exactly 1000000 Hz? If you have a narrow band receiver, yes, nearly all the signal power is there. If you have a wide band receiver, no, none is. You can see this effect for yourself at sdr.hu, a website that lets you see and hear, in real time, radio signals in the HF band and below as picked up at various places all over the world. You can select the receiver location, frequency, bandwidth, and mode. It's one of the truly wonderful websites, right up there with Wikipedia, Google, IMDB, and OEIS. (Similarly with the signal's direction versus where on your antenna it landed. If you know very precisely where it landed, you know its position on the axis at right angles to its direction of propagation to a high precision, which means you know its momentum on that axis, i.e. the direction it came from, to a very low precision. This is why directional antennas have to be large.) A few nitpicks: The first digital modem in widespread use was the Bell 103 (and compatibles), dating to 1962, and still in widespread use through the 1980s. It worked only up to 300 bits per second. It wasn't as fast as the later Bell 202, but unlike the 202 it was full duplex, meaning that signals could be sent in both directions at once. In one direction, it switched between 1070 and 1270 Hz, in the other direction, it switched between 2025 and 2225 Hz. It was always more common than the 202, though perhaps not as common as the later Bell 212, which was a full duplex 1200 bits per second modem. I don't think ham radio operators ever used Bell 202, at least in the US. For one thing, ASCII wasn't allowed on the air until 1980, by which time Bell 202 was long obsolete. Hams have been using the older five bit ITA2 code (often wrongly called Baudot) since 1946. Hams typically use 45 bits per second and a 170 Hz frequency difference between mark (1 bit) and space (0 bit) when sending ITA2 ("RTTY"). I think deaf people also still use ITA2 over the telephone.
Ham radio used (and continues to use) the Bell 202 standard for 1200 baud packet radio over VHF FM channels. While the use of connected mode via AX25 or TCP/IP has fallen over the years (particularly at 1200 baud), the Automatic Packet Reporting System (APRS) continues to use 1200 baud packet in many locations to relay GPS location and weather data throughout an interconnected network. Bell 202 signalling is still used for Caller ID over POTS lines as well. https://en.wikipedia.org/wiki/Bell_202_modem On Tue, Jan 16, 2018 at 3:58 PM, Keith F. Lynch <kfl@keithlynch.net> wrote:
Interesting, but not very physical. The frequencies present in a waveform are unchanged if you replace the waveform with its derivative. As such, it makes no difference whether you have a whole number of cycles before changing frequencies. The derivative of a sine wave that changes frequency when it crosses zero will change frequency at a different amplitude.
A sudden change of frequency will splatter all over the spectrum, even if the change happens at an instantaneous amplitude of zero. For a two-frequency modem on a standard analog telephone line, that doesn't matter, since the telephone circuit blocks extreme frequencies, and since the receiving modem is listening only for those two frequencies.
Frequency shift keying on radio is generally done by simply adding a radio frequency to the two tones. So instead of 1200 and 2200 Hz, it might be at 3501200 and 3502200 Hz. Obviously that will change whether the (unchanged) duration of one bit is a whole number of waves for one or both frequencies. Fortunately, that doesn't matter.
One way to avoid splattering all over the radio spectrum, which would of course be a bad thing, is to change the frequency gradually rather than abruptly. Slide smoothly from 1200 to 2200 or vice versa. Another way is to gradually decrease the amplitude of the 1200 tone while gradually increasing the amplitude of the 2200 tone or vice versa. Yet another way is to just run the signal through a narrow-pass analog filter. None of these depend on whether there are a whole number of waves, or on when the waveform crosses zero, or on what the radio frequency is.
Similarly with the even older and simpler method of on-off keying (e.g. Morse code). The amplitude should gradually increase and decrease, since suddenly starting or stopping a radio signal will produce "key click" noise all over the spectrum.
Of course by "gradually" I mean typically on a millisecond time scale. Though the time scale can be anything. The shorter it is, the more bandwidth the signal uses, and the more bits can be sent per second. The link between channel capacity (bits per second) and bandwidth (difference between the highest and lowest frequency used) is so strong that most people say "bandwidth" when they mean "channel capacity."
The link between channel capacity and bandwidth is due to the Heisenberg uncertainty principle. The more precisely you know the frequency of a signal, the less precisely you can know its time of arrival, and vice versa.
What frequencies are present in a signal depends as strongly on the receiver as on the transmitter. Alternate once per second between 1000000-1 Hz and 1000000+1 Hz. Is there any energy at exactly 1000000 Hz? If you have a narrow band receiver, yes, nearly all the signal power is there. If you have a wide band receiver, no, none is.
You can see this effect for yourself at sdr.hu, a website that lets you see and hear, in real time, radio signals in the HF band and below as picked up at various places all over the world. You can select the receiver location, frequency, bandwidth, and mode. It's one of the truly wonderful websites, right up there with Wikipedia, Google, IMDB, and OEIS.
(Similarly with the signal's direction versus where on your antenna it landed. If you know very precisely where it landed, you know its position on the axis at right angles to its direction of propagation to a high precision, which means you know its momentum on that axis, i.e. the direction it came from, to a very low precision. This is why directional antennas have to be large.)
A few nitpicks:
The first digital modem in widespread use was the Bell 103 (and compatibles), dating to 1962, and still in widespread use through the 1980s. It worked only up to 300 bits per second. It wasn't as fast as the later Bell 202, but unlike the 202 it was full duplex, meaning that signals could be sent in both directions at once. In one direction, it switched between 1070 and 1270 Hz, in the other direction, it switched between 2025 and 2225 Hz. It was always more common than the 202, though perhaps not as common as the later Bell 212, which was a full duplex 1200 bits per second modem.
I don't think ham radio operators ever used Bell 202, at least in the US. For one thing, ASCII wasn't allowed on the air until 1980, by which time Bell 202 was long obsolete. Hams have been using the older five bit ITA2 code (often wrongly called Baudot) since 1946. Hams typically use 45 bits per second and a 170 Hz frequency difference between mark (1 bit) and space (0 bit) when sending ITA2 ("RTTY"). I think deaf people also still use ITA2 over the telephone.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (2)
-
Keith F. Lynch -
Mark VandeWettering