I was puzzled about the 27.000 MHz crystal in the Jakks game. It's almost, but not quite, 7.5 times the color carrier frequency, 3.579545 MHz, which is the master clock frequency in the original Atari 2600. Eric Smith wrote to clear up the issue:

13.5 MHz (exact) is the most common sample rate used for standard definition digital video, based on the CCIR 601 standard. Most consumer digital video products contain one or more clocks that are integer multiples of this frequency. There are standard NTSC/PAL video encoder chips that operate on a 27 MHz clock for this reason. So it's not at all surprising that this product would contain a 27 MHz crystal, and it's completely implausible that it's a mistake of any sort.

The NTSC color carrier frequency is specified by the FCC to be 3.579545 MHz +/- 10 Hz. Although in practice a television will lock to a color carrier that is somewhat out of spec, if the crystal was really supposed to be 7.5x color carrier and 27.0 MHz was used instead, it would be off by .57%, which is enough that many if not most televisions would fail to lock.

The best description of how NTSC and PAL television signals work, and why they work that way, is Charles Poynton's book "A Technical Introduction to Digital Video".

The NTSC color carrier was chosen to be 63/88 of 5.0 MHz (exact). The FCC value is rounded to the nearest Hz, which is fine since they spec the tolerance at +/-10 Hz. Poynton's book gives some useful background in understanding why that was chosen. To make a long story short, it was desirable to use a subcarrier frequency with a known phase relationship to the horizontal rate, but alternating in phase on every scan line. Based on the limits of bandwidth and of manufacturable technology of the day, this resulted in a multiplier of 455/2, or 227.5. B&W television in the US used sync rates of 60 Hz (vertical) and 15750 Hz (horizontal), so it would have been logical for the color subcarrier to be 15750 Hz * 227.5, or 3.583125 MHz. However, this might have caused minor problems with the sound carrier. The obvious choice was to move the sound carrier center frequency by 0.1%. This would have caused no problems with the televisions of the day. However, the decision was made to change the entire video timing by 0.1% instead. This is why NTSC video has a field rate of 59.94 Hz rather than 60 Hz, requiring the use of drop-frame time code, and why the horizontal rate is 15734.5 Hz rather than 15750 Hz.

I've seen claims that moving the video timing by 0.1% was the result of the FCC deciding that changing the audio carrier by 0.1% would have been incompatible with existing receiving equipment (which, in fact, it would not have been). I don't recall whether Poynton makes that claim. But it seems more credible to me that the decision was made for the express purpose of having a simpler ratio (63/88) between the color subcarrier frequency and a standard reference frequency (5.0 MHz). If they'd kept the original horizontal rate, the ratio of color subcarrier to a 5.0 MHz reference would have been 5733/8000.

The 13.5 MHz sample rate used by CCIR 601 works out to exactly 858 samples per scan line for NTSC, and exactly 864 samples per scan line for PAL. Systems based on 13.5 MHz sampling use a PLL to generate the color subcarrier, and may perform the color encoding function in either the digital or analog domain.

Eric