Frequency modulation encoding

Source: Wikipedia, the free encyclopedia.

The Atari 810 was typical of FM-based floppy drives of the early 1980s home computer era.

Frequency modulation encoding, or simply FM, is a method of storing data that saw widespread use in early floppy disk drives and hard disk drives. The data is modified using differential Manchester encoding when written to allow clock recovery to address timing effects known as "jitter" seen on disk media. It was introduced on IBM mainframe drives and was almost universal among early minicomputer and microcomputer floppies. In the case of floppies, FM encoding allowed about 80 kB of data to be stored on a 5+14-inch disk.

IBM began introducing the more efficient modified frequency modulation, or MFM, starting in 1970. They referred to this format as "double density", with the original FM retroactively becoming "single density". MFM was more difficult to implement and it was not until the early 1980s that low-cost all-in-one MFM floppy drive controllers like the WD1770 emerged. This led to the rapid demise of FM encoding in favor of MFM by the mid-1980s.

Underlying storage mechanism

DRAM for instance, the presence of a voltage over a certain threshold represents a binary one, while any voltage below that value represents a zero. The letter "A" in ASCII is represented as 01000001 in binary, which might be stored in a typical late-1970s DRAM like the Mostek MK4116 as a series of 0 and 5 V voltages in the individual capacitors making up the memory.[1]

In contrast,

read/write head, a small electromagnet. When the polarity of the magnetic charge on the disk changes, a brief pulse of electricity is induced in the head which is read as a one, any section where the polarity does not change produces a zero.[2] To encode the same letter A, assuming the previous data ended with a zero, a disk would use 01111110. The first zero-to-one transition causes a 1 to be output, the stream of ones following causes no output, and finally the last one-to-zero creates the final 1.[3]

In addition to the data being stored in patterns that require on-the-fly conversion to and from their internal format, the disk faces additional problems associated with being an analog system – noise, mechanical effects and other issues. In particular, disks suffer from an effect known as jitter due to small changes in timing as the media speeds up and slows down during rotation. One form of unavoidable jitter is due to the hysteresis of the magnetic media, which can lead to an effect known as bit shift that causes the strings of magnetic transition to be stretched out in time. These effects make it difficult to know which bit a particular transition belongs to.[3]

To address this problem, disks use some form of clock recovery using additional signals written to the disk. When the data is read, the clock signal is separated out and data bits can then be clearly seen in the signal and be cleanly lined up into the appropriate slots in memory.[3]

Encoding

FM encoding uses a simple system to encode the original data in such a way that every bit of data will contain at least one transition, ensuring there are enough transitions during a given period for a successful clock recovery. To do this, it operates with a basic data period twice that of the maximum frequency of the recording media. These are known as "clock windows", with up to one clock transition and one data transition per window. Since each bit of data requires two minimum times, FM encoding stores about half the amount that is theoretically possible on that media.[3]

FM uses an implementation of the

floppy disk controller will translate this into the series 1011101010101011, inserting additional signals in front of every bit to represent the clock. When this signal is then sent to the read/write head, the polarity will be flipped every time there is a pulse. In this example, if the head was originally in the low state at the end of writing the last data, the leading 1 will flip it to the high state, and the following zero will leave it there. The result is a single transition in that window. The next bit will first flip the state back to low, and then flip it back to high, for two transitions in the window.[3]

Encoding these transitions requires the system to accept digital data from the host computer and then re-code it into the underlying FM format. On reading, the system has to separate out the clock signal again and leave only the data bits. Because the FM system is so simple, it could be implemented in single-chip forms using late 1970's

semiconductor fabrication techniques. This greatly lowered the cost of implementation of a complete drive controller, which consisted largely of a clock, a drive controller chip, a chip to communicate with the host computer, and some buffer memory. Especially popular was the Western Digital FD1771 and its variations.[4]

Data encoding vs format

The material above refers to bytes being written to disk, but this is a simplification. In most disks, the only unit of data is the sector, and the individual bytes within it have no meaning to the controller. When data is written, the controller is handed a full sector's worth of data and told to write it as a single atomic operation as a series of bits. The controller cannot align the bits with the bytes based solely on the FM information. Thus it is not only the bits within the data that have to be aligned on reading, but the starting point of the sector's data as a whole.[3]

This is not accomplished with the encoding scheme, but the

disk format instead. When the controller writes a sector of data, it adds a header section containing information about the data that follows, as well as the address of the sector so it can be found in the future. During the write process, the controller also writes out a series of special "sync bytes" before the header and the data. In the IBM format, this consists of a series of thirteen zeros followed by three hexadecimal A1's in front of the header and data areas. These are not FM encoded, so the controller can easily identify them on-the-fly. The controller locks onto these signals to find the start of data, which immediately follows the last sync byte. After that, it reads out each eight bits into subsequent bytes in the buffer.[3]

Replacement with MFM

As each bit of data requires two transition periods in the FM system, it makes use of only half the potential storage capacity of the disk. This led to a series of more advanced encodings that make better use of the available space. The most widely used replacement was modified frequency modulation, or MFM. This system recorded only a single bit in every window, which produced the underlying clock signal. The value of the bit, 1 or 0, was encoded by the location of the pulse within the window. 1's were encoded with pulses in the center of the window; 0's with the pulse at the end.[3]

MFM requires a more complex solution to recovering the clock signal. Generally this takes the form of a

IBM PC, but using them required the clock recovery to be performed by external hardware, the "data separator". IC manufacturing was advancing rapidly during this period, and by the mid-1980s all-in-one MFM controllers appeared and the market rapidly moved to the double-density format.[3]

References

Citations

  1. ^ St. Michael, Stephen (1 August 2019). "Introduction to DRAM". All About Circuits.
  2. ^ Lutz, Melloni & Wakeman 1995, p. 1.
  3. ^ a b c d e f g h i Lutz, Melloni & Wakeman 1995, p. 2.
  4. .

Bibliography