Disc reader project -- quick status update
cclist at sydex.com
Tue Dec 8 23:45:00 CST 2009
On 8 Dec 2009 at 18:13, Eric Smith wrote:
> The data separator has to deal with a lot more than just the variation
> in drive speed. That variation has long-term and short-term
> components, and it also has to deal with the bit shifting that isn't
> completely avoided by write precompensation.
Indeed, as I pointed out ISV can be much worse than CSV.
Even the lowly WD9216 manages to implement long-- and short-term
timing compensation in an 8-pin late 70's DIP.
Bit-crowding effects can vary between media vendors, all else
remaining the same.
The Catweasel utilities tend to use very simple algorithms for their
MFM and FM separation (I don't know about their algorithms are for
other formats). I would expect that one could do considerably
better, particularly when it comes to error recovery. For example, a
traditional data separator, when it hits a "glitch" in the middle of
data is more likely than not to interchange clock and data in the
stream. A software algorithm can work through the "glitch" and
reduce data loss to a much lower level. Similarly, corrupted IDAMs
usually result in a "Sector not found" type of error on traditional
gear, where the placement of the IDAM and its data might be inferred
using a clever software algorithm.
In other words, instead of being "as good as" a regular floppy
controller, one could be considerably better.
Given that the goal of a floppy reader is recovery of recorded
information, what is an adequate sampling rate? My own guess is that
after real-world factors have been taken into account, that 8x is
probably more than safe. But I would be interested in hearing
arguments to the contrary.
More information about the cctech