Redundancy algorithm for reading noisy bitstream

Posted by Tedd Hansen on Stack Overflow See other posts from Stack Overflow or by Tedd Hansen
Published on 2011-01-05T18:41:56Z Indexed on 2011/01/05 19:54 UTC
Read the original article Hit count: 282

Filed under:
|
|

I'm reading a lossy bit stream and I need a way to recover as much usable data as possible. There can be 1's in place of 0's and 0's in palce of 1's, but accuracy is probably over 80%.

A bonus would be if the algorithm could compensate for missing/too many bits as well.

The source I'm reading from is analogue with noise (microphone via FFT), and the read timing could vary depending on computer speed.

I remember reading about algorithms used in CD-ROM's doing this in 3? layers, so I'm guessing using several layers is a good option. I don't remember the details though, so if anyone can share some ideas that would be great! :)

Edit: Added sample data

Best case data:
 in: 0000010101000010110100101101100111000000100100101101100111000000100100001100000010000101110101001101100111000101110000001001111011001100110000001001100111011110110101011100111011000100110000001000010111
out: 0010101000010110100101101100111000000100100101101100111000000100100001100000010000101110101001101100111000101110000001001111011001100110000001001100111011110110101011100111011000100110000001000010111011

Bade case (timing is off, samples are missing):
out: 00101010000101101001011011001110000001001001011011001110000001001000011000000100001011101010011011001
 in: 00111101001011111110010010111111011110000010010000111000011101001101111110000110111011110111111111101

Edit2: I am able to controll the data being sent. Currently attempting to implement simple XOR checking (though it won't be enough).

© Stack Overflow or respective owner

Related posts about c#

Related posts about .NET