[aklug] Re: [OT] Re: random bits vs random Hex

From: Arthur Corliss <acorliss@nevaeh-linux.org>
Date: Wed May 29 2013 - 13:50:22 AKDT

On Wed, 29 May 2013, Doug Davey wrote:

> Bryan is right, if you have saved the binary data poorly, in an odd way,
> there is potential for fluff that would be easily be trimmed by a
> compression algorithm.

Uh, if we're saying let's save a binary stream intentionally inflated so
that we can "compress" it only to get a stream of the original size -- in
other words, *no* compression at all -- then maybe.... if I can get my fuzzy
little head around the point of this exercise at all. None of this,
however, gives you any real compression.

> As for pattern recognition, a random stream will hopefully have no
> patterns, so further compression won't work. If it does then the stream
> wasn't random.

All streams, random or not, will have repeating patterns if you're examining
small enough chunks. The problem faced by all alogrithms is whether the
occurence is frequent enough to gain you actual compression *after* you add
in the used space for the translation table. And that's where the wheels
often fall off in rudimentary algorithms.

         --Arthur Corliss
           Live Free or Die
---------
To unsubscribe, send email to <aklug-request@aklug.org>
with 'unsubscribe' in the message body.
Received on Wed May 29 13:50:47 2013

This archive was generated by hypermail 2.1.8 : Wed May 29 2013 - 13:50:47 AKDT