## 1/23/10

### It Doesn't Fit

Meme of today: How much information fits into a digital circuit which is of squared form?

If I can combine two minimal terms to derive a minimal term, then that gives a bottom-up strategy on solving instances of problems. How hard is it to check you're near minimal given the number of operations performed?

It becomes interesting when you look at the encoding of SUBSET-SUM, where SUBSET-SUM I trivialize by assuming the numbers and set have the same dimension n. In the digital encoding you end up with a 'square' term, corresponding to the space-time complexity, with width and breadth cn, and depth therefore 2dn. FACTOR looks similar. (Note that I am looking at propositional formulas encoding instances.)

Now we look at what a variable means, say we have two, a and b, gives 2**(2**2) functions, thus we should get 16 different functions. For a variable, you can assume it encodes a mapping on the state space. For example, a encodes 0011 and b encodes 0101. Similarly, (aa), we only have one operator so I omit it, thus encodes 1100, (ab) encodes 1110. My favorite, (aa)a, see the logo/favicon, encodes a tautology. Enumerating the terms gives all functions eventually.

There is only a squared number of operations performed, the depth is also bounded, so, the maximal information in the term, and the minimal term size, is, as far as I can see, the Shannon density? Guess its standard complexity theory, its been a while.

012610: Thought it over, I defined a tautology here. If you define the 'Shannon density' as the minimal term for a proposition, then of course everything sticks. For factor, the encoding of a product is quadratic in time-space, but the minimal term size -of a concrete instance- is upper bounded by the number of divisors (up to the square root) after manipulating the term and algebraically removing one unknown.

012610: Perfect, I found the bound again on the number of divisors: The divisor bound asserts that, as n gets large,

$d(n) \leq C_\varepsilon n^\varepsilon$

or the more precise bound

$\displaystyle d(n) \leq n^{O( 1/ \log \log n)} = \exp( O( \frac{\log n}{\log \log n} ) )$

012610: Of course, its rather easy to describe an exponential number of bit strings with a term recognizing bit vectors. So the answer is: Quadratic information even though a term may describe an exponential number of symbols.

012710: Oh, right, a variable encodes an exponential number of bit vectors. A term encodes an exponential number of paths to a variable. Variables can occur on even and odd length paths. Can we hope to compress along those paths? I think I tried that, its And-Or tree compression, and close to a variable picking heuristic from SAT solving.

(Great, in a day I went from logarithmic (which I remembered), to linear, to an exponential bound on the Shannon density of divisors. Messy. Ah well, been five years since I looked at this.)

Oi, Shannon, you home?