Andy's Math/CS page

Wednesday, September 26, 2007

In Plane Sight

Flatland is in tumult after a spate of triangle-on-triangle violence, and the prevailing attitude is one of utter paranoia. The triangles, which can see out of each of their three vertices, are so mistrustful that they will not tolerate the presence another triangle unless they can see each of the other's vertices, and, moreover, see it with each of their own eyes. Triangles are assumed to be disjoint and opaque.

It is your job to help as many triangles as possible congregate, in a fashion acceptable to all participants, to promote dialogue and peace. Find yourself a pen, paper, and a bag of Doritos, and consider:

Problem: Can three equilateral triangles of equal size be placed in a 'acceptable' arrangement? (If yes, what is the upper limit, if any?)


Notes:

-I do know the answer, but it took me awhile. I've never really understood triangles...

-I haven't read Abbott's 'Flatland' since maybe age 12, so I have no idea how faithful this scenario is, or whether related visibility issues were explored in the book. As far as I know this puzzle has not been posed elsewhere, but I do recall there being at least one interesting visibility puzzle in Engel's book 'Problem Solving Strategies'.

-Notice how I refrained from saying 'acute paranoia'... it wasn't easy, but I think I'm a better person for having done so.

Labels: ,

Sunday, September 16, 2007

Information is not a Substance

I wanted to share a cool idea from the young field of 'network coding', which explores techniques for efficiently disseminating information over networks like the web. I believe it originated in this seminal paper, which I learned about indirectly by reading Michael Mitzenmacher's very interesting research blog My Biased Coin (which reflects his professional activity in network coding and is a good source; ref. #7 on this link is a good first article to read).

Consider the following directed network:



Here, each of the three top nodes have a stream of data they need to communicate to the node of corresponding color on the bottom. Each edge shown can transmit a bit per second in the direction indicated, and the purple nodes in the middle are willing to assist in communication.

The perverse structure of this network seems to pose a problem: each node on top has two pink edges which are headed towards the wrong parties on the bottom. Since bottom nodes cannot themselves communicate, the pink edges appear useless. It would seem that all information flow must happen thru the black edges, whose bottleneck would allow only one party to get a bit across each second.

WRONG! (puzzle: stop reading and figure out a better strategy.)


Here's the idea: Say the top nodes want to send bits x, y, z respectively on a round. They send their bit on their every available edge. The top purple node receives all of them and computes their XOR, that is, x + y + z (mod 2). It sends it to bottom-purple node, which then broadcasts it to the three receivers.

Then, for instance, bottom-red receives y, z, and x + y + z, so it can add up all the bits it receives to get y + z + (x + y + z) = x, just what it wanted. Similarly for the other nodes. Repeating this process, the network transmits at three times the rate we were led to expect by our faulty initial reasoning (a line of thought which restricted us to the best so-called 'multicommodity flow' on the network).

If these were roads, and we were sending formed substances to their destinations, our first approach would be valid. But information is not a substance--it obeys zany laws of its own, and can at times be merged, fractionalized, and delocalized in unexpected and useful ways (often involving that mischievous XOR). Better understanding the information capacity of networks more complex than the one above is a fascinating and important research frontier--so keep following Mitzenmacher's blog!

Labels:

Friday, September 14, 2007

Be Rational

Today, enjoy a home-baked puzzle in real analysis--a somewhat old-fashioned subject that I can't help but love (much like automata theory).

We know that pi = 3.1415... is transcendental, i.e., it is not a zero of any nontrivial univariate polynomial with rational coefficients. Is it possible we could generalize this result?


Part I. Is there a continuous function f: R --> R, such that

a) f takes rational numbers to rational numbers,

b) f(x) is zero if and only if x = pi?


Part II. If you think no such f exists, then for which values (in place of pi) does such an f exist?
On the other hand, if f as in Part I does exist (I reveal nothing!), can it be made infinitely differentiable?


For those new to real analysis, 'weird' objects like the desired f above generally have to be created gradually, through an iterative process in which we take care of different requirements at different stages. We try to argue that 'in the limit', the object we converge to has the desired properties. (Does this sound like computability theory? The two should be studied together as they have strong kinships, notably the link between diagonalization/finite extension methods and the Baire Category Theorem.)

Passing to the limit can be subtler than it first appears; for example, a pointwise limit of continuous functions need not be continuous. Here is a pretty good fast online introduction to some central ideas in studying limits of functions.

Labels: ,

Sunday, September 09, 2007

Correlation Reversal

In an earlier post, we discussed Kleitman's Theorem, which tells us that monotone events are nonnegatively correlated. So, for example, if we generate a random graph G, including each edge with probability 1/2, and we condition on the event that G is nonplanar, that can only make it more likely that G is Hamiltonian, not less.

There's a further wrinkle we could add to make things more interesting. By a witness for the event f(x) = 1 on a particular bitstring x, we mean a subset of the bits of x that force f(x) = 1. (This event could have multiple, overlapping or disjoint, witnesses.)

Let f, g be monotone Boolean functions, and consider the event we'll call f^^g, the event (over uniformly chosen x)that f(x), g(x) are both 1 and that, moreover, we can find disjoint witnesses for these two conditions.

Problem: Is P(f^^g) greater or less than P(f = 1)*P(g = 1)?

On the one hand, f and g `help each other' to be 1, by Kleitman's Theorem. On the other hand, the disjoint-witness requirement seems to undercut their ability to positively influence each other. Which tendency wins out?

I'll tell you: P(f^^g) <= P(f = 1)*P(g = 1). The latter tendency prevails, although ironically enough, Kleitman's Theorem can actually be used to show this! It's a not-too-hard exercise in conditioning, which I recommend to the interested reader.

Here's the kicker, though: the inequality above is actually true for arbitrary f, g, not just monotone ones! This result, however, took a decade of work to prove; it was conjectured by van den Berg and Kesten, and proved by Reimer in 1994; see this article. Reimer's Inequality appears fruitful in probabilistic analysis, including in theoretical CS, see this article (which I have yet to read).

Labels:

Saturday, September 08, 2007

On the Move

Time for a personal update: this week I began a stay as a visiting student at MIT's CS and AI Lab (CSAIL)--a very exciting place to be for a young would-be theorist. For this privilege I have friend, fellow blogger, and new MIT prof Scott Aaronson to thank--thanks, Scott!

While here, I'd absolutely love to meet, get to know, trade ideas or collaborate with any Boston/Cambridge readers. Drop a line or come by my office, 32-G630. I'd also welcome advice on where to eat, how to furnish an apartment on the cheap, how to stop getting lost constantly around town and campus, and all that good stuff.

Speaking of getting lost, a small PSA: did you know that in a long random walk on a finite undirected graph, you can expect to appear at any given node with frequency proportional to its degree? A node's location and degree of centrality in the graph are ultimately irrelevant. Equivalently, every edge gets traversed with the same limiting frequency, and with equal frequency in either direction.

Despite its simplicity, this was an eye-popper for me as an undergrad, and my favorite result in our probability seminar, along with the theorem that the harmonic series 1 + 1/2 + 1/3 + ... converges with probability 1 when each term is instead given a random sign.