Thursday 14 July 2016

UCNC day 4

UCNC day 4, with an embarrassment of riches in the form of invited talks.

We kicked off with an invited talk from Friedrich Simmel on “Chemical Communication Between Cell-Sized Reaction Compartments”. This was a fascinating account about a series of experiments sending signals between cells, droplets, and “genelets” (droplets containing cellular “naked” genetic machinery), based on the ideas of quorum sensing: when a high enough chemical signal concentrations is produced, because there are enough producers around, it invokes a response. We saw droplets signalling the chemicals, inducing bacteria to react, and that signal propagating through multiple droplets. Apparently there is a “bacterial Turing test”: can you make a droplet that a bacterium will interact with (through chemical signals) just as if it were another bacterium? These systems pass it. Through a clever use of microfluidics, we saw videos of sheets of bacteria interacting, via fluorescent protein production. The fluorescence increases both due to the being switched on by the signalling, and due to the bacteria reproducing, two process with similar timescales. The possibilities of this approach include forming spatial and temporal patterns through reaction-diffusion systems of interacting genetically programmed droplets. If all this wasn’t enough, Simmel finished his talk with a description of using electron lithography to etch chips, deposit gene-length strands of DNA in a controlled manner, which could then be manipulated to stick together (condense) into linear bundles. It’s early days yet; next on the agenda is using gene expression to control the condensation. Heady stuff!

Next was the workshop on Physics and Computation. Gilles Dowek started with an invited talk on “Quantitative Informational Aspects in Discrete Physics”. Gandy has shown that if a system (1) is homogeneous in space and time; (2) has a bounded speed of information transport; (3) has a bounded density of information, then it can be simulated by a cellular automaton. Since physics appears to satisfy these properties, it should be so simulable. Then came a short but necessary digression on Planck’s constant. The physical constant c has the dimensions of a speed, and it is the speed of light. Planck’s constant has the dimension of an action; what action is it? After a bit of discussion, it turns out that it is (a small multiple of) the area of a bit of information (in a particular choice of units where everything has a dimension that is some power of length) as given by the Bekenstein bound. Then Dowek went through how to build a CA in Newtonian physics, special relativity, and general relativity, that models free fall (subject to some assumptions). It can’t be done in Newtonian physics, because there is no bound on speed. In SR it can be done, with particles that contain 320 bits of information (using the Planck area); in GR they only need 168 bits. This is an existence proof, but the CAs defined are not very satisfactory, for several reasons. The task is to do better! Listening to this, I recalled Randal Beer’s Game of Life talk from ALife last week: looking at a CA in terms of processes rather than cells gives a much more natural formulation. I wonder if that would work here? 

Then we had a talk about “The Information Content of Systems in General Physics Theories”. The idea here is to look at a broad range of probabilistic theories, of which quantum mechanics is one instance. Investigating their computational complexity of the “advice” given by a physical system can shed light on what makes QM special, different from just a general theory.

After lunch Ana Belén Sainz gave an invited talk on “Postquantum Steering”. This was in the same vein as the previous talk: look at a general theory, then compare with QM. Here the idea was applied to one particular kind of system: how much can Bob “steer” distant Alice’s state, by making measurements on his own state?

Next came some more talks. The first, on “Sequent Calculus Representations for Quantum Circuits”, was an approach to making reasoning about quantum circuits look like proof theoretic reasoning in other branches of computer science, by finding an appropriate set of axioms. Next was a talk on “Physical Computation, P/poly and P/log*”, looking at the computational complexity of physical computing as an unconventional co-processor, in terms of its advice complexity. After coffee we had a talk on “Local Searches Using Quantum Annealers: How to Beat Noise and Take Advantage of the Classical Algorithms we Already Have, or, Modernising Quantum Annealing using Local Search”. This contrasted classical simulated annealing, including its two improvements of parallel tempering and population annealing, with the quantum version: quantum annealing. Each has is strengths and weakeness; here was a suggestion of how to use the quantum annealing as a “subroutine”, getting the best of both approaches. The final workshop talk was on “Quantum Probability as an Application of Data Compression Principles”, a philosophical look at probabilities in general, and branching world probabilities in particular.

The day was then completed with a further invited talk, Bob Coecke talking “In Pictures: From Quantum Foundations to Natural Language Processing”. He zipped through a beautiful, formal, diagrammatic notation for quantum systems, and how the power of this notation makes many complicated quantum puzzles ad proofs essentially vanish. There will be a book, Picturing Quantum Processes, from Cambridge University press covering this published soon.  It's 922pp: pictures take a lot of space!  After all this the quantum mechanics, he went off in an unexpected direction, by showing how the very same notation could be used to calculate the meaning of sentences from their underlying grammar and the meaning of the individual words. Some modern meaning systems use high dimensional vectors to encapsulate word meanings. Adding the grammar via the diagrams improves the calculated meaning enormously. Thinking about the mathematical structures needed leads to the suggestion of using density matrices rather than vectors, to cope with ambiguous meanings. I love this kind of work: a deep piece of work in one domain that is not only applicable in a seemingly unrelated domain, but that suggests advances there, too.

No comments:

Post a Comment