Local Scene: Leaves and Trees, Hannover

Great to be somewhere that has a local music scene.  Hannover indie-folk band Leaves and Trees released their first EP on April 23rd.  The release show/party at LUX was full by the time we arrived (from a concert of Max Reger's choral music at the Marktkirche by way of  the Pfannekuchen Haus) so we hung around sheltering from the wind in front of a locked door facing the Schwarzer Bär tram stop, that appeared to be next to the stage as it transmitted the sound quite well.  Only a couple of beers from a nearby kiosk would have been necessary to complete the Just Kids Too Young to Get Inside picture, but we didn't bother... good sounds coming through the door, though.  Nice arrangements, with good use of cello.  I'll buy the EP at Bandcamp  (where it can also be streamed).

The signature tune is Who Is That Man, for which a very well done video that tells a story that goes beyond the lyrics, is available on YouTube (you'll get an accurate impression of the local woodlands from watching it):

There's also a nice video of lead singer Fabian Baumert singing another song from the EP at a singer-songwriter slam at local club Kulturpalast Linden:

I don't think every post about a band's new EP needs to be a "review", comparing it to the writer's favorite bands and the world's top artists, etc... and opining about a band's chance of "making it" instead of just enjoying their music. Nevertheless, since this EP and the Who Is That Man? video evince very high production values that might suggest eventual goals wider than just local or regional success, a few comments along those lines. I think that's not an implausible possibility. I don't really know what the indie-folk scene or possibilities are these days---but a little investigation suggests there are some pretty nice festivals and things around Europe with bands I enjoyed checking out on the web. (The opening band for L&T's LUX show was one.) Maybe there was a moment a few years back, when with Mumford and Sons and Bon Iver and such, indie-folkish singer-songwriter music was going mainstream, and maybe that moment is over, making some modest success for this type of band, that sort that can lead to an extended career for a band making a living from music, tougher. I like all the songs on the EP, like the overall band sound, and like Fabian Baumert's singing. A little bit of gentle, almost Nordic North-German melancholy in the mix is very nice. Uncomplicated, but not completely predictable, song and chord structure, beautifully arranged. Relaxed tempos and feel in general. The sound on the EP seems very good, possibly a little crunchy in the treble but I have only streamed it yet; the FLAC and CD may fix that. I'm reluctant to say such a thing, but I do think that to have a broader---say, international---appeal, it would be good if Mr. Baumert's accent when singing in English, which is generally quite good, were even more natural. Some of the lyrics are hard to understand, and in this kind of music that can be crucial. On a light note, it is risky to include "Whoa", let alone "Whoa-oah-oh", in lyrics, especially when you're playing acoustic guitar. It works out fine here.

If you have a local band of this quality, go to their shows, buy their music, and support them. Here's hoping Leaves and Trees get the opportunity to write and play much more and continue to grow.

Ed Dolan on the case for a universal basic income

Ed Dolan suggests that claims that social safety net programs on average don't disincentivize work much may depend on the current relatively limited coverage of such programs (especially compared to the situation before, say, the Reagan administration and (hopefully soon to be qualified with the word "first") Clinton administration.  He thinks advocates should consider replacing many of these programs with a universal basic income.  There is a lot more one could say about this issue, but I think this is an important point to keep in mind.

Bach, Johannes-Passion, Bachchor und Orchester Hannover, Marktkirche

I attended a performance of J.S. Bach's Passion according to St. John (Johannespassion) by the  Hannover Bach Choir and Orchestra last night at the Marktkirche in the central market square of Hannover's old town. I may or may not have listened my way through this work on LP as a youngster, and probably did overhear it on the stereo growing up, but this is probably my first careful listen to the whole piece. (About two hours, no intermission though a brief episode of tuning between the two sections.)  A very rewarding if, obviously, fairly solemn two hours.  Really superb choral singing with the different vocal parts sufficiently distinct and the words very clear (well, especially with the aid of a program given my limited German) but the choir unified.  Remarkably dramatic effect when the choir portrays the crowds present at the high priest's and Pilate's interrogations of Jesus, contrasting with the choir's other main role as expressing Christian sentiments from a point of view that is not necessarily within the narrative aspect of the piece (but might also be taken so, as expressing another aspect of experience of some in the crowd).  The latter is usually in hymn-like chorales, but also often (as in the opening "Herr, unser Herrscher dessen Ruhm") in more complex and extended episodes with more involvement of the orchestra.  The visible wind instruments were baroque in appearance, there was a large lute, and I suspect the string section and most or all of the rest of the orchestra was original style instruments as well.  Tempos were relatively fast, and the resulting sound was excellent, though for some reason the orchestra came across with less clarity than the singers---the relatively reverberant acoustic of the tall, relatively open North German gothic brick hall church maybe having something to do with that. On balance I think the original instruments and the chosen tempos gave a somewhat rough, unprettified, but still accurate and well-played, effect that worked extremely well in the piece, accentuating its seriousness.  Some passages, in which the choir and orchestra engaged in extended contrapuntal reflection upon a dramatic development, or expression of the crowd's intention or reaction, with voices and instruments becoming a swirl of fast-moving harmonies and passing tones, attained an eerie and dramatic effect that reminded me of some twentieth century postserialism, maybe Ligeti or Penderecki.   The soloists were really excellent and did everything well.  Such a performance is definitely not about attention-getting individual vocals but all the soloists did have, in performances that were consistent throughout, some songs that really stood out in expressing key moments in the drama.   Alto Christian Rohrbach has a beautiful clear voice and delivered "Es ist vollbracht!" perfectly; the soprano soloist (either Miriam Meyer or Nadine Dilger; two sopranos are listed in the program) was especially affecting (though never overdoing it) with "Zerfließe, mein Herze" ("Dein Jesus ist tot!"); bass Albrecht Pohl did a great job of handling a variety of vocal tasks in combining the role of Pilate with many additional bass arias.  Johannes Strauß was especially outstanding as the Evangelist---he has an amazingly clear and beautiful tenor voice, deployed with perfect control.

Of course an extended piece like this with religious and dramatic aspects is an occasion for plenty of reflection on musical aspects of the piece but also on these in relation to the human condition.  One of the more interesting aspects of this piece for me was the amount of attention given to the political and social aspect of the story: the interaction with Pilate (I don't fully understand what's going on here yet), the issue about Jesus being called "King of the Jews" but asserting "My kingdom is not of this world", the high priest and the servant, and later the crowd after the exchange with Pilate "Shall I crucify your king?" "We have no King but the Emperor", calling for Jesus' crucifixion.  (There seems to be an emphasis on "the Jews" delivering Jesus to Pilate and calling for his crucifixion in this text.)

A superb, clear, controlled and well-thought-out performance and a perfect way to get better acquainted with this serious, reflective, many-faceted masterwork of Bach's.

Deutsch, Popper, Gelman and Shalizi, with a side of Mayo, on Bayesian ideas, models and fallibilism in the philosophy of science and in statistics (I)

A few years back, when I reviewed David Deutsch's The Beginning of Infinity for Physics Today (see also my short note on the review at this blog), I ended up spending a fair amount of time revisiting an area of perennial interest to me: the philosophy of science, and the status of Popper's falsificationist and anti-inductive view of scientific reasoning. I tend to like the view that one should think of scientific reasoning in terms of coherently updating subjective probabilities, which might be thought of as Bayesian in a broad sense. (Broad because it might be more aligned with Richard Jeffrey's point of view, in which any aspect of one's probabilities might be adjusted in light of experience, rather than a more traditional view on which belief change is always and only via conditioning the probabilities of various hypotheses on newly acquired data, with one's subjective probabilities of data given the hypotheses never adjusting.) I thought Deutsch didn't give an adequate treatment of this broadly Bayesian attitude toward scientific reasoning, and wrote:

Less appealing is Deutsch and Popper’s denial of the validity of inductive reasoning; if this involves a denial that evidence can increase the probability of general statements such as scientific laws, it is deeply problematic. To appreciate the nature and proper role of induction, one should also read such Bayesian accounts as Richard Jeffrey’s (Cambridge University Press, 2004) and John Earman’s (MIT Press, 1992).

Deutsch and Popper also oppose instrumentalism and physical reductionism but strongly embrace fallibilism. An instrumentalist believes that particular statements or entities are not literally true or real, but primarily useful for deriving predictions about other matters. A reductionist believes that they have explanations couched in the terms of some other subject area, often physics. Fallibilism is the view that our best theories and explanations are or may well be false. Indeed many of the best have already proved not to be strictly true. How then does science progress? Our theories approximate truth, and science replaces falsified theories with ones closer to the truth. As Deutsch puts it, we “advance from misconception to ever better misconception.” How that works is far from settled. This seems to make premature Deutsch’s apparent dismissal of any role for instrumentalist ideas, and his neglect of pragmatist ones, according to which meaning and truth have largely to do with how statements are used and whether they are useful.

Thanks to Brad DeLong I have been reading a very interesting paper from a few years back by Andrew Gelman and Cosma Shalizi, "Philosophy and the practice of Bayesian statistics" that critiques the Bayesian perspective on the philosophy of science from a broadly Popperian---they say "hypothetico-deductive"---point of view, that embraces (as did Popper in his later years) fallibilism (in the sense of the quote from my review above).  They are particularly concerned to point out that the increasing use of Bayesian methods in statistical analysis should not necessarily be interpreted as supporting a Bayesian viewpoint on the acquisition of scientific knowledge more generally.  That point is well taken; indeed I take it to be similar to my point in this post that the use of classical methods in statistical analysis need not be interpreted as supporting a non-Bayesian viewpoint on the acquisition of knowledge.  From this point of view, statistical analysis, whether formally Bayesian or "classical" is an input to further processes of scientific reasoning; the fact that Bayesian or classical methods may be useful at some stage of statistical analysis of the results of some study or experiment does not imply that all evaluation of the issues being investigated must be done by the same methods.  While I was most concerned to point out that use of classical methods in data analysis does not invalidate a Bayesian (in the broad sense) point of view toward how the results of that analysis should be integrated with the rest of our knowledge, Gelman and Shalizi's point is the mirror image of this.  Neither of these points, of course, is decisive for the "philosophy of science" question of how that broader integration of new experience with knowledge should proceed.

Although it is primarily concerned to argue against construing the use of  Bayesian methods in data analysis as supporting a Bayesian view of scientific methods more generally, Gelman and Shalizi's paper does also contain some argument against Bayesian, and more broadly "inductive", accounts of scientific method, and in favor of a broadly Popperian, or what they call "hypothetico-deductive" view.  (Note that they distinguish this from the "hypothetico-deductive" account of scientific method which they associate with, for instance, Carl Hempel and others, mostly in the 1950s.)

To some extent, I think this argument may be reaching a point that is often reached when smart people, indeed smart communities of people,  discuss, over many years, fundamental issues like this on which they start out with strong differences of opinion:  positions become more nuanced on each side, and effectively closer, but each side wants to keep the labels they started with, perhaps in part as a way of wanting to point to the valid or partially valid insights that have come from "their" side of the argument (even if they have come from the other side as well in somewhat different terms), and perhaps also as a way of wanting to avoid admitting having been wrong in "fundamental" ways.  For example, one sees insights similar to those in the work of Richard Jeffrey and others from a "broadly Bayesian" perspective, about how belief change isn't always via conditionalization using fixed likelihoods, also arising in the work of the "hypothetico-deductive" camp, where they are used against the simpler "all-conditionalization-all-the-time" Bayesianism.  Similarly, probably Popperian ideas played a role in converting some  "relatively crude" inductivists to more sophisticated Bayesian or Jefferian approach.  (Nelson Goodman's "Fact, Fiction, and Forecast", with its celebrated "paradox of the grue emeralds", probably played this role a generation or two later.)  Roughly speaking, the "corroboration" of hypotheses of which Popper speaks, involves not just piling up observations compatible with the hypothesis (a caricature of "inductive support") but rather the passage of stringent tests. In the straight "falsification"  view of Popper, these are stringent because there is a possibility they will generate results inconsistent with the hypothesis, thereby "falsifying" it; on the view which takes it as pointing toward a more Bayesian view of things (I believe I once read something by I.J.Good in which he said that this was the main thing to be gotten from Popper), this might be relaxed to the statement that there are outcomes that are very unlikely if the hypothesis is true, thereby having the potential, at least, of leading to a drastic lowering of the posterior probability of the hypothesis (perhaps we can think of this as a softer version of falsification) if observed.  The posterior probability given that such an outcome is observed of course does not depend only on the prior probability of the hypothesis and the probability of the data conditional on the hypothesis---it also depends on many other probabilities.  So, for instance, one might also want such a test to have the property that "it would be difficult (rather than easy) to get an accordance between data x and H (as strong as the one obtained) if H were false (or specifiably flawed)".  The quote is from this post on Popper's "Conjectures and Refutations" by philosopher of science D. G. Mayo, who characterizes it as part of "a modification of Popper".  ("The one obtained" refers to an outcome in which the hypothesis is considered to pass the test.)  I view the conjunction of these two aspects of a test of a hypothesis or theory as rather Bayesian in spirit.  (I do not mean to attribute this view to Mayo.)
I'll focus later---most likely in a follow-up post---on Gelman and Shalizi's direct arguments against inductivism and more broadly Bayesian approaches to scientific methodology and the philosophy of science.  First I want to focus on a point that bears on these questions but arises in their discussion of Bayesian data analysis.  It is that in actual Bayesian statistical data analysis "the prior distribution is one of the assumptions of the model and does not need to represent the statistician's personal degree of belief in alternative parameter values".  They go on to say "the prior is connected to the data, so is potentially testable".  It is presumably just this sort of testing that Matt Leifer was referring to when he wrote (commenting on my earlier blog entry on Bayesian methods in statistics)

"What I often hear from statisticians these days is that it is good to use Bayesian methods, but classical methods provide a means to check the veracity of a proposed Bayesian method. I do not quite understand what they mean by this, but I think they are talking at a much more practical level than the abstract subjective vs. frequentist debate in the foundations of probability, which obviously would not countenance such a thing.

The point Gelman and Shalizi are making is that the Bayesian prior being used for data analysis may not capture "the truth", or more loosely, since they are taking into account the strong possibility that no model under consideration is literally true, that it may not adequately capture those aspects of the truth one is interested in---for example, may not be good at predicting things one is interested in. Hence one wants to do some kind of test of whether or not the model is acceptable. This can be based on using the Bayesian posterior distribution as a model to be tested further, typically with classical tests such as "pure significance tests".
As Matthew's comment above might suggest, those of us of more Bayesian tendencies, who might agree that the particular family of priors---and potential posteriors---being used in data analysis (qua "parameter fitting" where perhaps we think of the prior distribution as the (higher-level) "parameter" being fit) might well not "contain the truth", might be able to take these tests of the model, even if done using some classical statistic, as fodder for further, if perhaps less formal, Bayesian/Jeffreysian reasoning about what hypotheses are likely to do a good job of predicting what is of interest.

One of the most interesting things about Gelman and Shalizi's paper is that they are thinking about how to deal with "fallibilism" (Popper's term?), in particular, inference about hypotheses that are literally false but useful. This is very much in line with recent discussion at various blogs of the importance of models in economics, where it is clear that the models are so oversimplified as to be literally false, but nonetheless they may prove predictively useful.  (The situation is complicated, however, by the fact that the link to prediction may also be relatively loose in economics; but presumably it is intended to be there somehow.)  It is not very clear how Popperian "falsificationism" is supposed to adapt to the fact that most of the hypotheses that are up for falsification are already known to be false. Probably I should go back and see what Popper had to say on that score, later in his career when he had embraced fallibilism. (I do recall that he tried introducing a notion of "verisimilitude", i.e. some kind of closeness to the truth, and that the consensus seems to have been---as Gelman and Shalizi point out in a footnote---that this wasn't very successful.)  It seems to that a Bayesian might want to say one is reasoning about the probability of statements like "the model is a good predictor of X in circumstances Y", " the model does a good job capturing how W relates to Z" , and so forth. It is perhaps statements like these that are really being tested when one does the " pure significance tests" advocated by Gelman and Shalizi when they write things like "In designing a good test for model checking, we are interested in finding particular errors which, if present, would mess up particular inferences, and devise a test statistic which is sensitive to this sort of mis-specification."

As I said above, I hope to take up Gelman and Shalizi's more direct arguments (in the cited paper) against "inductivism" (some of which I may agree with) and Bayesianism sensu lato as scientific methodology in a later post. I do think their point that the increasing use of Bayesian analysis in actual statistical practice, such as estimation of models by calculating a posterior distribution over model parameters beginning with some prior, via formal Bayesian conditioning, does not necessarily tell in favor of a Bayesian account of scientific reasoning generally, is important. In fact this point is important for those who do hold such a loosely Bayesian view of scientific reasoning:  most of us do not wish to get stuck with interpreting such priors as the full prior input to the scientific reasoning process.  There is always implicit the possibility that such a definite specification is wrong, or, when it is already known to be wrong but thought to be potentially useful for some purposes nonetheless, "too wrong to be useful for those purposes".

Anthony Aguirre is looking for postdoc at Santa Cruz in Physics of the Observer

Anthony Aguirre points out that UC Santa Cruz is advertising for postdocs in the "Physics of the Observer" program; and although review of applications began in December with a Dec. 15 deadline "for earliest consideration", if you apply fast you will still be considered.  He explicitly states they are looking for strong applicants from the quantum foundations community, among other things.

My take on this: The interaction of quantum and spacetime/gravitational physics is an area of great interest these days, and people doing rigorous work in quantum foundations, quantum information, general probabilistic theories have much to contribute.  It's natural to think about links with cosmology in this context.  I think this is a great opportunity, foundations postdocs and students, and Anthony and Max are good people to be connected with, very proactive in seeking out sources of funding for cutting-edge research and very supportive of interdisciplinary interaction.  The California coast around Santa Cruz is beautiful, SC is a nice funky town on the ocean, and you're within striking distance of the academic and venture capital powerhouses of the Bay Area.  So do it!

Hannover wine roundup I: mostly French wines from Jacques'

Now that I'm living in Hannover, Germany for a while, I'm observing that the quality of life here is very high, especially so in the areas for which this blog is named.  I've already posted a little on physics so here's a bit on wine:

As I've mentioned a few times in discussing some of Trader Joe's wine offerings and elsewhere, German wine retailers seem to have a lot of good and reasonably priced Bordeaux---and French wines generally---available that we don't get in the US, possibly because of the lower transportation costs, and the ease of making direct business contacts with producers who may not sell enough to make it worth marketing across the atlantic.  (The TJ's connection is that as far as I know the latter is privately held in the same hands as German discount supermarket chain ALDI.)   Now that I'm living in Germany, I'm taking advantage of this fact.

Jacques' Wein Depot has stores all over town (and indeed all over Germany), and many bottles always open for tastings.  They had an excellent, medium-full-bodied and quite fruity (strawberry especially), but not sloppy and overripe, Côtes du Rhône from the AOC Rasteau, labeled Ortas Cuvée Prestige 2012, and produced by Caves de Rasteau, for around 7 euros.  Also has a hint of a typical Côtes du Rhône taste I call "leafy".  Excellent deal.  My top pick from Jacques' was Chateau La Croix Romane, AOC Lalande de Pomerol 2011, a right-bank Bordeaux of 80% Merlot and 10% each Cabernets Franc and Sauvignon.  Elegant but still medium-bodied, with some tannin to suggest an optimal drinking window of 4-8 years from now but delicious now, with definite chocolatey flavors and some complexity.  18 euros and well worth it.  2009 was an excellent year in much of Europe it seems, and two moderately priced 2009 Médocs from Jacques, Chateau Castera and Chateau Chantelys, were both excellent, with the former coming in a bit spicier and fuller bodied, the latter a bit lighter and more elegant.  The Castera was about 13 euros on sale (14 normally), the Chantelys I think was less; I think you would have difficulty finding a Médoc of this quality in this price range in the US.  There are cheaper ones at TJ's in the US that are OK, and in general among the better values for reasonably priced Bordeaux in the US, but I would much rather spend in the low teens for one of these.  Besides the business advantages German distributors have, there may be an additional quality advantage to getting French wine in Europe rather than the US: it has not been shipped by boat across the ocean, a roughing-up that probably does some damage to more elegant, subtle, and ready-to-drink wines.

On a different note, for around 9 or 10 euros a 2014 Riesling "Collection les terroirs" from Jean Geiler (Alsace) started a bit rough and ready, but with some air during the course of a meal got more interesting---the back of the bottle claims flavors of ripe lemon and coriander, but I found it more reminiscent of juicy starfruit.  An interesting and tasty bottle in the end; I'll probably get more although not perhaps load up on it.

Departing from the French theme for a moment, a Macedonian wine, STOBI Vranec Veritas (STOBI being the winery, Vranec the grape, Veritas their name for the wine), from the Tikres wine region, was recommended by one of the salespeople at the Kopernikusstrasse branch.  It was very good, fuller-bodied than the French reds mentioned above, closer to a "New World" style but not overripe and sloppy.  I will definitely go for more of it.

I've found the recommendations of the salespeople at these stores to be reliable, their descriptions accurate.

As I've often said, I don't put much stock in numerical wine ratings, but if you want a rough idea of how these might rate in terms of a Parker-type 100 point scale (as he used it circa the mid-1980s, say, which was the era when I paid some attention), I'd give the La Croix Romane a 90, Castera 88, Chantelys 87, Rasteau 86, Geiler Riesling 85, and the Vranec Veritas 88 or maybe 89.  (My impression is that there's been some inflation in Parker and his cohorts' since then: the 85 for the Riesling is still a pretty respectable showing; all of these wines are interesting and worth drinking.)

Martin Idel: the fixed-point sets of positive trace-preserving maps on quantum systems are Jordan algebras!

Kasia Macieszczak is visiting the ITP at Leibniz Universität Hannover (where I arrived last month, and where I'll be based for the next 7 months or so), and gave a talk on metastable manifolds of states in open quantum systems.  She told me about a remarkable result in the Master's thesis of Martin Idel at Munich: the fixed point set of any trace-preserving, positive (not necessarily completely positive) map on the space of Hermitian operators of a finite-dimensional quantum system, is a Euclidean Jordan algebra.  It's not necessarily a Jordan subalgebra of the usual Jordan algebra associated with the quantum system (whose Jordan product is the antisymmetrized matrix multiplication, ).  We use the usual characterization of the projector onto the fixed-point space of a linear map .  The maximum-rank fixed point is (where is the identity matrix), which we'll call , and the Jordan product on the fixed-point space is the original one "twisted" to have as its unit:  for fixed-points, this Jordan product, which I'll denote by , is:

which we could also write in terms of the original Jordan product as , where is the map defined by .

Idel's result, Theorem 6.1 in his thesis, is stated in terms of the map on all complex matrices, not just the  Hermitian ones; the fixed-point space is then the complexification of the Euclidean Jordan algebra.  In the case of completely positive maps, this complexification is "roughly a algebra" according to Idel.  (I suspect, but don't recall offhand, that it is a direct sum of full matrix algebras, i.e. isomorphic to a quantum system composed of several "superselection sectors" (the full matrix algebras in the sum), but as in the Euclidean case, not necessarily a -subalgebra of the ambient matrix algebra.)

I find this a remarkable result because I'm interested in places where Euclidean Jordan algebras appear in nature, or in mathematics.  One reason for this is that the finite-dimensional ones are in one-to-one correspondence with homogeneous, self-dual cones; perhaps I'll discuss this beautiful fact another time.  Alex Wilce, Phillip Gaebeler and I related the property of homogeneity to "steering" (which Schrödinger considered a fundamental weirdness of the newly developed quantum theory) in this paper.  I don't think I've blogged about this before, but Matthew Graydon, Alex Wilce, and I have developed ways of constructing composite systems of the general probabilistic systems based on reversible Jordan algebras, along with some results that I interpret as no-go theorems for such composites when one of the factors is not universally reversible.  The composites are still based on Jordan algebras, but are necessarily (if we wish them to still be Jordan-algebraic) not locally tomographic unless both systems are quantum.  Perhaps I'll post more on this later, too.  For now I just wanted to describe this cool result of Martin Idel's that I'm happy to have learned about today from Kasia.

ITFP, Perimeter: selective guide to talks. #1: Brukner on quantum theory with indefinite causal order

Excellent conference the week before last at Perimeter Institute: Information Theoretic Foundations for Physics.  The talks are online; herewith a selection of some of my favorites, heavily biased towards ideas new and particularly interesting to me (so some excellent ones that might be of more interest to you may be left off the list!).  Some of what would have been possibly of most interest and most novel to me happened on Weds., when the topic was spacetime physics and information, and I had to skip the day to work on a grant proposal.  I'll have to watch those online sometime.  This was going to be one post with thumbnail sketches/reviews of each talk, but as usual I can't help running on, so it may be one post per talk.

All talks available here, so you can pick and choose. Here's #1 (order is roughly temporal, not any kind of ranking...):

Caslav Brukner kicked off with some interesting work on physical theories in with indefinite causal structure.  Normally in formulating theories in an "operational" setting (in which we care primarily about the probabilities of physical processes that occur as part of a complete compatible set of possible processes) we assume a definite causal (partial) ordering, so that one process may happen "before" or "after" another, or "neither before nor after".  The formulation is "operational" in that an experimenter or other agent may decide upon, or at least influence, which set of processes, out of possible compatible sets, the actual process will be drawn, and then nature decides (but with certain probabilities for each possible process, that form part of our theory), which one actually happens.  So for instance, the experimenter decides to perform a Stern-Gerlach experiment with a particular orientation X of the magnets; then the possible processes are, roughly, "the atom was deflected in the X direction by an angle theta," for various angles theta.  Choose a different orientation, Y, for your apparatus, you choose a different set of possible compatible processes.  ("The atom was deflected in the Y direction by an angle theta.")  Then we assume that if one set of compatible processes happens after another, an agent's choice of which complete set of processes is realized later, can't influence the probabilities of processes occuring in an earlier set.  "No signalling from the future", I like to call this; in formalized operational theories it is sometimes called the "Pavia causality axiom".   Signaling from the past to the future is fine, of course.  If two complete  sets of processes are incomparable with respect to causal order ("spacelike-separated"), the no-signalling constraint operates both ways:  neither Alice's choice of which compatible set is realized, nor Bob's, can influence the probabilities of processes occuring at the other agent's site.   (If it could, that would allow nearly-instantaneous signaling between spatially separated sites---a highly implausible phenomenon only possible in preposterous theories such as the Bohmian version of quantum theory with "quantum disequilibrium", and Newtonian gravity. ) Anyway, Brukner looks at theories that are close to quantum, but in which this assumption doesn't necessarily apply: the probabilities exhibit "indeterminate causal structure".  Since the theories are close to quantum, they can be interpreted as allowing "superpositions of different causal structures", which is just the sort of thing you might think you'd run into in, say, theories combining features of quantum physics with features of general relativistic spacetime physics.  As Caslav points out, since in general relativity the causal structure is influenced by the distribution of mass and energy, you might hope to realize such indefinite causal structure by creating a quantum superposition of states in which a mass is in one place, versus being in another.  (There are people who think that at some point---some combinations of spatial scales (separation of the areas in which the mass is located) and mass scales (amount of mass to be separated in "coherent" superposition)) the possibility of such superpositions breaks down.  Experimentalists at Vienna (where Caslav---a theorist, but one who likes to work with experimenters to suggest experiments---is on the faculty) have created what are probably the most significant such superpositions.)

Situations with a superposition of causal orders seem to be exhibit some computational advantages over standard causally-ordered quantum computation, like being able to tell in fewer queries (one?) whether a pair of unitaries commutes or anticommutes.  Not sure whose result that was (Giulio Chiribella and others?), but Caslav presents some more recent results on query complexity in this model, extending the initial results.  I am generally wary about results on computation in theories with causal anomalies.  The stuff on query complexity with closed timelike curves, e.g. by Dave Bacon and by  Scott Aaronson and John Watrous has seemed uncompelling---not the correctness of the mathematical results, but rather the physical relevance of the definition of computation---to me for reasons similar to those given by Bennett, Leung, Smith and Smolin.  But I tend to suspect that Caslav and the others who have done these query results, use a more physically compelling framework because they are well versed in the convex operational or "general probabilistic theories" framework which aims to make the probabilistic behavior of processes consistent under convex combination ("mixture", i.e. roughly speaking letting somebody flip coins to decide which input to present your device with).  Inconsistency with respect to such mixing is part of the Bennett/Leung/Smolin/Smith objection to the CTC complexity classes as originally defined.

[Update:  This article at Physics.org quotes an interview with Scott Aaronson responding to the Bennett et. al. objections.  Reasonably enough, he doesn't think the question of what a physically relevant definition of CTC computing is has been settled.  When I try to think about this issue sometimes I wonder if the thorny philosophical question of whether we court inconsistency by trying to combine intervention ("free choice of inputs") in a physical theory is rearing its head.  As often with posts here, I'm reminding myself to revisit the issue at some point... and think harder.]

Duke Ellington Sacred Concerts---Oxford University Jazz Orchestra and Schola Cantorum Oxford

Just came from an extraordinary concert at the Sheldonian Theatre in which the Oxford University Jazz Orchestra and the Schola Cantorum of Oxford performed a version of Duke Ellington's Sacred Concerts, with two pieces from composer and baritone Roderick Williams' Oxford Blues Service inserted in the Sacred Concert running order.  This constituted the second half of the program; I'll perhaps write in another post about the first half, which featured many good things but a sound balance that was slightly problematic at times, with the band occasionally drowning out the excellent guest soloist, alto saxophonist Nigel Hitchcock.  (I can't allude to the first half, though, without mentioning the really superb singing of first-year Olivia Williams in "Lookin' Back" and "Feelin' Good".)  In the second half, the balance was suddenly almost perfect, the bass acoustic throughout, the swing consistent and unforced, and immediately with the meditative baritone saxophone solo, originally performed by Harry Carney, that introduces "In the Beginning God" we were immersed in Duke Ellington's world of sound and his personal take on religion and spirituality.  Besides the excellence of the band, choir, and soloists, the conducting and preparation of the musicians by Schola conductor James Burton was clearly crucial to the success of this performance.  Nigel Hitchcock's beautiful alto playing was another crucial ingredient, but the regular band members who played key solos, like the baritone sax in "In the Beginning", the clarinet in "Freedom", the plunger-muted trumpet in "The Shepherd" did themselves and the Duke proud as well.  The Roderick Williams pieces "Gray Skies Passing Over" and "The Lord's Prayer"  fit in perfectly, being in a somewhat harmonically lush jazz-to-mid-twentieth-century pop vocal style very similar to parts of the Ellington vocal score, but more contrapuntal, with, I think, an echo of English, and even perhaps Renaissance, church music.

Besides getting real swing from the ensemble, Burton kept things relaxed but accurate, with a real dynamic range, the band in balance with the soloists (Ellington's writing presumably helps here too), expressive phrasing and control over the pace and development of each piece.  "Freedom" was another standout, done with intense feeling and great energy, drawing roars of approval from the audience.  But all the movements were executed superbly, and there were many such moments.  The tap-dancing of Annette Walker, in "David Danced Before The Lord" was another highlight.

This was an utterly professional-sounding performance that felt infused with the passion of people who are together reaching a level they may or may not have reached before, in the zone, giving the audience a musical experience not to be forgotten.  The Sacred Concerts may be a work best experienced live---it was certainly immensely effective, enjoyable, powerful, and moving in this performance.  Bass player and alto Lila Chrisp who is in both groups apparently had the idea that they should join forces in this piece.  I'm very grateful to everyone involved for making this happen and really filling the Sheldonian with the spirit---especially the spirit of Duke Ellington and his band.