Month: October 2015

Conference Report: SOTFOM III, 21-23 September 2015

The 3rd Symposium on the Foundations of Mathematics was held in the Lecture Hall of the KGRC, in Vienna, from the 21st to the 23rd of September 2015.

The purpose of the conference was to bring together scholars who, in the last years, have contributed to the ongoing debate on the foundations of set theory, in particular  on such topics as the universe/multiverse dichotomy, new axioms of set theory and their ontological and epistemological features, different forms of justification for the acceptance of new axioms, competing foundations and, finally, the Hyperuniverse Programme (HP), which is currently investigated at the KGRC by S. Friedman and collaborators.

The conference was opened by Tatiana Arrigoni’s talk, which aimed to assess the current status of the HP, as recently developed by Friedman, Antos, Honzik and Ternullo. Arrigoni acknowledged that much work has been done within the programme in the direction of connecting multiverse axioms to the concept of set, but she also pointed out that further work is needed to make a full case for the `intrinsicness’ of the programme’s maximality principles. She also encouraged further work on the issue of why the programme believes that intrinsically justified axioms are particularly valuable for the fruitful (and correct) development of set theory.

In his talk, Giorgio Venturi explained how forcing may lend support to a realist conception of set theory, by using what he calls trans-universe realism, whose plausibility, in turn, is grounded on some recent mathematical results due to Hamkins. Pushing this interpretation of forcing even further, Venturi suggested that the notion of arbitrary set, which is an integral part of what he sees as a feature of a realist attitude in set theory (Bernays’ quasi-combinatorialism), may find a sharper formulation in that of generic set, as used in forcing.

Finally, in the last talk in the morning of Day 1, Dan Waxman assessed the argument that the independence phenomenon in set theory leads one to viewing some set-theoretic statements as indeterminate. The specific argument assessed by Waxman takes determinacy to arise either from the `world’ or from our `practice’. Now, if one accepts what Waxman defines, respectively, a metaphysical (1) and a cognitive (2) constraints on our theories, whereby, respectively, (1) objects are not ineliminable in our best account of mathematics and (2) we cannot attribute non-mechanical powers to our minds, then there will, of necessity, be indeterminate statements in set theory.

In the afternoon, Matteo Viale reviewed some of the reasons why forcing axioms have proved to be `successful’ in set theory. Among others, he pointed to the following two: (1) they are equivalent to key, well-established principles, such as the Axiom of Choice, and (2) they lead to absoluteness results within the set-generic multiverse.

In the last talk of Day 1, Øystein Linnebo presented his well-known modal version of the axioms of set theory using plural quantification, showing how it fits the requirements of a potentialist conception which originates from Cantor’s work (which famously identifies proper classes such as the ordinals as inconsistent, qua incompletable, multiplicities). In Linnebo’s account, this means, in particular, that (1) not all objects in V are given immediately (rather, they are produced gradually) and (2) that truths are `created’ as the hierarchy gradually unfolds.

On Day 2, Mary Leng reviewed Penelope Maddy’s recent `Defending the New Axioms’ book, where Maddy advocates a form of realism labelled `thin’ realism or `arealism’. In particular, Leng described how, by this conception, the old dispute about the existence of mathematical objects is irrelevant, while it is still relevant for it, and it still makes sense to ask, whether CH, e.g., is true or false. She then took a step further, by pointing out that arealism might, in fact, be compatible with the view that there is no fact of the matter about whether CH is true or false.

Afterwards, Neil Barton showed that the HP may be compatible with several pictures of the universe of sets and, in particular, with that arising from an `absolutist’ conception of V. The bulk of Barton’s strategy consists in showing that not only forcing extensions of V can be coded into models definable in V itself, as already shown by Hamkins, but, in general, that this, using V-logic, applies to all outer models of the Morse-Kelley axioms (which, in turn, best express the absolutist viewpoint). V-logic is a basic ingredient to express HP’s maximality principles and, thus, Barton’s project may open the way to an `absolutist’ construal of the HP.
The rest of Day 2 was spent making a trip in the Wienerwald. After having lunch in the rural Mostalm, Dr Peter Telec kindly offered to lead a relaxing walk through the beautiful Viennese woods.

IMG_20150922_133159

Day 3 opened with Geoffrey Hellman’s talk detailing a height-potentialist conception of V. Hellman showed us how his account, relying on a modal version of Zermelo’s second-order ZFC, is adequate to express some crucial intuitions behind set theory, including the \emph{set/class} distinction as proposed by Zermelo, as well as the indefinite extendability of the universe. Moreover, he showed how his account could also more naturally justify second-order reflection, which would take us beyond the inaccessibles and would help justify all small large cardinals.
Talking after Hellman, Sam Sanders showed that, using non-standard analysis, one can successfully respond to Voevodsky’s recently questioning the adequacy of ZFC as a foundation of mathematics, insofar as the latter cannot be `computational’. This presupposition led Voevodsky to reject ZFC as a foundation and advocate HoTT (Homotopy Type Theory) as a plausible alternative. As shown by Sanders, `computational’, here, does not mean `implemented by a computer’, but rather `constructive’ in the sense of Per-Löf’s intuitionistic type-theoretic system.
Subsequently, Emil Weydert explored the intiriguing topic of how the HP could be used to produce an axiom induction framework. The basic ingredient is given by formalising `multiverse reasoning’ in terms of non-monotonic reasoning, whereby the addition of new hypotheses (i.e., new axioms) leads to finding differring conclusions. Weydert’s project also envisages taking into account other parameters which are relevant to selecting new axioms.
In his talk, Douglas Blue started with a definition of maximality in set theory as the demand that a candidate axiom maximise the interpretative power of a theory T. But he showed us the interesting case of the axiom (\star), which satisfies the aforementioned definition (that is, as shown by Woodin, it maximises the theory of H(\omega_2) as far as \Pi_2-sentences are concerned), but is in contrast with a more intuitive version of maximality, one entailing that the power-set operation adds novel structure: if we adopt (\star), then we have that H(\omega_1) is bi-interpretable with H(\omega_2) and, thus, (\star) fails to satisfy the second version of maximality.

Our last speaker was Sy Friedman, who described different notions of `new axiom’ and justifications thereof. Friedman reviewed the maximality principles which have been investigated within the HP and, arguably, related to the concept of set (and, thus, intrinsically justified). Now, one further criterion to judge the value of an axiom is to see whether it is useful. However, here Friedman departs from the standard interpretation of this, by suggesting that the most useful set-theoretic axioms should be those which are also useful for non-set-theoretic mathematics. Finally, he formulated the conjecture that the intrinsically justified higher-order principles of HP will prove useful to find first-order axioms which are useful for non-set-theoretic mathematics: such axioms will then have to be considered true axioms of set theory.

The conference had several outcomes. First, we believe it helped understand some of the underlying assumptions in the HP, and also the theoretical challenges it has to face up to, and the different ways such challenges can be met (Arrigoni, Weydert, Barton).

The potentialist conception was reviewed in depth (Linnebo, Hellman), and its main advantages in light of the clear foundational purposes of set theory fully described.

A description of the multiverse and its utility for set theory was carried out in several talks, and the pressing issues of truth and ontology relating to it, as arising from pluralism (Waxman) or in relation to realism (Venturi), were also examined.

Naturalism was evoked in some talks (Leng, Venturi, Arrigoni). From these talks and the ensuing discussion, it seems reasonable to assert that it is still unclear whether naturalism can properly ground set-theoretic work in a fully satisfactory way, especially if one shifts to a multiversist conception.

New set-theoretic axioms, arising from the need for `maximality’ principles, were the subject of several talks (Barton, Friedman, Blue, Viale), all of which, in our opinion, helped dispel some confusion relating to the notion of maximality, but, at the same time, also clearly highlighted how poorly understood the notion still is. The debate on what form maximality is more acceptable is still open, but it seems that the HP may have good prospects to break new grounds.

Finally, Sanders’ talk hinted at how the issue of what foundational theory is preferable, among those available, might be solved in a way alternative to those usually discussed, that is, by looking into such features as that of `computability’.

In recognition of the joint effort of the organisers and speakers, a proposal for the proceedings of the conference, also including some of the papers discussed at previous SOTFOMs, will be submitted to Synthese.

A selection of slides for the talks is available here: Weydert, Venturi, Arrigoni, Hellman, Linnebo, Friedman, Sanders, VialeBarton,  Waxman and Warren.

We already look forward to SotFoM IV!