SEMINAR - Finished

Bruno Mendonça

In this talk, I present an overview of the recent literature on the dilemma of inference, a conflict between the following two theses: i) Non-informativeness: due to their deductive character, proofs cannot extend the mathematical information possessed beforehand. The semantic information carried by the conclusion of a proof is included in the information carried by its premises. ii) Epistemic usefulness: demonstrations improve our mathematical knowledge.This talk is divided in two parts. In the first part, we will examine Novaes' standpoint on this issue (Novaes, 2020). In her opinion, the dilemma of inference is a false dilemma, i.e., deductive reasoning is epistemic useful despite its non-informativeness. We will see that Novaes' solution is too restricted since it associates only a negative utility to logical knowledge. In the second part, we will survey a tradition of analysis of the informativeness of proofs in terms of a dichotomy between surface and depth information. In particular, we will consider D'Agostino and Floridi's treatment of the topic that ascribes a positive informational value specifically to hypothetical proofs (D'Agostino and Floridi, 2009). Then, I argue that, albeit D'Agostino and Floridi successfully delimit a large set of informative proofs, driven by a veridicalist conception of semantic information, they offer an implausible, hyper-realist explanation of the information displayed by such proofs. Finally, I claim that we might achieve a more parsimonious (anti-realist) account by paying attention to the pragmatic and dialogical dimensions of our use of dischargeable hypotheses.

Data
Tiago de Castro Alves

The kind of task to be carried out in this talk can be seen as belonging to, quoting Kreisel, “the sort of Kleinarbeit which is generally needed to support a genuine hypothesis (...) as opposed to a mere mathematical fancy.” (Kreisel 1971, p.114). We propose a brief analysis of an argumentative strategy to justify the completeness claim of the so-called normalisation thesis on identity of proofs (cf. Prawitz 1971), attributed to Barendregt by Kreisel in Kreisel 1971, based on proving the maximality of the equivalence relation corresponding to the thesis. This strategy became of particular significance after the actual obtainment of the maximality results of e.g. Widebäck 2001 and Došen and Petrić 2000 and 2001. Resorting to taxonomical remarks on criteria for identity of proofs and observing some related formal results discussed in de Castro Alves 2019, we argue that literature has endorsed an unfairly favourable impression of the degree of effectiveness of the justification for the normalisation thesis that such maximality results are capable of yielding as premisses in Barendregt’s argumentative strategy.

Data
Vincenzo Ciccarelli

In this talk I present a brief survey of certain possible understandings of the operation of
“mathematical abstraction” and I suggest that - especially when abstraction is considered as
second-order - a certain principle of reification of concepts is indispensable to explicate such
operation. I start by distinguishing four sorts of explications of abstraction: (1) direct
eliminative; (2) conceptual by reification; (3) conceptual by order reduction; (4) by recarving the
content. (1) corresponds to the classical account of abstraction which dates back to Aristotle and
may be partially found in Cantor’s definition of cardinals and in Peano’s school. (2) corresponds
to Frege’s account of abstraction in the Grundlagen. (3) corresponds to Frege’s account in the
Grundgesetze. (4) corresponds to the strategy sketched in §64 of the Grundlagen which Frege
rapidly abandoned. I argue that, when higher-order abstraction principles are concerned, all these
accounts require a principle for reification of concepts which may be preliminary understood as
the introduction of set-like entities. Moreover, I will show that (2) and (3) may be somehow
circular and (1) is less explanatory and may engender some logical difficulties. I draw the
conclusion that given the dependence of abstraction upon a basic set theory, there are little
chances to use this operation to provide a foundational account of arithmetic and real analysis.

Data
Giorgio Venturi

In this talk we will briefly present the application of the notion of model completeness and model companionship to the study of set theory. We will then present the solution that the model companion of ZFC + large cardinals offer to the Continuum Hypothesis. We end the talk discussing a new form of justification in set theory, related to the complete models under consideration.

Data
TBA
Data