Wednesday, 30 October 2013

A dialogical analysis of structural rules - Part I

(Cross-posted at NewAPPS)

As some of you may have seen, we will be hosting the workshop ‘Proof theory and philosophy’ in Groningen at the beginning of December. The idea is to focus on the philosophical significance and import of proof theory, rather than exclusively on technical aspects. An impressive team of philosophically inclined proof theorists will be joining us, so it promises to be a very exciting event (titles of talks will be made available shortly).

For my own talk, I’m planning to discuss the main structural rules as defined in sequent calculus – weakening, contraction, exchange, cut – from the point of view of the dialogical conception of deduction that I’ve been developing, inspired in particular (but not exclusively) by Aristotle’s logical texts. In this post, I'll do a bit of preparatory brainstorming, and I look forward to any comments readers may have!

In a nutshell (as previously spelled out e.g. here), the dialogical conception is based on the idea that a deductive proof is best understood as corresponding to a semi-adversarial dialogue between two fictitious characters, proponent and opponent, where proponent seeks to establish a final conclusion from given premises, and opponent seeks to block the establishment of the conclusion. Proponent puts forward statements stepwise, which she claims follow necessarily from what opponent has already granted in the course of the dialogue. Opponent can grant these statements, or else he can object that a given statement does not follow necessarily from what he has granted so far, by providing a counterexample (a situation where premises hold but conclusion does not). Another move available to opponent is: "why does it follow?" This would correspond to an inferential step by proponent that is not sufficiently perspicuous and compelling for opponent; proponent must then break it down into smaller, individually perspicuous inferential steps. The game ends when proponent manages to compel opponent to grant her final conclusion.

The motivation behind this dialogical conceptualization is what could be described as a functionalist approach to deductive proofs: what are they good for? What is the goal or function of a deductive proof? On this setting, the main function of a deductive proof is that of persuasion: a good proof is one that convinces a fair but ‘tough’ opponent of the truth of a given statement, given the (presumed) truth of other statements (the premises).

I believe that this dialogical/functionalist approach allows for a philosophically motivated discussion of the different structural rules. This becomes important in the context of the recent surge in popularity of substructural approaches to paradoxes (just yesterday I came across the announcement for a very interesting workshop taking place in a few weeks in Barcelona precisely on this topic). A number of people have been arguing that in many of the paradoxes (the Liar, Curry), it is the availability of contraction that allows for the derivation of the paradoxical conclusion (as I discussed here and here). I have expressed my dissatisfaction with many of these approaches given the lack of an independent motivation for restricting contraction: to say that contraction must be restricted solely because it gives rise to paradox is nothing but a ‘fix-up’ which does not take us to the core of the phenomenon. What is needed is a reflection on why contraction was thought to be a legitimate principle in the first place, and arguments against this presumed original rationale for contraction (or any other rule/principle).

Naturally, restrictions on structural rules are not a new idea: indeed, relevant logicians have offered arguments against the plausibility of weakening, and linear logicians pose restrictions on contraction. But in both cases, what motivates restriction of these structural rules are not paradox-related considerations; rather, it stems from independent reflection on what the logical systems in question are good for – in other words, something resembling the functionalist approach I am defending here. Linear logic is usually described as a logic of resources, and as such it matters greatly how many copies of a given formula are used – hence the restriction on contraction. Relevant logics require a relation of relevance between premises and conclusion, which can be disrupted by the addition of an arbitrary formula – hence the restriction on weakening.

Now, as it turns out, the dialogical conception of deductive proofs has its own story to tell about each of these structural principles. Let us start with weakening, which is the following structural rule in its sequent calculus formulation (I will restrict myself to left-weakening, as right-weakening poses a range of other problems related to the concept of multiple conclusions):

A => C
--------------
A, B => C

In first instance, if necessary truth-preservation is the only requirement for the legitimacy of proponent’s inferential steps, then weakening may seem as an entirely plausible principle: if opponent has granted A and is then compelled to grant C because C follows of necessity from A, then whatever additional B that comes up in the dialogue and is granted by opponent will not invalidate the move to C. That's simply the property of monotonicity, one of the core components of a deductive proof.

However, there is much more to be said on weakening from a dialogical perspective, and now it becomes useful to distinguish different kinds of such dialogical interactions. In the spirit of the purely adversarial interactions described for example in Book VIII of Aristotle’s Topics, it is in fact in the interest of proponent to confuse opponent by putting forward a large number of statements, some of which will be irrelevant to her final conclusion. In this way, opponent will not ‘see it coming’ and therefore may be unable to guard himself against being forced to grant the final conclusion. So in a purely adversarial setting, weakening is in fact strategically advantageous for proponent, as it may have a confusing effect for opponent.

By contrast, in a context where the goal is not only to beat the opponent by whichever means, but also to produce an explanatory proof – one that shows not only that the conclusion follows, but also why it follows – weakening becomes a much less plausible principle. Indeed, for didactic purposes for example, it makes much more sense to put on the table only the information that will in fact be relevant for the derivation of the conclusion, precisely because now confusing the interlocutor is the opposite of what proponent is trying to accomplish. And indeed, as it turns out, Aristotle’s syllogistic – which, according to some scholars, was developed to a great extent so as to provide the general framework for the theory of scientific explanation of the Posterior Analytics – is a relevant system, one where weakening is restricted. That is, for explanatory purposes, weakening is a pretty bad idea. So in other words, depending on the goal of a particular dialogical interaction of this kind, weakening will or will not be a legitimate principle.


(As this post has already become quite long, I will leave contraction, exchange and cut for a second installment later this week. So stay tuned!)

UPDATE: Part II here.

Tuesday, 22 October 2013

CFP: Trends in Logic XIV (Off-stream applications of formal methods)


Trends in Logic XIV
Entia et Nomina workshop
Ghent University, Belgium, July 8-11, 2014

The Road Less Travelled
Off-stream applications of formal methods

Theme

Logicians have devoted considerable effort to applying formal methods to what are now considered core disciplines of analytic philosophy: philosophy of mathematics, philosophy of language and metaphysics. Researchers in these fields have been accused of sharpening their knives without actually cutting anything of interest to those outside of philosophy. The focus of formal methods is changing and our intent for this conference is to further counter the impression of idleness with respect to philosophy at large. The focus of the workshop is to be on those applications of formal methods in philosophy which might be of interest to people working on philosophical questions of more direct relevance to human life.

We plan three sessions with the following invited speakers:

Session 1 Applications of formal methods in philosophy
Diderik Batens, Centre for Logic and Philosophy of Science, Ghent University (Belgium)
Krister Segerberg, Uppsala University (Sweden)
Katie Steele, London School of Economics and Political Science (UK)

Session 2 Applications of formal methods in social philosophy
Gabriella Pigozzi, Universite Paris-Dauphine (France)
Martin van Hees, University of Amsterdam (Netherlands)
John F. Horty, University of Maryland (USA)

Session 3 Applications of Bayesian methods in philosophy
Luc Bovens, London School of Economics and Political Science (UK)
Lara Buchak, University of California, Berkeley (USA)
Richard Pettigrew, Bristol University, (UK)

Format

Authors of contributed papers are asked to submit extended abstracts and full papers, prepared for blind-review by January 6, 2014. Extended abstracts should be no more than 2000 words. Authors of accepted papers will have 30-60 minutes to present their work, depending on the length of their papers. Each paper will be followed by two commentaries from other participants. Accepted participants might be asked to comment on at least one talk. 5-10 minute commentaries will be followed by 10-15 minutes of discussion. All accepted papers will be made available to the participants ahead of the conference.


More details

http://entiaetnomina.blogspot.be/p/trends-in-logic-xiv.html

https://docs.google.com/file/d/0B2KkZCQhKIFmM3VmeFoyV29BVmM/edit?usp=sharing

trendsinlogic2014@gmail.com




Friday, 4 October 2013

Epistemic Utility Theory project at Bristol: new webpages

There is now a website, a Facebook page, and a Google Calendar for my 'Epistemic Utility Theory: Foundations and Applications' research project at University of Bristol.  These give lots of information about events that we'll be holding as part of the project.

Website:  https://sites.google.com/site/epistemicutilitytheorybristol/home

Facebook:  https://www.facebook.com/epistemicutilitytheorybristol

Google Calendar:  iol18sr2acln8sjsmtr7ghgsu0@group.calendar.google.com

Wednesday, 2 October 2013

Operationalization 2013: An interdisciplinary workshop at the edge of experimental psychology and analytical philosophy

The workshop will take place at the FRIAS in Freiburg im Breisgau (Germany) the 15th and 16th October 2013.

The workshop brings together psychologists and philosophers in order to investigate questions of concept operationalization that are common to both disciplines. The questions we tackle combine theoretical and empirical features. Some examples: How should we devise elicitation procedures for complex and potentially ambiguous conceptsHow should we work out predictions that are precise enough to be put to empirical testHow can we connect the philosophical literature to the long-lasting empirical tradition in psychology?

The workshop is organized into three types of sessions: keynotes by researchers who have already worked successfully about operationalizing epistemic concepts, methodological tutorials which will present contemporary statistical procedures, and discussion sessions where planned experimental projects will be discussed.

Keynote speakers

Ulrike Hahn (Birkbeck, University of London, UK)
Edouard Machery (University of Pittsburg, USA)
David Over (University of Durham, UK)

Methodological tutorials 

David Kellen and Henrik Singmann (University of Freiburg, Germany)

Discussion sessions 

(The discussion session focus on ongoing projects and not past work.)

Matteo Colombo (Tilburg University, Netherlands); Nicole Cruz (PH Ludwigsburg, Germany); Marco Ragni and Barbara Kuhnert (University of Freiburg, Germany); Mark Siebel, Jakob Koscholke, and Michael Schippers (University of Oldenburg, Germany), and Marc Jekel (Max Planck Institute for Research on Collective Goods, Bonn, Germany); Jan Sprenger (Tilburg University, Netherlands); Tatsuji Takahashi (Tokyo Denki University, Japan); Matthias Unterhuber (University of Düsseldorf, Germany).

Organization

The workshop is organized by Henrik Singmann (University of Freiburg), Marco Ragni (University of Freiburg), Vincenzo Crupi (Università di Torino and LMU Muenchen), and Jan Sprenger (Tilburg University) and is a a follow-up event to the workshop Operationalizing Epistemic Concepts that took place last year in Aachen. The workshop is co-sponsored by the DFG Priority Program New Frameworks of Rationality (SPP 1516) and the Freiburg Institute for Advanced Studies (FRIAS).

(More information can also be found here.)