4

Note: this question seems apt for flagging as a duplicate, but I haven't found a duplicate of it yet. Apologies in advance if I haven't done my full due diligence regarding this matter.

Now, I was reading Homotopy Type Theory: Univalent Foundations of Mathematics, and after hundreds of pages of dense symbolism,§ of presumably (I presumed!) formalized text, this passage appears therein (in an appendix):

... in this book we have developed mathematics in univalent foundations without explicitly referring to a formal system of homotopy type theory.

The authors said much earlier (near the beginning) of the text that:

Univalent foundations is closely tied to the idea of a foundation of mathematics that can be implemented in a computer proof assistant. Although such a formalization is not part of this book, much of the material presented here was actually done first in the fully formalized setting inside a proof assistant, and only later “unformalized” to arrive at the presentation you find before you — a remarkable inversion of the usual state of affairs in formalized mathematics.

So now I'm not sure what "formalization" means. I thought it meant writing things out in the way that so many things are written out in that book. Per that book, the authors seem to identify formalization in part with a translation into a computer proof system, like COQ. But didn't we talk of formalization before modern digital computing and proof systems? Is formalization a vague property of a system? But if formalization is meant to dampen ambiguity, and if ambiguity is the handmaiden of vagueness, well...?


§For a not-entirely-representative sample:

from the book

1
  • The qualities of human-checking a mathematical proof, largely due to Frege, are conducive to computer-checking. See jstage.jst.go.jp/article/jafpos1956/12/1/12_1_15/_pdf/-char/en for a short holistic overview of formal languages. There are some exceptions like infinitary logics. Commented 2 hours ago

4 Answers 4

9

You are correct in pointing out that using the term "formalisation" to refer to "implementing in a computer proof assistant" is a novel use of the term. But such use is increasingly common. At any rate there have been several levels to "formalisation" even before computer proof assistants came on the scene. "Formalisation" does not merely involve "writing everything in formulas".

For example, a typical calculus textbook certainly looks very formalized in the sense of being full of formulas, but it typically does not delve into a set-theoretic grounding of the subject. Such foundation is assumed as the background but is usually not made explicit, analogously to the procedure in the book you cited (though "formalisation" does not mean the same thing in that case).

To take a more advanced example of "several levels of formalisation", John L. Bell develops Smooth Infinitesimal Analysis in a thoroughly formalized fashion in his books, but the foundations of the subject in topos/category terms is only summarized briefly in an appendix. Such foundations are treated in more detail in the books by Anders Kock and others.

4

This seems to be meant in a sense that would be used in Theoretical Computer Science (which itself can be closer to maths than computers, depending on who talks about it). That is, even if your book is written very rigorously in a fashion so no professional mathematician has anything bad to say about it; there is an additional level of formalism if you want to actually implement it on a computer (like your quote says) or if you want to argue rigorously about the formulae in the book - i.e., kind of a meta-formalism.

As a simple example from Theoretical Computer Science, you can write the rules for the Lambda Calculus (or SEP) in a way that is relatively easy to pick up by students in a CS 101 class; they would then possibly be able to write expressions in that calculus, evaluate them correctly, and maybe even argue about some easy theorems and proofs in that language.

But then you would maybe want to write down a different version of the lambda calculus that is extremely formal and rigorous, compared to the CS 101 one. I.e. every single statement would be grounded in some previous statements you have given, and all of them down to some tiny set of small axioms. The "descriptive" text would itself look like just a complete bunch of mathematical statements with a lot of set theory, 1st order predicate logic etc.

With the latter version, you might overwhelm a new student, but somebody implementing it in a computer, or somebody trying to do formal correctness proofs (in the sense that they are not just words that can be understood by a human, but actual mathematical formulae in a meta-language that describes the language of the Lambda Calculus; and in the sense that these correctness proofs can be evaluated by a computer program at compile- or run-time).

TLDR: A "user" of the book might not care about every single detail being super "formal", and might be able to use the information in there just fine. Someone else who wants to maybe do deeper research into the topic, or wants to build a whole new formalism on top of whatever it is that is described in your example, might be glad to have an even more rigorous presentation.

2

You ask:

What is it to write something "formally"?

In broad strokes, it means to use symbolic language, generally in the format of a formal system and is often associated with formal semantics. Formal semantics has been done in bits and pieces in logic and mathematics for hundreds of years, but with the emergence of computers which are built on the abstraction of a chip's instruction set and at a higher level programming languages, formal systems of all sorts are increasingly rendered into languages that express a programming language semantics. So, consider "writing formally" to mean both the use of symbols to represent ideas in the broad sense (such as formal logic) and in the narrow sense to mean the encoding of any formal system in a computer-based formal language. From WP's article on linguistic formal semantics:

Formal semantics is the scientific study of linguistic meaning through formal tools from logic and mathematics. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. Formal semanticists rely on diverse methods to analyze natural language. Many examine the meaning of a sentence by studying the circumstances in which it would be true. They describe these circumstances using abstract mathematical models to represent entities and their features. The principle of compositionality helps them link the meaning of expressions to abstract objects in these models. This principle asserts that the meaning of a compound expression is determined by the meanings of its parts.

This use of highly precise symbolic language with well-formed formulas is often attributed to Gottlob Frege, though George Boole and others were busy attempting to mathematize the symbolization of logic before him. This transition of symbols or formalisms began arguably when people began using number systems to represent quantities and algebraic and geometric notations were invented to teach mathematics. One notable development in formal semantics was analytic geometry which began using arithmetic symbols to represent geometric shapes, and by the time of Tarski, and others mathematical logic had developed its own language of model theory which requires formalizations such as a language L, a theory T, and a model M such that M⊨T.

Therefore, syntax and semantics of formal theories roughly developed in the order of number systems, arithmetic and geometry, algebra, logic, and ultimately computer science and linguistics. Turing, Church, and Kleene were of course highly skilled logicians when they began formulating formal systems for computations of effective computability. And by the time of Zellig Harris and Noam Chomsky, linguists entered into the formalization game by providing syntactic analysis and at first tentative semantic analysis as part of their research in the human capacity for language. One might see the formal languages of Richard Montague and Barbara Partee as reaching the widest sense of formal semantics with a thrust to integrate different formal systems under one roof. In my own research and experimentation related to Thierry Coquand's colleague Aarne Ranta's Grammatical Framework, his Type-theoretical Grammar takes a particular implementation of logic, intuitionist type theory (SEP), and advances type-theoretic semantics through the introduction of additional grammatical formalisms. But his is one of countless what might be termed "higher-order formalisms". Theorem-provers such as CoQ, which has been renamed Rocq, is just another example of translating one formal semantics into another.

You ask:

Per that book, the authors seem to identify formalization in part with a translation into a computer proof system, like COQ [sic]. But didn't we talk of formalization before modern digital computing and proof systems? Is formalization a vague property of a system?

So, it can be shown clearly that formalization of syntax and semantics is a process that goes back to the advent of writing systems itself, and probably achieved a certain recognition of fame and success beginning with Frege, advancing through Hilbert, and arriving Tarski with proofs and models and the pluralization of logics. After all, the Curry-Howard correspondence was recognized in the mid-30's and continued to develop over the next few decades demonstrating isomorphisms between logic and programming. But today, it's fair to say that many logicians and mathematicians have recognized the value of using programming language semantics to achieve their ends, and they now build systems on top of systems (on top of systems) of formalisms to automate the systems dear to them. So, it's fair to use "formalization" in both a broad and various narrow senses both as, for example, to "formalize a logic" and "formalize a logic in a programming language". In the former case, one is talking about moving from a natural language metalanguage to a formalized object language, and the latter one is building one formal system for another often with the goal to automate the grammar of the first.

-2

Formalization of a mathematical theory means to frame the theory with definitions, axioms, and theorems derived by logical conclusion step by step.

It is not the best way for a textbook if there is no free text in addition. Because presenting a theory is always also a pedagogical enterprise.

Aside: I assume that you know all this. So I wonder a bit about your post.

1
  • 1
    I don't know all this, hence my question. Commented 15 hours ago

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question