Joost Joosten was born in 1972 in the Netherlands. He studied mathematics and physics at the University of Amsterdam. After one year of teaching math at an international school in Spain, he went back to academia to obtain a PhD at the Department of Philosophy of Utrecht University studying the foundations of mathematics. After his PhD he spent three years in postdoc positions as a lecturer and researcher at various institutes: math, philosophy, and artificial intelligence departments at Utrecht University, the University of Amsterdam, and the Academy of Sciences of the Czech Republic. After this he worked for about two years at the headquarters of ABN AMRO in quant positions: first at Product Control, Foreign Exchange Options and Exotics, and later at Credit Risk Management. A very good offer made him decide to return to academia. Currently he works at the Philosophy Department of the Universidad de Sevilla in Spain.
His current interest is in trying to link various guises of complexity. One can think of the Hausdorff dimension or computational complexity in terms of time or space. Example measures of these resources are proof lengths, succinctness of representations, and much more. A secondary interest lies in a more philosophical study on complexity and how this is related to a cognitive framework. He is the father of three adorable boys; the middle one appears in the photo above.
Project: Is My Credit Card Safe? Pictures of Intractability.
Various safety protocols are based on assumptions that some encoding process is hard to decode. One such assumption is that it is hard to find the factors of a natural number. One can easily do it by just checking all the numbers smaller than the number to be factored, but this requires exponentially many steps if the original number is represented in binary.
Once given the factors of the number, it is easy to check that, indeed, multiplied by each other they yield the original number. (As an aside, this nicely illustrates the notion of NP-ness: it is nondeterministically polynomial. That is, the right answer is picked nondeterministically, and can be verified in polynomial time to indeed be the right answer.)
Many of the safety assumptions are so-called NP-complete problems. Without going into details, this means that solutions are hard to find but easy to check. NP is a class of problems, lying in the rich landscape between trivial and universal problems.
This project is related to the P!=NP question. If P!=NP holds, there would be a function for which no speed-ups are possible even using Turing machines with more resources. The goal of the project is to study the trade-offs between descriptional complexity (i.e. program size) and computational complexity (i.e. amount of resources) by means of Turing machines. It has been found that the average speed of machines computing a function actually decreases when more resources (states) are available in general, while there are a few machines computing it faster. That means that the greater the descriptional complexity of a Turing machine, the slower its average computation speed for a given function. In other words, the probability of randomly picking a Turing machine that will compute a function faster decreases when more states are available. This might have strong relationships with hardness and completeness in computational complexity, and is an experimental approach linking both descriptional and computational complexity.
Many functions that are candidates for showing P!=NP are known in the literature. However, most have ghastly computational properties as they are constructed in the familiar top-down approach: first, some mostly combinatorial problem is selected, which is then translated to number representation, creating a function, then implemented on a machine level.
An exhaustive search of the 2,2 and 2,3 space of Turing machines will be performed to look for and compare minimal programs.
Turing Machine Enumeration: NKS versus Lexicographical
Favorite Three-Color Cellular Automaton