James Bailey is a medieval historian by academic background who spent his working life in the computer industry, primarily at Digital Equipment and Thinking Machines Corporation. His 1996 book After Thought (Basic Books) makes the case for biologically inspired algorithms—like neural networks—eclipsing numbers and equations as the go-to math of the 21st century, just as those numbers and equations sidelined the circles and lines of geometry in the 17th century. He now focuses on ways of making this new math accessible deep in the K–12 curriculum. He is a graduate of Brown University, a member of Phi Beta Kappa and the sponsor of the www.selfschooling.com website.
Project: Learning about Learning
To show that high-school Algebra II students can, given the right support and instruction, learn to create an app that learns. To show that the same Wolfram Language that supports very powerful deep learning implementations can also provide an instructional on-ramp to this technology for those students.
Main Results in Detail
It is indeed possible to unite Wolfram Language code that carries out learning, at both the simplest and most sophisticated levels, with visualization steps that show step by step how we get from the former to the latter. The Wolfram Language Manipulate structures are effective in allowing learners to interactively design neural networks and get hints as to what is going on inside the nodes.
A full high-school-level introduction to deep learning needs to start earlier, with training in how lists, and lists of lists, work. It also needs to allow the student to dig deeper into the guts of individual nodes.