Wolfram Computation Meets Knowledge

Wolfram Summer School


Joshua Pedro

Science and Technology

Class of 2019


Joshua is a master of science student in Mathematics at the City University of New York. His interests are in the fields of Machine Learning and Complex Analysis and their applications to the social and natural sciences. Before moving to the United States, Joshua completed his bachelor of science in Economics at the University of Guyana. Mathematics was always his primary interest and he was known by his teachers and peers for showcasing exemplary work in the subject. Joshua has done research in the field of econometrics and graph theory, and he is also currently an adjunct lecturer in mathematics.

Computational Essay: Taylor Polynomials in Two and Three Dimensions

Project: Transfer Learning with Invertible Neural Networks


The goal of this project is to use invertible neural networks for generative deep learning. Given a set of MNIST digits 0 to 8, we will first train an invertible neural network to generate new digits from the same sample then use transfer learning to generate the digit 9 given only a few examples.

Summary of Results

  • We built the basic building blocks for an invertible residual neural network.
  • We managed to train a neural network to generate MNIST digits by mapping data to a latent space then reversing this map to the data space.
  • We used transfer learning to train the learned network on just 10 examples of the digit 9, which it had never seen before, and then it was able to generate all 10 MNIST digits.

Future Work

In the future, we plan to:

  • Perform better hyper-parameter tuning on the network to improve performance.
  • Add convolutional layers to the Real NVP network, which is expected to improve generation accuracy tremendously.
  • Fully implement an invertible residual network, which will provide a much more accurate representation of the latent space and generate the data space more efficiently.