Wolfram Computation Meets Knowledge

Wolfram Summer School

Alumni

Yash Akhauri

Science and Technology

Class of 2018

Bio

Yash Akhauri is an undergraduate student at BITS Pilani, studying electronics and instrumentation. He spends most of his time researching in the fields of high-performance computing and artificial intelligence. In a world with constantly evolving computing architectures, he believes that optimizing AI on heterogeneous systems will be the key to working with the massive amounts of data. He hopes to work on a distributed mesh computing platform optimized for AI. He also wants to leverage his earlier work in the field of optimizing quantized neural networks and his interest in bringing new quantized neural network architectures to state-of-the-art performances to streamline industry costs for working with machine learning.

Computational Essay

Initialization Strategies for Deep Neural Networks »

Project: Introducing Hadamard Binary Neural Networks

Goal

I hope to develop a new methodology of training neural networks in low precision in order to increase the speed and decrease the memory requirements with minimal loss of accuracy.

Main Results in Detail

  • 1. Faster convergence with respect to vanilla binarization techniques.
  • 2. Consistently 10 times faster than the classic matrix multiplication algorithm.
  • 3. Similar performance to Intel MKL matrix multiplication with minimal optimization.
  • 4. Angle of randomly initialized vectors preserved in high-dimensional spaces (approximately 37 degrees as the vector length approaches infinity).
  • 5. A histogram of network weights that has greater granularity than binary neural networks.

Future Work

  • 1. Conduct a study of the communication costs associated with this architecture.
  • 2. Test accuracy on deep convolutional and recurrent models.
  • 3. Develop a distributed framework optimized for AI training.
  • 4. Optimize convolutional kernels for the HBNN architecture.