Liubove Orlov Savko has finished her bachelor’s in pure mathematics at Universidad Nacional Autónoma de México and will continue her studies, pursuing a master’s in mathematical sciences. Her interests are computational neuroscience and artificial intelligence. She has modeled biological neural networks to provide a model for a property encountered in neural networks, which consisted of the poblational firing rate stability due to homeostasis. She also modeled biological learning algorithms with Mathematica. The Wolfram Summer School will help her start a new project, which she hopes to continue during her master’s.
Project: Self-Normalizing Neural Networks for Medical Diagnosis
Implement self-normalizing neural networks for the classification of diseases. Obtain medical datasets and train the self-normalizing neural networks on them. Compare the accuracies for different hyperparameters and various methods in Classify.
Main Results in Detail
The trained neural network showed an impressive accuracy of more than 95% in all of the datasets, and it was much better than the automated classification methods present in Classify. It is robust since it was analyzed over all hyperparameters (dropout, number of layers and neurons) and gave high accuracy, as shown in the list plots.
- 1. Add weighted training to improve results in cases where the amount of data is not the same in each class.
- 2. Train on other types of datasets, such as images, sounds, etc.
- 3. Write a function that processes data automatically, computes the best hyperparameters for that particular dataset and returns a trained neural network for classification.
- 4. Analyze the dependence of the hyperparameters with the number of attributes and classes.