My most recent research aims to bring together the fields of deep learning and theoretical ecology. I have several active projects in ecosystem modeling with deep networks, including work collecting datasets, in multi-agent RL, counterfactual inference, and meta-learning. Please contact me if you are interested in collaborations in this area.
I am broadly interested in understanding the effects on learning behaviour of varying “what goes into” deep models. In my research so far this has included multimodal input data, regularization, and different forms of feedback or environment, with applications to video, natural language, and climate data. I'm concerned and passionate about AI ethics, safety, and the application of ML to environmental management, health, and social welfare.
I started post-secondary education in biology with a focus on health and neuropsychology, but transitioned to a concentration in ecology. Analyzing results for my honour's research in bioremediation, I was introduced to programming for the first time and quickly realized I wanted to do machine learning. I recieved an NSERC scholarship to particpate in a large-scale research project on climate change, and later participated in a number of coding projects and discovered neural networks.
I began an MSc in computer science with Layachi Bentabet, studying biological realism in deep networks. During this time I was awarded a MITACS scholarship to be a machine learning research intern at iPerceptions, exploring semi-supervised learning in predictive models.
In November 2015 I completed my MSc, and in January 2016 began a PhD focused on deep learning research at Mila as an NSERC scholar with Christopher Pal.
My CV can be found here.
Tools for society [chapter in Tackling Climate Change with Machine Learning]. Tegan Maharaj. Edited by David Rolnick et al. 2019. [website]
Hidden incentives for self-induced distributional shift. David Krueger, Tegan Maharaj, Shane Legg, Jan Leike SafeML@ICLR2019. [pdf]
Memorization in recurrent neural networks. Tegan Maharaj, David Krueger, Tim Cooijmans PADL@ICML2017. [pdf]
Reserve output units for deep open set learning. David Krueger, Tegan Maharaj COSL@CVPR2017. [pdf]
A closer look at memorization in deep networks Devansh Arpit*, Stanislav Jastrzebski*, Nicolas Ballas*, David Krueger*, Emmanuel Bengio, Max Kanwal, Tegan Maharaj, Asja Fischer, Aaron Courville, Yoshua Bengio, Simon Lacoste-Julien ICML2017. [pdf]
ExtremeWeather: A large-scale climate dataset for semi-supervised detection, localization, and understanding of extreme weather events. Evan Racah, Christopher Beckham, Tegan Maharaj, Prabhat, Samira Kahou, Christopher Pal. NeurIPS2017. [pdf] [code] [dataset]
A dataset and exploration of models for understanding video data through fill-in-the-blank question-answering. Tegan Maharaj, Nicolas Ballas, Anna Rohrbach, Aaron Courville, Christopher Pal. CVPR2017. [pdf]
Suprisal-Driven Zoneout. Kamil Rocki, Tomasz Kornuta, Tegan Maharaj. 2016. [pdf]
Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations. David Krueger*, Tegan Maharaj*, Janos Kramar, Mohammad Pezeshki, Nicolas Ballas, Nan Rosemary Ke, Anirudh Goyal, Yoshua Bengio, Hugo Larochelle, Aaron Courville, Chris Pal. ICLR2017. [pdf] [poster] [code]
I've co-organized several workshops:
- Climate Change: How Can AI Help? at ICML 2019
- LSMDC (Large Scale Movie Description Challenge) workshop at ECCV2016 and ICCV2017
- Joint Workshop on Storytelling with Images and Videos
I was a co-founder of the Montreal AI Ethics meetup, and a contributor to SOCML 2017 and 2018, as well as the Montreal Declaration for Responsible AI.
LSMDC2016 - Fill in the Blank Challenge. Joint 2nd Workshop on Storytelling with Images and Videos (VisStory) at ECCV. 2016/10. [slides]
Zoneout: Regularizing RNNs by randomly preserving hidden activations. Deep Learning Summer School. 2016/08. [slides]
BRAINS (anatomy, structure, function, and evolution). University of Montreal. 2016/06. [slides]
Neuroscience and biology for deep learning. University of Montreal. 2016/04. [slides]
Introducing "neurotransmitters" to an artificial neural network for modular concept learning and more accurate classification. Research week, Bishop's University, Sherbrooke, QC. (1st prize in poster competition) 2014/02.
Intelligent data analysis broadens our understanding of the world (2nd prize in oral competition) 2014/02.
I was a TA for the following classes during phd:
- Deep Learning
- Artificial Intelligence
- Introduction to Machine Learning
During undergrad and master's:
- CSC211 Introduction to Programming
- CSC103 Interactive Web Page Design
- FIN218 Digital Imaging
- PHY101 Introductory Statistics
- BIO349 Invertebrate Zoology
- ESG226 Oceans I
- BIO110 Genetics
- BIO116 Diversity of Life
I also worked as a tutor at the Computer Science Help Centre at the end of my BSc/beginning of MSc, and at the ITS Helpdesk (troubleshooting and tech support) throughout my BSc.
Prediction and generation of sound with LSTMs: end-of-term project for a deep learning course. [website] (research blog) [code] (based heavily on johnarevalo's code in blocks for RNN-char-prediction, modified to take and generate sound)
S.E.A.N.N. (Software Engineering Artificial Neural Network) group project: Draw a digit and a trained neural network will tell you what probability it assigns to that number being [0-9]. [website] [code]