Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp010z709052n
Title: Towards the Automatic Discovery of Deep Recurrent Neural Network Architectures
Authors: Agrawal, Saisha
Advisors: Jha, Niraj
Department: Electrical Engineering
Certificate Program: Applications of Computing Program
Robotics & Intelligent Systems Program
Class Year: 2021
Abstract: Recurrent Neural Networks (RNNs) have become a staple tool for solving sequential learning problems, but hand-designing deep RNN architectures remains a laborious task because of the enormous search space of candidate configurations. DreamCoder is an inductive programming framework that efficiently traverses a vast program search space by building hierarchical symbolic representations of knowledge. Therefore, we posit that DreamCoder can be applied towards the generation of deep RNN architectures. This thesis makes three elementary steps towards this goal. First, we extend the DreamCoder model to the vector algebra domain, culminating in the rediscovery of nonparametric models for multidimensional RNN cells and cell components out of basic arithmetic and list processing primitives. Next, we harness DreamCoder to learn parametric functions in the single-dimensional algebra domain, leading to the rediscovery of parametric models for single-dimensional RNN cell components. Finally, we present a roadmap to extend this thesis’s rediscovery of single RNN cells to the discovery of deep RNN architectures, setting the stage for the generation of novel RNN architectures that have been tailored to model individual sequential learning tasks.
URI: http://arks.princeton.edu/ark:/88435/dsp010z709052n
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Electrical and Computer Engineering, 1932-2023

Files in This Item:
File Description SizeFormat 
AGRAWAL-SAISHA-THESIS.pdf1.38 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.