Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01g158bm64j
Title: Using Generative Flow Networks to Model Human Causal Induction
Authors: Krauel, Alexander
Advisors: Leslie, Sarah-Jane
Department: Philosophy
Certificate Program: Center for Statistics and Machine Learning
Class Year: 2024
Abstract: Causal induction—the process through which we infer cause-and-effects relationships from observations in our environment—is fundamental to our ability to make decisions, predict the future, and understand our world. While cognitive scientists Thomas Griffiths and Joshua Tenenbaum have theorized a seemingly robust Bayesian framework for modeling human causal induction, current models struggle when dealing with high-dimensional spaces and sparse or ambiguous data. Generative Flow Networks, a novel machine learning technique that learns a stochastic policy for sampling discrete objects from a hypothesis space with probability proportional to an associated reward, might hold promise for sidestepping some of the issues with current models of human causal induction. This thesis is an early investigation into the application of GFlowNets for causal induction modeling. In it, we deploy a GFlowNet to draw inferences over simple causal structures based on observational data from a set of psychological experiments and assess our findings against human judgements. Our analysis reveals that GFlowNets hold significant potential in accurately modeling human causal induction, paving the way for future research on the subject.
URI: http://arks.princeton.edu/ark:/88435/dsp01g158bm64j
Access Restrictions: Walk-in Access. This thesis can only be viewed on computer terminals at the Mudd Manuscript Library.
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Philosophy, 1924-2024

Files in This Item:
File Description SizeFormat 
KRAUEL-ALEXANDER-THESIS.pdf3.79 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.