Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01ww72bf25t
Title: Learning to Earn: An Application of Reinforcement Learning to High-Frequency Trading in the U.S. Futures Market
Authors: Liu, Kevin
Advisors: Almgren, Robert
Department: Operations Research and Financial Engineering
Certificate Program: Applications of Computing Program
Center for Statistics and Machine Learning
Class Year: 2018
Abstract: The purpose of this thesis is to investigate the application of reinforcement learning models to high-frequency trading on the United States futures market. Applications of reinforcement learning are valuable in this subfield as the iterative nature of high-frequency trading allows for consistent improvements in policies on many common strategies. We compare the effectiveness of different classes of model-free reinforcement learning algorithms across both tabular methods and approximate solution methods (for example, Q-learning, Monte Carlo methods, etc.) with several sets of parameter inputs, focusing on interval profit as our reward metric. While previous studies have explored the application of high-frequency trading models to the foreign exchange and equities market as market makers, this investigation will focus on entering the futures market in the classically-studied case of an execution trader in the optimal execution problem. The findings of this research offer insight into whether reinforcement learning can be applied to other financial instruments in financial markets outside of the well-studied foreign exchange and equities markets and whether the performance of reinforcement learning powered trading agents warrants them a spot in the trader's toolkit.
URI: http://arks.princeton.edu/ark:/88435/dsp01ww72bf25t
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Operations Research and Financial Engineering, 2000-2023

Files in This Item:
File Description SizeFormat 
LIU-KEVIN-THESIS.pdf1.59 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.