Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp019w0326322
Title: QUANTUM PHOTONIC TRANSFORMER (QPT)
Authors: Kaz, Alkin
Advisors: Tureci, Hakan
Department: Electrical and Computer Engineering
Certificate Program: Center for Statistics and Machine Learning
Class Year: 2023
Abstract: In this thesis, I propose a classic-quantum hybrid Machine Learning model, Quantum Photonic Transformer (QPT). The model follows the transformer architecture that changed the landscape of the Natural Language Processing (NLP) since its invention. However, instead of the the dot-product and Softmax in the attention weight calculation, QPT uses simple quantum photonic circuitry to obtain the query-key similarity in the form of a Gaussian Radial Basis Function (RBF). By using the 2-dimensional noisy (spiral) classification as a toy problem, I show that the suggested attention mechanism in the classical regime (shots = ∞) improves the performance of the original self-attention.
URI: http://arks.princeton.edu/ark:/88435/dsp019w0326322
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Electrical and Computer Engineering, 1932-2023

Files in This Item:
File Description SizeFormat 
KAZ-ALKIN-THESIS.pdf3.63 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.