Please use this identifier to cite or link to this item:
|Title:||Programming A Poet: Poetry Text Generation Using LSTM|
|Abstract:||I programmed Long Short-Term Memory (LSTM) models to generate poems using Walt Whitman’s poetry collection. I programmed two different models: a character level model, which generates text by character; and a word level model, which generates text by word. Within each model, I also experimented with different parameters. I wrote a baseline model, a wide model which has double number of cells in each LSTM layer than the baseline model, a deep model which adds one more LSTM layer, and a wide and deep model which combines both features. I used perplexity to measure the prediction ability of the generative models. By evaluating the generated poems and their perplexities, I conclude that the word level model is far superior than the character level model. Within the word level model, the wide and deep model produces the best quality of poems, although its perplexity is sometimes slightly higher. After sufficient training, the poems generated by the word level model are meaningful, expressive, and thematic.|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Electrical Engineering, 1932-2020|
Files in This Item:
|HONG-KATHERINE-THESIS.pdf||268.96 kB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.