Skip navigation
Please use this identifier to cite or link to this item:
Title: Semantic Networks for NLP: Language, Brains, and Digital Telepathy
Authors: Berl, Ethan
Advisors: Fellbaum, Chrstiane
Contributors: Botvinick, Matthew
Department: Computer Science
Class Year: 2014
Abstract: Natural Language Processing (NLP) algorithms have trouble reaching human level performance today on many problems because of a lack semantic knowledge behind the words that are being manipulated. This work shows how using a semantic neural network topology developed by Rumelhart but modified based on deep learning intuitions, we can store semantic information and have quick concept similarity metrics useful for NLP problems like pp-attachment in parsing. These concept similarity metrics also naturally give us context dependency (for example how similar are two objects when we care about what they look like vs. how similar they are when we care about what they are used for) which is very important in semantically parsing language. Additionally this thesis shows how networks like this can teach each other concepts that are missing when they at least have several basic concepts in common. This type of semantic communication that goes around the word-knowledge directly to world-knowledge might one day be applied to brains with embedded computer chips to provide humans with a faster and more accurate way to communicate.
Extent: 66 pages
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1988-2016

Files in This Item:
File SizeFormat 
berl_ethan_Thesis.pdf1.22 MBAdobe PDF    Request a copy

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.