Please use this identifier to cite or link to this item:
|Title:||Semantic Networks for NLP: Language, Brains, and Digital Telepathy|
|Abstract:||Natural Language Processing (NLP) algorithms have trouble reaching human level performance today on many problems because of a lack semantic knowledge behind the words that are being manipulated. This work shows how using a semantic neural network topology developed by Rumelhart but modified based on deep learning intuitions, we can store semantic information and have quick concept similarity metrics useful for NLP problems like pp-attachment in parsing. These concept similarity metrics also naturally give us context dependency (for example how similar are two objects when we care about what they look like vs. how similar they are when we care about what they are used for) which is very important in semantically parsing language. Additionally this thesis shows how networks like this can teach each other concepts that are missing when they at least have several basic concepts in common. This type of semantic communication that goes around the word-knowledge directly to world-knowledge might one day be applied to brains with embedded computer chips to provide humans with a faster and more accurate way to communicate.|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Computer Science, 1988-2016|
Files in This Item:
|berl_ethan_Thesis.pdf||1.22 MB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.