Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01t435gg27v
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorVerma, Naveen-
dc.contributor.authorKo, Brittany-
dc.date.accessioned2015-06-09T13:38:08Z-
dc.date.available2015-06-09T13:38:08Z-
dc.date.created2015-05-04-
dc.date.issued2015-06-09-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01t435gg27v-
dc.description.abstractRecent technology has been gravitating towards hands-free, more intuitive human machine interaction. Within the last few years, Professor Verma’s group in Princeton University has developed a 3D gesture sensing system that interacts with the user at a 0-30cm range, right in between the ranges of the touchscreen and Microsoft Kinect (50-400cm), making it complementary to existing technology. It provides not only a low-cost option, but also an easy integration process with existing monitors that are not touchscreen or gesture sensing capable. Applications already developed for this gesture sensing system include Google map interactions, and the library of gestures includes swipes and clicks. It also has a preliminary algorithm for XYZ position tracking of the interacting hand or user. However, more application development can be done to exemplify the system’s rich capabilities and the ease of using it with other devices. This prompted a semi-casual initiative to develop an RPG memory task game that would utilize the gesture sensing platform and an EEG headset to show the system’s potential for use with other devices for game development. The gameplay consists of the user fighting Yoda, a fictional character from Star Wars. There are three different spells that Yoda can cast at the user, and the user must make the correct gesture to counter them. In addition, the user must use a NeuroSky MindWave Mobile EEG headset and achieve a high enough Meditation level in order for the gesture to complete, and thus for the counterattack to work. Several methods for expanding the gesture library were explored. First, I expanded the gesture library to detect gestures of letters ‘o’, ‘w’, and ‘z’ by training a support vector machine (SVM) with key features from the gesture data stream. Later, I implemented another method using Google Tesser OCR, a platform that processes images of letters and characters. Between these two methods, tradeoffs between computational time, immunity to changes due to using different monitors and settings, and accuracy had to be considered. Ultimately, the SVM method was more suited for this particular game. For the EEG headset, the provided Meditation output parameter was much more stable and consistent than the other Attention parameter and raw EEG wave outputs. The game was ultimately successful and playable with both these components together.en_US
dc.format.extent72 pages*
dc.language.isoen_USen_US
dc.titleDeveloping Applications For An Extended-Range Capacitive 3D Gesture Sensing Systemen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2015en_US
pu.departmentElectrical Engineeringen_US
pu.pdf.coverpageSeniorThesisCoverPage-
Appears in Collections:Electrical and Computer Engineering, 1932-2023

Files in This Item:
File SizeFormat 
PUTheses2015-Ko_Brittany.pdf1.82 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.