Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01b8515r30h
Title: Visual Analogy Extrapolation Challenge (VAEC)
Contributors: Webb, Taylor
Dulberg, Zachary
Frankland, Steven
Petrov, Alexander
O'Reilly, Randall
Cohen, Jonathan
Issue Date: 2020
Publisher: Princeton University
Related Publication: Webb, T., Dulberg, Z., Frankland, S., Petrov, A., O'Reilly, R., Cohen, J. (2020). Learning representations that support extrapolation. In Proceedings of the 37th International Conference on Machine Learning, Vienna, Austria, PMLR 119, 2020.
Abstract: Extrapolation -- the ability to make inferences that go beyond the scope of one's experiences -- is a hallmark of human intelligence. By contrast, the generalization exhibited by contemporary neural network algorithms is largely limited to interpolation between data points in their training corpora. In this paper, we consider the challenge of learning representations that support extrapolation. We introduce a novel visual analogy benchmark that allows the graded evaluation of extrapolation as a function of distance from the convex domain defined by the training data. We also introduce a simple technique, context normalization, that encourages representations that emphasize the relations between objects. We find that this technique enables a significant improvement in the ability to extrapolate, considerably outperforming a number of competitive techniques.
Description: Download the VAEC_readme.txt file for a detailed description of this dataset's content, and please download the data sets from Dropbox using https://www.dropbox.com/sh/rv5msg9ml8k0q4g/AAAxx0uU8hPrap8U8RFaaOuna?dl=0
URI: http://arks.princeton.edu/ark:/88435/dsp01b8515r30h
https://www.dropbox.com/sh/rv5msg9ml8k0q4g/AAAxx0uU8hPrap8U8RFaaOuna?dl=0
https://doi.org/10.34770/81bg-rt16
Appears in Collections:Research Data Sets

Files in This Item:
File Description SizeFormat 
VAEC_readme.txt5.46 kBTextView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.