← Back

Word2vec

Topic spotlight
TopicWorld Wide

word2vec

Discover seminars, jobs, and research tagged with word2vec across World Wide.
2 curated items2 Seminars
Updated about 3 years ago
2 items · word2vec
2 results
SeminarNeuroscienceRecording

Do large language models solve verbal analogies like children do?

Claire Stevenson
University of Amsterdam
Nov 16, 2022

Analogical reasoning –learning about new things by relating it to previous knowledge– lies at the heart of human intelligence and creativity and forms the core of educational practice. Children start creating and using analogies early on, making incredible progress moving from associative processes to successful analogical reasoning. For example, if we ask a four-year-old “Horse belongs to stable like chicken belongs to …?” they may use association and reply “egg”, whereas older children will likely give the intended relational response “chicken coop” (or other term to refer to a chicken’s home). Interestingly, despite state-of-the-art AI-language models having superhuman encyclopedic knowledge and superior memory and computational power, our pilot studies show that these large language models often make mistakes providing associative rather than relational responses to verbal analogies. For example, when we asked four- to eight-year-olds to solve the analogy “body is to feet as tree is to …?” they responded “roots” without hesitation, but large language models tend to provide more associative responses such as “leaves”. In this study we examine the similarities and differences between children's and six large language models' (Dutch/multilingual models: RobBERT, BERT-je, M-BERT, GPT-2, M-GPT, Word2Vec and Fasttext) responses to verbal analogies extracted from an online adaptive learning environment, where >14,000 7-12 year-olds from the Netherlands solved 20 or more items from a database of 900 Dutch language verbal analogies.

SeminarNeuroscienceRecording

Probabilistic Analogical Mapping with Semantic Relation Networks

Hongjing Lu
UCLA
Jun 30, 2021

Hongjing Lu will present a new computational model of Probabilistic Analogical Mapping (PAM, in collaboration with Nick Ichien and Keith Holyoak) that finds systematic correspondences between inputs generated by machine learning. The model adopts a Bayesian framework for probabilistic graph matching, operating on semantic relation networks constructed from distributed representations of individual concepts (word embeddings created by Word2vec) and of relations between concepts (created by our BART model). We have used PAM to simulate a broad range of phenomena involving analogical mapping by both adults and children. Our approach demonstrates that human-like analogical mapping can emerge from comparison mechanisms applied to rich semantic representations of individual concepts and relations. More details can be found https://arxiv.org/ftp/arxiv/papers/2103/2103.16704.pdf