← Back

Human Language

Topic spotlight
TopicWorld Wide

human language

Discover seminars, jobs, and research tagged with human language across World Wide.
5 curated items4 Seminars1 ePoster
Updated about 1 year ago
5 items · human language
5 results
SeminarNeuroscience

LLMs and Human Language Processing

Maryia Toneva, Ariel Goldstein, Jean-Remi King
Max Planck Institute of Software Systems; Hebrew University; École Normale Supérieure
Nov 28, 2024

This webinar convened researchers at the intersection of Artificial Intelligence and Neuroscience to investigate how large language models (LLMs) can serve as valuable “model organisms” for understanding human language processing. Presenters showcased evidence that brain recordings (fMRI, MEG, ECoG) acquired while participants read or listened to unconstrained speech can be predicted by representations extracted from state-of-the-art text- and speech-based LLMs. In particular, text-based LLMs tend to align better with higher-level language regions, capturing more semantic aspects, while speech-based LLMs excel at explaining early auditory cortical responses. However, purely low-level features can drive part of these alignments, complicating interpretations. New methods, including perturbation analyses, highlight which linguistic variables matter for each cortical area and time scale. Further, “brain tuning” of LLMs—fine-tuning on measured neural signals—can improve semantic representations and downstream language tasks. Despite open questions about interpretability and exact neural mechanisms, these results demonstrate that LLMs provide a promising framework for probing the computations underlying human language comprehension and production at multiple spatiotemporal scales.

SeminarCognition

Great ape interaction: Ladyginian but not Gricean

Thom Scott-Phillips
Institute for Logic, Cognition, Language and Information
Nov 20, 2023

Non-human great apes inform one another in ways that can seem very humanlike. Especially in the gestural domain, their behavior exhibits many similarities with human communication, meeting widely used empirical criteria for intentionality. At the same time, there remain some manifest differences. How to account for these similarities and differences in a unified way remains a major challenge. This presentation will summarise the arguments developed in a recent paper with Christophe Heintz. We make a key distinction between the expression of intentions (Ladyginian) and the expression of specifically informative intentions (Gricean), and we situate this distinction within a ‘special case of’ framework for classifying different modes of attention manipulation. The paper also argues that the attested tendencies of great ape interaction—for instance, to be dyadic rather than triadic, to be about the here-and-now rather than ‘displaced’—are products of its Ladyginian but not Gricean character. I will reinterpret video footage of great ape gesture as Ladyginian but not Gricean, and distinguish several varieties of meaning that are continuous with one another. We conclude that the evolutionary origins of linguistic meaning lie in gradual changes in not communication systems as such, but rather in social cognition, and specifically in what modes of attention manipulation are enabled by a species’ cognitive phenotype: first Ladyginian and in turn Gricean. The second of these shifts rendered humans, and only humans, ‘language ready’.

SeminarNeuroscience

It’s not over our heads: Why human language needs a body

Michał B. Paradowski
Institute of Applied Linguistics, University of Warsaw
May 8, 2022

n the ‘orthodox’ view, cognition has been seen as manipulation of symbolic, mental representations, separate from the body. This dualist Cartesian approach characterised much of twentieth-century thought and is still taken for granted by many people today. Language, too, has for a long time been treated across scientific domains as a system operating largely independently from perception, action, and the body (articulatory-perceptual organs notwithstanding). This could lead one into believing that to emulate linguistic behaviour, it would suffice to develop ‘software’ operating on abstract representations that would work on any computational machine. Yet the brain is not the sole problem-solving resource we have at our disposal. The disembodied picture is inaccurate for numerous reasons, which will be presented addressing the issue of the indissoluble link between cognition, language, body, and environment in understanding and learning. The talk will conclude with implications and suggestions for pedagogy, relevant for disciplines as diverse as instruction in language, mathematics, and sports.

SeminarNeuroscienceRecording

Structure-mapping in Human Learning

Dedre Gentner
Northwestern University
Apr 1, 2021

Across species, humans are uniquely able to acquire deep relational systems of the kind needed for mathematics, science, and human language. Analogical comparison processes are a major contributor to this ability. Analogical comparison engages a structure-mapping process (Gentner, 1983) that fosters learning in at least three ways: first, it highlights common relational systems and thereby promotes abstraction; second, it promotes inferences from known situations to less familiar situations; and, third, it reveals potentially important differences between examples. In short, structure-mapping is a domain-general learning process by which abstract, portable knowledge can arise from experience. It is operative from early infancy on, and is critical to the rapid learning we see in human children. Although structure-mapping processes are present pre-linguistically, their scope is greatly amplified by language. Analogical processes are instrumental in learning relational language, and the reverse is also true: relational language acts to preserve relational abstractions and render them accessible for future learning and reasoning. Although structure-mapping processes are present pre-linguistically, their scope is greatly amplified by language. Analogical processes are instrumental in learning relational language, and the reverse is also true: relational language acts to preserve relational abstractions and render them accessible for future learning and reasoning.

ePoster

Canine white matter pathways potentially related to human language comprehension

Mélina Cordeau, Isabel Levin, Mira Sinha, Erin Hecht

FENS Forum 2024