» » »

Formal Models of Language Generation

Jon Kleinberg

The emergence of large language models has prompted a surge of interest into theoretical models that might give us insight into both their successes and their shortcomings. We’ll give an overview of recent work in this direction, focusing on a surprising line of positive results that shows it is possible to give guarantees for language-generation algorithms even in the absence of any probabilistic assumptions, in a framework known as “language generation in the limit”. These results suggest interesting notions of “breadth” in language generation, attempting to formalize the idea that different algorithms for this problem might all meet the specification but differ significantly in their expressiveness - in how “richly” they can generate from the underlying language. We also discuss strong contrasts with classical results on language identification, showing a strong sense in which language generation and language learning are fundamentally different as computational problems. The talk will be based on joint work with Sendhil Mullainathan and Fan Wei.

Speaker: Jon Kleinberg, Cornell University

Attend in person or click here to watch online

Wednesday, 04/22/26

Contact:

Website: Click to Visit

Cost:

Free

Save this Event:

iCalendar
Google Calendar
Yahoo! Calendar
Windows Live Calendar

Sutardja Dai Hall

UC Berkeley
Banatao Auditorium
Berkeley, CA 94720