Skip to main content
English homeVideos home
View Video
16 of 50

How Language Models Learn: From Random Starts to Smart Responses | Mingrui Liu | TEDxChantilly HS

How do language models like ChatGPT learn to write essays, answer questions, or explain science? The secret lies not in magic, but in optimization: the math behind learning from mistakes. In this talk, I will unpack how these systems start with random responses and gradually improve through feedback and repetition. Along the way, you’ll see how machines, like people, get better by practicing, adjusting, and scaling up. Mingrui Liu is an assistant professor in the department of computer science at George Mason University. His research interests include machine learning, optimization, statistical learning theory and deep learning. He has published over 30 papers in leading AI venues such as NeurIPS/ICML/ICLR/JMLR. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

More from TED

1-6 of 50
Loading