2024-12-12 –, Plenary Room 'Progress'
Thomas Wolf is the co-founder and Chief Science Officer (CSO) of Hugging Face , where he has been a pivotal figure in driving the company’s open-source, educational, and research initiatives. A prominent advocate for open science, Thomas has played a crucial role in making cutting-edge AI research and technologies widely accessible. He spearheaded the development of the Hugging Face Transformers and Datasets libraries, which have become foundational tools for researchers and developers in the machine learning community.
His contributions go beyond software development; Thomas is deeply invested in bridging the gap between academic research and industrial applications through projects like the BigScience Workshop on Large Language Models (LLM), which led to the creation of BLOOM, a large-scale open-source LLM.
With a diverse academic background spanning Physics, AI, and Intellectual Property, Thomas brings a unique interdisciplinary perspective to the field of advanced computing. He holds a Ph.D. in Statistical/Quantum Physics from Sorbonne University and has worked across both research and legal domains. Today, his research interests revolve around LLM accessibility and overcoming current limitations in AI. Outside of research, Thomas enjoys creating educational content, authoring the book Natural Language Processing with Transformers and sharing insights on the future of AI through blogs and videos.
Thomas Wolf is the co-founder and Chief Science Officer (CSO) of Hugging Face, where he has been a pivotal figure in driving the company’s open-source, educational, and research initiatives. A prominent advocate for open science, Thomas has played a crucial role in making cutting-edge AI research and technologies widely accessible. He spearheaded the development of the Hugging Face Transformers and Datasets libraries, which have become foundational tools for researchers and developers in the machine learning community.
His contributions go beyond software development; Thomas is deeply invested in bridging the gap between academic research and industrial applications through projects like the BigScience Workshop on Large Language Models (LLM), which led to the creation of BLOOM, a large-scale open-source LLM.
With a diverse academic background spanning Physics, AI, and Intellectual Property, Thomas brings a unique interdisciplinary perspective to the field of advanced computing. He holds a Ph.D. in Statistical/Quantum Physics from Sorbonne University and has worked across both research and legal domains. Today, his research interests revolve around LLM accessibility and overcoming current limitations in AI. Outside of research, Thomas enjoys creating educational content, authoring the book Natural Language Processing with Transformers and sharing insights on the future of AI through blogs and videos.