Shiyue Zhang
Reliable Natural Language Generation and Evaluation
Research Abstract:
Natural language generation (NLG) is an important subfield of natural language processing (NLP) and a critical problem to resolve before achieving artificial general intelligence. Over the past few years, Transformer-based large-scale pre-trained language models (LLMs) have demonstrated surprisingly good NLG performance. Nevertheless, there are still many problems that still need to be solved. My research focuses on aligning NLG learning objectives with how we expect models to behave and improving the reliability of NLG evaluations.
Bio:
Shiyue Zhang is a Research Engineer at Bloomberg AI focusing on large language model research and development. She received her Ph.D. from the Computer Science department at the University of North Carolina (UNC) Chapel Hill in August 2023, advised by Prof. Mohit Bansal. Her research expertise is in Natural Language Processing (NLP), with a focus on language generation, including language modeling, text summarization, and low-resource machine translation. She is a recipient of the Bloomberg Data Science Ph.D. Fellowship and did internships at MSR, Meta AI, and Bloomberg AI. She is dedicated to making NLP more reliable in the real world.