Title

Session 14: Fine-tuning Transformer-based Natural Language Generation Algorithms for USDA Grains Reports for Farmers, Producers, and Small Businesses

Presenter Information/ Coauthors Information

Winston Zeng, University of Arizona

Presentation Type

Invited

Student

Yes

Track

Tools

Abstract

The Transformer architecture for Natural Language Generation (NLG) was nothing short of a revolution for natural language processing. Self- and multi-head attention models have proven their efficacy in a variety of textual tasks, including classification, translation, summarization, and generation. Fine-tuning Transformer-based models on summarization tasks has proven successful with small textual datasets. In this research, we focus on fine-tuning pre-trained transformer-based NLG algorithms for USDA reports on grains to produce high-quality summaries, highlights, narratives, and Q&As, to enable farmers, producers, and small businesses in making informed decisions on production, investments, expansions, and risk management. Based on high-performing transformer-based algorithms pre-trained on large textual datasets such as collections of news articles, we develop, train, and fine-tune NLG algorithms with implementation of the Transformer model architecture on publicly available USDA grain publications and reports with characteristics that require specific considerations in model building and training, such as topic-specific (e.g, corn or soybeans), numerical data-intensive (e.g, supply and demand), and temporally sequential (e.g, weekly progress updates). A multi-stage end-to-end NLG system will be developed in which learning evolves from unsupervised, semi-supervised, to supervised, and the final stage of supervised learning would utilize human-expert-produced summaries and highlights. While the research contributes to the field of applying transformer-based models into specific domains, the practical goal is to automate the processes to efficiently inform intended audiences with high-quality content.

Start Date

2-8-2022 3:30 PM

End Date

2-8-2022 4:25 PM

This document is currently not available here.

Share

COinS
 
Feb 8th, 3:30 PM Feb 8th, 4:25 PM

Session 14: Fine-tuning Transformer-based Natural Language Generation Algorithms for USDA Grains Reports for Farmers, Producers, and Small Businesses

Dakota Room 250 A/C

The Transformer architecture for Natural Language Generation (NLG) was nothing short of a revolution for natural language processing. Self- and multi-head attention models have proven their efficacy in a variety of textual tasks, including classification, translation, summarization, and generation. Fine-tuning Transformer-based models on summarization tasks has proven successful with small textual datasets. In this research, we focus on fine-tuning pre-trained transformer-based NLG algorithms for USDA reports on grains to produce high-quality summaries, highlights, narratives, and Q&As, to enable farmers, producers, and small businesses in making informed decisions on production, investments, expansions, and risk management. Based on high-performing transformer-based algorithms pre-trained on large textual datasets such as collections of news articles, we develop, train, and fine-tune NLG algorithms with implementation of the Transformer model architecture on publicly available USDA grain publications and reports with characteristics that require specific considerations in model building and training, such as topic-specific (e.g, corn or soybeans), numerical data-intensive (e.g, supply and demand), and temporally sequential (e.g, weekly progress updates). A multi-stage end-to-end NLG system will be developed in which learning evolves from unsupervised, semi-supervised, to supervised, and the final stage of supervised learning would utilize human-expert-produced summaries and highlights. While the research contributes to the field of applying transformer-based models into specific domains, the practical goal is to automate the processes to efficiently inform intended audiences with high-quality content.