Session 5: Machine Learning for Stock Return Prediction: Transformers or Simple Neural Networks

Presentation Type

Oral

Student

Yes

Track

Finance/Insurance Application

Abstract

We utilize the Autoformer, a transformer-based model that incorporates autocorrelation and seasonality, to forecast US stock returns. This approach is supported by the findings of Wu et al. (2021), who highlight the effectiveness of Transformers for time series forecasting.

Previous studies, such as those by Gu, Kelly, and Xiu (2020), have suggested that simple neural networks with 3-4 layers are the most efficient models for stock return predictions. However, our research aims to challenge this notion by comparing the performance of the Autoformer with these simple neural networks across all US stocks from 1957 to 2022.

Our study acknowledges the presence of both short-term and long-term dependencies in stock returns, as noted by Martin (2021) and Lewellen (2022). Additionally, we consider the well-documented January effect (Thaler 1987), which demonstrates the seasonality in stock prices. These factors further justify the use of the Autoformer for stock return prediction, given its ability to capture autocorrelation and seasonality.

In our analysis, we take into account 90 features, including firm characteristics, technical indicators, and economic indicators. We hypothesize that momentum, liquidity, and volatility factors are the most predictive signals. Furthermore, we conjecture that the transformer-based model, specifically the Autoformer, will outperform the simple neural networks.

Start Date

2-6-2024 11:00 AM

End Date

2-6-2024 12:00 PM

This document is currently not available here.

Share

COinS
 
Feb 6th, 11:00 AM Feb 6th, 12:00 PM

Session 5: Machine Learning for Stock Return Prediction: Transformers or Simple Neural Networks

Dakota Room 250 A/C

We utilize the Autoformer, a transformer-based model that incorporates autocorrelation and seasonality, to forecast US stock returns. This approach is supported by the findings of Wu et al. (2021), who highlight the effectiveness of Transformers for time series forecasting.

Previous studies, such as those by Gu, Kelly, and Xiu (2020), have suggested that simple neural networks with 3-4 layers are the most efficient models for stock return predictions. However, our research aims to challenge this notion by comparing the performance of the Autoformer with these simple neural networks across all US stocks from 1957 to 2022.

Our study acknowledges the presence of both short-term and long-term dependencies in stock returns, as noted by Martin (2021) and Lewellen (2022). Additionally, we consider the well-documented January effect (Thaler 1987), which demonstrates the seasonality in stock prices. These factors further justify the use of the Autoformer for stock return prediction, given its ability to capture autocorrelation and seasonality.

In our analysis, we take into account 90 features, including firm characteristics, technical indicators, and economic indicators. We hypothesize that momentum, liquidity, and volatility factors are the most predictive signals. Furthermore, we conjecture that the transformer-based model, specifically the Autoformer, will outperform the simple neural networks.