Abstract
Recent innovations in transformers have shown their superior performance in natural language processing (NLP) and computer vision (CV). The ability to capture long-range dependencies and interactions in sequential data has also triggered a great interest in time series modeling, leading to the widespread use of transformers in many time series applications. However, being the most common and crucial application, the adaptation of transformers to time series forecasting has remained limited, with both promising and inconsistent results. In contrast to the challenges in NLP and CV, time series problems not only add the complexity of order or temporal dependence among input sequences but also consider trend, level, and seasonality information that much of this data is valuable for decision making. The conventional training scheme has shown deficiencies regarding model overfitting, data scarcity, and privacy issues when working with transformers for a forecasting task. In this work, we propose attentive federated transformers for time series stock forecasting with better performance while preserving the privacy of participating enterprises. Empirical results on various stock data from the Yahoo! Finance website indicate the superiority of our proposed scheme in dealing with the above challenges and data heterogeneity in federated learning.
Original language | English |
---|---|
Title of host publication | 37th International Conference on Information Networking, ICOIN 2023 |
Publisher | IEEE Computer Society |
Pages | 499-504 |
Number of pages | 6 |
ISBN (Electronic) | 9781665462686 |
DOIs | |
Publication status | Published - 2023 |
Event | 37th International Conference on Information Networking, ICOIN 2023 - Bangkok, Thailand Duration: 11 Jan 2023 → 14 Jan 2023 |
Publication series
Name | International Conference on Information Networking |
---|---|
Volume | 2023-January |
ISSN (Print) | 1976-7684 |
Conference
Conference | 37th International Conference on Information Networking, ICOIN 2023 |
---|---|
Country/Territory | Thailand |
City | Bangkok |
Period | 11/01/23 → 14/01/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- attentive aggregation
- federated learning
- multi-head self-attention
- time series stock forecasting
- transformer