Skip to content
forked from Zero-coder/FECAM

About Code release for "FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting"

Notifications You must be signed in to change notification settings

CoolCodeLvs/FECAM

 
 

Repository files navigation

FECAM

Arxiv link

state-of-the-artpytorch

This is the original pytorch implementation for the following paper: [FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting](https://arxiv.org/abs/2212.01209).

If you find this repository useful for your research work, please consider citing it as follows:

 @misc{jiang2022fecam, title={FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting}, author={Maowei Jiang and Pengyu Zeng and Kai Wang and Huan Liu and Wenbo Chen and Haoran Liu}, year={2022}, eprint={2212.01209}, archivePrefix={arXiv}, primaryClass={cs.AI} } 

Updates

  • [2022-12-01] FECAM v1.0 is released
  • [2023-3-23] Model/Linear means FECAM+ linear projection layer,fecam is used for feature extration,and projection layer is for controlling the output length of the prediction

Features

  • Support Six popular time-series forecasting datasets, namely Electricity Transformer Temperature (ETTh1, ETTh2 and ETTm1,ETTm2) , Traffic, National Illness, Electricity and Exchange Rate , ranging from power, energy, finance,illness and traffic domains.
  • We generalize FECAM into a module which can be flexibly and easily applied into any deep learning models within just few code lines.

To-do items

  • Integrate FECAM into other mainstream models(eg:Pyraformer,Bi-lstm,etc.) for better performance and higher efficiency on real-world time series.
  • Validate FECAM on more spatial-temporal time series datasets.
  • As a sequence modelling module,we believe it can work fine on NLP tasks too,like Machine Translation and Name Entity Recognization.Further more,as a frequency enhanced module it can theoretically work in any deep-learning models like Resnet.

Stay tuned!

Get started

  1. Install the required package first(Mainly including Python 3.8, PyTorch 1.9.0):
 cd FECAM conda create -n fecam python=3.8 conda activate fecam pip install -r requirements.txt 
  1. Download data. You can obtain all the six benchmarks from Tsinghua Cloud or Google Drive. All the datasets are well pre-processed and can be used easily.
  2. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts. You can reproduce the experiment results by:
 sh ./scripts/electricity.sh sh ./scripts/ettm2.sh sh ./scripts/exchange_rate.sh sh ./scripts/ill.sh sh ./scripts/traffic.sh sh ./scripts/weather.sh 

SENET(channel attention)

FECAM(Frequency Enhanced Channel Attention Mechanism)

As a module to enhance the frequency domain modeling capability of transformers and LSTM

Comparison with Transformers and other mainstream forecasting models

Multivariate Forecasting:

FECAM outperforms all transformer-based methods by a large margin.

Univariate Forecasting:

Efficiency

Compared to vanilla models, only a few parameters are increased by applying our method (See Table 4), and thereby their computationalcomplexities can be preserved.

Performance promotion with FECAM module

Visualization

Forecasting visualization:Visualization of ETTm2 and Exchange predictions given by different models.

FECAM visualization:Visualization of frequency enhanced channel attention and output tensor of encoder layer of transformer.x-axis represents channels,y-axis represents frequency from low to high,performing on datasets weather and exchange.

Used Datasets

We conduct the experiments on 6 popular time-series datasets, namely Electricity Transformer Temperature (ETTh1, ETTh2 and ETTm1) and Traffic, Weather,Illness, Electricity and Exchange Rate, ranging from power, energy, finance , health care and traffic domains.

Overall information of the 9 real world datasets

Datasets Variants Timesteps Granularity Start time Task Type
ETTh1 7 17,420 1hour 7/1/2016 Multi-step
ETTh2 7 17,420 1hour 7/1/2016 Multi-step
ETTm1 7 69,680 15min 7/1/2016 Multi-step
ETTm2 7 69,680 15min 7/1/2016 Multi-step&Single-step
ILI 7 966 1hour 1/1/2002 Multi-step
Exchange-Rate 8 7,588 1hour 1/1/1990 Multi-step&Single-step
Electricity 321 26,304 1hour 1/1/2012 Multi-step-step
Traffic 862 17,544 1hour 1/1/2015 Multi-step-step
Weather 21 52,695 10min 1/1/2020 Multi-step-step

Dataset preparation

Download data. You can obtain all the six benchmarks from Tsinghua Cloud or Google Drive. All the datasets are well pre-processed and can be used easily.(We thanks Author of Autoformer ,Haixu Wu for sorting datasets and public sharing them.)

The data directory structure is shown as follows.

./ └── datasets/ ├── electricity │ └── electricity.csv ├── ETT-small │ ├── ETTh1.csv │ ├── ETTh2.csv │ ├── ETTm1.csv │ └── ETTm2.csv ├── exchange_rate │ └── exchange_rate.csv ├── illness │ └── national_illness.csv ├── traffic │ └── traffic.csv └── weather └── weather.csv 

Contact

If you have any questions, feel free to contact us or post github issues. Pull requests are highly welcomed!

Maowei Jiang: jiangmaowei@sia.cn 

Acknowledgements

Thank you all for your attention to our work!

This code uses (Autoformer,Informer, Reformer, Transformer, LSTM,N-HiTS, N-BEATS, Pyraformer, ARIMA) as baseline methods for comparison and further improvement.

We appreciate the following github repos a lot for their valuable code base or datasets:

https://github.com/zhouhaoyi/Informer2020

https://github.com/thuml/Autoformer

https://github.com/cure-lab/LTSF-Linear

https://github.com/zhouhaoyi/ETDataset

https://github.com/laiguokun/multivariate-time-series-data

Thank you for your attention.

About

About Code release for "FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 80.5%
  • Python 17.8%
  • Shell 1.7%