Skip to content

cure-lab/Awesome-time-series

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

94 Commits
Β 
Β 

Repository files navigation

Awesome Time Series

πŸ“ Time Series Papers

A comprehensive survey on the time series papers from 2018-2022 (we will update it in time ASAP!) on the top conferences (NeurIPS, ICML, ICLR, SIGKDD, SIGIR, AAAI, IJCAI, WWW, CIKM, ICDM, WSDM, etc.)

We divided these papers into several fundamental tasks as follows.

Features

  • Up-to-date papers
  • Summarize the contributions in papers
  • Present the datasets used in papers

Update

  • [2022-05-31] Add papers published in ICML 2022
  • [2022-05-31] Add papers published in NeurIPS, ICML, ICLR, SIGKDD, SIGIR, AAAI, IJCAI 2019!
  • [2022-05-05] Add papers published in WWW 2022!
  • [2022-04-25] TS-Paper v1.0 is released! We support the published time series papers from 2020 to 2022. Stay tuned!

TODO

  • Add papers published in 2018. (v3.0)

Survey

Paper Conference Year Code Key Contribution
Transformers in Time Series: A Survey - 2022 link 1. This work summarizes the network structures from adaptations and modification. 2. This work categorizes these methods into three tasks, e.g., forecasting, anomaly detection, and classification.
Time series data augmentation for deep learning: a survey IJCAI 2021 - 1. This work systematically reviews and empirically compares different data augmentation methods for time series. 2. They discuss and highlight five future directions to provide useful research guidance.
Neural temporal point processes: a review IJCAI 2021 - 1. They focus on important design choices and general principles for defining neural TPP models. 2. They provide an overview of common application areas. 3. They conclude many open challenges and important directions.
Time-series forecasting with deep learning: a survey Philosophical Transactions of the Royal Society A 2021 - 1. They survey common encoder and decoder designs used in both one-step-ahead and multi-horizon time series forecasting– describing how temporal information is incorporated into predictions by each model. 2. They highlight recent developments in hybrid deep learning models, which combine well-studied statistical models with neural network components to improve pure methods in either category. 3. They outline some ways in which deep learning can also facilitate decision support with time series data.
Deep learning for time series forecasting: a survey Big Data 2021 - 1. They formulate the time series forecasting problem along with its mathematical fundamentals. 2. They discuss the advantages and limitations in the feed forward networks, recurrent neural networks (including Elman, long-short term memory, gated recurrent units, and bidirectional networks), and convolutional neural networks.
DL-Traff: Survey and Benchmark of Deep Learning Models for Urban Traffic Prediction CIKM 2021 graph-data, grid-data They synthetically review the deep traffic models and the widely used datasets, then build a standard benchmark to comprehensively evaluate their performances with the same settings and metrics.
Graph Neural Network for Traffic Forecasting: A Survey - 2021 - The first comprehensive survey that explores the application of graph neural networks for traffic forecasting problems (e.g. road traffic flow and speed forecasting, passenger flow forecasting in urban rail transit systems, and demand forecasting in ride-hailing platforms).
Deep learning for anomaly detection in time-series data: review, analysis, and guidelines Access 2021 - 1. This review provides a background on anomaly detection in time-series data and reviews the latest applications in the real world. 2. They comparatively analyze state-of-the-art deep-anomaly-detection models with several benchmark datasets. 3. They offer guidelines for appropriate model selection and training strategy for deep learning-based time series anomaly detection.
A review on outlier/anomaly detection in time series data ACM Computing Surveys 2021 - 1. This review provides a structured and comprehensive state-of-the-art on outlier detection techniques in the context of time series. 2. a taxonomy is presented based on the main aspects that characterize an outlier detection technique.
A unifying review of deep and shallow anomaly detection Proceedings of the IEEE 2021 - 1. This work draws connections between classic β€˜shallow’ and novel deep approaches and show how this relation might cross-fertilize or extend both directions. 2. They outline some critical open challenges.
Big Data for Traffic Estimation and Prediction: A Survey of Data and Tools Applied System Innovation 5 2021 - This study presents an up-to-date survey of open data and big data tools used for traffic estimation and prediction.
Fusion in stock market prediction: A decade survey on the necessity, recent developments, and potential future directions Information Fusion 2021 - 1. Survey information, feature, model fusion from 2011–2020. 2. Discuss their limitations and future directions are explored for various stock applications.
Applications of deep learning in stock market prediction: Recent progress ESA 2021 - 1. Give a latest review of recent works on deep learning models for stock market prediction. 2. Discuss data sources, models, metrics and the implementation and reproducibility.
Deep Learning for Spatio-Temporal Data Mining: A Survey KDD 2020 - 1. A review of recent progress in applying deep learning techniques for STDM. 2. They classify existing literature based on the types of spatio-temporal data, the data mining tasks, and the deep learning models.
Urban flow prediction from spatiotemporal data using machine learning: A surveyΒ  Information Fusion 2020 - 1. Urban flow prediction from spatiotemporal data. 2. methods based on machine learning. 3. The difficulties and some ideas.
An empirical survey of data augmentation for time series classification with neural networks - 2020 link 1. a taxonomy and outline the four families in time series data augmentation, including transformation-based methods, pattern mixing, generative models, and decomposition methods. 2. empirically evaluate 12 time series data augmentation methods on 128 time series classification datasets with six different types of neural networks. 3. analyze the characteristics, advantages and disadvantages, and recommendations of each data augmentation method.
Deep Learning on Traffic Prediction: Methods, Analysis and Future Directions - 2020 - 1. They summarize the existing traffic prediction methods, widely used public datasets, give an evaluation and analysis by conducting extensive experiments to compare the performance of different methods on a real-world public dataset.
Neural forecasting: Introduction and literature overview - 2020 - An introduction and an overview of some of the advances of neural networks in machine learning.
Financial time series forecasting with deep learning : A systematic literature review: 2005–2019 ASC 2019 - 1. categorized the studies according to the intended forecasting implementation areas, such as index, forex, commodity forecasting. 2. grouped them based on DL model choices, such as CNNs, Deep Belief Networks (DBNs), Long-Short Term Memory (LSTM).
Deep learning for time series classification: a review Data Mining and Knowledge Discovery 2019 link They implemented existing approaches by training 8,730 deep learning models on 97 time series datasets.
Financial time series forecasting with deep learning : A systematic literature review: 2005–2019 ASC 2019 - 1. They categorized the studies according to their intended forecasting implementation areas, such as index, forex, commodity forecasting. 2. They grouped DL model, such as Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Long-Short Term Memory (LSTM).
Natural language based financial forecasting: a survey Artificial Intelligence Review 2018 - They show scopes, progress and hotspots in natural language based financial forecasting (NLFF).

Time Series Forecasting

Paper Conference Year Code Used Datasets Key Contribution
FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting ICML 2022 code ETT, Electricity, Exchange, Weather, ILI We propose to combine Transformer with the seasonal-trend decomposition method, in which the decomposition method captures the global profile of time series while Transformers capture more detailed structures. The proposed method, termed as Frequency Enhanced Decomposed Transformer (FEDformer), is more efficient than standard Transformer with a linear complexity to the sequence length.
TACTiS: Transformer-Attentional Copulas for Time Series ICML 2022 code electricity, fred-md, kdd-cup, solar-10min, traffic We propose a versatile method, based on the transformer architecture, that estimates joint distributions using an attentionbased decoder that provably learns to mimic the properties of non-parametric copulas.
Domain Adaptation for Time Series Forecasting via Attention Sharing ICML 2022 code UCI, Wiki we propose a novel domain adaptation framework, Domain Adaptation Forecaster (DAF). DAF leverages statistical strengths from a relevant domain with abundant data samples (source) to improve the performance on the domain of interest with limited data (target).
Volatility Based Kernels and Moving Average Means for Accurate Forecasting with Gaussian Processes ICML 2022 code we take inspiration from well studied domains to introduce a new class of models, Volt and Magpie, that significantly outperform baselines in stock and wind speed forecasting, and naturally extend to the multitask setting.
DSTAGNN: Dynamic Spatial-Temporal Aware Graph Neural Network for Traffic Flow Forecasting ICML 2022 code PEMS This paper proposes a novel Dynamic Spatial-Temporal Aware Graph Neural Network (DSTAGNN) to model the complex spatial-temporal interaction in road network.
Multi-Granularity Residual Learning with Confidence Estimation for Time Series Prediction WWW 2022 Code Electricity, Stock we design a novel residual learning net to model the prior knowledge of the fine-grained data’s distribution through the coarse-grained one. Furthermore, to alleviate the side effect of validity dif- ferences, we introduce a self-supervised objective for confidence estimation, which delivers more effective optimization without the requirement of additional annotation efforts.
CAMul: Calibrated and Accurate Multi-view Time-Series Forecasting WWW 2022 Code google-symptoms, covid19, power, tweet We propose a general probabilistic multi-view forecasting framework CAMul, which can learn representations and uncertainty from diverse data sources. It integrates the information and uncertainty from each data view in a dynamic context-specific manner, assign- ing more importance to useful views to model a well-calibrated forecast distribution.
EXIT: Extrapolation and Interpolation-based Neural Controlled Differential Equations for Time-series Classification and Forecasting WWW 2022 Code MuJoCo, Google Stock we propose to i) generate another latent continuous path using an encoder-decoder archi- tecture, which corresponds to the interpolation process of NCDEs, i.e., our neural network-based interpolation vs. the existing explicit interpolation, and ii) exploit the generative characteristic of the de- coder, i.e., extrapolation beyond the time domain of original data if needed.
RETE: Retrieval-Enhanced Temporal Event Forecasting on Unified Query Product Evolutionary Graph WWW 2022 - Yelp, E-commerce RETE efficiently and dynamically retrieves relevant entities centrally on each user as high-quality subgraphs, preventing the noise propagation from the densely evolutionary graph structures that incorporate abun- dant search queries.
CATN: Cross Attentive Tree-aware Network for Multivariate Time Series Forecasting AAAI 2022 - Traffic, Electricity, PeMSD7(M), METR-LA studied the hierarchical and grouped correlation mining problem of multivariate time-series data and proposed CATN for multi-step forecasting.
Reinforcement Learning based Dynamic Model Combination for Time Series Forecasting AAAI 2022 - DATA a novel and practically effective online ensemble aggregation framework for time-series forecasting that employs a deep reinforcement learning approach as a meta-learning technique.
Conditional Local Convolution for Spatio-temporal Meteorological Forecasting AAAI 2022 Code link WeatherBench (Rasp et al. 2020) a local conditional convolution to capture and imitate the meteorological flows of local patterns on the whole sphere
TLogic: Temporal Logical Rules for Explainable Link Forecasting on Temporal Knowledge Graphs AAAI 2022 Code link Integrated Cri- sis Early Warning System, Split method the first symbolic framework that directly learns temporal logical rules from temporal knowl- edge graphs and applies these rules for link forecasting
Spatio-Temporal Recurrent Networks for Event-Based Optical Flow Estimation AAAI 2022 - The MVSEC dataset (Zhu et al. 2018a) novel input representation to effectively extract the spatio-temporal information from event input.
A GNN-RNN Approach for Harnessing Geospatial and Temporal Information: Application to Crop Yield Prediction AAAI 2022 - Crop a novel GNN-RNN framework to innovatively incorporate both geospatial and temporal knowledge into crop yield prediction.
ST-GSP: Spatial-Temporal Global Semantic Representation Learning for Urban Flow Prediction WSDM 2022 Code link TaxiBJ, BikeNYC our model explicitly models the correlation among temporal dependencies of different scales to extract global temporal dependencies + new simple fusion strategy + self-supervised learning
PYRAFORMER: LOW-COMPLEXITY PYRAMIDAL ATTENTION FOR LONG-RANGE TIME SERIES MODELING AND FORECASTING ICLR 2022 Code link Electricity, Wind, ETT data and App Flow a novel model based on pyramidal attention that can effectively describe both short and long temporal dependencies with low time and space complexity.
DEPTS: DEEP EXPANSION LEARNING FOR PERIODIC TIME SERIES FORECASTING ICLR 2022 Code link ELECTRICITY, TRAFFIC2, and M4(HOURLY) model complicated periodic dependencies and to capture sophisticated compositions of diversified periods simultaneously.
TAMP-S2GCNETS: COUPLING TIME-AWARE MULTIPERSISTENCE KNOWLEDGE REPRESENTATION WITH SPATIO-SUPRA GRAPH CONVOLUTIONAL NETWORKS FOR TIME-SERIES FORECASTING ICLR 2022 Code link PeMSD3, PeMSD4, PeMSD8 and COVID-19 The developed TAMP-S2GCNets model is shown to yield highly competitive forecasting performance on a wide range of datasets, with much lower computational costs.
CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting ICLR 2022 Code link ETT,Electricity,Weather proposed CoST, a contrastive learning framework that learns disentangled seasonal-trend representations for time series forecasting tasks.
REVERSIBLE INSTANCE NORMALIZATION FOR ACCURATE TIME-SERIES FORECASTING AGAINST DISTRIBUTION SHIFT ICLR 2022 - ETT, Electricity Consuming Load (ECL) address the distribution shift problem in time series, proposing a simple yet ef- fective normalization-and-denormalization method, reversible instance normalization (RevIN)
TEMPORAL ALIGNMENT PREDICTION FOR SUPERVISED REPRESENTATION LEARNING AND FEW-SHOT SEQUENCE CLASSIFICATION ICLR 2022 Code link MSR Action3D, MSR Daily Activity3D, β€œSpoken Arabic Digits (SAD)” dataset, ChaLearn present TAP, which is a learnable distance for sequences.
Deep Switching Auto-Regressive Factorization: Application to Time Series Forecasting AAAI 2021 - Pacific Ocean Temperature Dataset, Parking Birmingham Data Set, ........ it parameterizes the weights in terms of a deep switching vector auto-regressive likelihood governed with a Markovian prior
Dynamic Gaussian Mixture Based Deep Generative Model for Robust Forecasting on Sparse Multivariate Time Series AAAI 2021 - USHCN, KDD-CUP, MIMIC-III provides a novel and general solution that explicitly defines temporal dependency between Gaussian mixture distributions at different time steps
Temporal Latent Autoencoder: A Method for Probabilistic Multivariate Time Series Forecasting AAAI 2021 - Traffic, Electricity, Wiki introduced a novel temporal latent auto-encoder method which enables nonlinear factorization of multivariate time series, learned end-to-end with a temporal deep learning latent space forecast model. By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Synergetic Learning of Heterogeneous Temporal Sequences for Multi-Horizon Probabilistic Forecasting AAAI 2021 - Electricity(UCI), Traffic, Environment(Li, L.;Yan,J.;Yang,X.;and Jin,Y.2019a.) presented a novel approach based on the deep conditional generative model to jointly learn from heterogeneous temporal sequences.
Time-Series Event Prediction with Evolutionary State Graph WSDM 2021 Code link DJIA30, WebTraffic, NetFlow, ClockErr, and AbServe proposed a novel represen- tation, the evolutionary state graph, to present the time-varying re- lations among time-series states.
Long Horizon Forecasting With Temporal Point Processes WSDM 2021 Code link Election, Taxi, Traffic-911, and EMS-911. a novel MTPP model specif- ically designed for long-term forecasting of events.
Modeling Inter-station Relationships with Attentive Temporal Graph Convolutional Network for Air Quality Prediction WSDM 2021 - Beijing, Tianjin and POIs data encode multiple types of inter-station relationships into graphs and design parallel GCNbased encoding and decoding modules to aggregate features from related stations using different graphs.
Predicting Crowd Flows via Pyramid Dilated Deeper Spatial-temporal Network WSDM 2021 - Wi-Fi connection log, bike in New York city and taxi ride in New York ConvLSTM + pyramid dilated residual network + integrated attention
Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series Forecasting ICML 2021 Code link Decentraland, Bytom, PeMSD4 and PeMSD8. The new Z-GCNETs layer allows us to track the salient timeaware topological characterizations of the data persisting over time.
Explaining Time Series Predictions with Dynamic Masks ICML 2021 Code link MIMIC-III These masks are endowed with an insightful information theoretic interpretation and offer a neat improvement in terms of performance.
End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series ICML 2021 Code link Labour, Traffic, Tourism, Tourism-L, and Wiki a single, global model that does not require any adjustments to produce coherent, probabilistic forecasts, a first of its kind.
Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting ICML 2021 Code link Exchange, Solar and Electricity, Traffic, Taxi and Wikipedia a combination of improved variance schedule and an L1 loss to allow sampling with fewer steps at the cost of a small reduction in quality if such a trade-off is required.
Conformal prediction interval for dynamic time-series ICML 2021 Code link solar and wind energy data present a predictive inference method for dynamic time-series.
RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting ICML 2021 - PeMSD3, PeMSD4, PeMSD7 and PeMSD8 propose a state-space probabilistic model- ing framework for multivariate time-series prediction that can process information provided in the form of a graph that specifies (probable) predictive or causal relationships.
ST-Norm: Spatial and Temporal Normalization for Multi-variate Time Series Forecasting KDD 2021 Code link BikeNYC, PeMSD7 and Electricity propose two kinds of normalization modules -- temporal and spatial normalization -- which separately refine the high-frequency component and the local component underlying the raw data.
MiniRocket: A Fast (Almost) Deterministic Transform for Time Series Classification KDD 2021 Code link UCR archive reformulate Rocket into a new method, MiniRocket. MiniRocket is up to 75 times faster than Rocket on larger datasets, and almost deterministic.
Dynamic and Multi-faceted Spatio-temporal Deep Learning for Traffic Speed Forecasting KDD 2021 Code link PeMSD4, PeMSD8 and England design a dynamic graph construction method to learn the time-specific spatial dependencies of road segments.
Forecasting Interaction Order on Temporal Graphs KDD 2021 Code link COLLEGEMSG, EMAIL-EU and FBWALL devise an attention mechanism to aggregate neighborhoods' information based on their representations and time encodings attached to their specific edges.
Quantifying Uncertainty in Deep Spatiotemporal Forecasting KDD 2021 - air quality PM2.5, road network traffic, and COVID-19 incident deaths conduct benchmark studies on uncertainty quantification in deep spatiotemporal forecasting from both Bayesian and frequen- tist perspectives.
Spatial-Temporal Graph ODE Networks for Traffic Flow Forecasting KDD 2021 Code link - we capture spatial-temporal dynamics through a tensor-based ordinary differential equation (ODE), as a result, deeper networks can be constructed and spatial-temporal features are utilized synchronously.
A PLAN for Tackling the Locust Crisis in East Africa: Harnessing Spatiotemporal Deep Models for Locust Movement Forecasting KDD 2021 Code link - PLAN's novel spatio-temporal deep learning architecture enables representing PlantVillage's crowdsourced locust observation data using novel image-based feature representations, and its design is informed by several unique insights about this problem domain.
Topological Attention for Time Series Forecasting NeurIPS 2021 Code link M4 competition dataset propose topological attention, which allows attending to local topological features within a time horizon of historical data.
MixSeq: Connecting Macroscopic Time Series Forecasting with Microscopic Time Series Data NeurIPS 2021 - Rossmann, Wiki and M5 an end2end mixture model to cluster microscopic time series, where all the components come from a family of Seq2seq models parameterized by differ- ent parameters.
Test-time Collective Prediction NeurIPS 2021 - Boston...... Our approach takes inspiration from the literature in social science on human consensus-making.
Bubblewrap: Online tiling and real-time flow prediction on neural manifolds NeurIPS 2021 Code link - we propose a method that combines fast, stable dimensionality reduction with a soft tiling of the resulting neural manifold, allowing dynamics to be approximated as a probability flow between tiles.
Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting NeurIPS 2021 Code link ETT, Electricity, Exchange, and Traffic we propose Auto- former as a novel decomposition architecture with an Auto-Correlation mechanism.
Learning to Learn the Future: Modeling Concept Drifts in Time Series Prediction CIKM 2021 - Climate Dataset, Stock Dataset and Synthetic Dataset propose a novel framework called learning to learn the future. Specifically, we develop a learning method to model the concept drift during the inference stage, which can help the model generalize well in the future.
AdaRNN: Adaptive Learning and Forecasting of Time Series CIKM 2021 - UCI activity, Air quality, Electric power and Stock price AdaRNN is a general framework with flexible distribution distances integrated.
Actionable Insights in Urban Multivariate Time-series CIKM 2021 - Gaussian, Insect, Wikipedia and so on ...... introduce and formalize a novel problem RaTSS that aims to find such time-series (rationalizations), which are actionable for the segmentation. We also propose an algorithm Find-RaTSS to find them for any black-box segmentation.
Historical Inertia: A Neglected but Powerful Baseline for Long Sequence Time-series Forecasting CIKM 2021 - ETT, Electricity introduce a new baseline for LSTF, the historical inertia (HI), which refers to the most recent historical data-points in the input time series.
AGCNT: Adaptive Graph Convolutional Network for Transformer-based Long Sequence Time-Series Forecasting CIKM 2021 - ETT a probsparse adaptive graph self-attention+the stacked encoder with distilling probsparse graph self-attention integrates the graph attention mechanism+ the stacked decoder with generative inference generates all prediction values in one forward operation
PIETS: Parallelised Irregularity Encoders for Forecasting with Heterogeneous Time-Series ICDM 2021 - Covid-19 design a novel architecture, PIETS, to model heterogeneous time-series.
Attentive Neural Controlled Differential Equations for Time-series Classification and Forecasting ICDM 2021 Code link Character Trajectories, PhysioNet Sepsis and Google Stock. present Attentive Neural Controlled Differential Equations (ANCDEs) for time-series classification and forecasting, where dual NCDEs are used: one for generating attention values, and the other for evolving hidden vectors for a downstream machine learning task.
SSDNet: State Space Decomposition Neural Network for Time Series Forecasting ICDM 2021 - Electricity,Exchange, Solar ...... SSDNet combines the Transformer architecture with state space models to provide probabilistic and interpretable forecasts, including trend and seasonality components and previous time steps important for the prediction.
Two Birds with One Stone: Series Saliency for Accurate and Interpretable Multivariate Time Series Forecasting IJCAI 2021 - electricity, Air-quality, Industry data Series saliency is model agnostic and performs as an adap- tive data augmentation method for training deep models. Moreover, by slightly changing the objec- tive, we optimize series saliency to find a mask for interpretable forecasting in both feature and time dimensions.
TE-ESN: Time Encoding Echo State Network for Prediction Based on Irregularly Sampled Time Series Data IJCAI 2021 - MG system, SILSO, USHCN, COVID-19 propose a novel Time En- coding (TE) mechanism. TE can embed the time information as time vectors in the complex domain.
DeepFEC: Energy Consumption Prediction under Real-World Driving Conditions for Smart Cities WWW 2021 Code link SPMD, VED presents a novel framework that identifies vehicle/driving environment-dependent factors to predict energy consumption over a road network based on his- torical consumption data for different vehicle types.
HINTS: Citation Time Series Prediction for New Publications viaDynamic Heterogeneous Information Network Embedding WWW 2021 - the AMiner Computer Science dataset and the American Physical Society (APS) Physics dataset a novel end-to-end deep learning framework that converts citation signals from dynamic heterogeneous information networks (DHIN) into citation time series.
Variable Interval Time Sequence Modeling for Career Trajectory Prediction: Deep Collaborative Perspective WWW 2021 - traffic data from 1988.1 to 2018.11 propose a unified time-aware career trajectory prediction framework, namely TACTP, which is capable of jointly providing the above three abilities for better understanding the career trajectories of talents.
REST: Reciprocal Framework for Spatiotemporal coupled predictions WWW 2021 - a traffic dataset released by Li et al. and a web dataset come up with a novel Reciprocal SpatioTemporal (REST) framework, which introduces Edge Inference Networks (EINs) to couple with GCNs.
AutoSTG: Neural Architecture Search for Predictions of Spatio-Temporal Graph WWW 2021 Code link PEMS-BAY and METR-LA propose a novel framework, entitled AutoSTG, for automated spatio-temporal graph prediction. In our AutoSTG, spatial graph convolution and temporal convolution operations are adopted in our search space to capture complex spatio-temporal correlations. Besides, we employ the meta learning technique to learn the adjacency matrices of spatial graph convolution layers and kernels of temporal convolution layers from the meta knowledge of the attributed graph.
Fine-grained Urban Flow Prediction WWW 2021 - TaxiBJ+, HappyValley Spatio-Temporal Relation Net- work (STRN) to predict fine-grained urban flows. First, a backbone network is used to learn high-level representations for each cell. Second, we present a Global Relation Module (GloNet) that cap- tures global spatial dependencies much more efficiently compared to existing methods. Third, we design a Meta Learner that takes external factors and land functions (e.g., POI density) as inputs to produce meta knowledge and boost model performances.
Probabilistic Time Series Forecasting with Shape and Temporal Diversity NeurIPS 2020 Code link - Diversity is controlled via two proposed differentiable positive semi-definite kernels for shape and time and exploits a forecasting model with a disentangled latent space.
Benchmarking Deep Learning Interpretability in Time Series Predictions NeurIPS 2020 Code link - a comprehensive synthetic benchmark where positions of informative features are known.
Adversarial Sparse Transformer for Time Series Forecasting NeurIPS 2020 - electricity,traffic, wind, solar, M4-Hourly By adversarial learning, we improve the contiguous and fidelity at the sequence level. We further propose Sparse Transformer to improve the ability to pay more attention on relevant steps in time series.
Deep Rao-Blackwellised Particle Filters for Time Series Forecasting NeurIPS 2020 - electricity, traffic, solar, exchange, wiki proposed an extension of the classical SGLS that addresses two weaknesses
Gamma-Models: Generative Temporal Difference Learning for Infinite-Horizon Prediction NeurIPS 2020 - - introduced a new class of predictive model, a R-model, that is a hybrid between standard model-free and model-based mechanisms
EvolveGraph: Multi-Agent Trajectory Prediction with Dynamic Relational Reasoning NeurIPS 2020 - Honda 3D Dataset (H3D), NBA SportVU Dataset (NBA), and Stanford Drone Dataset (SDD) present a generic trajectory forecasting framework with explicit relational reasoning among multiple heterogeneous, interactive agents with a graph representation.
Multi-agent Trajectory Prediction with Fuzzy Query Attention NeurIPS 2020 Code link ETH-UCY, Collisions, NGsim, Charges, NBA a general architecture designed to predict trajectories in multi-agent systems while modeling the crucial inductive biases of motion, namely, inertia, relative motion, intents and interactions.
Set Functions for Time Series ICML 2020 Code link MIMIC-III, Physionet 2012 Mortality Prediction Challenge presented a novel approach for classifying time se- ries with irregularly-sampled and unaligned.
Learning from Irregularly-Sampled Time Series: A Missing Data Perspective ICML 2020 Code link MNIST..... introduced an encoder-decoder framework for modeling general missing data problems and introduced two model families leveraging this framework: P-VAE and P-BiGAN.
Unsupervised Transfer Learning for Spatiotemporal Predictive Networks ICML 2020 Code link - studied a new unsupervised transfer learn- ing problem of using multiple pretrained models to im- prove the performance of a new spatiotemporal predictive learning task.
Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks KDD 2020 Code link Solar-Energy, Traffic, Electricity, Exchange-Rate and PeMS ...... a novel framework for multivariate time series forecasting
Deep State-Space Generative Model For Correlated Time-to-Event Predictions KDD 2020 Code link MIMIC-III proposed a deep latent state-space generative model to capture the relations between patients’ mortality risk and the associated organ failure risks.
Attention based multi-modal new product sales time-series forecasting KDD 2020 - - propose and empirically evaluate several novel attention-based multi-modal encoder-decoder models to forecast the sales for a new product purely based on product images, any available product attributes and also external factors like holidays, events, weather, and discount.
BusTr: predicting bus travel times from real-time traffic KDD 2020 - - demonstrates excellent generalization to test data that differs both spatially and temporally from the training examples we use, allowing our model to cope gracefully with the ever-changing world.
CompactETA: A Fast Inference System for Travel Time Prediction KDD 2020 - - encode high order spatial and temporal dependency into sophisticated representations by applying graph attention network on a spatiotemporal weighted road network graph. We further encode the sequential information of the travel route by positional encoding to avoid the recurrent network structure.
DATSING: Data Augmented Time Series Forecasting with Adversarial Domain Adaptation CIKM 2020 - - propose a two-phased frameworkwhich first clusters similar mixed domains time series data and thenperforms a fine-tuning procedure with domain adversarial regular-ization to achieve better out-of-sample generalization.
Dual Sequential Network for Temporal Sets Prediction SIGIR 2020 - - addressed the problem that most of the existing methods were designed for predicting time series or temporal events, which could not be directly used for temporal sets prediction due to the difficulties of multi-level representations of items and sets, complex temporal dependencies of sets, and evolving dynamics of sequential behaviors.
Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting NeurIPS 2019 Code electricity, traffic, wik, PeMS07(M) Our model can be trained effectively on high-dimensional but diverse time series, where different time series can have vastly different scales, without a priori normalization or rescaling.
Shape and Time Distortion Loss for Training Deep Time Series Forecasting Models NeurIPS 2019 Code Synth, ECG, Traffic We introduce a differentiable loss function suitable for training deep neural nets, and provide a custom back-prop implementation for speeding up optimization. We also introduce a variant of DILATE, which provides a smooth generalization of temporally-constrained Dynamic Time Warping (DTW).
Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting NeurIPS 2019 - solar, wind we first propose convolutional self-attention by producing queries and keys with causal convolution so that local context can be better incorporated into attention mechanism.
Discovering Latent Covariance Structures for Multiple Time Series ICML 2019 - - We present a pragmatic search algorithm which explores a larger structure space efficiently.
BeatGAN: Anomalous Rhythm Detection using Adversarially Generated Time Series IJCAI 2019 Code - BeatGAN outputs explainable results to pinpoint the anomalous time ticks of an input beat, by comparing them to ad- versarially generated beats.
Learning Interpretable Deep State Space Model for Probabilistic Time Series Forecasting IJCAI 2019 Code MIT-BIH ECG dataset, CMU Motion Capture dataset Our approach involves a deep network based embodiment of the state space model, to allow for non-linear emission and transition models design, which is flexible to deal with arbitrary data distribution.
Explainable Deep Neural Networks for Multivariate Time Series Predictions IJCAI 2019 - - We design a two stage convolutional neural network architec- ture which uses particular kernel sizes. This allows us to utilise gradient based techniques for generat- ing saliency maps for both the time dimension and the features.

Time Series Classification

Paper Conference Year Code Used Datasets Key Contribution
OMNI-SCALE CNNS: A SIMPLE AND EFFECTIVE KERNEL SIZE CONFIGURATION FOR TIME SERIES CLASSIFICATION ICLR 2022 Code link MEG-TLE, UEA 30 archive, UCR 85 archive, UCR 128 archive presents a simple 1D-CNN block, namely OS-block.
Correlative Channel-Aware Fusion for Multi-View Time Series Classification AAAI 2021 - EV-Action, NTU RGB+D, UCI Daily and Sports Activities The global-local temporal encoders are developed to extract robust temporal representations for each view, and a learnable fusion mechanism is proposed to boost the multi-view label information.
Learnable Dynamic Temporal Pooling for Time Series Classification AAAI 2021 - UCR/UEA proposes a dynamic temporal pooling + a learning framework to simultaneously optimize the network parameters of a CNN classifier and the prototypical hidden series that encodes the latent semantic of the segments.
ShapeNet: A Shapelet-Neural Network Approach for Multivariate Time Series Classification AAAI 2021 Code link UEA MTS datasets We propose Mdc-CNN to learn time series subsequences of various lengths into unified space and propose a cluster-wise triplet loss to train the network in an unsupervised fashion. We adopt MST to obtain the MST representation of time series.
Joint-Label Learning by Dual Augmentation for Time Series Classification AAAI 2021 Code UCR a novel time-series data augmentation method.
Explainable Multivariate Time Series Classification: A Deep Neural Network Which Learns To Attend To Important Variables As Well As Time Intervals WSDM 2021 - PM2.5w Seizure Movement introduced LAXCAT, a novel, modular architecture for explainable multivariate time series classification.
Voice2Series: Reprogramming Acoustic Models for Time Series Classification ICML 2021 Code link Coffee,...... a novel approach to reprogram a pre-trained acoustic model for time series classification.
Learning Saliency Maps to Explain Deep Time Series Classifiers CIKM 2021 - Wafer, GunPoint, Computers, Earthqakes, FordA, FordB, CricketX, PTB, ECG a method that learns to highlight the timesteps that are most responsible and the degree to which they are important for the classifier’s prediction.
Gaussian Process Model Learning for Time Series Classification ICDM 2021 - - proposed a novel approach for time series classification called Local Gaussian Process Model Inference Classification (LOGIC).
Contrast Profile: A Novel Time Series Primitive that Allows Classification in Real World Settings ICDM 2021 Code link UCR archive We have shown that the MPdist is more robust to noise, irrelevant data, misalignment etc., than either Euclidian distance or DTW.
Attentive Neural Controlled Differential Equations for Time-series Classification and Forecasting ICDM 2021 Code link - a novel NCDE architecture that incorporates the concept of attention.
Imbalanced Time Series Classification for Flight Data Analyzing with Nonlinear Granger Causality Learning CIKM 2020 Code link - presented a neural network classification model for imbalanced multivariate time series by leveraging the information learned from normal class, which can also learn the nonlinear Granger causality for each class, so that we can pinpoint how time series classes differ from each other.
Visualet: Visualizing Shapelets for Time Series Classification CIKM 2020 Code link UCR archive Such efficiency has made it possible for demo attendees to interact with shapelet discovery and explore high-quality shapelets. In this demo, we present Visualet -- a tool for visualizing shapelets, and exploring effective and interpretable ones.
Learning Discriminative Virtual Sequences for Time Series Classification CIKM 2020 Code link - propose a novel time series classification method named Discriminative Virtual Sequence Learning (DVSL).
Fast and Accurate Time Series Classification Through Supervised Interval Search CIKM 2020 Code link - STSF improves the classification efficiency by examining only a (set of) sub-series of the original time series, and its tree-based structure allows for interpretable outcomes.

Anomaly Detection

Dataset Conference Year Code Used Datasets Key Contribution
Unsupervised Model Selection For Time-series Anomaly Detection ICLR 2023 UCR, SMD In this paper, we explore how we can select accurate time-series anomaly detection models given an unlabeled dataset and a set of candidate models. I’d like to point out that we use adjusted F1. The adjustment ensures that once an algorithm detects even a part of the anomaly, we consider that it has detected the entire anomaly.
Deep Variational Graph Convolutional Recurrent Network for Multivariate Time Series Anomaly Detection ICML 2022 DND, SMD, MSL, SMAP In this paper, we model channel dependency and stochasticity within MTS by developing an embedding-guided probabilistic generative network. We combine it with adaptive Variational Graph Convolutional Recurrent Network (VGCRN) to model both spatial and temporal fine-grained correlations in MTS. To explore hierarchical latent representations, we further extend VGCRN into a deep variational network, which captures multilevel information at different layers and is robust to noisy time series.
A Semi-Supervised VAE Based Active Anomaly Detection Framework in Multivariate Time Series for Online Systems WWW 2022 - online cloud server data from two different types of game business SLA-VAE first defines anomalies based on feature extraction module, introduces semi-supervised VAE to identify anomalies in multivariate time series, and employs active learning to update the online model via a small number of uncertain samples.
Towards a Rigorous Evaluation of Time-series Anomaly Detection AAAI 2022 - Secure water treatment (SWaT), ...... applying PA can severely overestimate a TAD model’s capability.
DeepGPD: A Deep Learning Approach for Modeling Geospatio-Temporal Extreme Events AAAI 2022 Code link the Global Historical Climatology Network (GHCN) proposed a novel deep learning architecture (DeepGPD) capable of learning the parameters of the generalized Pareto distribution while satisfying the conditions placed on those parameters.
Graph-Augmented Normalizing Flows for Anomaly Detection of Multiple Time Series ICLR 2022 - PMU-B, PMU-C, SWaT, METR-LA propose a novel flow model by imposing a Bayesian network among constituent series.
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy ICLR 2022 - SMD MSL SMAP SWaT PSM propose the Anomaly Transformer with a new Anomaly-Attention mechanism to compute the association discrepancy. A minimax strategy is devised to amplify the normal-abnormal distinguishability of the association discrepancy.
Graph Neural Network-Based Anomaly Detection in Multivariate Time Series AAAI 2021 - SWaT and WADI proposed our Graph Deviation Network (GDN) approach, which learns a graph of relationships between sensors, and detects deviations from these patterns, while incorporating sensor embeddings
Time Series Anomaly Detection with Multiresolution Ensemble Decoding AAAI 2021 - ECG, 2D-gesture and Power-demand, Yahoo’s S5 Its core is to use lower-resolution information to help long-range decoding at layers with higher resolutions. This is achieved by jointly learning multiple recurrent decoders where each decoder is with a different decoding length.
Outlier Impact Characterization for Time Series Data AAAI 2021 benchmark Webscope, Physionet study recurring outliers in time series data and aim to provide a systematic way of measuring the impact of such outliers on time series analysis
F-FADE: Frequency Factorization for Anomaly Detection in Edge Streams WSDM 2021 Code link - a new approach for detection of anomalies in edge streams, which uses a novel frequency-factorization technique to efficiently model the time-evolving distributions of frequencies of interactions between node-pairs.
FluxEV: A Fast and Effective Unsupervised Framework for Time-Series Anomaly Detection WSDM 2021 Code link - By converting the non-extreme anomalies to extreme values, our framework addresses the limitation of SPOT and achieves a huge improvement in the detection accuracy. Moreover, Method of Moments is adopted to speed up the parameter estimation in the automatic thresholding.
Event Outlier Detection in Continuous Time ICML 2021 Code link MIMIC III we develop outlier detection methods based on point processes thatcan take context information into account. Our methods are based on Bayesian decision theory and hypothesis testing with theoretical guarantees.
Multivariate Time Series Anomaly Detection and Interpretation using Hierarchical Inter-Metric and Temporal Embedding KDD 2021 Code link - Its core idea is to model the normal patterns inside MTS data through hierarchical Variational AutoEncoder with two stochastic latent variables, each of which learns low-dimensional inter-metric or temporal embeddings. Furthermore, we propose an MCMC-based method to obtain reasonable embeddings and reconstructions at anomalous parts for MTS anomaly interpretation.
Practical Approach to Asynchronous Multi-variate Time Series Anomaly Detection and Localization KDD 2021 Code link - Our solution is designed to leverage this behavior. The solution utilizes spectral analysis on the latent representation of a pre-trained autoencoder to extract dominant frequencies across the signals, which are then used in a subsequent network that learns the phase shifts across the signals and produces a synchronized representation of the raw multivariate.
Time Series Anomaly Detection for Cyber-physical Systems via Neural System Identification and Bayesian Filtering KDD 2021 Code link - a specially crafted neural network architecture is posed for system identification, i.e., capturing the dynamics of CPS in a dynamical state-space model; then a Bayesian filtering algorithm is naturally applied on top of the "identified" state-space model for robust anomaly detection by tracking the uncertainty of the hidden state of the system recursively over time.
Multi-Scale One-Class Recurrent Neural Networks for Discrete Event Sequence Anomaly Detection KDD 2021 Code link - a multi-scale one-class recurrent neural network for detecting anomalies in discrete event sequences.
Online false discovery rate control for anomaly detection in time series NeurIPS 2021 - - The methods proposed in this article overcome short-comings of previous FDRC rules in the context of anomaly detection, in particular ensuring that power remains high even when the alternative is exceedingly rare (typical in anomaly detection) and the test statistics are serially dependent (typical in time series).
Detecting Anomalous Event Sequences with Temporal Point Processes NeurIPS 2021 - LOGS,STEAD The proposed method can be combined with various TPP models, such as neural TPPs, and is easy to implement.
You Only Look at One Sequence: Rethinking Transformer in Vision through Object Detection NeurIPS 2021 Code link - have explored the transferability of the vanilla ViT pre-trained on mid-sized ImageNet-1k dataset to the more challenging COCO object detection benchmark
Drop-DTW: Aligning Common Signal Between Sequences While Dropping Outliers NeurIPS 2021 - MNIST introduced an extension to the classic DTW algorithm, which relaxes the constraints of matching endpoints of paired sequences and the continuity of the path cost.
Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation NeurIPS 2021 Code link healthcare, air quality a novel approach to impute multivariate time series with conditional diffusion models.
SDFVAE: Static and Dynamic Factorized VAE for Anomaly Detection of Multivariate CDN KPIs WWW 2021 - - Our key insight is that different KPIs are constrained by certain time-invariant characteristics of the underlying system, and that explicitly modelling such invariance may help resist noise in the data. We thus propose a novel anomaly detection method called SDFVAE, short for Static and Dynamic Factorized VAE, that learns the representations of KPIs by explicitly factorizing the latent variables into dynamic and static parts.
Time-series Change Point Detection with Self-Supervised Contrastive Predictive Coding WWW 2021 - Yahoo!Benchmark, HASC, USC-HAD propose a novel self-supervised CPD method, 𝑇𝑆 βˆ’ 𝐢𝑃2 for time series. 𝑇𝑆 βˆ’ 𝐢𝑃2 learns an embedded representation predict a future interval of a times series from historical samples.
NTAM: Neighborhood-Temporal Attention Model for Disk Failure Prediction in Cloud Platforms WWW 2021 - industrial datasets collected from millions of disks in Microsoft Azure,....... NTAM is a novel approach that not only utilizes a disk’s own status data, but also considers its neighbors’ status data. Moreover, NTAM includes a novel attention-based temporal component to capture the temporal nature of the disk status data. Besides, we propose a data enhancement method, called Temporal Progressive Sampling (TPS), to handle the extreme data imbalance issue.
Improving Irregularly Sampled Time Series Learning with Time-Aware Dual-Attention Memory-Augmented Networks CIKM 2021 Code link - The proposed model can leverage both time irregularity, multi-sampling rates and global temporal patterns information inherent in IASS-MTS so as to learn more effective representations for improving prediction performance.
BiCMTS: Bidirectional Coupled Multivariate Learning of Irregular Time Series with Missing Values CIKM 2021 - - BiCMTS method to represent both forward and backward value couplings within a time series by RNNs and between MTS by self-attention networks; the learned bidirectional intra- and inter-time series coupling representations are fused to estimate missing values.
Timeseries Anomaly Detection using Temporal Hierarchical One-Class Network NeurIPS 2020 - 2D-gesture, Power demand, KDD-Cup99 data, SWaT, MSL, SMAP based on a set of hierarchical structured hyperspheres. The solution uses a probabilistic relevance on cluster centers to help the model access the whole temporal history. A center orthogonality loss and a temporal self-supervision loss are also introduced for improved feature representation.
USAD : UnSupervised Anomaly Detection on multivariate time series KDD 2020 - SWaT, WADI, SMD, SMAP, MSL adversely trained autoencoders
Application Performance Anomaly Detection with LSTM on Temporal Irregularities in Logs CIKM 2020 Code link - present a new method to perform anomaly detection, while maintaining the quantitative aspect of time, using a count of event types over time.
Multivariate Time-series Anomaly Detection via Graph Attention Network ICDM 2020 Code link SMAP, MSL, TSA propose a novel framework based on graph attention network for multivariate time-series anomaly detection.
MERLIN: Parameter-Free Discovery of Arbitrary Length Anomalies in Massive Time Series Archives ICDM 2020 - - an algorithm that can efficiently and exactly find discords of all lengths in massive time series archives.
Outlier Detection for Time Series with Recurrent Autoencoder Ensembles IJCAI 2019 Code Traffic,...... The two solutions are ensemble frameworks, specifically an indepen- dent framework and a shared framework, both of which combine multiple S-RNN based autoencoders to enable outlier detection.
Regularization for Time Series Trend Filtering IJCAI 2019 - - we adopt the Hu- ber loss to suppress outliers, and utilize a combina- tion of the first order and second order difference on the trend component as regularization to cap- ture both slow and abrupt trend changes. Further- more, an efficient method is designed to solve the proposed robust trend filtering based on majoriza- tion minimization (MM) and alternative direction method of multipliers (ADMM).

Time series Clustering

Paper Conference Year Code Used Datasets Key Contribution
Clustering Interval-Censored Time-Series for Disease Phenotyping AAAI 2022 - - present our method, SubLign, to learn latent representations of disease progression that correct for temporal misalignment in real-world observations and consider conditions for identifiability of subtype and alignment values.
Corsets for Time Series Clustering NeurIPS 2021 - synthetic data address the problem of constructing coresets for time series data generated from Gaussian mixture models with auto-correlations across time.
Temporal Phenotyping using Deep Predictive Clustering of Disease Progression ICML 2020 Code link UKCF, Alzheimer’s Disease Neuroimaging Initiative (ADNI) defined novel loss functions to encourage each cluster to have homogeneous future outcomes and designed optimization procedures to avoid trivial solutions in identifying cluster as- signments and the centroids.
Learning low-dimensional state embeddings and metastable clusters from time series data NeurIPS 2019 - simulated diffusion processes his idea also leads to a kernel reshaping method for more accurate nonparametric estimation of the transition function. State embedding can be used to cluster states into metastable sets, thereby identifying the slow dynamics. Sharp statistical error bounds and misclassification rate are proved.
Learning Representations for Time Series Clustering NeurIPS 2019 - - ere we propose a novel unsupervised temporal representation learning model, named Deep Temporal Clustering Representation (DTCR), which integrates the temporal reconstruction and K-means objective into the seq2seq model.

Time series Segmentation

Paper Conference Year Code Used Datasets Key Contribution
ClaSP-Time Series Segmentation CIKM 2021 - 98 datasets.... a novel and highly accurate method for TSS. ClaSP hierarchically splits a TS into two parts, where each split point is determined by training a binary TS classifier for each possible split point and selecting the one with highest accuracy
Multi-series Time-aware Sequence Partitioning for Disease Progression Modeling IJCAI 2021 - sEMG improved the TICC by incorporating multi- series input (M-TICC) and time-awareness (MT-TICC).
Linear Time Complexity Time Series Clustering with Symbolic Pattern Forest IJCAI 2019 Code - This paper presents a novel time series clustering algorithm that has linear time complex- ity. The proposed algorithm partitions the data by checking some randomly selected symbolic pat- terns in the time series.
Similarity Preserving Representation Learning for Time Series Clustering IJCAI 2019 - - In this paper, we bridge this gap by proposing an efficient representation learning framework that is able to convert a set of time series with various lengths to an instance-feature matrix.

Others

Paper Conference Year Code Used Datasets Key Contribution
Adaptive Conformal Predictions for Time Series ICML 2022 code Uncertainty quantification of predictive models is crucial in decision-making problems. Conformal prediction is a general and theoretically sound answer. However, it requires exchangeable data, excluding time series. While recent works tackled this issue, we argue that Adaptive Conformal Inference (ACI, Gibbs and Candes` , 2021), developed for distribution-shift time series, is a good procedure for time series with general dependency. We theoretically analyse the impact of the learning rate on its efficiency in the exchangeable and auto-regressive case. We propose a parameter-free method, AgACI, that adaptively builds upon ACI based on online expert aggregation. We lead extensive fair simulations against competing methods that advocate for ACI’s use in time series. We conduct a real case study: electricity price forecasting. The proposed aggregation algorithm provides efficient prediction intervals for day-ahead forecasting. All the code and data to reproduce the experiments is made available.
Modeling Irregular Time Series with Continuous Recurrent Units ICML 2022 code Pendulum Images, Climate Data (USHCN), Electronic Health Records (Physionet) In many datasets (e.g. medical records) observation times are irregular and can carry important information. To address this challenge, we propose continuous recurrent units (CRUs) – a neural architecture that can naturally handle irregular intervals between observations.
Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion ICML 2022 HAR, SleepEDF, ECG Waveform, ETT, Weather, SaaT, WADI, SMD, SMAP, MSL We devise a novel iterative bilinear temporal-spectral fusion to explicitly encode the affinities of abundant time-frequency pairs, and iteratively refines representations in a fusion-and-squeeze manner with Spectrum-to-Time (S2T) and Time-to-Spectrum (T2S) Aggregation modules
Utilizing Expert Features for Contrastive Learning of Time-Series Representations ICML 2022 code HAR, SleepEDF, ECG Waveform We present an approach that incorporates expert knowledge for time-series representation learning. Our method employs expert features to replace the commonly used data transformations in previous contrastive learning approaches.
Neural Predicting Higher-order Patterns in Temporal Networks WWW 2022 Code Tags-math-sx, Tags-ask-ubuntu, Congress-bills, DAWN, Threads-ask-ubuntu we proposed the first model HIT to predict higher- order patterns in temporal hypergraphs to answer what type of, when, and why interactions may expand in a node triplet. HIT can be further generalized to predict even higher-order patterns.
ONBRA: Rigorous Estimation of the Temporal Betweenness Centrality in Temporal Networks WWW 2022 Code data, data2 In this work we present ONBRA, the first sampling-based approximation algorithm for estimating the temporal betweenness centrality values of the nodes in a temporal network, providing rigorous probabilistic guar- antees on the quality of its output.
Knowledge-based Temporal Fusion Network for Interpretable Online Video Popularity Prediction WWW 2022 Code medium-video dataset and a micro-video dataset from the server logs of Xigua and Douyin In this paper, we propose a Knowledge-based Temporal Fusion Network (KTFN) that incorporates knowledge graph representation to address the aforementioned challenges in the task of online video popularity prediction.
STAM: A Spatiotemporal Aggregation Method for Graph Neural Network-based Recommendation WWW 2022 Code MovieLens, Amazon, Taobao In this work, we propose a spatiotemporal aggregation method STAM to efficiently incorporate temporal information into neighbor embedding learning.
A Graph Temporal Information Learning Framework for Popularity Prediction WWW 2022 Code Sina Weibo In this paper, we propose a graph temporal information learning framework based on an improved graph convolutional network (GTGCN), which can capture both the temporal information governing the spread of information in a snapshot, and the inherent temporal dependencies among different snapshots.
PREP: Pre-training with Temporal Elapse Inference for Popularity Prediction WWW 2022 Code Sina Weibo, Twitter We design a novel pretext task for pre-training, i.e., temporal elapse inference for two ran- domly sampled time slices of popularity dynamics, impelling the representation model to learn intrinsic knowledge about popularity dynamics.
Conditional Loss and Deep Euler Scheme for Time Series Generation AAAI 2022 - - -
TS2Vec: Towards Universal Representation of Time Series AAAI 2022 Code link 128 UCR datasets,30 UEA datasets, 3 ETT datasets, Electricity, Yahoo dataset, KPI dataset performs contrastive learning in a hierarchical way over augmented context views
Time Masking for Temporal Language Models WSDM 2022 Code link - -
Long Short-Term Temporal Meta-learning in Online Recommendation WSDM 2022 - - -
Structure Meets Sequences: Predicting Network of Co-evolving Sequences WSDM 2022 Code link - -
EvoKG: Jointly Modeling Event Time and Network Structure for Reasoning over Temporal Knowledge Graphs WSDM 2022 Code link - -
FILLING THE GAPS: MULTIVARIATE TIME SERIES IMPUTATION BY GRAPH NEURAL NETWORKS ICLR 2022 - Air quality, Traffic, and Smart Grids -
PSA-GAN: PROGRESSIVE SELF ATTENTION GANS FOR SYNTHETIC TIME SERIES ICLR 2022 Code link, Code on glueonts Electricty, M4, Solar energy, Traffic -
Generative Semi-Supervised Learning for Multivariate Time Series Imputation AAAI 2021 - - -
Temporal Cross-Effects in Knowledge Tracing WSDM 2021 - - -
Learning Dynamic Embeddings for Temporal Knowledge Graphs WSDM 2021 - - -
Temporal Meta-path Guided Explainable Recommendation WSDM 2021 - - -
Generative Adversarial Networks for Markovian Temporal Dynamics: Stochastic Continuous Data Generation ICML 2021 Code link - -
Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning KDD 2021 Code link - -
Time-series Generation by Contrastive Imitation NeurIPS 2021 - - -
Adjusting for Autocorrelated Errors in Neural Networks for Time Series NeurIPS 2021 Code link - -
Spikelet: An Adaptive Symbolic Approximation for Finding Higher-Level Structure in Time Series ICDM 2021 - - -
STING: Self-attention based Time-series Imputation Networks using GAN ICDM 2021 Code link - -
SMATE: Semi-Supervised Spatio-Temporal Representation Learning on Multivariate Time Series ICDM 2021 Code link - -
TCube: Domain-Agnostic Neural Time-series Narration ICDM 2021 Code link - -
Towards Interpretability and Personalization: A Predictive Framework for Clinical Time-series Analysis ICDM 2021 Code link - -
Continual Learning for Multivariate Time Series Tasks with Variable Input Dimensions ICDM 2021 Code link - -
CASPITA: Mining Statistically Significant Paths in Time Series Data from an Unknown Network ICDM 2021 Code link - -
Multi-way Time Series Join on Multi-length Patterns ICDM 2021 Code link - -
Temporal Event Profiling based on Multivariate Time Series Analysis over Long-term Document Archives SIGIR 2021 Code link - -
Time-Aware Multi-Scale RNNs for Time Series Modeling ICDM 2021 Code link - -
Deep reconstruction of strange attractors from time series NeurIPS 2020 Code link - -
One Detector to Rule Them All: Towards a General Deepfake Attack Detection Framework WWW 2021 Code link - -
High-recall causal discovery for autocorrelated time series with latent confounders NeurIPS 2020 Code link - -
Learning Long-Term Dependencies in Irregularly-Sampled Time Series NeurIPS 2020 Code link - -
ARMA Nets: Expanding Receptive Field for Dense Prediction NeurIPS 2020 Code link - -
Learnable Group Transform For Time-Series ICML 2020 Code link - -
Fast RobustSTL: Efficient and Robust Seasonal-Trend Decomposition for Time Series with Complex Patterns KDD 2020 - - -
Matrix Profile XXI: A Geometric Approach to Time Series Chains Improves Robustness KDD 2020 Code link - -
Multi-Source Deep Domain Adaptation with Weak Supervision for Time-Series Sensor Data KDD 2020 Code link - -
Personalized Imputation on Wearable-Sensory Time Series via Knowledge Transfer CIKM 2020 Code link - -
Hybrid Sequential Recommender via Time-aware Attentive Memory Network CIKM 2020 Code link - -
Order-Preserving Metric Learning for Mining Multivariate Time Series ICDM 2020 Code link - -
Fast Automatic Feature Selection for Multi-period Sliding Window Aggregate in Time Series ICDM 2020 - Tianchi, PLAsTiCC, NFL, MotionSense, Gas Sensors a framework to fill the gap of the end-to-end automatic sliding window aggregate feature selection for time series
Matrix Profile XXII: Exact Discovery of Time Series Motifs Under DTW ICDM 2020 Code link - -
Inductive Granger Causal Modeling for Multivariate Time Series ICDM 2020 - Finance, FMRI, Synthetic data -
Mining Recurring Patterns in Real-Valued Time Series using the Radius Profile ICDM 2020 - - -
Learning Periods from Incomplete Multivariate Time Series ICDM 2020 - - -
FilCorr: Filtered and Lagged Correlation on Streaming Time Series ICDM 2020 - - -
Unsupervised Scalable Representation Learning for Multivariate Time SeriesΒ  NeurIPS 2019 Code Datasets -
Latent Ordinary Differential Equations for Irregularly-Sampled Time Series NeurIPS 2019 - Human Activity dataset We generalize RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model we call ODE-RNNs. Furthermore, we use ODE-RNNs to replace the recognition network of the recently-proposed Latent ODE model.
GRU-ODE-Bayes: Continuous Modeling of Sporadically-Observed Time Series NeurIPS 2019 Code Datasets -
Interpolation-Prediction Networks for Irregularly Sampled Time Series ICLR 2019 Code MIMIC-III, UWaveGestureLibraryAll In this paper, we have presented a new framework for dealing with the problem of supervised learn- ing in the presence of sparse and irregularly sampled time series. The proposed framework is fully modular.
SOM-VAE: Interpretable Discrete Representation Learning on Time Series ICLR 2019 - - The SOM-VAE can recover topologically interpretable state representations on time series and static data. It provides an improvement to standard methods in terms of clustering performance and offers a way to learn discrete two-dimensional representations of the data manifold in concurrence with the reconstruction task.
U-Time: A Fully Convolutional Network for Time Series Segmentation Applied to Sleep Staging NeurIPS 2019 Code Datasets -
E^2GAN: End-to-End Generative Adversarial Network for Multivariate Time Series Imputation IJCAI 2019 - - -

πŸ“ Time Series Libraries

Name Company Stars Explanation
πŸ“š Darts Unit8 ⭐️ 5.3K Darts is a Python library for user-friendly forecasting and anomaly detection on time series. It contains a variety of models, from classics such as ARIMA to deep neural networks. The forecasting models can all be used in the same way, using fit() and predict() functions, similar to scikit-learn. The library also makes it easy to backtest models, combine the predictions of several models, and take external data into account. Darts supports both univariate and multivariate time series and models. The ML-based models can be trained on potentially large datasets containing multiple time series, and some of the models offer a rich support for probabilistic forecasting.
πŸ“š Prophet Meta (Facebook) ⭐️ 15.5K Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well.
πŸ“š Neural Prophet - ⭐️ 2.8K A Neural Network based Time-Series model, inspired by Facebook Prophet and AR-Net, built on PyTorch.
πŸ“š GluonTS AWS ⭐️ 3.3K GluonTS is a Python package for probabilistic time series modeling, focusing on deep learning based models, based on PyTorch and MXNet.
πŸ“š stumpy TD Ameritrade ⭐️ 2.5K STUMPY is a powerful and scalable Python library that efficiently computes something called the matrix profile, which is just an academic way of saying "for every subsequence within your time series, automatically identify its corresponding nearest-neighbor". What's important is that once you've computed your matrix profile (middle panel above) it can then be used for a variety of time series data mining tasks.
πŸ“š tsfresh Blue Yonder GmbH ⭐️ 7K The package provides systematic time-series feature extraction by combining established algorithms from statistics, time-series analysis, signal processing, and nonlinear dynamics with a robust feature selection algorithm. In this context, the term time-series is interpreted in the broadest possible sense, such that any types of sampled data or even event sequences can be characterised.
πŸ“š SKTIME - ⭐️ 6.1K sktime is a library for time series analysis in Python. It provides a unified interface for multiple time series learning tasks. Currently, this includes time series classification, regression, clustering, annotation and forecasting. It comes with time series algorithms and scikit-learn compatible tools to build, tune and validate time series models.
πŸ“š pmdarima - ⭐️ 1.3K Pmdarima (originally pyramid-arima, for the anagram of 'py' + 'arima') is a statistical library designed to fill the void in Python's time series analysis capabilities.
πŸ“š tslearn - ⭐️ 2.4K The machine learning toolkit for time series analysis in Python.
πŸ“š PyTorch Forecasting - ⭐️ 2.6K PyTorch Forecasting is a PyTorch-based package for forecasting time series with state-of-the-art network architectures. It provides a high-level API for training networks on pandas data frames and leverages PyTorch Lightning for scalable training on (multiple) GPUs, CPUs and for automatic logging.
πŸ“š StatsForecast - ⭐️ 2.2K StatsForecast offers a collection of widely used univariate time series forecasting models, including automatic ARIMA, ETS, CES, and Theta modeling optimized for high performance using numba. It also includes a large battery of benchmarking models.
πŸ“š Streamz - ⭐️ 1.1K Streamz helps you build pipelines to manage continuous streams of data. It is simple to use in simple cases, but also supports complex pipelines that involve branching, joining, flow control, feedback, back pressure, and so on.

πŸ“ Time Series Benchmarks and Datasets

Paper Conference Year Code Key Contribution
Monash Time Series Forecasting Repository NeurIPS 2021 paper link There have been many deep time series evaluated on the same datasets in recent years. Even though this works for basic benchmarking, it may not hold up when applied to a variety of temporal tasks. Its goal is to create a "master list" of different time series datasets and serve as an authoritative benchmark. Over 20 different datasets are included in the repository, spanning industries as diverse as health, retail, ride-share, and demographics.
Revisiting Time Series Outlier Detection: Definitions and Benchmarks NeurIPS 2021 link This paper critiques many existing time series anomaly/outlier detection datasets and proposes 35 brand-new synthetic datasets and 4 real-world datasets for benchmarking purposes.
Subseasonal Forecasting Microsoft Microsoft 2021 link Microsoft has released a dataset to facilitate machine learning for improving subseasonal forecasting (e.g. two to six weeks in the future). Forecasting subseasonally helps government agencies and farmers prepare for weather events. In general, deep learning models performed quite poorly compared to other methods in Microsoft's benchmark. A simple feed-forward model proved to be the most accurate DL model, while the Informer performed poorly.

Contributing

We appreciate all contributions to improve this paper repo!

Please feel free to pull requests, open an issue or send me email (ailingzengzzz@gmail.com) to add awesome papers.