Skip to content

Time Series

Time-Series Representation Learning

  • TF-C: Time-Frequency Consistency (TF-C) Model (Harvard MIMS Lab, NeurIPS 2022) - A cross-domain time-series representation model that leverages contrastive learning between time-domain and frequency-domain views of the same signal​. By enforcing consistent embeddings in both domains, TF-C learns general features transferable to diverse sensors (EEG, accelerometer, etc.). Weights are available.
  • TS-TCC: Time-Series Representation Learning via Temporal and Contextual Contrasting: contrastive. augmentation: jitter, scale the amplitude and permutation.
  • TimesURL: Learning Universal Representations of Time Series via Contrastive Learning: contrastive and reconstruction based (MAE) based on TS2Ve model. augmentation: left-righ cropping with frequency swaping of the negative example. hard negatives are temporal and instance based. In temporal way, the augmentation mix in different time. In the instance-based augmentation, this mixing occurs in mixing the different instances.
  • TS2Vec: Towards universal representation of time series. Augmentation: croping
  • CPC: Representation Learning with Contrastive Predictive Coding: The contrastive learning works such as SimCLR, MoCo, SwAV, DINO are based on the contrastive loss introduced in this paper. The postive are the predictions.
  • TNC (Temporal Neighborhood Coding): Triplet loss (neighbor positive and further negative)

Novel Category Discovery

  • GCD: Generalized Category Discovery. Use supervised contrastive and self-supervised contrastive for representation learning. Then use KMean with forcing the labeled data stays in a cluster. Hungarian matching assignment defines the accuracy of clustering on the labeled data, where this accuracy defines the number of clusters.

General Time-Series Models

MOIRAI-MOE: Time-series foundation model, which uses mixture of expertes to select for different data frequencies. It is build upon MOIRAI. Other time-series foundation models are Moment, MOIRAI, Chronos, PatchTST TimesFM, Lag-Llama, [TimeGPT-1].

Autoformer, Informer, Reformer for the long-term forecasting. Some of these methods are provided in HuggingFace Time Series Models. In Transformers Effective for Time Series Forecasting?, argues the transformers are not needed.

Discriminative Representation

The representation that can be used in GCN (Generalized Category Discovery) (GCN, SelEx).

Contrastive learning, Sparse autoencoder or older method such as DEC (Deep Embedded Clustering), SOM (Self Organizing Maps).

Characteristics of Time Series

Implicit Reasoning in Deep Time Series Forecasting: It is observed that certain linear, MLP-based, and patch-based Transformer models generalize effectively in carefully structured out-of-distribution scenarios, suggesting underexplored reasoning capabilities beyond simple pattern memorization.