Time Series
Time-Series Representation Learning
- TF-C: Time-Frequency Consistency (TF-C) Model (Harvard MIMS Lab, NeurIPS 2022) - A cross-domain time-series representation model that leverages contrastive learning between time-domain and frequency-domain views of the same signal​. By enforcing consistent embeddings in both domains, TF-C learns general features transferable to diverse sensors (EEG, accelerometer, etc.). Weights are available.
- TS-TCC: Time-Series Representation Learning via Temporal and Contextual Contrasting: contrastive. augmentation: jitter, scale the amplitude and permutation.
- TimesURL: Learning Universal Representations of Time Series via Contrastive Learning: contrastive and reconstruction based (MAE) based on TS2Ve model. augmentation: left-righ cropping with frequency swaping of the negative example. hard negatives are temporal and instance based. In temporal way, the augmentation mix in different time. In the instance-based augmentation, this mixing occurs in mixing the different instances.
- TS2Vec: Towards universal representation of time series. Augmentation: croping
- CPC: Representation Learning with Contrastive Predictive Coding: The contrastive learning works such as SimCLR, MoCo, SwAV, DINO are based on the contrastive loss introduced in this paper. The postive are the predictions.
- TNC (Temporal Neighborhood Coding): Triplet loss (neighbor positive and further negative)
Generalized Category Discovery (GCD)
- GCD: Generalized Category Discovery. Use supervised contrastive and self-supervised contrastive for representation learning. Then use KMean with forcing the labeled data stays in a cluster. Hungarian matching assignment defines the accuracy of clustering on the labeled data, where this accuracy defines the number of clusters.
Imbalanced Generalized Category Discovery (GCD)
- SimGCD: Clustering is trained jointly with the representation learning network. This is not an imbalanced data setting. However, entropy is used as a regularization to prevent the model from over-predicting certain label classes. This issue is more about imbalanced prediction, which arises during joint learning of clustering and representation, but not in separate training as done in the original GCD method. MASA: Multi-Activity Sequence Alignment via Implicit Clustering is related in terms of parametric clustering, though it addresses a different task.
- LegoGCD
- AGCD
- BaCon
- Generalized Category Discovery under the Long-Tailed Distribution
- DebiasGCD
- Long-tailed GCD
- ImbaGCD
General Time-Series Models
MOIRAI-MOE: Time-series foundation model, which uses mixture of expertes to select for different data frequencies. It is build upon MOIRAI. Other time-series foundation models are Moment, MOIRAI, Chronos, PatchTST TimesFM, Lag-Llama, [TimeGPT-1].
Autoformer, Informer, Reformer for the long-term forecasting. Some of these methods are provided in HuggingFace Time Series Models. In Transformers Effective for Time Series Forecasting?, argues the transformers are not needed.
Discriminative Representation
The representation that can be used in GCN (Generalized Category Discovery)
(GCN, SelEx).
Contrastive learning, Sparse autoencoder or older method such as DEC (Deep Embedded Clustering), SOM (Self Organizing Maps).
Characteristics of Time Series
Implicit Reasoning in Deep Time Series Forecasting: It is observed that certain linear, MLP-based, and patch-based Transformer models generalize effectively in carefully structured out-of-distribution scenarios, suggesting underexplored reasoning capabilities beyond simple pattern memorization.