We collected 158 time series anomaly detection algorithms from different research areas; Deep Learning, Statistics, Outlier Detection, Signal Analysis, Classic Machine Learning, Data Mining, Stochastic Learning. Some of these algorithms can detect anomalies on multidimensional time series.
Overview
We implemented 71 of the total collection. These implemented algorithms are used in our evaluation. The following table shows all categorized algorithms. Column Implemented shows which algorithms we used in the evaluation.
|
Research Area |
Method data type |
Implemented |
| Method Name |
|
|
|
| AE |
Deep Learning |
multivariate |
True |
| ARIMA |
Statistics (Regression & Forecasting) |
univariate |
True |
| Bagel |
Deep Learning |
univariate |
True |
| CBLOF |
Ourlier Detection |
multivariate |
True |
| COF |
Ourlier Detection |
multivariate |
True |
| COPOD |
Ourlier Detection |
multivariate |
True |
| DAE |
Deep Learning |
multivariate |
True |
| DBStream |
Ourlier Detection |
multivariate |
True |
| DSPOT |
Statistics (Regression & Forecasting) |
univariate |
True |
| DWT-MLEAD |
Signal Analysis |
univariate |
True |
| DeepAnT |
Deep Learning |
multivariate |
True |
| DeepNAP |
Deep Learning |
multivariate |
True |
| Donut |
Deep Learning |
univariate |
True |
| EIF |
Classic Machine Learning |
multivariate |
True |
| EncDec-AD |
Deep Learning |
multivariate |
True |
| Ensemble GI |
Data Mining |
univariate |
True |
| FFT |
Signal Analysis |
univariate |
True |
| Fast-MCD |
Statistics (Regression & Forecasting) |
multivariate |
True |
| GrammarViz |
Data Mining |
univariate |
True |
| HBOS |
Classic Machine Learning |
multivariate |
True |
| HIF |
Ourlier Detection |
multivariate |
True |
| HOT SAX |
Data Mining |
univariate |
True |
| HealthESN |
Deep Learning |
univariate |
True |
| Hybrid KNN |
Deep Learning |
multivariate |
True |
| IE-CAE |
Deep Learning |
univariate |
True |
| IF-LOF |
Ourlier Detection |
multivariate |
True |
| KNN |
Classic Machine Learning |
multivariate |
True |
| LOF |
Ourlier Detection |
multivariate |
True |
| LSTM-AD |
Deep Learning |
multivariate |
True |
| LSTM-VAE |
Deep Learning |
multivariate |
True |
| LaserDBN |
Stochastic Learning |
multivariate |
True |
| Left STAMPi |
Data Mining |
univariate |
True |
| MSCRED |
Deep Learning |
multivariate |
True |
| MTAD-GAT |
Deep Learning |
multivariate |
True |
| MedianMethod |
Statistics (Regression & Forecasting) |
univariate |
True |
| MultiHMM |
Stochastic Learning |
multivariate |
True |
| NormA |
Data Mining |
univariate |
True |
| Normalizing Flows |
Deep Learning |
multivariate |
True |
| NoveltySVR |
Classic Machine Learning |
univariate |
True |
| NumentaHTM |
Deep Learning |
univariate |
True |
| OceanWNN |
Deep Learning |
univariate |
True |
| OmniAnomaly |
Deep Learning |
multivariate |
True |
| PCC |
Classic Machine Learning |
multivariate |
True |
| PCI |
Statistics (Regression & Forecasting) |
univariate |
True |
| PS-SVM |
Classic Machine Learning |
univariate |
True |
| PST |
Data Mining |
univariate |
True |
| RBForest |
Classic Machine Learning |
multivariate |
True |
| RForest |
Classic Machine Learning |
univariate |
True |
| RobustPCA |
Classic Machine Learning |
multivariate |
True |
| S-H-ESD |
Statistics (Regression & Forecasting) |
univariate |
True |
| SAND |
Data Mining |
univariate |
True |
| SARIMA |
Statistics (Regression & Forecasting) |
univariate |
True |
| SR |
Signal Analysis |
univariate |
True |
| SR-CNN |
Deep Learning |
univariate |
True |
| SSA |
Data Mining |
univariate |
True |
| STAMP |
Data Mining |
univariate |
True |
| STOMP |
Data Mining |
univariate |
True |
| Series2Graph |
Data Mining |
univariate |
True |
| Sub-Fast-MCD |
Statistics (Regression & Forecasting) |
univariate |
True |
| Sub-IF |
Ourlier Detection |
univariate |
True |
| Sub-LOF |
Ourlier Detection |
univariate |
True |
| TARZAN |
Data Mining |
univariate |
True |
| TAnoGAN |
Deep Learning |
multivariate |
True |
| TSBitmap |
Data Mining |
univariate |
True |
| Telemanom |
Deep Learning |
multivariate |
True |
| Torsk |
Deep Learning |
multivariate |
True |
| Triple ES |
Statistics (Regression & Forecasting) |
univariate |
True |
| VALMOD |
Data Mining |
univariate |
True |
| XGBoosting |
Classic Machine Learning |
univariate |
True |
| iForest |
Ourlier Detection |
multivariate |
True |
| k-Means |
Classic Machine Learning |
multivariate |
True |
| AD-LTI |
Deep Learning |
multivariate |
False |
| AMD Segmentation |
Statistics (Regression & Forecasting) |
univariate |
False |
| ANODE |
Statistics (Regression & Forecasting) |
|
False |
| AOSVM |
Classic Machine Learning |
univariate |
False |
| AR |
Statistics (Regression & Forecasting) |
univariate |
False |
| ARMA |
Statistics (Regression & Forecasting) |
univariate |
False |
| BLOF |
Ourlier Detection |
|
False |
| BoehmerGraph |
Data Mining |
|
False |
| Box Plot |
Statistics (Regression & Forecasting) |
univariate |
False |
| CHEB |
Statistics (Regression & Forecasting) |
univariate |
False |
| CoalESN |
Deep Learning |
|
False |
| ConInd |
Statistics (Regression & Forecasting) |
multivariate |
False |
| CxDBN |
Stochastic Learning |
multivariate |
False |
| DAD |
Data Mining |
univariate |
False |
| DADS |
Data Mining |
univariate |
False |
| DILOF |
Ourlier Detection |
multivariate |
False |
| Deep K-Means |
Deep Learning |
|
False |
| Deep OCSVM |
Deep Learning |
|
False |
| DeepLSTM |
Deep Learning |
univariate |
False |
| DeepPCA |
Deep Learning |
|
False |
| DissimilarityAlgo |
Data Mining |
univariate |
False |
| Double ES (Holt's) |
Statistics (Regression & Forecasting) |
|
False |
| Dynamic State Estimator (DSE) |
Statistics (Regression & Forecasting) |
|
False |
| EDBN |
Stochastic Learning |
|
False |
| EM-HMM |
Stochastic Learning |
|
False |
| EWMA-STR |
Statistics (Regression & Forecasting) |
|
False |
| Eros-SVMs |
Classic Machine Learning |
multivariate |
False |
| FuzzyDNBC |
Stochastic Learning |
multivariate |
False |
| GLA |
Stochastic Learning |
univariate |
False |
| GeckoFSM |
Ourlier Detection |
multivariate |
False |
| GridLOF |
Ourlier Detection |
|
False |
| HMAD |
Stochastic Learning |
univariate |
False |
| HSDE |
Ourlier Detection |
univariate |
False |
| HSMM |
Stochastic Learning |
univariate |
False |
| Hybrid K-means |
Classic Machine Learning |
|
False |
| I-HMM |
Stochastic Learning |
univariate |
False |
| ILOF |
Data Mining |
multivariate |
False |
| K-LOF |
Ourlier Detection |
|
False |
| KNN (PTSA) |
Classic Machine Learning |
|
False |
| Kalman Filter |
Statistics (Regression & Forecasting) |
|
False |
| KnorrSeq2 |
Data Mining |
|
False |
| LAMP (GPU) |
Deep Learning |
|
False |
| LOCI/aLOCI |
Ourlier Detection |
multivariate |
False |
| LSTM (PTSA) |
Deep Learning |
|
False |
| LSTM-based VAE-GAN |
Deep Learning |
multivariate |
False |
| MA |
Statistics (Regression & Forecasting) |
univariate |
False |
| MAD-GAN |
Deep Learning |
multivariate |
False |
| MCD |
Statistics (Regression & Forecasting) |
|
False |
| MCOD |
Ourlier Detection |
univariate |
False |
| MERLIN |
Data Mining |
univariate |
False |
| MGDD |
Statistics (Regression & Forecasting) |
multivariate |
False |
| MS-SVDD |
Classic Machine Learning |
multivariate |
False |
| MoteESN |
Deep Learning |
|
False |
| MultiHTM |
Deep Learning |
multivariate |
False |
| NetworkSVM |
Classic Machine Learning |
multivariate |
False |
| NorM |
Data Mining |
univariate |
False |
| NorM (SAD) |
Data Mining |
|
False |
| OC-KFD |
Classic Machine Learning |
multivariate |
False |
| Online DWT-MLEAD |
Signal Analysis |
univariate |
False |
| PAD |
Deep Learning |
univariate |
False |
| PCA |
Classic Machine Learning |
multivariate |
False |
| Poly (PTSA) |
Statistics (Regression & Forecasting) |
univariate |
False |
| RADM |
Deep Learning |
multivariate |
False |
| RPIF |
|
|
False |
| RUSBoost |
Classic Machine Learning |
univariate |
False |
| RePAD |
Statistics (Regression & Forecasting) |
univariate |
False |
| Robust Deep AutoEncoder |
Deep Learning |
|
False |
| S-SVM |
Classic Machine Learning |
univariate |
False |
| SALOF |
Ourlier Detection |
|
False |
| SCRIMP++ |
Data Mining |
|
False |
| SH-ESD+ |
Statistics (Regression & Forecasting) |
univariate |
False |
| SLADE-MTS |
Classic Machine Learning |
|
False |
| SLADE-TS |
Classic Machine Learning |
|
False |
| STORN |
Deep Learning |
univariate |
False |
| Simple ES (EWMA) |
Statistics (Regression & Forecasting) |
|
False |
| SmartSifter |
Stochastic Learning |
multivariate |
False |
| Sparse AutoEncoder |
Deep Learning |
|
False |
| Structured Denoising AutoEncoder (StrDAE) |
Deep Learning |
|
False |
| SurpriseEncoding |
Data Mining |
multivariate |
False |
| TCN-AE |
Deep Learning |
univariate |
False |
| TOLF |
Ourlier Detection |
|
False |
| TwoFinger |
Data Mining |
univariate |
False |
| U-GMM-HMM |
Stochastic Learning |
univariate |
False |
| VELC |
Deep Learning |
univariate |
False |
| Yesterday (PTSA) |
Statistics (Regression & Forecasting) |
univariate |
False |
| pEWMA |
Statistics (Regression & Forecasting) |
|
False |
| sequenceMiner |
Classic Machine Learning |
univariate |
False |
Implementation Details
More than half of the 71 chosen algorithms had to be reimplemented by ourselves. However, some authors provided algorithm implementations or community versions exist. All implementations can be found in our Github repository.
| Method Name |
Source Code Origin |
Language |
License |
Method Family |
|
| ARIMA |
own (John Paparrizos and team) |
Python |
no license |
forecasting |
→Github |
| AE |
own |
Python, Tensorflow |
MIT |
reconstruction |
→Github |
| Bagel |
original |
Python |
no license |
reconstruction |
→Github |
| CBLOF |
community (PyOD) |
Python |
BSD 2 |
distance |
→Github |
| COF |
community (PyOD) |
Python |
BSD 2 |
distance |
→Github |
| COPOD |
community (PyOD) |
Python |
BSD 2 |
distribution |
→Github |
| DAE |
own |
Python, Tensorflow |
MIT |
reconstruction |
→Github |
| DBStream |
original |
R |
no license |
distance |
→Github |
| DeepAnT |
own |
Python, Pytorch |
no license |
forecasting |
→Github |
| DeepNAP |
own |
Python, Pytorch |
MIT |
forecasting |
→Github |
| Donut |
original |
Python, Pytorch |
no license |
reconstruction |
→Github |
| DSPOT |
original |
Python |
GPL 3.0 |
distribution |
→Github |
| DWT-MLEAD |
own |
Python |
MIT |
distribution |
→Github |
| EIF |
original |
Python |
UIUC |
trees |
→Github |
| EncDec-AD |
own |
Python, Pytorch |
MIT |
reconstruction |
→Github |
| Ensemble GI |
own |
Python |
MIT |
encoding |
→Github |
| Fast-MCD |
own |
Python |
MIT |
distribution |
→Github |
| FFT |
own |
Python |
MIT |
reconstruction |
→Github |
| RForest |
own |
Python |
MIT |
forecasting |
→Github |
| XGBoosting |
own |
Python |
MIT |
forecasting |
→Github |
| GrammarViz |
original |
Java |
GPL 2.0 |
encoding |
→Github |
| HBOS |
community (PyOD) |
Python |
BSD 2 |
distribution |
→Github |
| HealthESN |
own |
Python |
MIT |
forecasting |
→Github |
| HIF |
original |
Python |
GPL 2.0 |
trees |
→Github |
| HOT SAX |
original |
Python |
GPL 2.0 |
distance |
→Github |
| Hybrid KNN |
own |
Python, Pytorch |
MIT |
distance |
→Github |
| IF-LOF |
own |
Python |
MIT |
trees |
→Github |
| iForest |
community (PyOD) |
Python |
BSD 2 |
trees |
→Github |
| IE-CAE |
own |
Python, Pytorch |
MIT |
reconstruction |
→Github |
| k-Means |
own |
Python |
MIT |
distance |
→Github |
| KNN |
community (PyOD) |
Python |
BSD 2 |
distance |
→Github |
| LaserDBN |
own |
Python |
MIT |
encoding |
→Github |
| Left STAMPi |
original |
Python |
BSD |
distance |
→Github |
| LOF |
community (PyOD) |
Python |
BSD 2 |
distance |
→Github |
| LSTM-AD |
own |
Python, Pytorch |
MIT |
forecasting |
→Github |
| LSTM-VAE |
own |
Python, Tensorflow |
MIT |
reconstruction |
→Github |
| MedianMethod |
own |
Python |
MIT |
forecasting |
→Github |
| MSCRED |
own |
Python, Tensorflow |
MIT |
reconstruction |
→Github |
| MTAD-GAT |
own |
Python, Pytorch |
MIT |
forecasting |
→Github |
| MultiHMM |
own |
Python |
MIT |
encoding |
→Github |
| NormA |
original |
Python |
private |
distance |
→Github |
| Normalizing Flows |
own |
Python, Pytorch |
MIT |
distribution |
→Github |
| NoveltySVR |
own |
Python |
GPL 3.0 |
forecasting |
→Github |
| NumentaHTM |
original |
Python |
AGPL |
forecasting |
→Github |
| OceanWNN |
own |
Python, Pytorch |
MIT |
forecasting |
→Github |
| OmniAnomaly |
original |
Python, Tensorflow |
MIT |
reconstruction |
→Github |
| PCC |
community (PyOD) |
Python |
BSD 2 |
reconstruction |
→Github |
| PCI |
own |
Python |
MIT |
reconstruction |
→Github |
| PS-SVM |
own |
Python |
MIT |
distance |
→Github |
| PST |
own |
R |
GPL |
encoding |
→Github |
| RBForest |
own |
Python |
MIT |
forecasting |
→Github |
| RobustPCA |
community |
Python |
MIT |
reconstruction |
→Github |
| S-H-ESD |
own |
R |
GPL 3.0 |
distribution |
→Github |
| SAND |
original |
Python |
private |
distance |
→Github |
| SARIMA |
own |
Python |
BSD 3.0 |
forecasting |
→Github |
| Series2Graph |
original |
Python |
private |
distance |
→Github |
| SR |
original |
Python |
MIT |
reconstruction |
→Github |
| SR-CNN |
original |
Python, Pytorch |
MIT |
reconstruction |
→Github |
| SSA |
own (John Paparrizos and team) |
Python |
no license |
distance |
→Github |
| STAMP |
original |
R |
Apache |
distance |
→Github |
| STOMP |
original |
R |
Apache |
distance |
→Github |
| Sub-Fast-MCD |
own |
Python |
MIT |
distribution |
→Github |
| Sub-IF |
own |
Python |
MIT |
trees |
→Github |
| Sub-LOF |
own |
Python |
MIT |
distance |
→Github |
| TAnoGAN |
own |
Python, Pytorch |
no license |
reconstruction |
→Github |
| TARZAN |
original |
Python |
no license |
encoding |
→Github |
| Telemanom |
original |
Python, Tensorflow |
Caltech |
forecasting |
→Github |
| Torsk |
original |
Python, Pytorch |
no license |
forecasting |
→Github |
| Triple ES |
own |
Python |
MIT |
forecasting |
→Github |
| TSBitmap |
community |
Python |
no license |
encoding |
→Github |
| VALMOD |
original |
R |
Apache |
distance |
→Github |
Parameterization
After an independent parameter search, we conducted the experiments with the following parameters. Some parameters depend on data set properties.
ARIMA
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| max_lag |
10% of dataset length |
| p_start |
1 |
| q_start |
1 |
| max_p |
5 |
| max_q |
5 |
| differencing_degree |
1 |
| distance_metric |
twed |
| random_state |
42 |
AE
| Parameter |
Value |
| latent_size |
32 |
| epochs |
500 |
| learning_rate |
0.001 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| random_state |
42 |
Bagel
| Parameter |
Value |
| window_size |
2.0 dataset period size |
| latent_size |
6 |
| hidden_layer_shape |
[100, 100] |
| dropout |
0.1 |
| cuda |
False |
| epochs |
500 |
| batch_size |
64 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| random_state |
42 |
CBLOF
| Parameter |
Value |
| n_clusters |
50 |
| alpha |
default |
| beta |
default |
| use_weights |
default |
| random_state |
42 |
| n_jobs |
1 |
COF
| Parameter |
Value |
| n_neighbors |
50 |
| random_state |
42 |
COPOD
| Parameter |
Value |
| random_state |
42 |
DAE
| Parameter |
Value |
| latent_size |
32 |
| epochs |
500 |
| learning_rate |
0.001 |
| noise_ratio |
0.1 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| random_state |
42 |
DBStream
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| radius |
1.3 |
| lambda |
0.001 |
| distance_metric |
euclidean |
| shared_density |
True |
| n_clusters |
30 |
| alpha |
0.5 |
| min_weight |
0 |
| random_state |
42 |
DeepAnT
| Parameter |
Value |
| epochs |
500 |
| window_size |
0.5 dataset period size |
| prediction_window_size |
50 |
| learning_rate |
0.001 |
| batch_size |
64 |
| random_state |
42 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
DeepNAP
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| partial_sequence_length |
3 |
| lstm_layers |
1 |
| rnn_hidden_size |
200 |
| dropout |
0.5 |
| linear_hidden_size |
100 |
| batch_size |
64 |
| validation_batch_size |
64 |
| epochs |
500 |
| learning_rate |
0.001 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| random_state |
42 |
Donut
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| latent_size |
5 |
| regularization |
0.001 |
| linear_hidden_size |
130 |
| epochs |
500 |
| random_state |
42 |
DSPOT
| Parameter |
Value |
| q |
default |
| n_init |
1000 |
| level |
0.99 |
| up |
True |
| down |
True |
| alert |
default |
| bounded |
True |
| max_excess |
200 |
| random_state |
42 |
DWT-MLEAD
| Parameter |
Value |
| start_level |
3 |
| quantile_epsilon |
0.1 |
| random_state |
42 |
EIF
| Parameter |
Value |
| n_trees |
500 |
| max_samples |
None |
| extension_level |
None |
| limit |
None |
| random_state |
42 |
EncDec-AD
| Parameter |
Value |
| lstm_layers |
3 |
| anomaly_window_size |
max anomaly length |
| latent_size |
-30% default value |
| batch_size |
64 |
| validation_batch_size |
64 |
| epochs |
500 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| learning_rate |
0.001 |
| random_state |
42 |
| window_size |
1.0 dataset period size |
| test_batch_size |
64 |
Ensemble GI
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| n_estimators |
500 |
| max_paa_transform_size |
20 |
| max_alphabet_size |
10 |
| selectivity |
0.8 |
| random_state |
42 |
| n_jobs |
1 |
| window_method |
sliding |
Fast-MCD
| Parameter |
Value |
| store_precision |
True |
| support_fraction |
default |
| random_state |
42 |
FFT
| Parameter |
Value |
| fft_parameters |
3 |
| context_window_size |
5 |
| local_outlier_threshold |
0.78 |
| max_anomaly_window_size |
max anomaly length |
| max_sign_change_distance |
20 |
| random_state |
42 |
RForest
| Parameter |
Value |
| train_window_size |
500 |
| n_trees |
500 |
| max_features_method |
auto |
| bootstrap |
True |
| max_samples |
None |
| random_state |
42 |
| verbose |
0 |
| n_jobs |
1 |
| max_depth |
4 |
| min_samples_split |
2 |
| min_samples_leaf |
1 |
XGBoosting
| Parameter |
Value |
| train_window_size |
500 |
| n_estimators |
500 |
| learning_rate |
0.001 |
| booster |
gbtree |
| tree_method |
auto |
| n_trees |
500 |
| max_depth |
4 |
| max_samples |
None |
| colsample_bytree |
None |
| colsample_bylevel |
None |
| colsample_bynode |
None |
| random_state |
42 |
| verbose |
0 |
| n_jobs |
1 |
GrammarViz
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| paa_transform_size |
5 |
| alphabet_size |
6 |
| normalization_threshold |
0.01 |
| random_state |
42 |
HBOS
| Parameter |
Value |
| n_bins |
20 |
| alpha |
default |
| bin_tol |
default |
| random_state |
42 |
HealthESN
| Parameter |
Value |
| linear_hidden_size |
default |
| prediction_window_size |
50 |
| connectivity |
default |
| spectral_radius |
default |
| activation |
default |
| random_state |
42 |
HIF
| Parameter |
Value |
| n_trees |
500 |
| max_samples |
None |
| random_state |
42 |
HOT SAX
| Parameter |
Value |
| num_discords |
None |
| anomaly_window_size |
max anomaly length |
| paa_transform_size |
3 |
| alphabet_size |
3 |
| normalization_threshold |
0.01 |
| random_state |
42 |
Hybrid KNN
| Parameter |
Value |
| linear_layer_shape |
default |
| split |
0.8 |
| anomaly_window_size |
max anomaly length |
| batch_size |
64 |
| test_batch_size |
64 |
| epochs |
500 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| learning_rate |
0.001 |
| n_neighbors |
10 |
| n_estimators |
500 |
| random_state |
42 |
IF-LOF
| Parameter |
Value |
| n_trees |
500 |
| max_samples |
default |
| n_neighbors |
50 |
| alpha |
default |
| m |
default |
| random_state |
42 |
iForest
| Parameter |
Value |
| n_trees |
500 |
| max_samples |
None |
| max_features |
1.0 |
| bootstrap |
false |
| random_state |
42 |
| verbose |
0 |
| n_jobs |
1 |
IE-CAE
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| kernel_size |
default |
| num_kernels |
32 |
| latent_size |
100 |
| leaky_relu_alpha |
0.03 |
| batch_size |
64 |
| test_batch_size |
64 |
| learning_rate |
0.001 |
| epochs |
500 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| random_state |
42 |
k-Means
| Parameter |
Value |
| n_clusters |
50 |
| anomaly_window_size |
max anomaly length |
| stride |
1 |
| n_jobs |
1 |
| random_state |
42 |
KNN
| Parameter |
Value |
| n_neighbors |
50 |
| leaf_size |
20 |
| method |
default |
| radius |
default |
| distance_metric_order |
2 |
| n_jobs |
1 |
| random_state |
42 |
LaserDBN
| Parameter |
Value |
| timesteps |
2 |
| n_bins |
10 |
| random_state |
42 |
Left STAMPi
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| n_init_train |
10% of dataset length or until first anomaly |
| random_state |
42 |
LOF
| Parameter |
Value |
| n_neighbors |
50 |
| leaf_size |
20 |
| distance_metric_order |
2 |
| n_jobs |
1 |
| random_state |
42 |
LSTM-AD
| Parameter |
Value |
| lstm_layers |
1 |
| split |
0.8 |
| window_size |
2.0 dataset period size |
| prediction_window_size |
50 |
| batch_size |
64 |
| validation_batch_size |
64 |
| epochs |
500 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| learning_rate |
0.001 |
| random_state |
42 |
| test_batch_size |
64 |
LSTM-VAE
| Parameter |
Value |
| rnn_hidden_size |
5 |
| latent_size |
5 |
| learning_rate |
0.001 |
| batch_size |
64 |
| epochs |
500 |
| window_size |
1.0 dataset period size |
| lstm_layers |
10 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| Parameter |
Value |
| neighbourhood_size |
2.0 dataset period size |
| random_state |
42 |
MSCRED
| Parameter |
Value |
| windows |
default |
| gap_time |
10 |
| window_size |
1.0 dataset period size |
| batch_size |
64 |
| learning_rate |
0.001 |
| epochs |
500 |
| early_stopping_patience |
10 |
| early_stopping_delta |
0.05 |
| split |
0.8 |
| test_batch_size |
64 |
| random_state |
42 |
MTAD-GAT
| Parameter |
Value |
| mag_window_size |
40 |
| score_window_size |
52 |
| threshold |
6 |
| context_window_size |
30 |
| kernel_size |
7 |
| learning_rate |
0.001 |
| epochs |
500 |
| batch_size |
64 |
| window_size |
2.0 dataset period size |
| gamma |
0.8 |
| latent_size |
5 |
| linear_layer_shape |
[5, 5, 5] |
| early_stopping_patience |
10 |
| early_stopping_delta |
0.05 |
| split |
0.8 |
| random_state |
42 |
MultiHMM
| Parameter |
Value |
| discretizer |
choquet |
| n_bins |
5 |
| random_state |
42 |
NormA
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| normal_model_percentage |
0.5 |
| random_state |
42 |
Normalizing Flows
| Parameter |
Value |
| n_hidden_features_factor |
1.0 |
| hidden_layer_shape |
[100, 100] |
| window_size |
1.0 dataset period size |
| split |
0.8 |
| epochs |
500 |
| batch_size |
64 |
| test_batch_size |
64 |
| teacher_epochs |
100 |
| distillation_iterations |
100 |
| percentile |
0.05 |
| early_stopping_patience |
10 |
| early_stopping_delta |
0.05 |
| random_state |
42 |
NoveltySVR
| Parameter |
Value |
| n_init_train |
10% of dataset length or until first anomaly |
| forgetting_time |
None |
| train_window_size |
500 |
| anomaly_window_size |
max anomaly length |
| lower_suprise_bound |
None |
| scaling |
standard |
| epsilon |
0.1 |
| verbose |
0 |
| C |
1.0 |
| kernel |
rbf |
| degree |
3 |
| gamma |
None |
| coef0 |
0.0 |
| tol |
0.001 |
| stabilized |
True |
| random_state |
42 |
NumentaHTM
| Parameter |
Value |
| encoding_input_width |
21 |
| encoding_output_width |
75 |
| autoDetectWaitRecords |
50 |
| columnCount |
2048 |
| numActiveColumnsPerInhArea |
50 |
| potentialPct |
0.1 |
| synPermConnected |
0.1 |
| synPermActiveInc |
0.05 |
| synPermInactiveDec |
0.01 |
| cellsPerColumn |
32 |
| inputWidth |
2048 |
| newSynapseCount |
15 |
| maxSynapsesPerSegment |
32 |
| maxSegmentsPerCell |
128 |
| initialPerm |
0.15 |
| permanenceInc |
0.1 |
| permanenceDec |
0.1 |
| globalDecay |
0 |
| maxAge |
0 |
| minThreshold |
9 |
| activationThreshold |
12 |
| pamLength |
1 |
| alpha |
0.5 |
| random_state |
42 |
OceanWNN
| Parameter |
Value |
| train_window_size |
500 |
| hidden_size |
20 |
| batch_size |
64 |
| test_batch_size |
64 |
| epochs |
500 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
| learning_rate |
0.001 |
| wavelet_a |
-3.25 |
| wavelet_k |
-1.95 |
| wavelet_wbf |
mexican_hat |
| wavelet_cs_C |
2.275 |
| threshold_percentile |
0.99 |
| random_state |
42 |
| with_threshold |
True |
OmniAnomaly
| Parameter |
Value |
| latent_size |
4 |
| rnn_hidden_size |
100 |
| window_size |
1.0 dataset period size |
| linear_hidden_size |
100 |
| nf_layers |
5 |
| epochs |
500 |
| split |
0.8 |
| batch_size |
64 |
| l2_reg |
0.0001 |
| learning_rate |
0.001 |
| random_state |
42 |
PCC
| Parameter |
Value |
| n_components |
default |
| n_selected_components |
default |
| whiten |
default |
| svd_solver |
auto |
| tol |
default |
| max_iter |
default |
| random_state |
42 |
PCI
| Parameter |
Value |
| window_size |
0.5 dataset period size |
| thresholding_p |
0.05 |
| random_state |
42 |
PS-SVM
| Parameter |
Value |
| embed_dim_range |
[0.5, 1.0, 1.5] * dataset period size |
| project_phasespace |
False |
| nu |
0.5 |
| kernel |
rbf |
| gamma |
None |
| degree |
3 |
| coef0 |
0.0 |
| tol |
0.001 |
| random_state |
42 |
PST
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| max_depth |
4 |
| n_min |
1 |
| y_min |
default |
| n_bins |
5 |
| sim |
SIMn |
| random_state |
42 |
RBForest
| Parameter |
Value |
| train_window_size |
500 |
| n_estimators |
500 |
| max_features_per_estimator |
0.5 |
| n_trees |
500 |
| max_features_method |
auto |
| bootstrap |
True |
| max_samples |
None |
| random_state |
42 |
| verbose |
0 |
| n_jobs |
1 |
| max_depth |
4 |
| min_samples_split |
2 |
| min_samples_leaf |
1 |
RobustPCA
| Parameter |
Value |
| max_iter |
default |
| random_state |
42 |
S-H-ESD
| Parameter |
Value |
| max_anomalies |
dataset contamination |
| timestamp_unit |
m |
| random_state |
42 |
SAND
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| n_clusters |
50 |
| n_init_train |
10% of dataset length or until first anomaly |
| iter_batch_size |
500 |
| alpha |
0.5 |
| random_state |
42 |
SARIMA
| Parameter |
Value |
| train_window_size |
500 |
| prediction_window_size |
50 |
| max_lag |
10% of dataset length |
| period |
dataset period size |
| max_iter |
default |
| exhaustive_search |
False |
| n_jobs |
1 |
| random_state |
42 |
Series2Graph
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| query_window_size |
1.5*window_size |
| rate |
100 |
| random_state |
42 |
SR
| Parameter |
Value |
| mag_window_size |
40 |
| score_window_size |
40 |
| window_size |
1.0 dataset period size |
| random_state |
42 |
SR-CNN
| Parameter |
Value |
| window_size |
1.5 dataset period size |
| random_state |
42 |
| step |
64 |
| num |
10 |
| learning_rate |
0.001 |
| epochs |
500 |
| batch_size |
64 |
| n_jobs |
1 |
| split |
0.8 |
| early_stopping_delta |
0.05 |
| early_stopping_patience |
10 |
SSA
| Parameter |
Value |
| ep |
3 |
| window_size |
2.0 dataset period size |
| rf_method |
alpha |
| alpha |
0.2 |
| random_state |
42 |
STAMP
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| exclusion_zone |
0.5 |
| verbose |
0 |
| n_jobs |
1 |
| random_state |
42 |
STOMP
| Parameter |
Value |
| anomaly_window_size |
max anomaly length |
| exclusion_zone |
0.5 |
| verbose |
0 |
| n_jobs |
1 |
| random_state |
42 |
Sub-Fast-MCD
| Parameter |
Value |
| store_precision |
True |
| support_fraction |
default |
| random_state |
42 |
Sub-IF
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| n_trees |
500 |
| max_samples |
None |
| max_features |
1.0 |
| bootstrap |
false |
| random_state |
42 |
| verbose |
0 |
| n_jobs |
1 |
Sub-LOF
| Parameter |
Value |
| window_size |
1.0 dataset period size |
| n_neighbors |
50 |
| leaf_size |
20 |
| distance_metric_order |
2 |
| n_jobs |
1 |
| random_state |
42 |
TAnoGAN
| Parameter |
Value |
| epochs |
500 |
| cuda |
False |
| window_size |
1.0 dataset period size |
| learning_rate |
0.001 |
| batch_size |
64 |
| n_jobs |
1 |
| random_state |
42 |
| early_stopping_patience |
10 |
| early_stopping_delta |
0.05 |
| split |
0.8 |
| iterations |
25 |
TARZAN
| Parameter |
Value |
| random_state |
42 |
| anomaly_window_size |
max anomaly length |
| alphabet_size |
4 |
Telemanom
| Parameter |
Value |
| batch_size |
64 |
| smoothing_window_size |
30 |
| smoothing_perc |
0.05 |
| error_buffer |
100 |
| dropout |
0.5 |
| lstm_batch_size |
64 |
| epochs |
500 |
| split |
0.8 |
| early_stopping_patience |
10 |
| early_stopping_delta |
0.05 |
| window_size |
1.5 dataset period size |
| prediction_window_size |
50 |
| p |
0.17 |
| random_state |
42 |
Torsk
| Parameter |
Value |
| input_map_size |
100 |
| input_map_scale |
0.125 |
| context_window_size |
10 |
| train_window_size |
100 |
| prediction_window_size |
5 |
| transient_window_size |
20% of train_window_size |
| spectral_radius |
2.0 |
| density |
0.01 |
| reservoir_representation |
sparse |
| imed_loss |
False |
| train_method |
pinv_svd |
| tikhonov_beta |
None |
| verbose |
0 |
| scoring_small_window_size |
10 |
| scoring_large_window_size |
100 |
| random_state |
42 |
Triple ES
| Parameter |
Value |
| train_window_size |
500 |
| period |
dataset period size |
| trend |
add |
| seasonal |
add |
| random_state |
42 |
TSBitmap
| Parameter |
Value |
| feature_window_size |
500 |
| lead_window_size |
200 |
| lag_window_size |
500 |
| alphabet_size |
4 |
| level_size |
2 |
| compression_ratio |
1 |
| random_state |
42 |
VALMOD
| Parameter |
Value |
| min_anomaly_window_size |
1.0 dataset period size |
| max_anomaly_window_size |
2.0 dataset period size |
| heap_size |
50 |
| exclusion_zone |
0.5 |
| verbose |
0 |
| random_state |
42 |