Results of fault diagnosis using MLP-256#

This notebook presents experimental results of fault diagnosis on the Tennessee Eastman Process dataset using the model MLP-256.

Importing libraries.

import numpy as np
import torch
from sklearn.preprocessing import StandardScaler
from tqdm.auto import trange
from ice.fault_diagnosis.datasets import FaultDiagnosisReinartzTEP
from ice.fault_diagnosis.models import MLP

Downloading the TEP dataset.

dataset = FaultDiagnosisReinartzTEP()

Standardization of the input data.

scaler = StandardScaler()
dataset.df[dataset.train_mask] = scaler.fit_transform(dataset.df[dataset.train_mask])
dataset.df[dataset.test_mask] = scaler.transform(dataset.df[dataset.test_mask])

Training the model and calculation of metrics.

metrics = []
for i in trange(5):
    torch.random.manual_seed(i)
    model = MLP(
        window_size=10,
        hidden_dim=256,
        batch_size=128,
        lr=0.001,
        num_epochs=10,
        verbose=False,
        device='cuda',
    )
    model.fit(
        dataset.df[dataset.train_mask], dataset.target[dataset.train_mask])
    metrics.append(
        model.evaluate(
            dataset.df[dataset.test_mask], dataset.target[dataset.test_mask]))

Printing metrics.

acc = []
cdr = []
tpr = []
fpr = []
for metrics_i in metrics:
    acc.append(metrics_i["accuracy"])
    cdr.append(metrics_i["correct_daignosis_rate"])
    tpr.append(metrics_i["true_positive_rate"])
    fpr.append(metrics_i["false_positive_rate"])
tpr_mean, tpr_std = np.mean(tpr, axis=0), np.std(tpr, axis=0)
fpr_mean, fpr_std = np.mean(fpr, axis=0), np.std(fpr, axis=0)

print(f'Accuracy: {np.mean(acc):.4f} ± {2*np.std(acc):.4f}')
print(f'CDR: {np.mean(cdr):.4f} ± {2*np.std(cdr):.4f}')
for i in range(len(tpr_mean)):
    print(f'TPR/FPR {i+1}: {tpr_mean[i]:.4f} ± {2*tpr_std[i]:.4f} / {fpr_mean[i]:.4f} ± {2*fpr_std[i]:.4f}')
Accuracy: 0.8500 ± 0.0029
CDR: 0.9582 ± 0.0013
TPR/FPR 1: 0.9948 ± 0.0039 / 0.0000 ± 0.0000
TPR/FPR 2: 0.9951 ± 0.0008 / 0.0000 ± 0.0000
TPR/FPR 3: 0.9457 ± 0.0487 / 0.0000 ± 0.0000
TPR/FPR 4: 0.9963 ± 0.0020 / 0.0000 ± 0.0000
TPR/FPR 5: 0.2423 ± 0.0490 / 0.0009 ± 0.0004
TPR/FPR 6: 0.9984 ± 0.0002 / 0.0000 ± 0.0000
TPR/FPR 7: 0.9978 ± 0.0001 / 0.0000 ± 0.0000
TPR/FPR 8: 0.9782 ± 0.0136 / 0.0000 ± 0.0000
TPR/FPR 9: 0.3068 ± 0.1355 / 0.0001 ± 0.0001
TPR/FPR 10: 0.9754 ± 0.0018 / 0.0000 ± 0.0000
TPR/FPR 11: 0.9539 ± 0.0130 / 0.0000 ± 0.0000
TPR/FPR 12: 0.9098 ± 0.0086 / 0.0000 ± 0.0000
TPR/FPR 13: 0.9702 ± 0.0074 / 0.0000 ± 0.0000
TPR/FPR 14: 0.9949 ± 0.0006 / 0.0000 ± 0.0000
TPR/FPR 15: 0.0000 ± 0.0000 / 0.0000 ± 0.0000
TPR/FPR 16: 0.2772 ± 0.0309 / 0.0002 ± 0.0001
TPR/FPR 17: 0.9786 ± 0.0006 / 0.0000 ± 0.0000
TPR/FPR 18: 0.9635 ± 0.0019 / 0.0000 ± 0.0000
TPR/FPR 19: 0.9918 ± 0.0004 / 0.0000 ± 0.0000
TPR/FPR 20: 0.9739 ± 0.0006 / 0.0000 ± 0.0000
TPR/FPR 21: 0.0000 ± 0.0000 / 0.0000 ± 0.0000
TPR/FPR 22: 0.5031 ± 0.1355 / 0.0000 ± 0.0000
TPR/FPR 23: 0.8086 ± 0.0242 / 0.0000 ± 0.0000
TPR/FPR 24: 0.9890 ± 0.0011 / 0.0000 ± 0.0000
TPR/FPR 25: 0.9872 ± 0.0007 / 0.0000 ± 0.0000
TPR/FPR 26: 0.9696 ± 0.0064 / 0.0000 ± 0.0000
TPR/FPR 27: 0.9742 ± 0.0111 / 0.0000 ± 0.0000
TPR/FPR 28: 0.3768 ± 0.0459 / 0.0002 ± 0.0001