IEEE SysTol, Nice, France, october 6-10, 2010
All the methods for Fault Detection and Isolation (FDI) involve internal parameters, often called hyperparameters, that have to be carefully tuned. Most often, tuning is ad hoc and this makes it difficult to ensure that any comparison between methods is unbiased. We propose to consider the evaluation of the performance of a method with respect to its hyperparameters as a computer experiment, and to achieve tuning via global optimization based on Kriging and Expected Improvement. This approach is applied to several residualevaluation (or change-detection) algorithms on classical testcases. Simulation results show the interest, practicability and performance of this methodology, which should facilitate the automatic tuning of the hyperparameters of a method and allow a fair comparison of a collection of methods on a given set of test-cases. The computational cost turns out to be much lower than the one obtained with other general-purpose optimization methods such as genetic algorithms.
Télécharger l'article
au Format PDF
Télécharger la présentation
au format PDF