Return to search results
Empirical Evaluation of Diagnostic Algorithm Performance Using a Generic Framework
A variety of rule-based, model-based and datadriven
techniques have been proposed for detection
and isolation of faults in physical systems.
However, there have been few efforts to comparatively
analyze the performance of these approaches
on the same system under identical conditions.
One reason for this was the lack of a standard
framework to perform this comparison. In this paper
we introduce a framework, called DXF, that
provides a common language to represent the system
description, sensor data and the fault diagnosis
results; a run-time architecture to execute
the diagnosis algorithms under identical conditions
and collect the diagnosis results; and an evaluation
component that can compute performance
metrics from the diagnosis results to compare the
algorithms. We have used DXF to perform an empirical
evaluation of 13 diagnostic algorithms on a
hardware testbed (ADAPT) at NASA Ames Research
Center and on a set of synthetic circuits
typically used as benchmarks in the model-based
diagnosis community. Based on these empirical
data we analyze the performance of each algorithm
and suggest directions for future development.
Complete Metadata
| @type | dcat:Dataset |
|---|---|
| accessLevel | public |
| accrualPeriodicity | irregular |
| bureauCode |
[
"026:00"
]
|
| contactPoint |
{
"fn": "SCOTT POLL",
"@type": "vcard:Contact",
"hasEmail": "mailto:scott.d.poll@nasa.gov"
}
|
| description | A variety of rule-based, model-based and datadriven techniques have been proposed for detection and isolation of faults in physical systems. However, there have been few efforts to comparatively analyze the performance of these approaches on the same system under identical conditions. One reason for this was the lack of a standard framework to perform this comparison. In this paper we introduce a framework, called DXF, that provides a common language to represent the system description, sensor data and the fault diagnosis results; a run-time architecture to execute the diagnosis algorithms under identical conditions and collect the diagnosis results; and an evaluation component that can compute performance metrics from the diagnosis results to compare the algorithms. We have used DXF to perform an empirical evaluation of 13 diagnostic algorithms on a hardware testbed (ADAPT) at NASA Ames Research Center and on a set of synthetic circuits typically used as benchmarks in the model-based diagnosis community. Based on these empirical data we analyze the performance of each algorithm and suggest directions for future development. |
| distribution |
[
{
"@type": "dcat:Distribution",
"title": "ijPHM_10_002.pdf",
"format": "PDF",
"mediaType": "application/pdf",
"description": "ijPHM_10_002.pdf",
"downloadURL": "https://c3.nasa.gov/dashlink/static/media/publication/ijPHM_10_002.pdf"
}
]
|
| identifier | DASHLINK_864 |
| issued | 2013-12-18 |
| keyword |
[
"ames",
"dashlink",
"nasa"
]
|
| landingPage | https://c3.nasa.gov/dashlink/resources/864/ |
| modified | 2025-03-31 |
| programCode |
[
"026:029"
]
|
| publisher |
{
"name": "Dashlink",
"@type": "org:Organization"
}
|
| title | Empirical Evaluation of Diagnostic Algorithm Performance Using a Generic Framework |