Abstract:
This thesis studies the maintenance of a dynamic system consisting of several components which age in time at a given failure rate. The states of the components are hidden. In each decision epoch, the decision of whether replacing a component or doing nothing is to be made. The major difference of this problem from the other main- tenance problems is its complex structure due to many components. Two versions of the maintenance problem are studied. In the first one, it is possible to estimate the re- liability of the whole system. The aim is to find a minimal maintenance cost given that the reliability of the system should always be above a predetermined threshold value. In the second problem, partial observations, i.e., signals related with the components are observed in each time period. The next observation may have an associated cost to the decision maker. This problem is a partially observed Markov decision process (POMDP). Dynamic Bayesian networks (DBNs) are proposed as a solution to the first prob- lem. Four heuristic approaches are presented to select the component to be replaced. A hierarchical heuristic solution procedure is proposed to solve the second problem. An aggregate model is developed by aggregating states and actions so that it can be solved with exact POMDP solvers. Disaggregation is done by simulating the process with a DBN and applying troubleshooting approaches in the decision epochs where replacement is planned in the aggregate policy.