Skip to main content

IMIS

A new integrated search interface will become available in the next phase of marineinfo.org.
For the time being, please use IMIS to search available data

 

[ report an error in this record ]basket (1): add | show Print this page

one publication added to basket [353055]
Optimal inspection and maintenance planning for deteriorating structural components through dynamic Bayesian networks and Markov decision processes
Morato, P.G.; Papakonstantinou, K.G.; Andriotis, C.P.; Nielsen, J.S.; Rigo, P. (2022). Optimal inspection and maintenance planning for deteriorating structural components through dynamic Bayesian networks and Markov decision processes. Structural Safety 94: 102140. https://dx.doi.org/10.1016/j.strusafe.2021.102140
In: Structural Safety. ELSEVIER SCIENCE BV: Amsterdam. ISSN 0167-4730; e-ISSN 1879-3355, more
Peer reviewed article  

Available in  Authors 

Keyword
    Marine/Coastal
Author keywords
    Infrastructure management; Inspection and maintenance; Partially Observable Markov Decision Processes; Deteriorating structures; Dynamic Bayesian networks; Decision analysis

Authors  Top 
  • Morato, P.G., more
  • Papakonstantinou, K.G.
  • Andriotis, C.P.
  • Nielsen, J.S.
  • Rigo, P., more

Abstract
    Civil and maritime engineering systems, among others, from bridges to offshore platforms and wind turbines, must be efficiently managed, as they are exposed to deterioration mechanisms throughout their operational life, such as fatigue and/or corrosion. Identifying optimal inspection and maintenance policies demands the solution of a complex sequential decision-making problem under uncertainty, with the main objective of efficiently controlling the risk associated with structural failures. Addressing this complexity, risk-based inspection planning methodologies, supported often by dynamic Bayesian networks, evaluate a set of pre-defined heuristic decision rules to reasonably simplify the decision problem. However, the resulting policies may be compromised by the limited space considered in the definition of the decision rules. Avoiding this limitation, Partially Observable Markov Decision Processes (POMDPs) provide a principled mathematical methodology for stochastic optimal control under uncertain action outcomes and observations, in which the optimal actions are prescribed as a function of the entire, dynamically updated, state probability distribution. In this paper, we combine dynamic Bayesian networks with POMDPs in a joint framework for optimal inspection and maintenance planning, and we provide the relevant formulation for developing both infinite and finite horizon POMDPs in a structural reliability context. The proposed methodology is implemented and tested for the case of a structural component subject to fatigue deterioration, demonstrating the capability of state-of-the-art point-based POMDP solvers of solving the underlying planning stochastic optimization problem. Within the numerical experiments, POMDP and heuristic-based policies are thoroughly compared, and results showcase that POMDPs achieve substantially lower costs as compared to their counterparts, even for traditional problem settings.

All data in the Integrated Marine Information System (IMIS) is subject to the VLIZ privacy policy Top | Authors