Choose Index below for a list of all words and phrases defined in this glossary.


Markov Model

index | Index

Markov Model - definition(s)

Markov Model - a probabilistic model that can accurately capture the effects of order dependent component failure and of changing failure rates resulting from stress or other factors. In general, Markov modeling is used to evaluate system reliability as a function of time by mapping out the states of the system - fully operational, degraded, failed - and the probability of moving from one state to another. Markov models are most useful for modeling complex behavior associated with fault-tolerant systems, degraded modes of operation, repairable systems, sequence-dependent behavior and time-varying failure rate.

[Category=Quality ]

Source: The Quality Portal, 15 April 2011 09:52:52, http://thequalityportal.com/glossary/l.htm External 

 

 

 


Data Quality Glossary.  A free resource from GRC Data Intelligence. For comments, questions or feedback: dqglossary@grcdi.nl