
Accession Number : AD0685735
Title : BAYESIAN LEARNING IN MARKOV CHAINS WITH OBSERVABLE STATES,
Corporate Author : MICHIGAN STATE UNIV EAST LANSING DIV OF ENGINEERING RESEARCH
Personal Author(s) : Dubes,Richard C. ; Donoghue,Patrick J.
Report Date : MAR 1969
Pagination or Media Count : 34
Abstract : Two practical and related problems concerning decisionmaking with observations from Markov chains are considered in this report. First, Bayesian learning theory is used to develop recursive relations for the densities of the unknown parameters in a Markov chain, based on classified observations of the chain's states. Computationally simple results are obtained using a matrixbeta distribution for the chain's parameters. In the case of unsupervised observations, the basic relations for learning are derived and methods for their implementation are discussed. Second, the related problem of deciding which of a set of chains is active, based on state observations, is considered. Two datagenerating models are proposed and decision rules are derived. A particularly useful result is derived for one model using the matrixbeta distribution for the unknown parameters. The decision rule for the more difficult model is then derived and its implications discussed. Simulation results for a specific example show the probability of error for different amounts of training data and demonstrate the inherent practicality of the results. (Author)
Descriptors : (*LEARNING, *DECISION THEORY), (*PATTERN RECOGNITION, *INFORMATION THEORY), ELECTROENCEPHALOGRAPHY, MATHEMATICAL MODELS, TRANSFER OF TRAINING, DECISION MAKING, SIMULATION, PROBABILITY
Subject Categories : Personnel Management and Labor Relations
Cybernetics
Bionics
Distribution Statement : APPROVED FOR PUBLIC RELEASE