aofsorular.com
İŞL353U

Markov Chains

8. Ünite 20 Soru
S

Why are Markov chains independent of events in the past?

Markov chains have a special property that probabilities indicating how the process will evolve in the future depending only on the present state of the process, and so are independent of events in the past.

S

What is the difference between discrete fashion and continuous fashion with which time can be dealt in stochastic processes?

In the continuous case, it can be looked at the inventory level of a product in a store at any time while in the discrete case inventory level is observed only during each minute, once a month or at similar intervals.

S

What is a  stochastic process?

A stochastic process is a family of random variables
{X(t),t ∈T} defined on a given probability space, indexed by the parameter t, where t varies over an index set T.

S

What is necessary to be able to analyze a stochastic process?

In order to be able to analyze a stochastic process, it is needed to make some assumptions on the dependence between the random variables.

S

What is the most common dependence structure called?

The most common dependence structure is called Markov Property.

S

How is a Markov chain described? 

A Markov chain is described in terms of its transition probabilities and the key assumption underlying a Markov chain is that the stochastic process is a Markov chain, which has the Markov property.

S

What is the assumption behind the conditional probability of an event B?

The event will occur given the knowledge that an event A has already occurred.

S

What is a convenient way of showing all the transition probabilities?

A convenient way of showing all the transition probabilities is the transition probability matrix, P, which is simply a two-dimensional array whose element at the ith row and jth column is pij

S

What is Pk?

Pk gives the probabilities of a transition from one state to another in k repetitions.

S

What is the n-step transition probability pij(n) ?

It is the probability that a process in state j will be in state i after n additional transitions. 

S

When is a transition matrix called regular? 

A transition matrix is called regular if some power of the matrix includes all positive entries.

S

What makes a Markov chain a regular Markov chain ?

Markov chain is a regular Markov chain if its transition matrix is regular.

S

In which circumstances do states i and j take place in the same communicating class?

States i and j are in the same communicating class if each state is accessible from the other, i.e., i↔j.

S

If a Markov chain has more than one class, what is it called?

Markov chain has more than one class, it is called reducible.

S

When is a state called periodic?

A state is called periodic, if it can only return to itself after a fixed number of transitions greater than 1.

S

What is called an absorbing chain?

In fact, some of the most important applications of Markov chains involve an important class of Markov chains which is called an absorbing chain. A state i of a Markov chain is called an absorbing state if, once the Markov chain enters the state, it locks in there forever. This means that the probability of leaving the state is zero.

S

What is the main goal of a Markov chain?

The main goal of a Markov chain is to calculate the probability that a system occupies a given state Si , when n is very large. This probability is called the limiting probability and it may converge to steady-state values that are independent of the initial state.

S

What does the steady-state behavior of a Markov chain mean?

Markov chain means that the chain does not stop changing but that enough time has elapsed, so that the probabilities do not change with respect to time.

S

What is brand switching?

Brand switching is the process of choosing to switch from routine use of one brand or product to steady use of a different but similar brand or product.

S

Who was the novelist that inspired  A. A. Markov to found Markov Chain?

Alexander Pushkin