Q.1 What is a Markov process primarily used to model?
Random variables with memory
Processes with no memory
Deterministic events
Non-random trends
Explanation - Markov processes are memoryless, meaning the future state depends only on the present state, not on past states.
Correct answer is: Processes with no memory
Q.2 The property that future states depend only on the current state is called?
Independence property
Stationarity property
Markov property
Transition property
Explanation - The Markov property defines that the probability of moving to the next state depends only on the current state.
Correct answer is: Markov property
Q.3 In Markov analysis, probabilities of moving from one state to another are called?
Stationary distributions
Transition probabilities
Conditional means
Limit values
Explanation - Transition probabilities specify the likelihood of moving from one state to another in a Markov chain.
Correct answer is: Transition probabilities
Q.4 What is the mathematical representation of a Markov chain?
Differential equation
Transition probability matrix
Decision tree
Linear regression model
Explanation - A Markov chain is usually represented by a transition probability matrix, where each entry indicates the probability of transitioning between states.
Correct answer is: Transition probability matrix
Q.5 If the sum of each row in a transition matrix is not equal to 1, what does it imply?
Invalid Markov chain
Steady state exists
Process is reversible
It is a continuous process
Explanation - For a valid transition matrix, the sum of probabilities in each row must equal 1.
Correct answer is: Invalid Markov chain
Q.6 A Markov chain that eventually reaches the same distribution regardless of the initial state is said to have?
Ergodic property
Absorbing property
Reversible property
Memory property
Explanation - An ergodic Markov chain converges to the same stationary distribution, independent of the starting state.
Correct answer is: Ergodic property
Q.7 What does an absorbing state in a Markov chain mean?
It has zero probability
Once entered, cannot be left
It changes after each step
It cycles back to itself only once
Explanation - An absorbing state is one that, once entered, cannot transition to another state.
Correct answer is: Once entered, cannot be left
Q.8 Which condition ensures a Markov chain has a stationary distribution?
It must be absorbing
It must be ergodic
It must be reducible
It must be transient
Explanation - An ergodic chain guarantees convergence to a unique stationary distribution.
Correct answer is: It must be ergodic
Q.9 In commerce, Markov analysis is commonly used to model?
Consumer brand switching
Compound interest
Market equilibrium
Linear costs
Explanation - Markov analysis is often used in marketing to model brand loyalty and switching patterns among consumers.
Correct answer is: Consumer brand switching
Q.10 What type of matrix is a transition probability matrix?
Stochastic matrix
Symmetric matrix
Deterministic matrix
Diagonal matrix
Explanation - A transition matrix is stochastic, as each row sums to 1 and all entries are non-negative.
Correct answer is: Stochastic matrix
Q.11 If state A always leads to state B with probability 1, what does it imply?
Absorbing property
Deterministic transition
Random behavior
Stationary distribution
Explanation - A transition probability of 1 indicates a deterministic move from A to B.
Correct answer is: Deterministic transition
Q.12 What is the long-run behavior of a Markov chain studied with?
Immediate transitions
Stationary distribution
Absorbing states
Initial probabilities
Explanation - The stationary distribution describes the long-run probabilities of being in different states.
Correct answer is: Stationary distribution
Q.13 If a transition matrix has multiple absorbing states, what can happen?
System oscillates
Process ends in one absorbing state
All states are transient
No steady state exists
Explanation - When multiple absorbing states exist, the process eventually gets absorbed into one of them.
Correct answer is: Process ends in one absorbing state
Q.14 Which of the following is NOT a property of Markov chains?
Memoryless
Probabilistic transitions
Deterministic outcomes
Stochastic nature
Explanation - Markov chains are inherently probabilistic, not deterministic.
Correct answer is: Deterministic outcomes
Q.15 What does 'reducible' mean in a Markov chain?
It cannot reach some states
It has only one absorbing state
It has a stationary distribution
It is always ergodic
Explanation - A reducible chain has states that cannot be reached from some other states.
Correct answer is: It cannot reach some states
Q.16 What is meant by a transient state?
A state that is visited infinitely often
A state not revisited once left
A permanent absorbing state
A deterministic state
Explanation - A transient state may be visited initially, but eventually the process leaves it and does not return.
Correct answer is: A state not revisited once left
Q.17 If the probability of remaining in the same state is 1, the state is?
Transient
Recurrent
Absorbing
Ergodic
Explanation - An absorbing state has a self-transition probability of 1.
Correct answer is: Absorbing
Q.18 In brand-switching analysis, the rows of a transition matrix represent?
Probabilities of future states
Initial conditions
Probabilities of current states
Long-run outcomes
Explanation - Each row represents the current brand a consumer is using and the probabilities of moving to other brands.
Correct answer is: Probabilities of current states
Q.19 Which matrix operation is used to find probabilities after multiple steps?
Inverse matrix
Matrix multiplication (powers)
Determinant
Row operations
Explanation - By raising the transition matrix to higher powers, we can compute probabilities for multiple-step transitions.
Correct answer is: Matrix multiplication (powers)
Q.20 What does it mean if a Markov chain is aperiodic?
It has a fixed cycle
It does not follow cycles
It has only absorbing states
It is reducible
Explanation - An aperiodic chain does not get trapped in cycles of fixed length.
Correct answer is: It does not follow cycles
Q.21 The limiting probabilities of a Markov chain are also known as?
Absorbing probabilities
Stationary distribution
Transient values
Initial conditions
Explanation - The stationary distribution gives the long-term probabilities of being in each state.
Correct answer is: Stationary distribution
Q.22 What is a key assumption in applying Markov analysis to business problems?
Future depends on entire history
Future depends only on present
Probabilities are unknown
Probabilities change randomly
Explanation - Markov analysis assumes the next step depends only on the current state, not on the full history.
Correct answer is: Future depends only on present
Q.23 What is meant by 'steady state' in Markov chains?
Equal probability of all states
Probabilities no longer change
Absorbing condition
Only one state exists
Explanation - In steady state, the distribution of probabilities remains constant over time.
Correct answer is: Probabilities no longer change
Q.24 Which business example suits Markov analysis best?
Estimating future cash flows
Customer migration between services
Fixed cost allocation
Interest compounding
Explanation - Markov chains model probabilities of movement between states, such as customers switching services.
Correct answer is: Customer migration between services
Q.25 What ensures the existence of a unique stationary distribution?
Irreducibility and aperiodicity
Reducibility and periodicity
Absorbing property
Symmetry of matrix
Explanation - A Markov chain that is irreducible and aperiodic has a unique stationary distribution.
Correct answer is: Irreducibility and aperiodicity
