Optimization Techniques # MCQs Practice set

Q.1 Which of the following is the main goal of optimization techniques?

To maximize or minimize an objective function
To create random solutions
To analyze data without decision making
To compute only averages
Explanation - Optimization focuses on finding the best solution by either maximizing or minimizing an objective function subject to constraints.
Correct answer is: To maximize or minimize an objective function

Q.2 Linear Programming problems assume that:

Objective function is quadratic
Variables are continuous and linear
Constraints are always nonlinear
Solutions are probabilistic
Explanation - Linear Programming (LP) deals with linear objective functions and linear constraints involving continuous variables.
Correct answer is: Variables are continuous and linear

Q.3 The feasible region in Linear Programming is always:

Non-convex
Convex
Discrete
Unbounded only
Explanation - In LP, the feasible region formed by linear constraints is always a convex polyhedron.
Correct answer is: Convex

Q.4 The Simplex method is used to solve:

Nonlinear programming problems
Linear programming problems
Integer programming problems
Graph theory problems
Explanation - Simplex method is a systematic procedure for solving linear programming problems.
Correct answer is: Linear programming problems

Q.5 Which of the following is NOT an optimization technique?

Simplex method
Gradient descent
Newton's method
Binary search
Explanation - Binary search is an algorithm for searching, not an optimization technique.
Correct answer is: Binary search

Q.6 In optimization, 'constraints' refer to:

Random numbers used in function
Conditions that must be satisfied by the solution
Approximation methods
Error tolerance levels
Explanation - Constraints define the feasible region by setting limits within which the solution must lie.
Correct answer is: Conditions that must be satisfied by the solution

Q.7 Integer Programming problems are harder than Linear Programming because:

They involve only linear equations
They restrict variables to discrete values
They cannot be solved by any algorithm
They always have infinite solutions
Explanation - Restricting variables to integers makes the solution space non-convex, increasing complexity.
Correct answer is: They restrict variables to discrete values

Q.8 The Kuhn–Tucker conditions are associated with:

Unconstrained optimization
Linear programming
Nonlinear programming with inequality constraints
Graph coloring
Explanation - Kuhn–Tucker (Karush-Kuhn-Tucker) conditions provide necessary conditions for optimality in constrained nonlinear problems.
Correct answer is: Nonlinear programming with inequality constraints

Q.9 In gradient descent, the step size is also called:

Learning rate
Slack variable
Dual variable
Feasible solution
Explanation - Step size in gradient descent is called learning rate, controlling how fast the algorithm converges.
Correct answer is: Learning rate

Q.10 Dynamic Programming is best suited for problems with:

Independent subproblems
Overlapping subproblems and optimal substructure
Random decision-making
Nonlinear constraints only
Explanation - Dynamic Programming works well when a problem can be broken into overlapping subproblems with optimal substructure.
Correct answer is: Overlapping subproblems and optimal substructure

Q.11 Which algorithm is widely used for unconstrained nonlinear optimization?

Simplex method
Newton’s method
Hungarian method
Ford-Fulkerson method
Explanation - Newton’s method is effective for unconstrained nonlinear optimization using second-order derivatives.
Correct answer is: Newton’s method

Q.12 The dual problem in Linear Programming provides:

The same solution as the primal
No useful information
An upper/lower bound on the primal solution
Exact solution only for integers
Explanation - Duality provides bounds and useful insights into the optimality of primal solutions.
Correct answer is: An upper/lower bound on the primal solution

Q.13 Which method is suitable for solving large-scale Linear Programming problems?

Simplex method
Interior point method
Dynamic programming
Branch and bound
Explanation - Interior point methods are efficient for large-scale LP problems compared to simplex method.
Correct answer is: Interior point method

Q.14 Slack variables in Linear Programming are introduced to:

Convert inequalities into equalities
Reduce dimensions
Find integer solutions
Change convexity of region
Explanation - Slack variables are added to constraints to convert inequalities into equalities for simplex method.
Correct answer is: Convert inequalities into equalities

Q.15 Branch and Bound method is typically used for:

Linear Programming
Integer Programming
Unconstrained optimization
Graph traversal
Explanation - Branch and Bound is an effective technique to solve integer programming problems.
Correct answer is: Integer Programming

Q.16 The Lagrangian function is used in optimization to:

Eliminate constraints
Transform constrained problems into unconstrained problems
Randomize solutions
Approximate nonlinear functions
Explanation - Lagrangian function incorporates constraints into the objective using multipliers.
Correct answer is: Transform constrained problems into unconstrained problems

Q.17 Convex functions are important in optimization because:

They always diverge
Local minima are also global minima
They cannot be optimized
They require random search
Explanation - For convex functions, any local minimum is guaranteed to be a global minimum, simplifying optimization.
Correct answer is: Local minima are also global minima

Q.18 The Hungarian method is used for:

Assignment problems
Nonlinear programming
Graph coloring
Sorting arrays
Explanation - Hungarian method optimally solves assignment problems in polynomial time.
Correct answer is: Assignment problems

Q.19 Penalty methods in optimization are used to:

Handle constraints by adding penalty terms to the objective
Reduce dimensionality
Randomize gradient direction
Improve duality gap
Explanation - Penalty methods incorporate constraints into the objective using penalty terms for violations.
Correct answer is: Handle constraints by adding penalty terms to the objective

Q.20 In convex optimization, the feasible region is always:

Convex
Concave
Discrete
Irregular
Explanation - Convex optimization problems require both objective function and feasible region to be convex.
Correct answer is: Convex

Q.21 Simulated Annealing is inspired by:

Neural networks
Thermal annealing process in metallurgy
Random walk theory
Branch and bound
Explanation - Simulated Annealing uses probabilistic techniques inspired by annealing in metallurgy to escape local minima.
Correct answer is: Thermal annealing process in metallurgy

Q.22 Which optimization method can escape local minima?

Gradient descent
Newton's method
Simulated Annealing
Simplex method
Explanation - Simulated Annealing can probabilistically escape local minima by accepting worse solutions with some probability.
Correct answer is: Simulated Annealing

Q.23 The duality gap in optimization refers to:

Difference between primal and dual objective values
Slack variable values
Learning rate size
Gradient approximation error
Explanation - Duality gap measures the difference between primal and dual solutions; at optimality, this gap is zero in convex problems.
Correct answer is: Difference between primal and dual objective values

Q.24 Which of the following is a metaheuristic optimization algorithm?

Simplex method
Genetic Algorithm
Interior point method
Dynamic programming
Explanation - Genetic Algorithm is a metaheuristic inspired by natural selection to solve complex optimization problems.
Correct answer is: Genetic Algorithm

Q.25 In Newton's method, the Hessian matrix represents:

First derivatives
Second derivatives
Random updates
Constraints
Explanation - The Hessian matrix contains second-order partial derivatives, crucial in Newton’s method.
Correct answer is: Second derivatives