Analysis and Design of Algorithm Final
Analysis and Design of Algorithm Final
INTERNAL ASSIGNMENT
SET I
Q. No1
(a). What are the properties of an algorithm? Explain branch and bound algorithm
with an example
This approach reveals the most reliable answer quicker than brute-pressure methods by
heading off pointless computations.
5. Optimize if wished
Use memorization or backside-up DP to lessen redundant computations.
instance: Optimizing Fibonacci reduces time complexity from O(2ⁿ) to O(n).
Through following this systematic technique, the efficiency of recursive algorithms may be
efficaciously analyzed.
Q. No2 Differentiate between bottom-up and top-down heap construction with example.
For big-scale packages like priority queues and sorting algorithms, bottom-Up production is
favored because of its performance.
Q. No3
(a). How is Divide and Conquer a better method for sorting?
4. Parallelizability
Mergers can be correctly parallelized as sorting is carried out independently in subarrays
earlier than merging.
END
Divide and conquer presents better time complexity, scalability, and efficiency, making it a
superior approach for sorting big datasets in comparison to less difficult strategies.
(b). What is the best, worst and average cases in an insertion sort?
Example:
[5,4,3,2,1]
The primary element is already looked after.
The second one detail movements one step left.
The third detail actions are two steps left, and so on.
This outcomes in about (n²)/2 comparisons and swaps.
3.Average Case:
O(n2)
Situation: The elements are randomly ordered.
Reason: On common, each detail is inserted midway into the looked after portion, leading to
O(n²) comparisons and swaps.
Conclusion
exceptional Case: O(n) (Already looked after)
Worst Case: O(n2) (reverse looked after)
common Case: O(n2) (Random order)
Consequently, Insertion kind is green for nearly sorted information but sluggish for huge,
unsorted inputs.
Q. No4 Explain the algorithm to solve the Knapsack problem using the dynamic
programming method.
let dp[i][w] represent the maximum fee that can be acquired using the primary 𝑖 objects with
1. outline the nation
weight potential 𝑤
2. Recurrence Relation
dp[i][j]={dp[i−1][j],max(dp[i−1][j],v[i]+dp[i−1][j−w[i]]),if w[i]>j(item i cannot be included)
otherwise
Here:
If the item's weight exceeds j, we cannot include it.
Otherwise, we decide whether to include it or not based on maximum value gain.
3. Base Case
A 2D table dp[n+1][W+1] is initialized, where:
dp[i][0] = 0 (no value if knapsack capacity is 0)
dp[0][j] = 0 (no value if there are no items)
We fill the table using the recursive relation, iterating over all items and capacities.
but, direct computation using factorials can result in overflow issues and inefficiency because
of redundant calculations. alternatively, we use dynamic programming (DP) to compute
binomial coefficients correctly.
n−1 factors.
𝑘
If it isn't always included, we choose all
Base cases:
C(n,0)=C(n,n)=1
Rhose mirror the records that:
selecting zero gadgets from any set constantly has exactly 1 manner (deciding on not
anything).
𝑛
choosing all
𝑛
n items from an
Implementation:
def binomial_coefficient(n, ok):
dp = [[0] * (ok + 1) for _ in variety(n + 1)]
for i in range(n + 1):
for j in range(min(i, k) + 1): # j can never be greater than i
if j == 0 or j == i:
dp[i][j] = 1 # Base cases
else:
dp[i][j] = dp[i-1][j-1] + dp[i-1][j] # Pascal’s identity
go back dp[n][k]
# example usage:
print(binomial_coefficient(five, 2)) # Output: 10
Time and space Complexity
Time Complexity: O(nk) on the grounds that we compute every value once.
Space Complexity: O(nk) due to the second table.
Implementation
def binomial_coefficient_optimized(n, okay):
dp = [0] * (k + 1)
dp[0] = 1 # Base case: C(n, zero) = 1
for i in range(1, n + 1):
for j in variety(min(i, okay), zero, -1): # update from proper to left
dp[j] += dp[j - 1]
return dp[k]
# example utilization:
print(binomial_coefficient_optimized(five, 2)) # Output: 10
Conclusion
Using dynamic programming, we keep away from redundant calculations, making binomial
coefficient computation green. The optimized 1D DP approach further improves area
utilization, making it best for huge inputs.
Q No.6
(a). Describe greedy choice property
End
If a hassle satisfies the greedy desire assets along with most useful substructure, a greedy set
of rules can successfully discover the surest solution without the want for backtracking or
dynamic programming.
(b). Explain the sorting problem with the help of a decision tree
This indicates any assessment-based sorting algorithm (like Merge type, brief sort, or Heap
type) calls for at the least O(nlogn) comparisons in the worst case.
3. Example for Sorting 3 factors {A, B, C}
The foundation compares A and B.
relying at the end result, it then compares B and C or A and C.
The tree has 3!=6 leaves, confirming the O(nlogn) certain.
End
decision timber prove that no assessment-based sorting set of rules may be quicker
thaO(nlogn) in the worst case, making it a fundamental concept in sorting concept. 🚀