Debugging, Problem Solving & Complexity Calculation
Debugging, Problem Solving & Complexity Calculation
#Supported features
Following are supported features:
Launch/Attach - You can either launch the Java project within VS Code or attach to any running
JVM process in debug mode, locally or remotely.
Breakpoints - Conditional breakpoints by Hit Count is supported and can easily be set using the
inline breakpoint settings window. This allows you to conveniently add conditional breakpoints
to your code, directly in the source viewer, without requiring a modal window. Break on
exceptions is also supported.
Control flow - Including Pause, Continue F5, Step over F10, Step into F11, Step out Shift+F11
Data inspection - When you're stopped at a breakpoint, the debugger has access to the variable
names and values that are currently stored in memory. Inspect/Watch/Set Variables are
supported.
Diagnostics - The CALL STACK panel shows the call stack of your program and allows you to
navigate through the call path of each captured allocation. Multi-threaded debugging is
supported by parallel stacks.
Debug Console - The Debug Console lets you see information from both stdout and stderr.
Introduction- Problem Solving, Debugging
Space Complexity : O(1) as the algorithm uses only a fixed amount of extra space, regardless of
the input size. But this approach does use an extra temporary space to preserve the values, this
space can be optimised in the later approaches.
Method2:- Using Arithmetic Operators + and –
Space Complexity : O(1) as the algorithm uses only a fixed amount of extra space, regardless of
the input size. But this approach does use an extra temporary space to preserve the values, this
space can be optimised in the later approaches.
2.Asymptotic Notations:-
Asymptotic notation describe the efficiency and scalability of algorithms. It provides a high-level
understanding of an algorithm's behavior, especially as the input size grows.
Asymptotic notation categorizes algorithms based on their performance as the input size
grows. They helps predict how an algorithm will perform under different conditions.
Majorly, we use THREE types of Asymptotic Notations and those are as follows...
Big - Oh (O) Big - Omega (Ω) Big - Theta (Θ)
express the worst-case express the best-case Express both worst and best cases
f(n) = Θ(g(n)) if ∃ constants c₁, c₂ > 0,
scenario scenario
Rule2:- Nested loops, the total running time of statements in nested loops is the sum of product
of all sizes of the loops.
Ex:-
for(i=0;i<n;i++){ ------n+1
for(j=0;j<n;j++){ ------n*(n+1)
c[i][j]=a[i][j]+b[i] ------n*n
[j];
} Total is 2n2+2n+1
}
Ignore the lower exponents & constants
So the time complexity is O(n2)
Rule4:- if else, the running timeis maximum of statements of if & else bolcks.
Ex:-
if(n<0) ----1
return n; ----1
else{ ----1
for(i=0;i<n;i++) -----n+1
s=s+i; -----n
return s; -----1
}
Total is 2n+5
Ignore the constants
So the time complexity is O(n)
Rule5:- Recursion
The time complexity of a recursive function depends on the number of times the function is
called and the time complexity of a single call. The total time complexity is the product of these
values.
Ex:- void f(int n) { The call f(n) causes n function calls, and the time
if (n == 1) complexity of each call is O(1). Thus, the total time
return; complexity is O(n).
f(n-1);
}
Ex:- void g(int n) { In this case each function call generates two other
if (n == 1) calls, except for n = 1. Let us see what happens
return; when g is called with parameter n. The following
g(n-1); table shows the function calls produced by this
g(n-1); single call:
} function call number of calls
g(n) 1
g(n−1) 2
g(n−2) 4
··· ···
g(1) 2n−1
Based on this, the time complexity is 1+2+4+··· +2
n−1 = 2 n −1 = O(2n )