Big O notation is used to describe the time complexity of an algorithm. It represents the worst-case scenario of how an algorithm's performance scales with input size. Common notations include O(1) for constant time, O(n) for linear time, and O(n^2) for quadratic time. It helps analyze and compare algorithms' efficiency, focusing on the most significant factors impacting runtime, ignoring constants and lower-order terms.
To Center | Best-Case |
To Periphery | Worst-Case |
Links