Add asymptotic notation content

pull/2896/head^2
Kamran Ahmed 2 years ago
parent 19ae880d6a
commit 75843e114f
  1. 11
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/100-big-o-notation.md
  2. 8
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/101-big-theta-notation.md
  3. 8
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/102-big-omega-notation.md
  4. 8
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/100-constant.md
  5. 8
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/101-logarithmic.md
  6. 8
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/102-linear.md
  7. 15
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/103-polynomial.md
  8. 17
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/104-exponential.md
  9. 12
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/105-factorial.md
  10. 18
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/103-common-runtimes/readme.md
  11. 17
      content/roadmaps/103-computer-science/content/103-asymptotic-notation/readme.md

@ -1 +1,10 @@
# Big o notation
# Big O Notation
Big O Notation describes, how well an algorithm scales with the input size. It is used to describe the worst case scenario of an algorithm. It is used to compare algorithms and to determine which algorithm is better.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://archive.org/details/ucberkeley_webcast_VIS4YDpuP98'>moviesCS 61B Lecture 19: Asymptotic Analysis</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=ei-A_wy5Yxw&list=PL1BaGV1cIH4UhkL8a9bJGG356covJ76qN&index=3'>Big Oh Notation (and Omega and Theta)</BadgeLink>

@ -1 +1,7 @@
# Big theta notation
# Big Theta Notation
While Big O Notation refers to the upper bound of a function, Big Theta Notation refers to the exact bound of a function. Big Theta Notation is used to describe the exact growth rate of a function. It is denoted by the symbol Θ.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=ei-A_wy5Yxw&list=PL1BaGV1cIH4UhkL8a9bJGG356covJ76qN&index=3'>Big Oh Notation (and Omega and Theta)</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=iOq5kSKqeR4'>Asymptotic Notation - CS50</BadgeLink>

@ -1 +1,7 @@
# Big omega notation
# Big Omega Notation
Big Omega notation is used to describe the lower bound of a function. It is the opposite of Big O notation. While Big O is used to describe the worst case scenario of an algorithm, Big Omega is used to describe the best case scenario of an algorithm.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=ei-A_wy5Yxw&list=PL1BaGV1cIH4UhkL8a9bJGG356covJ76qN&index=3'>Big Oh Notation (and Omega and Theta)</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=iOq5kSKqeR4'>Asymptotic Notation - CS50</BadgeLink>

@ -1 +1,7 @@
# Constant
# Constant
Constant time algorithms are the simplest and most efficient algorithms. They are algorithms that always take the same amount of time to run, regardless of the size of the input. This is the best case scenario for an algorithm, and is the goal of all algorithms.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,7 @@
# Logarithmic
# Logarithmic
Logarithmic complexity algorithms are the second fastest algorithms. They are faster than linear algorithms, but slower than constant algorithms.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,7 @@
# Linear
# Linear
Linear algorithms are algorithms that have a runtime that is directly proportional to the size of the input. This means that the runtime of the algorithm will increase linearly with the size of the input. For example, if the input size is 10, the runtime will be 10 times the runtime of the algorithm when the input size is 1. If the input size is 100, the runtime will be 100 times the runtime of the algorithm when the input size is 1.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,14 @@
# Polynomial
# Polynomial
Polynomial algorithms are algorithms that have a runtime that is a polynomial function of the input size. This means that the runtime is a function of the form `n^k` where `k` is a constant. For example, the runtime of the following algorithm is `n^2`:
```python
def polynomial_algorithm(n):
for i in range(n):
for j in range(n):
print(i, j)
```
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,16 @@
# Exponential
# Exponential
Exponential algorithms are those that grow at a rate of 2^n. This means that for each additional input, the algorithm will take twice as long to run. The following function is an example of an exponential algorithm:
```python
def exponential(n):
if n == 0:
return 1
return 2 * exponential(n - 1)
```
As you can see, the algorithm's runtime grows exponentially. For each additional input, the algorithm will take twice as long to run.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,11 @@
# Factorial
# Factorial
Factorial complexity algorithms have a runtime of `O(n!)`. This is the worst case scenario for an algorithm. Factorial complexity algorithms are very inefficient and should be avoided.
```python
def factorial(n):
if n == 0:
return 1
else:
return n * factorial(n-1)
```

@ -1 +1,17 @@
# Common runtimes
# Common Runtimes
Given below is the list of common algorithmic runtimes. The runtimes are listed in ascending order of their complexity.
* O(1) - Constant
* O(log n) - Logarithmic
* O(n) - Linear
* O(n log n) - Linearithmic
* O(n^2) - Quadratic
* O(n^3) - Cubic
* O(2^n) - Exponential
* O(n!) - Factorial
* O(n^n) - Polynomial
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=V6mKVRU1evU'>Big O Notations</BadgeLink>

@ -1 +1,16 @@
# Asymptotic notation
# Asymptotic Notation
The efficiency of an algorithm depends on the amount of time, storage and other resources required to execute the algorithm. The efficiency is measured with the help of asymptotic notations.
An algorithm may not have the same performance for different types of inputs. With the increase in the input size, the performance will change.
The study of change in performance of the algorithm with the change in the order of the input size is defined as asymptotic analysis.
<ResourceGroupTitle>Free Content</ResourceGroupTitle>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.programiz.com/dsa/asymptotic-notations'>Asymptotic Analysis: Big-O Notation and More</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=Z0bH0cMY0E8'>Big O Notation — Calculating Time Complexity</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.youtube.com/watch?v=__vX2sjlpXU'>Big O Notation in 5 Minutes</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://www.youtube.com/watch?v=iOq5kSKqeR4'>Asymptotic Notation - CS50</BadgeLink>
<BadgeLink colorScheme='red' badgeText='Watch' href='https://archive.org/details/ucberkeley_webcast_VIS4YDpuP98'>CS 61B Lecture 19: Asymptotic Analysis</BadgeLink>
<BadgeLink colorScheme='yellow' badgeText='Read' href='https://www.bigocheatsheet.com/'>Big-O Cheat Sheet</BadgeLink>

Loading…
Cancel
Save