Read
Discuss(20+)
Improve
Improve
Improve
Like Article
Like
Save Article
Save
Report issue
Report
The efficiency of an algorithm depends on two parameters:
- Time Complexity
- Space Complexity
Time Complexity: Time Complexity is defined as the number of times a particular instruction set is executed rather than the total time taken. It is because the total time took also depends on some external factors like the compiler used, processor’s speed, etc.
Space Complexity: Space Complexity is the total memory space required by the program for its execution.
Both are calculated as the function of input size(n).
One important thing here is that in spite of these parameters the efficiency of an algorithm also depends upon the nature and size of the input.
Types Of Time Complexity :
- Best Time Complexity: Define the input for which algorithm takes less time or minimum time. In the best case calculate the lower bound of an algorithm. Example: In the linear search when search data is present at the first location of large data then the best case occurs.
- Average Time Complexity: In the average case take all random inputs and calculate the computation time for all inputs.
And then we divide it by the total number of inputs. - Worst Time Complexity: Define the input for which algorithm takes a long time or maximum time. In the worst calculate the upper bound of an algorithm. Example: In the linear search when search data is present at the last location of large data then the worst case occurs.
Following is a quick revision sheet that you may refer to at the last minute
Algorithm | Time Complexity | Space Complexity | ||
---|---|---|---|---|
Best | Average | Worst | Worst | |
Selection Sort | Ω(n^2) | θ(n^2) | O(n^2) | O(1) |
Bubble Sort | Ω(n) | θ(n^2) | O(n^2) | O(1) |
Insertion Sort | Ω(n) | θ(n^2) | O(n^2) | O(1) |
Heap Sort | Ω(n log(n)) | θ(n log(n)) | O(n log(n)) | O(1) |
Quick Sort | Ω(n log(n)) | θ(n log(n)) | O(n^2) | O(n) |
Merge Sort | Ω(n log(n)) | θ(n log(n)) | O(n log(n)) | O(n) |
Bucket Sort | Ω(n +k) | θ(n +k) | O(n^2) | O(n) |
Radix Sort | Ω(nk) | θ(nk) | O(nk) | O(n + k) |
Count Sort | Ω(n +k) | θ(n +k) | O(n +k) | O(k) |
Shell Sort | Ω(n log(n)) | θ(n log(n)) | O(n^2) | O(1) |
Tim Sort | Ω(n) | θ(n log(n)) | O(n log (n)) | O(n) |
Tree Sort | Ω(n log(n)) | θ(n log(n)) | O(n^2) | O(n) |
Cube Sort | Ω(n) | θ(n log(n)) | O(n log(n)) | O(n) |
Also, see:
Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above
Feeling lost in the world of random DSA topics, wasting time without progress? It's time for a change! Join our DSA course, where we'll guide you on an exciting journey to master DSA efficiently and on schedule.
Ready to dive in? Explore our Free Demo Content and join our DSA course, trusted by over 100,000 geeks!
Last Updated : 10 Jan, 2023
Like Article
Save Article
As an expert in computer science and algorithms, I have a comprehensive understanding of the concepts mentioned in the provided article on algorithm efficiency, time complexity, and space complexity. I've gained expertise through both academic knowledge and practical application in software development and problem-solving.
To address the content in the article:
-
Time Complexity: It's a measure that evaluates how the runtime of an algorithm increases with the size of the input. It's commonly denoted using Big O notation (O()) and encompasses three cases: best-case, average-case, and worst-case scenarios. Each of these scenarios helps in understanding an algorithm's behavior under different input conditions.
-
Space Complexity: This evaluates the amount of memory space an algorithm requires concerning the input size. Similar to time complexity, space complexity also aims to provide insights into the scalability and efficiency of an algorithm.
-
Types of Time Complexity:
- Best Time Complexity: Represents the minimum time taken by an algorithm, usually when the input is favorable or optimized.
- Average Time Complexity: Involves considering all possible inputs and calculating the average time taken.
- Worst Time Complexity: Represents the maximum time an algorithm takes to run, often occurring with the least favorable inputs.
-
Algorithm Efficiency Sheet: The provided sheet lists various sorting algorithms alongside their best-case, average-case, and worst-case time complexities, as well as their space complexities. Each algorithm has different performance characteristics under varying scenarios, aiding in the selection of the most suitable algorithm for a specific problem based on its input size and nature.
-
Practical Applications and Learning Resources: The article also suggests further learning resources and topics related to searching and sorting algorithms, including previous year questions from exams like GATE (Graduate Aptitude Test in Engineering), and offers guidance on data structures and algorithms (DSA) courses in multiple programming languages.
In summary, the article provides a comprehensive overview of algorithmic efficiency, time complexity, space complexity, and various sorting algorithms with their performance metrics. Understanding these concepts is crucial for designing efficient algorithms and solving computational problems effectively in various domains.