Abu Ja'far Muhammad ibn Musa al-Khwarizmi's is defined algorithm as folows:
Every algorithm should have the following characteristic features:
understand completely problem and if possible splite into subproblems. For each subproblem different instances are to be defined. For a specific input a different instance may occur. If a specific instance is not considered then the algorithm is limited to only some inputs.
We can classify algorithms into two categories based on computational complexity. They are sequential algorithms and parallel algorithms. Algorithm design doesn't depend upon the speed and amount of memory of a system. Designing of a good algorithm depends on problem but not on system specification. Size of the data and complexity of processing data decides system specifications. An efficient algorithm may solve such problems even on systems with low configuration.
Algorithm + Data structure = Program
Designing a suitable data structure to handle input and outputs that are to be processed by an algorithm will improve algorithms efficiency interms of time and space. Most of the programs today are object oriented and hence data structures are crucial in designing an algorithm.
An algorithm may be specified in any one of the following methods.
a) Natural language: Natural languages are the languages that people speak, such as English
b) Flow chart: By using a predefined geometric shapes we can describe the procedure in algorithm. Flow chart is unable to represent complex algorithms.
c) Pseudo code: It is a mixture of natural language and programming language. There are different forms of pseudo code and all these dialects are close to each other.
A correct algorithm is not one that works most of the time but one that works correctly for all legitimate inputs. To measure the correctness we must consider preconditions and post conditions also. For large programs, the program was broken into smaller modules and their correctness is verified.
A program requires storage space for the instructions, constants and variables used by the program, and the input data. If the amount of space needed changes then it is need to estimate the worst case or average case analysis.
Time needed to execute a program depends on computer configurations, platform used, programming language, programmer style etc. Algorithms efficiency doesnot depends on all these extraneous factors. Hence while estimating the time needed only basic operations in the algorithm are considered.
An optimal algorithm is an algorithm which involves few basic operations. There may be many logics to solve a given problem. The best possible method which involves very less number of operations that takes less memory and time can be considered as an optimal algorithm
Sometimes simplest and straight forward way may not be optimal method. It may be easier to verify the correctness but it may take more time.
Analysis of algorithm means estimating the efficiency of algorithm interms of time and space that an algorithm requires.
1. First step is to determine what operations must be done and what are their relative cost. Basic operations such as addition, subtraction, comparison have constant time.
2. Second task is to determine the number of data sets which cause algorithm to exhibit all possible patterns.
It requires to understand the best and worst behaviour of an algorithm for different data configurations.
1. Priori analysis: In this analysis we find some functions which bounds algorithm time and space complexities. By the help of these functions it is possible to compare the efficiency of algorithms.
2. Posterior analysis: Estimating actual time and space when an algorithm is executing is called posterior analysis.
1. The process of producing correct answer to given legal inputs is called algorithm validation.
2. The purpose of validation is to prove an algorithm works independent of language.
3. Validation process contains two forms - one is producing respective outputs for given inputs and the other is specification, whether it is producing outputs according to specified requirements.
4. These two forms are shown equivalent for a given input.
1. Greedy: Minimum Spanning Tree. Shortest Path, Huffmann Codes
2. Iterative Improvement: Network Flow, Matching
3. Divide and Conquer: Mergesort, Quicksort, Binary Search.
4. Dynamic Programming: Storing intermediate values to avoid recomputing them (widespread in string processing).
5. Randomized: versions of all of the above (from 1 to 4).
6. Exhaustive Search: Best-First with Pruning, Branch-and-Bound (very useful, but the algorithms are trivial-everything revolves around the bounds)
7. Linear & Integer Programming: Massively useful, but highly specialized and best learned in a dedicated course on optimization.
8. Approximation Algorithms: Any of the above strategies can be used for this (from 1 to 7); in addition, specialized techniques for approximation often combine several of these techniques; randomization is particularly useful in this context.