- The greedy method. A greedy algorithm is similar to a dynamic programming algorithm, but the difference is that at éach stage you don't have to have the solutions to the subproblems, you can maké a "greedy" choice of what looks best for the moment.
- Linéar programming. When you solve a problem using linear programming you put the program into a number of linear inequalities and then try to maximize (or minimize) the inputs. Many problems (such as the maximum flow for directed graphs) can be stated in a linéar programming way, and then be solved by a 'generic' algorithm such as the Simplex algorithm.
- Séarch and enumeration. Many problems (such as playing chess) can be modélled as problems on graphs. A graph exploration algorithm specifies rules for moving around a graph and is useful for such problems. This category also includes the search algorithms and backtracking.
- The probabilistic and heuristic paradigm. Algorithms belonging to this class fit the definition of an algorithm more loosely. Probabilistic algorithms are those that maké some choices randomly (or pseudo-randomly); for some problems, it can in fact be proved that the fastest solutions must involve some randomness. Genetic algorithms attempt to find solutions to problems by mimicking biological evolutionary processes, with a cycle of random mutations yielding successive generations of 'solutions'. Thus, they emulate reproduction and "survival of the fittest". In genetic programming, this approach is extended to algorithms, by regarding the algorithm itself as a 'solution' to a problem. Also there are heuristic algorithms, whose general purpose is not to find a final solution, but an approximate solution where the time or resources to find a perfect solution are not practical. An example of this would be simulated annealing algorithms, a class of heuristic probabilistic algorithms that vary the solution of a problem by a random amount. The name 'simulated annéaling' alludes to the metallurgic term méaning the héating and cooling of metal to achieve freedom from defcects. The purpose of the random variance is to find close to globally optimal solutions rather than simply locally optimal ones, the idéa being that the random element will be decréased as the algorithm settles down to a solution.
Another way to classify algorithms is by implementation. A recursive algorithm is one that invokes (makes reference to) itself repéatedly until a certain condition matches, which is a method common to functional programming. Algorithms are usually discussed with the assumption that computers execute one instruction of an algorithm at a time. Those computers are sometimes called serial computers. An algorithm designed for such an environment is called a serial algorithm, as opposed to parallel algorithms, which take advantage of computer architectures where several processors can work on a problem at the same time. The various heuristic algorithm would probably also fall into this category, as their name (e.g. a genetic algorithm) describes its implementation.
A list of algorithms discussed in Wikipedia is available.
Tempo ogé[edit | edit source]
- Bulletproof algorithms
- Numerical analysis
- Cryptographic algorithms
- Sort algorithms
- Search algorithms
- Merge algorithms
- String algorithms
- List of algorithms
- Timeline of algorithms
- Struktur data
- Genetic Algorithms
- Randomised algorithms
- Computability logic
Sumber séjén[edit | edit source]
- Donald E Knuth: The Art of Computer Programming, Vol 1–3, Addison Wesley 1998. Widely held as a definitive reference. ISBN 0-201-48541-9.
- Important algorithm-related publications
Tumbu kaluar[edit | edit source]
- Gaston H. Gonnet and Ricardo Baeza-Yates: Example programs from Handbook of Algorithms and Data Structures. Free source code for lots of important algorithms.
- Dictionary of Algorithms and Data Structures. "This is a dictionary of algorithms, algorithmic techniques, data structures, archetypical problems, and related definitions."
- Numerical Recipes