Parallel computing, a problem which is able to be trivially divided into parallelized tasks
In
parallel computing, an embarrassingly parallel workload or problem (also called embarrassingly parallelizable, perfectly parallel, delightfully parallel or pleasingly parallel) is one where little or no effort is needed to separate the problem into a number of parallel tasks.[1] This is often the case where there is little or no dependency or need for communication between those parallel tasks, or for results between them.[2]
Thus, these are different from
distributed computing problems that need communication between tasks, especially communication of intermediate results. They are easy to perform on
server farms which lack the special infrastructure used in a true
supercomputer cluster. They are thus well suited to large, Internet-based
volunteer computing platforms such as
BOINC, and do not suffer from
parallel slowdown. The opposite of embarrassingly parallel problems are
inherently serial problems, which cannot be parallelized at all.
A common example of an embarrassingly parallel problem is 3D video rendering handled by a
graphics processing unit, where each frame (forward method) or pixel (
ray tracing method) can be handled with no interdependency.[3] Some forms of
password cracking are another embarrassingly parallel task that is easily distributed on
central processing units,
CPU cores, or clusters.
Etymology
"Embarrassingly" is used here to refer to parallelization problems which are "embarrassingly easy".[4] The term may imply embarrassment on the part of developers or compilers: "Because so many important problems remain unsolved mainly due to their intrinsic computational complexity, it would be embarrassing not to develop parallel implementations of polynomial
homotopy continuation methods."[5] The term is first found in the literature in a 1986 book on multiprocessors by
MATLAB's creator
Cleve Moler,[6] who claims to have invented the term.[7]
An alternative term, pleasingly parallel, has gained some use, perhaps to avoid the negative connotations of embarrassment in favor of a positive reflection on the parallelizability of the problems: "Of course, there is nothing embarrassing about these programs at all."[8]
Examples
Some examples of embarrassingly parallel problems include:
In
R (programming language) – The Simple Network of Workstations (SNOW) package implements a simple mechanism for using a set of workstations or a
Beowulf cluster for embarrassingly parallel computations.[16] Similar R packages include "future", "parallel" and others.
See also
Amdahl's law defines value P, which would be almost or exactly equal to 1 for embarrassingly parallel problems.
^Herlihy, Maurice; Shavit, Nir (2012).
The Art of Multiprocessor Programming, Revised Reprint (revised ed.). Elsevier. p. 14.
ISBN9780123977953. Retrieved 28 February 2016. Some computational problems are "embarrassingly parallel": they can easily be divided into components that can be executed concurrently.
^Matloff, Norman (2011). The Art of R Programming: A Tour of Statistical Software Design, p.347. No Starch.
ISBN9781593274108.
^Leykin, Anton; Verschelde, Jan; Zhuang, Yan (2006). "Parallel Homotopy Algorithms to Solve Polynomial Systems". Mathematical Software - ICMS 2006. Lecture Notes in Computer Science. Vol. 4151. pp. 225–234.
doi:
10.1007/11832225_22.
ISBN978-3-540-38084-9.
^Moler, Cleve (1986). "Matrix Computation on Distributed Memory Multiprocessors". In Heath, Michael T. (ed.). Hypercube Multiprocessors. Society for Industrial and Applied Mathematics, Philadelphia.
ISBN978-0898712094.