Parallel computing can often speed up common mathematical operations used in science and engineering, and enable larger data sets to be processed.Let’s briefly take a look at the kinds of algorithms and code structures perform well when parallelized, usinga combination of two scientific computing tools: StarP from Interactive Supercomputing, and MATLAB® from The MathWorks.
First of all, it is important to note that not all algorithms and code structures lend themselves equally well to parallelization.There are two types of programming approaches that leverage parallel computing power: taskparallel and dataparallel computations.Figure 1 illustrates a strategy for porting a custom serial algorithm to a parallel implementation: some computations make sense to leave on the desktop, others lend themselves well to task parallelism, and others to data parallelism:

Execute in serial: some computations are so trivial that parallelization (and the associated overhead) is unnecessary.For the same reason you wouldn’t take a plane to the neighborhood convenience store, trivial operations that take fractions of a second and/or operate on small data sets or text strings, are best left on the desktop for serial execution.

Taskparallel computations:Task parallelism (sometimes called “coarsegrained” or "embarrassingly parallel") is a powerful method to carry out many independent calculations in parallel, such as Monte Carlo simulations, or "unrolling" serial FOR loops.For example, in a medical application involving image processing on multiple brain slices, StarP can distribute the images across several processors, and simultaneously process them.

Dataparallel computations: Data parallelism (sometimes called "global array syntax") is used for highlevel matrix and vector operations on large data sets, for example a twodimensional Fast Fourier Transform of a highresolution image.







Figure 1 illustrates a strategy for porting a custom serial algorithm to a parallel implementation: some computations make sense to leave on the desktop, others lend themselves well to task parallelism, and others to data parallelism, and the two make for a very powerful combination in writing custom parallel applications.







Contact the author, Ilya Mirman, at imirman@interactivesupercomputing.com.
For more information on this topic, check out Interactive Supercomputing’s Interactive Tour, which has live demonstrations and various math algorithms running in parallel at http://www.interactivesupercomputing.com/IT/
Implementing Two Modes of Parallelism
Let’s take a look at how to implement the two modes of parallelism with a couple simple examples.
Data Parallel
To take advantage of StarP’s global array syntax, just add the *p construct to the dimension of the variable.Adding the *p makes the variables parallel. Through propagation, related variables also become parallel.The following MATLAB script is a simple example of using a random tossing a coin two million times and adding up the number of heads using a random vector: first in serial MATLAB (in the green frame), then taking advantage of StarP (red frame) to create the random vector on the parallel server.
Task Parallel
To illustrate task parallelism, let’s consider the example a 3dimension array (100x100x500 elements). Think of it as 500 planes of 100x100 elements each.Our goal for each plane is to compute its inverse.The green frame below illustrates a “for” loop in MATLAB executed 500 times.
To do exactly the same thing with StarP (red frame), we would tag the last dimension of the matrix with a *p, and then use the ppeval command (analogous to MATLAB’s “feval,” or “evaluate function” command).The arguments we pass it are the function we are performing, ‘inv’ in this case, and the variable we are performing it on – the matrix a.StarP takes care of distributing and computing this, transparently, independent of the number of parallel processors.
For more information, You need this trick if you would like to execute your MATLAB code in parallel, to solve bigger problems, and for faster execution.
Got a cool software trick? Send us details, including any documentation and supporting code, to kfield@reedbusiness. If we publish your trick, we’ll send you a super cool Design News tshirt.