How do computers blur images?
Basically, they replace each pixel with the average of all their nearby pixels. Weight input pixels by how close they are to the output pixel for the best aesthetics.
This qualifies as "embarassingly parallel", though the more you blur the image the more input pixels it needs to sample. And hence the slower the operation becomes.
One optimization is to perform separate horizontal vs vertical blurs, such that there's 2n weights instead of n^2 weights.
1/2!