What is the nature of data you are processing, that it requires iterations?
Depending on nature of data, the simplest way may be for the driver to run multiple map reduce jobs in a while loop.
If number of iterations is known and is constant, or can be determined by a first pass job, it's straightforward to run the loop.
If it's not known or is variable depending on data, the reducer in each job is responsible for communicating to the driver whether more iterations are necessary or the terminating condition has been satisfied.
It can do this either via a status file on HDFS, or using
Counters.
Another option is to use Giraph itself, especially its
blocks framework.