How to control orders in which Java Futures are "submitted"?

In this example, i am submitting a few files to my comparator object. It all works fine, except that i noticed that order in which files are submitted is not always teh same order in which they are returned. Any suggestions on how i can better control this?

 ExecutorService pool = Executors.newFixedThreadPool(5);
  CompletionService<Properties> completion = new ExecutorCompletionService<Properties>(pool);

  for (String target : p.getTargetFiles()) {
   completion.submit(new PropertiesLoader(target, p));
  }

  for (@SuppressWarnings("unused")
  String target : p.getTargetFiles()) {
   Properties r = null;
   try {
    r = completion.take().get();
   } catch (InterruptedException e) {
    e.printStackTrace();
   } catch (ExecutionException e) {
    e.printStackTrace();
   }

   p.addTargetFilesProperties(r);
  }

  pool.shutdown();

Answers


The main point of using CompletionService.take is to have it return whichever Future has finished, no matter what order they were submitted. If you want to return them in order you might as well not use that at all (you might not even want to use CompletionService at all, but you can). Keep a list of the Future objects returned from submit() and call .get() on each one; it will block until the result is available.


When submitting multiple tasks to the ThreadPoolExecutor at once, you have no real control on the termination times, since they are being executed by multiple threads concurrently. The completion service returns them in the order they complete, which may vary from one execution to another. If you decide to execute the task serially, the completion order is the same as the activation order.

--EDIT--

If you still want concurrency, but want to wait for the tasks in a specific order, don't use the completion service. Simply loop on the futures in the required order, and call their get() method, which blocks if necessary.


Need Your Help

Out of Memory Killer activated for python script running a multiprocessing Queue?

python queue out-of-memory multiprocessing kill-process

I have written a python program that needs to run for multiple days at a time, because of the constant collection of data. Previously I had no issues running this program for months at a time. I