The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

HPC::Runner::Command::submit_jobs::Utils::Scheduler::ResolveDeps;

Once we have parsed the input file parse each job_type for job_batches

Attributes

schedule

Schedule our jobs

Subroutines

schedule_jobs

Use Algorithm::Dependency to schedule the jobs

Catch any scheduling errors not caught by the sanity check

sanity_check_schedule

Run a sanity check on the schedule. All the job deps should have existing job names

chunk_commands

Chunk commands per job into batches

#TODO Clean this up

resolve_max_array_size

Arrays should not be greater than the max_array_size variable

If it is they need to be chunked up into various arrays

Each array becomes its own 'batch'

assign_batch_stats

Iterate through the batches to assign stats (number of batches per job, number of tasks per command, etc)

assign_batches

Each jobtype has one or more batches iterate over the the batches to get some data and assign s

For batches - each HPC::Runner::Command::submit_jobs::Utils::Scheduler::Batch

is an element in the array Each element could has commands_per_node tasks

assign_batch_tags

Parse the #TASK lines to get batch_tags #TODO We should do this while are reading in the file