Problem

Given n jobs arriving at time 0, each job i has a processing time p_i and a deadline d_i. A schedule assigns a start time s_i to each job; the finish time is f_i = s_i + p_i. The lateness of job i is l_i = max(0, f_i - d_i). Find a schedule that minimizes the maximum lateness L = max_i l_i.

Examples

Example 1

gantt
  title Jobs
  dateFormat  YYYY
  axisFormat  %y
  section Schedule
  Job3             : 2000, 1y
  Job1             : 2001, 3y
  Job2             : 2004, 2y
  section Deadlines
  Job3 deadline    :milestone, 2003, 0
  Job1 deadline    :milestone, 2004, 0
  Job2 deadline    :milestone, 2006, 0
  
1
2
3
4
5
Input jobs = [(3,4), (2,6), (1,3)].
Output: 0
Explanation: 
One optimal schedule (by deadlines): job3 -> job1 -> job2
Output: maximum lateness = 0 (all finish by their deadlines)

Solution

The optimal greedy policy is to schedule jobs in non-decreasing order of deadlines (Earliest Deadline First). Below we give the intuition, a precise approach, and code in multiple languages.

Method 1 - Earliest Deadline First

Intuition

Scheduling the job with the earliest deadline next prevents imminent deadlines from being missed. An exchange argument shows that any optimal schedule can be transformed (without increasing maximum lateness) into one that orders jobs by non-decreasing d_i.

Approach

  1. Sort jobs by deadline in non-decreasing order.

  2. Set time = 0.

  3. For each job in sorted order:

  • Assign start = time and finish = time + p.

  • Compute lateness = max(0, finish - deadline) and update L = max(L, lateness).

  • Set time = finish.

  1. Return L (maximum lateness) and the schedule if required.

Edge cases:

  • If n == 0 return L = 0.
  • Jobs with equal deadlines may be ordered arbitrarily; the maximum lateness is identical for any order among equal-deadline jobs.

Code

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
class Solution {
 public:
  int minimizeMaximumLateness(vector<pair<int,int>> jobs) {
    // jobs: vector of (p, d)
    sort(jobs.begin(), jobs.end(), [](const auto &a, const auto &b){ return a.second < b.second; });
    int time = 0;
    int L = 0;
    for (auto &job : jobs) {
      int p = job.first, d = job.second;
      time += p;
      L = max(L, time - d);
    }
    return max(0, L);
  }
};
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
package main

import "sort"

func minimizeMaximumLateness(jobs [][2]int) int {
  // jobs: [[p,d], ...]
  sort.Slice(jobs, func(i, j int) bool { return jobs[i][1] < jobs[j][1] })
  time := 0
  L := 0
  for _, job := range jobs {
    p, d := job[0], job[1]
    time += p
    if time-d > L { L = time-d }
  }
  if L < 0 { return 0 }
  return L
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
import java.util.Arrays;

class Solution {
  public int minimizeMaximumLateness(int[][] jobs) {
    // jobs: [[p,d], ...]
    Arrays.sort(jobs, (a, b) -> Integer.compare(a[1], b[1]));
    int time = 0;
    int L = 0;
    for (int[] job : jobs) {
      int p = job[0], d = job[1];
      time += p;
      L = Math.max(L, time - d);
    }
    return Math.max(0, L);
  }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
class Solution:
  def minimize_max_lateness(self, jobs: list[tuple[int,int]]) -> int:
    # jobs: list of (p, d)
    if not jobs:
      return 0
    jobs.sort(key=lambda x: x[1])
    time = 0
    L = 0
    for p, d in jobs:
      time += p
      L = max(L, time - d)
    return max(0, L)

Complexity

  • ⏰ Time complexity: O(n log n) for sorting the n jobs; the scan is O(n), so total O(n log n).
  • 🧺 Space complexity: O(1) extra space beyond input (or O(n) if the sort implementation requires auxiliary space).