You are given a 2D integer matrix grid of size n x m, an integer array limits of length n, and an integer k. The task is to find the maximum sum of at mostk elements from the matrix grid such that:
The number of elements taken from the ith row of grid does not exceed limits[i].
Return the maximum sum.
Example 1:
Input:grid = [[1,2],[3,4]], limits = [1,2], k = 2
Output:7
Explanation:
From the second row, we can take at most 2 elements. The elements taken are 4 and 3.
The maximum possible sum of at most 2 selected elements is 4 + 3 = 7.
Example 2:
Input:grid = [[5,3,7],[8,2,6]], limits = [2,2], k = 3
Output:21
Explanation:
From the first row, we can take at most 2 elements. The element taken is 7.
From the second row, we can take at most 2 elements. The elements taken are 8 and 6.
The maximum possible sum of at most 3 selected elements is 7 + 8 + 6 = 21.
Constraints:
n == grid.length == limits.length
m == grid[i].length
1 <= n, m <= 500
0 <= grid[i][j] <= 105
0 <= limits[i] <= m
0 <= k <= min(n * m, sum(limits))
Solutions
Solution 1: Greedy + Priority Queue (Min-Heap)
We can use a priority queue (min-heap) \(\textit{pq}\) to maintain the largest \(k\) elements.
Traverse each row, sort the elements in each row, and then take the largest \(\textit{limit}\) elements from each row and add them to \(\textit{pq}\). If the size of \(\textit{pq}\) exceeds \(k\), pop the top element of the heap.
Finally, sum the elements in \(\textit{pq}\).
The time complexity is \(O(n \times m \times (\log m + \log k))\), and the space complexity is \(O(k)\). Here, \(n\) and \(m\) are the number of rows and columns of the matrix \(\textit{grid}\), respectively.