## Greedy Algorithms

In this section, we'll learn about the basic characteristics of a greedy algorithm by looking at a classic greedy algorithm. We'll then compare and contrast greedy algorithms with dynamic programming algorithm to understand when to use each.

###### DESCRIPTION (credit Leetcode.com)

You are given two integer arrays:

- greeds (of size n), where each element represents the minimum size of a cookie that a child needs to be satisfied.
- cookies (of size m), where each element represents the size of a cookie.

Your task is to assign cookies to children such that as many children as possible are satisfied. A child is satisfied if the cookie they receive is equal to or greater than their greed factor. Each child can receive at most one cookie, and each cookie can be given to only one child. Write a function to return the maximum number of children that can be satisfied.

**Example 1**:
Input:

`greeds = [1, 2, 3] cookies = [1, 1]`

Output: 1

Explanation:

The first child with a greed of 1 can be satisfied with the first cookie of size 1. The second cookie of size 1 cannot satisfy the second child with a greed of 2. Therefore, only one child can be satisfied.

**Example 2**:

Input:

`greeds = [1, 2] cookies = [1, 2, 3]`

Output: 2

Explanation:

The first child with a greed of 1 can be satisfied with the first cookie of size 1. The second child with a greed of 2 can be satisfied with the second cookie of size 2. The third cookie of size 3 is not needed as both children are already satisfied. Therefore, both children (2 in total) can be satisfied.

### Intuition

Intiutively, we want to give each child the smallest cookie that satisfies them. This allows us to save the larger cookies for the greedier children and allows us to maximize the number of satisfied children.

### Greedy Algorithm

The greedy algorithm sorts both the greeds and cookies arrays in ascending order. This places the child with the smallest greed and the smallest cookie at the front of each array.

For example:

`greeds = [1, 3, 3, 4] cookies = [2, 2, 3, 3]`

We then initialize two pointers i and j to the start of the greeds and cookies arrays, respectively. i represents the current child and j represents the current cookie.

If cookies[j] >= greeds[i], that means the current cookie can satisfy the current child. We increment the number of satisfied children and move to the next child and cookie.

If cookies[j] < greeds[i], that means the current cookie *cannot* satisfy the current child, so we move to the next cookie to see if it can.

We can continue this process until we reach the end of either the greeds or cookies arrays, and return the number of satisfied children as the result.

def findContentChildren(greeds, cookies):greeds.sort()cookies.sort()count = 0i, j = 0, 0while i < len(greeds) and j < len(cookies):# current cookie can satisfy current childif cookies[j] >= greeds[i]:count += 1i += 1j += 1return count

#### Complexity Analysis

**Time Complexity**: O(n log n + m log m) where n is the number of children and m is the number of cookies. We sort the greeds and cookies arrays in O(n log n + m log m) time, and then iterate through the arrays in O(n + m) time.

**Space Complexity**: O(1) since we only use a constant amount of space.

### What Makes this a Greedy Algorithm?

There are a few characteristics of this algorithm that make it a greedy algorithm:

**Greedy Choice Property**
By repeatedly making a *locally* optimal choice, we can arrive at a *globally* optimal solution.

In this case, after sorting the arrays, the locally optimal (or "greedy") choice is to give each child the smallest cookie that will satisfy them. After iterating over the array, we will have maximized the number of satisfied children (the global optimal solution).

In other words, we always make the best possible choice without worrying about the future consequences of that choice. When we do need to worry about future consequences, we often need to use dynamic programming instead.

**Optimal Substructure**
The optimal solution to the problem can be constructed from the optimal solutions to its subproblems.

For this question, once we have assigned a cookie to a child, we can safely remove the child and the cookie from the arrays, and the problem reduces to assigning the remaining cookies to the remaining children. This allows us to solve the problem by making a series of locally optimal choices.

**No Backtracking**
Greedy algorithms make a decision once and do not revisit it. In this case, once we have assigned a cookie to a child, we never revisit that decision by taking the cookie back and giving it to another child, or giving the child a different cookie.

### Greedy vs. Dynamic Programming

Greedy algorithms and dynamic programming are both used to solve optimization problems, so it's useful to build an understanding of when to use each approach.

We'll take a brief look at a problem in which a greedy approach do not work, and how dynamic programming solves the issues that greedy algorithms face.

###### DESCRIPTION

Given an array of integers, find the longest increasing subsequence (LIS) in the array. The subsequence does not have to be contiguous.

**Example**:
Input:

`nums = [10, 13, 2, 5, 3, 7, 101, 18]`

Output: 4. The longest increasing subsequence is [2, 3, 7, 101].

A greedy approach to this problem might involve choosing the first element in the array as the start of the subsequence, and then iterating over the array to add the next element to the subsequence if it is greater than the last element in the subsequence.

For example, if nums = [10, 13, 2, 5, 3, 7, 101, 18], this would yield the subsequence [10, 13, 101].

**The issue with this approach is that choosing the first element 10 affects our ability to actually choose the optimal subsequence later on**, which starts with 2. This means that the *greedy choice* does not yield the *globally optimal solution*.

The dynamic programming approach instead involves iterating through the array and keeping track of the length of the longest increasing subsequence ending at each element, allowing us to consider all possible sequences and choose the longest one. You can learn more about the dynamic programming solution here.

Loading comments...