Has Biden ran America into the Ground?

It used to be just right wing conservatives claiming that President Biden has failed our nation. Now there are more and more left wing democrats hopping on board with the idea that Biden has ran our great country into the ground. Liz breaks down the current state of our country.