Only Text Quote
In the '70s and '80s, what private equity did is it changed corporate America. It started holding companies accountable, and for the first time managers started thinking like owners.
In the '70s and '80s, what private equity did is it changed corporate America. It started holding companies accountable, and for the first time managers started thinking like owners.