You just read:

New study finds majority of employees within corporate America believe companies need to play more active role in addressing important societal issues

News provided by

Povaddo

May 23, 2017, 07:00 ET