How 'The Office' Changed The Way We Work In Offices

How 'The Office' Changed The Way We Work In Offices

What started as a mirror of everyday life became something that affected the real world. By the time “The Office” wrapped up, the company culture depicted on the show barely resembled the average American workplace, and that’s a good thing. We loved the show, but we’re glad for some of the lessons it taught us and the ways it changed the workplace!

Read More