Hollywood’s Negative Impact on Americans’ Body Image
The problem with this is that it is negatively effecting people and their views Of their self-worth. The negativity that Hollywood is pushing on body image greatly affects women in their teens as well...