People’s first impression of western culture often comes from Western TV and movies but this can be misleading and enforces stereotypes.
Whilst TV and movies reflect and influence culture to an extent they are not a good indicator
If we believed everything on TV I’d think all Chinese are Kung Fu masters and you’d think Westerners have affairs all the time
There are elements of truth in both but neither is an accurate depiction
So where does western culture come from?