Social norms that are deemed acceptable, not "what I think is wrong with america".
What do you dislike in the life here in the USA but were not ready to express your opinion.
I know that many social norms are not acceptable but who is ready to tell us which?