Are you claiming that immigrants should leave "their culture" at the door when they come to the US and embrace "US culture?" If ?
Yes.
Imagine if you (as an American) moved to any non-western country and did not integrate into their society.
What do you think would happen to you?
And how do you think you (as an American) would be received by that country’s general population, media, law enforcement, etc?
By all means, name a specific country or countries you’re thinking about for any examples.
I’ve done lots of solo travel abroad (incredible experiences) and lived abroad, but never wanted to stand out like a sore thumb while abroad — especially post Iraq war in 2003 — the US and individual Americans became a **lot** less popular after that disaster.
It was uniquely bad for Americans abroad around the 2005 period. And we’re going to pay the price for invading Iraq for an entire generation. But I digress.
Obviously, you can’t completely avoid standing out, but there are some simple and respectful things you can do to help blend in (and integrate) to other societies— locals abroad appreciate that. It’s respectful to their culture, customs and way of life. And safer.
Related: North America and Europe are the only places that actually pay immigrants not to integrate.
I assume this started for the US after the establishment of LBJ’s mid-1960s welfare programs.