I ask this having been to events with national/ethnic dress, food, and other cultures. What can a white American say their culture is? It feels that for better or worse it’s been all melted together.
Trying to trace back to European roots feels disingenuous because I’ve been disconnected from those roots for a few generations.
This also makes me wonder was their any political motive in making white American culture be everything and nothing?
As a white American myself I define it as a lot of things. It’s mostly European in origin. Things country music, burgers and fries, flannel shirts, line dancing and the Beach Boys are just some things that scream white Americana.