In my younger years I traveled a LOT throughout the States. Midwest, East Coast, Southern States.
Alas, I missed out on western U.S. places like California and Texas, etc.
On my travels, I met nothing but kind hearted Americans. Now, that I have traveled far less over the last 40 or so years, I keep hearing and reading about how "bad" it's gotten. I never noticed "bad" in my home State of Minnesota. Then again, the George Floyd incident happened.
When reading the headlines, I have to wonder HAS things gone from "bad" to "worse?" HAS civil discourse become worse? Is it less safe to travel NOW, and if I was in my youth NOW and traveled throughout the States, would my experiences be different?
Some perspective:
The Best Countries to Travel Alone ranking draws from a global perceptions-based survey and ranks countries based on scores from a compilation of seven country attributes: culturally accessible, fun, friendly, pleasant climate, safe, scenic and unapproachable.
https://www.usnews.com/news/best-countries/best-countries-to-travel-alone
Typically, we all have our unique perspectives and experiences. I might have some issues with the listing in the link. Example: Malaysia is listed ahead of the U.S. I personally would never travel alone to Malaysia. Canada is ranked 13th, and from personal experience, traveling alone there has never been an issue. Yet Brazil is listed #7 best for traveling alone.
WE could quibble whether "best" also means "safest" but when I look at lists like the above one (and an internet source will provide you with other sources and links) and see how far the U.S. has fallen, I wonder.
So, how about YOUR experiences? Is the U.S. more or less as safe as ever, or is it now less safe? Outside of the safety factor, would you recommend people visit the U.S. in this current political climate? OR, are there still enough decent Americans that despite what we hear and read in the media, the U.S. is still an awesome country to travel through or visit?