Do all American women wear bikinis?

Do all American women wear bikinis?

The US is a bit of a mixed (bikini) bag. Majority of the US wears full coverage suits, one pieces and tankinis, however cities like Miami, California and Hawaii embrace the beach lifestyle and with it the bikinis get smaller and smaller!Nov 24, 2013