I know in its geographic location, florida is as south as you can get in the US. However, when I visited Miami, I did not feel as if I was in the south but rather I felt more like I was in Los Angeles or Orange County especially with everyone showing off their wealth and materialistic possessions. Orlando also felt not very southern as well and same with Jacksonville. What about the rest of florida? Would Tampa Bay or Fort Myers or places like that be considered southern?
No comments:
Post a Comment