Do y'all consider Florida a southern state?

I'm from Central Florida, right outside of Daytona Beach. Now, I've traveled all throughout the south (excluding Tennessee & Kentucky) but, I have never been south of Tampa, so I really don't know what that third of Florida is like.

Most people where I'm from considers Florida southern, with the exception of Miami, and I've met people who think that only North and Central Florida is southern, and then I have met people that think the Florida is entirely southern, and just because we have one big city with a diverse cultural experience, doesn't mean that it takes away from being southern.

Obviously, compared to other non-southern states, Florida is southern, but compared to southern states, is Florida southern?

I'm going to go ahead and agree with the statement that Florida as a whole is southern.. I look at it this way, 1) we were in the confederacy, which was the south. 2) If we look at it like 2/3, then that is majority, (even though I don't see it like that), and majority wins, or makes generalizations. 3) Unless you are from/raised in a different state/country, everyone from Florida that I have ever met has a southern drawl.

But that's just my opinion, I want to know other people's opinions!

Please enter comments
Please enter your name.
Please enter the correct email address.
You must agree before submitting.

Answers & Comments


Helpful Social

Copyright © 2024 Q2A.ES - All rights reserved.