Movie stars are typically Northerners, West Coast folks, Midwesterners, Canadians, Brits, Aussies and Kiwis. Very few true Southerners though.
Why? Do they only star in Southern-based films, and never venture out of Dixie?
I mean people from Texas to Virginia.