Is it true that Hollywood never actually had US Southerners in films or TV shows?

Movie stars are typically Northerners, West Coast folks, Midwesterners, Canadians, Brits, Aussies and Kiwis. Very few true Southerners though.

Why? Do they only star in Southern-based films, and never venture out of Dixie?

I mean people from Texas to Virginia.

Is it true that Hollywood never actually had US Southerners in films or TV shows?
Post Opinion