It is a very hot debate as to whether the Empire was a good or bad thing for the world.
The Empire stopped a LOT of slavery in Africa, the Caribbean and even Asia; it also made a lot of African and Asian countries a lot more economically developed with better infrastructure, laws and rules of trading and banking etc.
A lot of Empirical history still stands today, such as in Mumbai, India; where a lot of the buildings are the ones the British built. I mean, without the Empire, Hong Kong wouldn't even exist!
Unfortunately the Empire wasn't all good. Near to the end of the Empire there was a lot of rebellion and uprising from the countries Britain ruled, which resulted in a lot of bloodshed and massacres. Some people say that the British "raped" countries of their resources and money, and took it all back to Britain, which in turn stopped those countries from developing even further.
So what's your view?
(Whatever your opinion, please explain why you have that opinion)
Most Helpful Opinions