"The debate is open! »
- Western media are interested in Africa heads as if Africa did not have a tails. It is all day long that they talk about misery, hunger, disease, wars (...) of Africa.
- They talk about Africa as if the West (America - Europe), were paradises everywhere. However, there are also in the West (America - Europe) a lot of things that leave something to be desired.
If every coin has its flip side, Africa also has its positive side and its negative side like any other continent. So, any argument to denigrate Africa is only slapstick and burlesque farce ... True or false ?