0
The world’s media portray Africa as a land where people war for a living and are plagued by poverty and diseases. In fact, some persons in the Western world see Africa as a country. So, do not be surprised if you hear/read statements like, “in China, US, France and Africa…”. But the reality is, Africa… / READ MORE /