This was a very interesting interview with the former Ugandan Ambassador. This reason is that western country's mainly The Americans, has been losing ground in Africa because of these 3 reasons. One was that they did not see Africans as equals, second China got in with better deals and calling Africans Equals and lastly was a big one, Racism was a major factor. Africans don't want to be dictated how to govern there corresponding country's, and the colonizing narrative for example like France in the past had to stop. Tell me what you all think about this?
Global Horizontal Ad1
Collapse
Google Tag Manager
Collapse
Google Website Review Code
Collapse
Announcement
Collapse
No announcement yet.
Why is Africa turning away from the United States? | The Bottom Line- Al Jazeera
Collapse
X
Global Horizontal Ad2
Collapse