The West has always had a complicated history with Africa. The slave trade and colonialism are a thing of the past, but there are different issues being raised today about relationships. While Western NGO's have committed significant amounts of time and money to help meet some of the continent's most pressing needs, do the media often portray Africa as having problems and the West as having the answers? Whose story gets told and from what perspective? Is there a danger of creating an image of Africa as helpless without Western aid?