Like other imperialist powers, Germany established a number of colonies in Africa in the 1800s in order to generate income. To a greater extent than most of the others, Germany relied on force to seize and subdue its colonies. The only European nation possibly more brutal was Belgium, in the Congo. Any resistance to German rule was routinely crushed by extreme force. However, it's unlikely that Germany ever made any real profit from it's colonies because it lost them all after WW1.
In my opinion, Germany is not free of the imperialist history that England and Denmark have, and even though the outcome of the first and second world wars may have changed the leadership of German held and occupied nations the history is still apparent in the cultures, loyalties and the peoples of those nations often still have a great deal of animosity toward their former rulers.