The question is vague, but one presumes it means progress towards a more enlightened society or less violent one. There is no question that, studying history and other culture, the top cause of death in the west is no longer murder, and that shows a significant decline overall in violence within society. International wars are also lower than they have been in the past as well.
I believe the West has made progress considering past violence. I think violence tends to go up and down with time and more than likely this is the case for the west. There is a low point that they have reached and it is unlikely that it will stay that way for a long time. Unfortunately.
I believe that the West has made progress considering past violence. I think that there has been too much attention paid to such histories of Western countries. I think that the world was a different place in the past. It is more important that modern societies focus intead on the future.
Scotland and Scandinavia? Oh hell no! Other countries yes. Remember all of the world was like this a long time ago, but some are learning from their mistakes (Ireland, Germany, England, France, Spain). So yes you can't say all the West has learned from their violent past because they haven't.
No, the West has not made progress considering past violence, because there is still a great amount of violence. Even within our own country, domestic violence is tolerated more than it ever has before. Drugs, and the violent deaths associated with their trade, is still a pervasive problem that we tolerate far too much.