When you say 'nordics' what do you mean? Are you meaning white people of nordic/ european descent? If so, then definitely not. Considering that this would cover the greater western world, and the greater western world is the most accepting, egalitarian, multicultural, and supportive society on the face of the earth, with the highest reported numbers of seeing universal equal rights for all people regardless of race, gender, orientation, ability, etc, then it would be hard to make an argument that the population of that area is ignorant of the issues about race.
Realistically, who has done more to make the world a safe, equal, accepting place than 'nordics' as you put it. Who put a stop to slavery? The west. Who gave the working man the vote? The west. Who gave women and minorities equal rights and opportunities under the law, including an equal vote? The west. Who has the most robust system of universal human rights and freedoms in the world? The west.
Sounds like they are not only AWARE of the issues, but fighting constantly and consistently against tyranny in this regard, fighting hardest for universal equal rights.
As much as people seem to want to paint the west as some horrible society that hates women and minorities, the arguments kind of fall flat when you look at how good the west is doing in comparison to the rest of the world.
Granted, with people spitting in the face of that history of rising above the superstitions and prejudices of old to make it seem like we are still in the dark ages, I am not sure how long it will last... People are fed up with the doublethink. How the hell else do you think Trump got elected?