The definition of "The West" or "Western", as it is commonly used when used in this context, defines the West as, essentially, the same nations where Christianity in general (particularly non-Eastern Orthodox) established a strong foothold and became, by far, the most popular religion (e.G. The EU, Canada, etc.) . One can debate whether "Is Christianity a western thing" makes sense if one ignores the connotations surrounding the word "West", but if one takes into account the meanings ascribed to word, the statement claims something perfectly reasonable: Christianity is predominantly a religion practice by these Western countries. Sure one could argue the question implies an origin, but this view (or its opposite) are rather arbitrary, and so I vote with my typical bias: towards the side that doesn't try to nitpick.
Christianity began in the middle eastern region, when Jesus our savior was born. From that region, thats where his influence spread slowly. It did eventually reached Europe. Then the middle eastern region created their own blind-faith " Islam" which threatened Christians. This is when the catholic church in Europe (you can say the "west") got involved by launching military campaigns to defend the holy ground in Jerusalem (modern-day israel/palestine). TO conclude, it is NOT a "western thing". Even though a lot of westerners are christians (catholics and protestants), it is still not.
This has to be one of the dumbest debate topics to be ever asked. Who would think of something like this in the first place. It's freedom of choice for people to believe in what ever religion they choose. I myself am Christian but even if i weren't i still would believe that this was a stupid topic to debate about.
Sometime people should not even assume this kind of stuff to the public. A lot of people, and I mean a lot of people, will be offended because some people might be Canadian or American or even Russian and still be Christian. So I, personally think that it shouldn't be discussed publicly
Christianity was first became real was in Asia. Look in the Bible. Look in the Bible . The place it first started in the Bible. We think that its a western thing because a lot of Americans ( not to be racist) are Christians. And so we think that it started in the western part of the world.