This is going to be a bit of a weird topic so please bear with me if you don't understand or if I have poorly worded it.
Since in the West at least we seem to be witnessing the twilight of Christian civilisation as it's regressing into paganism and libertinism, I've started to wonder to myself if Christian civilisation such as we've seen was an accident?
Do you think Christianity was meant to take hold in such a way as it did? Do you think it would have been possible for the Faith to have existed and thrived without having been compromised by earthly authority of kings and emperors? Because from this tiny seed it ended up encompassing massive empires which grew rich and went to war with each other.
For me it seems more probably that Christian civilisation should probably have ended up looking something more like the austere conditions of Cluny or Sketis, or even something like modern Egypt or Iraq where Christians are constantly menaced with death, rather than the opulence Constantinople or even Rome.
What I'm asking is, basically, do you think that Christianity was ever meant to be a "dominant" or "majority" religion and not one which was always threatened by the "rulers of darkness of this world"? Because as times go on, it seems very likely that either literally or figuratively we will be going back into the catacombs. Do you think it should always have been this way? Or was the creation of Christian nations an inevitability, and it's only our own fallen nature which makes it hard to look past worldly attachments like money, power, glory etc?
Please note this is not an attack on the church, ecclesiology, evangelisation or the Faith in general or anything, I don't believe that Christianity was or ever should be become some sort of exclusive "mystery religion" like Mithraism, only that I wonder if "Christian nations" should have existed at all?