Official acceptance in the East and West
In AD 312, the Emperor Constantine stopped the persecutions and Christianity became a recognized and official religion of the Roman Empire. The church began to grow and spread like a wildfire, moving from the farthest eastern outpost to the islands of the far western part of the Empire. It still had its share of controversies and scandals, but the believers were finally free of facing death for their faith.
(A discussion of all the organizational changes, theological unification creeds, eventual schisms that occurred during the next centuries is beyond the scope of this book. A thorough and complete look and analysis of the growth of Christianity after this time period and until today, is found in Diarmaid MacCulloch’s ‘Christianity – The First Three Thousand Years’ .)
Christianity stayed as the dominant faith of the East until the armies of Islam poured out of Arabia around 642 AD and conquered state after state and kingdom after kingdom and established Islam as the official religion. Because of this conquest, Christian communities virtually disappeared from large sections of the continent.
In the West, Christianity continued to be not only the dominant religion; it also shaped the culture and society as a whole. Then in 1565, the Spanish crossed the Atlantic Ocean, established the first settlement in North America at what is now Saint Augustine in Florida, and brought the first Catholic missionaries to the land. In 1607, the English settled in Virginia and with them brought the first Protestant clergy to America. For the next 400 years, America would be demonstrably a country where the vast majority of citizens professed a belief in the Christian faith.