Christendom
Christianity, or Christians collectively, or the regions where Christianity is the dominant faith (“the Christian world”). The term has been associated, at times pejoratively, with the concept of a “Christian state” or “Christian society” which can be traced to Constantine. The contemporary United States has been described in terms of a “post-Christendom” era.
Glossary definitions provided courtesy of Church Publishing Incorporated, New York, NY,(All Rights reserved) from “An Episcopal Dictionary of the Church, A User Friendly Reference for Episcopalians,” Don S. Armentrout and Robert Boak Slocum, editors.