While the United States remains shaped by Christianity, the faith’s influence—particularly as a force in American politics and culture—is slowly waning. An increasing number of religiously unaffiliated, a steady drop in church attendance, the recent Supreme Court decision on same-sex marriage, and the growing tension over religious freedoms all point to a larger secularizing trend sweeping across the nation.