I think it will be better if Christianity is left out of schools. It seems as if the more we try to teach Christianity, the more distorted it becomes. It is perhaps the most unteachable religion.
People also have preconceived notions of Christianity when they get a "small slice of the basics" in school. Maybe it's better to keep it out of state schools so that people discover Christianity in their own time.
I reckon we should make Christianity a secret so that it can be preserved underground for future generations to come. Rather than having its concepts shoved into people's heads in high school, people get to experience it later in life when they are ready to understand and comprehend its meaning.
Christianity doesn't "own" the West, so why should it insist on Western governments passing a law to have Christianity taught in schools? Why are we mixing Christianity with Western culture and politics?
Christianity is Christianity. It doesn't need to be integrated into a political system.