My understanding is that it is taught to the child in US schools so long as the parent agrees to allow it.
I don't think its that crazy though that a parent might prefer to talk about it with the child personally.
But parents don't always know the best way to inform their children. It's the same as driving education, parents who teach their kids tend to pass on bad habits, bias and misunderstandings of what's right and what's wrong instead of teaching the objective facts of the situation. Sex education at worst should be done in conjunction with the schools and should be mandatory. It's best to let nature take its course with them having a full understanding of what happens instead of giving them ill informed information or worse skipping over certain areas.