My point in bringing this up is one, Holy fucking shit what the hell is wrong with the human species, and two, Why don't we hear about THIS shit? We (Americans, I'm speaking of) study WWII, but does anyone ONCE mention this event? Not commonly.
Also, I feel like I just need to talk about this now.
Why? America is an America-centric country. You're probably not taught much outside of things pertaining to your country. The Sino-Japanese wars were exceptionally brutal and undeniably devastating.
I think our education in general is based on what's interesting for the government's issues. I seriously doubt US citizens learn about how much blood is on Uncle Sam's hands (direct and indirectly), except maybe for the Hiroshima/Nagasaki bombs.
I wish I'd learnt more about Europe and Asia, but I think I had a pretty rich History education. It was about Greece, Mesopotamia, our Portuguese colonization of course, the Dutch invasion, Africa, the 20th century, Brazilian History in general, lots of wars, US policy, the 1800s and so on... the rest I get from books.
I always wondered what do you guys from other parts of the world learn in History. Also, you shouldn't forget that America is a continent, not a country.