I wasn’t too different than most children who grow up in the United States and take lessons in history — specifically, the history of the United States, and how it came to be. We learned about the pilgrims and the Native Americans, and how wonderfully everything went when the pilgrims settled the colonies in an effort to escape religious persecution. We learned about the first Thanksgiving meal and how the Pilgrims learned so much about growing new crops and making homes.
It wasn’t until I entered into University that I learned that everything I just mentioned now was a pack of lies sold to us without our knowledge. Not only that, but the lies were, and continue to be, sold to children to this day in the United States.
Why is there this need to change the history of this country to make it sound so warm and wonderful?
For what reason do we not give information that is at least close to reality — that we weren’t exactly a welcome presence in this country, that the people who bought Manhattan just about stole it with their “deal” — and that millions of native dwellers died from disease that was brought to this country by settlers?
Okay, so I see why we would want to tone down that last part just a little bit. Yet, I can’t help but think that we should not romanticize the settling of the United States quite as much as teachers routinely do now.
What could teachers in the 21st century teach children now that would not leave them screaming with never ending nightmares yet would not be mostly fictional?