As we all know by now, at our first contact with Europeans, the Eastern part of Indian Country was well settled with towns, cities and states, of law abiding religious citizens.
Most Hollywood films from about 1902 up until the 1960's were about drunken Indians of the plains or west,
dancing around, whooping and banging their mouths with their hands
and killing Europeans, African Americans and sometimes themselves.
Then from the 1960's until now, many films have been about bad Europeans mistreating African Americans and Native Americans.
From the beginning until now, has Hollywood made even one really honest film about the good, the bad and ugly treatment of our people at first contact, by these same Europeans?
The Hollywood westerns made a lot of money for their film industry in the 1940-50's.
People like Randolph Scott, Gary Cooper and John Wayne were always shown protecting the poor, mistreated settlers from the murdering savages who killed for no reason.
And our people were almost always the villains.
Now, times have changed and because of some good films showing the other side of our story, films like Dances with Wolves and The Outlaw Josey Wales, everybody has seen that in the past, films about us were dead wrong.
However, one must wonder why were there no films about the Villages of the East Coast of this Country?
Of how we took in these strangers?
And how people like Christopher Columbus, the Vikings, Spanish Conquistadors, Dutch Colonists, France, England and so on, abused, enslaved and killed, no slaughtered, the original inhabitants of this Country?
Now all the rest of the world believe that all but the Casino Indians are long ago dead,