- Calvin Davis
- Houston, TX
- United States
This conversation is closed.
Should the American school system place a greater focus on International Studies?
In some places in the US, Texas for example, teaches Texas history with little focus outside of the state through high school. With the current state of technology and an ever increasing global economy, should it not be a concern to avoid the "American" stigma and allow our children to be knowledgable of world history and things outside of America?
It may seem like a common sense question, but I just don't see the focus, especially in public school systems.