Black History in America
Reflect on what you have learned in school about Black history in this country. Has your view of our country’s history changed in light of recent current events? How and why? What are your thoughts on the ways in which Black history is portrayed and taught? What/who is being left out when having these conversations in the classroom? What can be done to improve this?