Respuesta :

Answer:

The American Civil War was one of the most important events in American history that forever changed the life of Americans. ... The war also ended slavery by giving African American slaves their rights and freedom.

Q&A Platform for Education
Platform Explore for Education