Share This Article

Is it true that people believe if the Confederacy won the war their foreign policies and Ideas would have made this country a better place?

Regards,

Thomas Murphy

? ? ?

Dear Mr. Murphy,

As far as I—and Dana Shoaf, editor of Civil War Times—can determine, no American in his right mind believes that.

Sincerely,

 

Jon Guttman
Research Director
World History
www.historynet.com

More Questions at Ask Mr. History

 

Don’t miss the next Ask Mr. History question! To receive notification whenever any new item is published on HistoryNet, just scroll down the column on the right and sign up for our RSS feed.