War within a country is nothing new to countries throughout the world. Most of them have experienced, and even are experiencing, civil wars within their borders. What viewpoints and opinions have other countries viewed about America from their own unique perspectives?
Well for Britain and France, they were considering whether they should intervene in the war. However, these were the considerations of the government...not the people of those countries. The people kinda sided with the North so it pressured the foreign governments to not intervene. Slavery was abolished in these countries so it makes sense they would side with the North.