Treaty of Washington in 1871 is When U.S. Became a Colony of the British Empire

You may also like...