Caseyas45
contestada

Germany was founded in 1871. Back then, Germany was called the German Empire (Deutsches Reich - a name it kept till 1945). Where was the German Empire proclaimed?