After the "Norman conquest of England", many things including languages, the ways of life, etc. have changed a lot in Europe. Especially it affected the English language significantly that a lot of French words were borrowed by English.
Now, I would like to know how history describes the period when England was ruled by royal families of France. Also, if it was regarded as a colony, was there any independence war against the French rule in England? When did England became free of the political influence by France and by what event?