Definition of Mandate
Mandate
man·date


Definition/Meaning
(noun)
An official order or instruction from a higher authority.

e.g. The government received a mandate from the people to reduce taxes.

(noun)
the authority to carry out a particular course of action or enact a particular policy, granted to a representative, especially to someone who has won an election;

e.g. The candidate respected the mandate of the citizens and developed the policy to benefit them.

(noun)
in historical usage, an official command given by the League of Nations to a member country, for establishing government and taking over administration in conquered territories;

e.g. The mandate established a new government in the state.

(verb)
to officially order or require something, i.e. to make it mandatory;

e.g. The terms of your contract mandate your attendance at this event.

(verb)
to officially require or direct someone to do something, especially to assign them a responsibility;

e.g. The new law will mandate them to have sole power over this territory.



Translate this Word

Select a language from the dropdown and click "Translate Now" to see this word in your preferred language.




Similar Words



Opposite Words



Related Words


Comments



English Words

 

WORD OF THE DAY