Glossary / Lexicon
Enlightenment
Enlightenment refers to a period during which Western philosophy embraced the belief that unbiased reason or the objective methods of science could reveal the principles governing the universe. Once discovered, these principles could be used for the betterment of humankind.