Androcentrism refers to a position of viewing the world with the male at the center.
Other /More definition:
Androcentrism refers to the belief that the male is the norm.