Why are college campuses more liberal?

I went to a major university in the south and it leaned far more liberal than the rest of the state.

Not trying to get political, but if we're talking about "DEI hires", it felt like for a moment they intentionally hired more right leaning professors. That was until our governor took complete control of our federally funded colleges in order to ensure "unbiased education" that is sure to teach nationalism.

I never felt I had any stance pushed on me. I asked one ed professor how he viewed politics in the classroom and his response was "I am practically a socialist, but I firmly believe that my politics shouldn't influence the classroom at all" meanwhile my right leaning professors wore "Liberal Tears" shirts and had mugs of that saying, and showed us John Stossel and Tucker Carlson in class.

I haven't felt indoctrinated by left leaning people. I was fairly independent up until recently but it always felt like left leaning people focused on critical thinking and research skills while conservatives focused on getting our answers to be their answers. They routinely shut down dissenting opinions while my Liberal professors opened them.

Sorry if I'm jaded. Just really anxious of the future and I'm scared to go into teaching.