We do not know what "college" or "higher ed" mean anymore. I used to think of "college" as referring to Liberal Arts, but I am way out of date. Wasn't a liberal arts education always an elite thing except for the very curious and self-educated?
Lots of Higher Ed is work- or career-related these days.
College Is Not For All