There is an art and a science behind DEI. There are entire schools of thought around how to do it effectively. So why shouldn't it be taught in Universities?
And if universities are going to teach it, they should practice it. So why shouldn't there be full time university positions supporting DEI?
The company or university that is more inclusive, makes better use of its non-men, non-white staff. Which last I checked, was more than 75% of everyone. Why not take that improvement?