A lateral concern to this:
I do not think that you can make general AI without incorporating some model of emotional states or moods.
When I bring this up people think about human emotions and complex moods, perhaps in the context of a chatbot. But I'm talking about something much more basic. If you want to model the intelligence of an ant, so it can solve a maze or whatever, you need to have some model for moods if you want to really capture what an ant is doing IMO.