First, machine learning systems, unlike bodies, have nothing at stake. The creators of them do, but the machine itself doesn’t give a hoot if it gets something wrong or harmful, any more than your knife cares if you sharpen it. A model, of course, can be constrained by what it’s trained on and then by guardrails that apply rules—often over-vigorously—on the words that it’s allowed to utter. However, the model follows those rules without caring about them in the least. Why does caring matter for AI? Because caring is not a simple sensation or reasoning process. It’s a whole-body and whole-mind phenomenon. Getting it right, especially when it comes to specifics, seems like one of the most complex and wide-ranging functions of the embodied, situated individual that you are.
Second, AI agents will have lots of specific data from us, especially what we absorb or create with our computing devices and the data trails we leave behind, but our embodiment goes beyond what we type. What we write is a portrait of ourselves that we create. Bodies tell a different story. Often so softly that it’s tacit.