IKHYF (2024)
Utilizing the API provided by ChatGPT in TouchDesigner, IKHYF(I know how you feel) listens to the voice input from an audience, analyzes the emotion undertone, and output an animated design.

What are we not noticing when we speak? To what extent does AI understand human emotions? And when we interact with the machine, what psychological progress do we experience? The project aims to not only help designers see the color in their voice, but also evoke a reflection on the continuity between human and machine.