Embodied Companionship

Embodied Companionship is a wearable research project that aims to reframe existing relationships to machine learning and technological artifacts.

Approaching machine learning as an intra-active process that moves beyond quantification and the instrumentalization of embodied experiences and expression, the project proposes the development of new frameworks in which to explore machine learning and human-data interaction.

Using a wearable artifact, in this case a sleeve made of a hand-woven fabric that incorporates conductive and high performance yarns, connected to a custom electronics and a Arduino Nano 33 BLE that runs Machine Learning, Embodied Companionship seeks to experiment with gesture recognition, and the embodied relationship between machine and human. Going beyond “gestural interfaces,” the project looks at the way human gestures form themselves an ambivalent vocabulary, one that evades machinic intelligences and asks how this ambivalence can become part of algorithmic systems.

The wearable, named The Little Creature, is itself is an ambiguous and peculiar object, already presaging its indeterminacy, attached to the body or awaiting to be so, woven in a handloom with a combination of conventional, reflective, and conductive yarns, where the pliability of the metal in the conductive yarns gives it a sculptural and tentacular form. A microcontroller, also connected to the overall construction with conductive yarns, runs a machine learning model trained to recognize a number of gestures, in response to which different sections of the conductive yarns in the woven fabric produce a cadence of heat, ranging from a murmur to a burning, almost aggressive sensation on the skin, where the mathematics of the weave relate to the skin as pixels.

Embodied Companionship attempts to create a discursive relationship between machine learning and humans, one that is centered around nuance, curiosity and second order feedback loops. Using machine learning to not only train and “learn” human behaviours but create a symbiotic relationship with a technological artifact that rests on a mutual progression of understanding, the project aims to embody and make legible machine learning processes and shed some light on the algorithmic black box.

Below is a blog post I wrote that discusses the challenges, considerations and implication of machine learning our body gestures.

https://blog.umbrellium.co.uk/post/627420566849257472/learning-nothing

Embodied Companionship is a collaborative research project between researcher and designer Despina Papadopoulos and myself, funded by Human Data Interaction.