Harpsicorpse

The harpsicorpse is a musical instrument I built that tracks bodiess and colors in a video and interprets them into sound. The original idea was to make an instrument that responded to bodies on screen so that I could feed in a scene and generate sounds from the movements, gestures, and expressions of the people within the scene.

Clearly, films already use sound as another dimension along with the image to create moods and ideas within the viewer. However, a composer or sound designer has the ability to decide when to compliment or juxapose the image with the sound.

The harpsicorpse, on the other hand, always maps one-to-one with the image. Thus, Iwould not necissarily reach for it as some sort of "soundtrack generator."

Instead, the harpsicorpse is an instrument you play by sending it vidoes. It is not the first instrument to interpret human bodies into sound, but it is my attempt and interpretation of such an instrument.

Below are a few tests I ran with the instrument, along with the github repo where you can find the code to run the harpsicorpse yourself.


Harpsicorpse Webcam Test (April 5th, 2024)


Harpsicorpse Camcord Test (April 5th, 2024)


Harpsicorpse Malina Test (April 5th, 2024)


Here is the link to the github repository for the harpsicorpse: Harpsicorpse Repo