The concept for the video was inspired by the architectural movement Archigram — an idea-driven approach that, despite never being built, emphasized technology as a common denominator, as a base that is flexible and versatile in its use.
Our artwork tries to evoke this idea of technology as a core, as an enabler and the ways it interacts with and enhances other disciplines like art.
GAZE, 2025
Animation in Houdini & Redshift
I was commissioned by GAZE Spacial to design a poster for their up-
coming listening event at Empire Bio. The brief called for a design that
functioned both as an animation and a still image while conveying a sen-
se of holiness and mystery.
— leading me to think of an abstract sculpture that embodies both senses.
Fluid Flows, 2024
Animation in Houdini & Redshift
This research aims to cherish fluid dynamics unpredictability and visually research how fluid dynamics instability can be visualised through 3D renders and simulations. It also shows how much we can learn about nature, people and water by studying them digitally.
Through the use of 3D renders and simulations, I've visualised the unpredictability and complexity of fluid dynamics.
Vellum Dreams, 2024
Stills made in Houdini & Redshift
Exploration in creating calm and abstract renders by using vellum simulations.
There is so much room for playfulness when it comes to lighting the scene and finding the right textures. Small adjustment to the simulations would create a completely new look.
These two renders here showcases exactly that.
Sunglasses, 2023
Project made in Cinema4d & Redshift for Fine Chaos.
Fine Chaos made two new sunglasses as part of their new collection. Therefore they asked me if I wanted to visualize how the sunglasses could look in a 3D space.
I modelled both sunglasses from the buttom and learned a lot about everything from modelling to UV mapping.
This process has not only enriched my modeling skills but also deepened my understanding of how to conceptually brand accessories like sunglasses.
Flowers, 2023
Project made in Touchdesigner using Stable Diffusion
I installed Stable Diffusion locally on my MacBook and integrated it with TouchDesigner via an API.
This integration allows me to dictate which images enter the Stable Diffusion model, leading to unique and intriguing outcomes.
The particular Stable Diffusion technique I used is ”img2img,” which brings a flickering style to the video.
Additionally, I configured this piece to respond to music by utilizing CHOP nodes in TouchDesigner, enabling the flowers to move in sync with the rhythm."
Alienation, 2024
Render made in Cinema4d & Redshift
In this project I wanted dig deeper into world building. How to make suitable surrondings to an abstract 3D object.
I modelled this shape by using different modifiers/sculpt brushes and afterwards spent a lot of time on lighting the scene to give it a dystopian look.