The whole thing was an experiment in which I kept introducing new ideas as it progressed.
At first, only a dark, closed room was to be visible. But you can only create a rectangular room in Stable Diffusion A1111 with a ControlNet template. By changing various parameters, you can still influence the extent to which the AI adds its own characteristics. There is also a lot of chance involved. For example, in one of the many rooms created, a gap was formed on the floor through which light shone in from outside. This gave me the idea of having a person walking along the outer wall, whose shadow was projecting into the room. The whole scene was supposed to radiate something eerie. This animation was a somewhat laborious process. For the floor patch, I used individual frames from a screen video, which I painted in Photoshop and then converted back into a video.
I was surprised that the pinned video could replace the floor patch.
The eyes as an inpainting test only succeeded when I switched from A1111 to Fooocus.
The fact that the background noise of the footsteps is almost in sync with the shadow makes the intended effect a little more believable. With a lot of fine-tuning, this could be achieved even more accurately. Who is walking remains my secret
