A month from now the yearly European science fiction conference called ‘Eurocon’ will open its doors in Rotterdam, the Netherlands. Each yearly version has its own name, and this one is called ‘Erasmuscon’. Seeing that Rotterdam isn’t far from where I live, I contacted the organisers about a year ago, hoping they might be interested in adding a bit of speculative biology to the conference. After all, speculative biology went down very well at a similar science fiction convention in London (see here and here).
The result is that I am now scheduled to give a 45-minute talk on Furaha. For those interested, the programme can be found on the Erasmuscon site; I will speak right after the opening ceremony on 16 August, 2024.
The organisers asked participants to come up with a video to use in social media. I wish they had asked that a year ago, so I could have had more time, but I managed to cobble something together. I will present it here, and there is a larger version on YouTube as well.
I decided to do a micro-documentary. The organisers had asked for a short clip lasting 30 to 60 seconds, so I aimed for one minute. The end result is a minute and a half though. I decided that the paintings and a few animations should take centre stage, but just showing an unmoving painting seemed boring, so I decided to play with making paintings move.
To do that, I cut up two paintings in layers representing distance from the viewer. The farthest layer showed the sky and an empty landscape. The next layer represented a part of the landscaper closer by, and in that layer all the really far parts had been removed, leaving a transparent emptiness. The next layers contained progressively closer objects. I wrote a quick Matlab programme to put the layers on top of one another again, with layers shifting more to the right the closer the layer is to the viewer. Repeat that to make as many frames as you want, take a snapshot each time, assemble them into a video and you get a view as if the camera is moving through a 3D view. Anyway, that was the idea, and I will leave it to you to judge whether it worked. For other scenes, the ‘camera’ just moved over the painting, zooming in or out as desired.
I wished to add a voice comment. I had once played with a microphone which was not a success at all: I had to repeat each phrase many times because I made errors, and the few times I didn’t, an airplane flew over or some other noise intruded. This time, I looked at voice generation and decided to use that. You type in a sentence and hear an AI voice speaking those words. It’s amazing this works, given the general lack of correspondence between the sounds of English words and the way they are spelled. Obviously, the AI behaves like humans in this respect: it ‘knows’ what English words sound like irrespective of their spelling. A funny thing is that the AI did not manage to say the word ‘Furaha’ the way a human would pronounce it. Writing it like ‘Fooh raha’ worked a lot better.
So here is the video; I do not know whether the Erasmuscon people will actually use it in their communications, as I just handed it in. Perhaps I will see you in Rotterdam next month!
Please visit the accompanying website: Life on Nu Phoenicis IV, the planet Furaha. This blog is about speculative biology. Recurrent themes are biomechanics, the works of other world builders, and, of course, the planet Furaha.
Superb work on the video, and thats great news about the attending - congrats!
ReplyDeletewishing you all the best good fortune at it.
Keenir: thank you!
ReplyDelete