We all know the Zoom facefilter get-togethers which usually end up in a franctic show-off of individuals doing improvised searches in the Snap Camera filters.
But being together and all activating one specific interactive effect, that leads to a different kind of collective experience. On this page I'm listing the effects I'm creating for this kind of setting.
Mist - feb 2021
This effect turns the session into a lively guessing game. Every minute there's a new algorithm to trigger with your movements. Can you discover what makes you (dis)appear?
What happens when all participants in a Zoom session activate the same filter-effect? Which characteristics of an effect are an added value? And when is it relevant to be collectively using the effect with a group, instead of experiencing an effect individually? The past few weeks I've been exploring these questions by creating multi-user interactive filter-effects and testing these with small groups of people.
This one was a success! It's an interactive filter-effect that has an algorithm that changes each minute. Participants had to find out what movement or action was making them (dis)appear. They did so by trying out various movements (great way of exercising during a long day of Zoom meetings!) And sometimes they accidentally found their way to become visible. For example when they were all hidden, but started to chatter about their attempts to become visible during the minute when 'speaking' was the trigger to appear (And that reversed during the next minute: by speaking you would become invisible. But then the participants were better prepared, having new knowledge that voice was one of the possible triggers/activities the algorithm was reacting upon)
"Waiting for Godot" - jan 2021
Multi-user experience for Zoom. Sit on the left to act as Vladimir, sit on the right to be Estragon. Read your autocue text when it appears.
The participants in the Zoom-session below are wearing a virtual outfit. A custom facefilter-script causes their outfit to change color at regular intervals, in a synchronised way.
Unfortunately @TheSnapCamera facefilters cannot (yet) connect to external sources for data exchange. But synchronisation of #AR outfits based on the systemclock works well! (Except when someone is logging into #Zoom from another continent...) pic.twitter.com/BG8jpk2b12
Want to try this synchronised outfit with colleagues in you next Zoom-session? Install the
"Snap Camera" program,
and copy/paste the link below into the 'search' field. Then set 'Snap camera' as
videosource in your videoconferencing software.