As part of the workshops, we evaluate our new findings and knowledge to consider how to best approach Virtual Production technology in an educational setting.
For this article, José Raimundo of IPCA (Polytechnic Institute of Cávado and Ave), workshop participant and member of the VPSN project has reflected upon “Workshop 02: Train the Trainer” during the Spring of 2023.
IPCA XR Stage Workshop Report
Next, we provide overall considerations regarding VPSN experiential Workshop 02 – devoted to shooting a short film at the XR stage at BUas (Breda University of Applied Sciences). We highlight BUAS staff efforts in preparing knowledge-base documents, as well as, the support provided to participants’ hands-on experience throughout the workshop. That said, here we attempt to contribute with an overview of our perception from this experience, and also constructive criticism to improve documents, workshops, and how all this knowledge can be integrated into academic courses.
Pre-Workshop activities
Activities in this stage were geared at developing a PreVis from each participating partner institution (IPCA and NORD), until 21st April 2023. Institutions also had to schedule all aspects according to their PreVis production, such as actors, equipment, props, camera and light positions, and making sure that these documents could be readily understood by team members during shooting, figure 1 shows IPCA’s take.
Fig 1. Left to right: IPCA’s Moodboard, Storyboard and UNREAL scene.
Click on image to enlarge.
We provide this e.g. as a means to reflect on mood, style and scripting choices, in spite of the probability for this approach to be overwhelmingly difficult to produce in a week’s time span. The overall idea of this take was to push the boundaries for VP (Virtual Production) in general, to find if it was suitable for stylized shooting (e.g., film noir) instead of more common realism (and the uncanny valley this may ensue), and also, to help to pinpoint implications and limitations, either in terms of characterization, lighting, transitions, etc. We did learn from our colleagues that shooting in Black and White is a lot less “forgiving” in terms of the light problems that arise from VP (which need to be well-controlled as light tends to bounce all over from the LED Screen). We would say that this is a rather interesting problem, as it departs from conventional films and can prove to be a valuable niche to research under the VP’s umbrella, both artistically and technologically-wise.
That said, we are aware that our level of UNREAL knowledge did not allow us to develop that much of a convincing PreVis, mainly because the camera and light motion were not smooth enough nor provided seamless transitions. The Previs would of course benefit from better voice-overs than those improvised by ourselves. Further, we should have resorted to motion capture so the implemented Metahumans could act in a more natural way (we did not record movement through puppeteering as this would delay the development of the PreVis in due time). With this in mind, we suggest adding to the big list of UNREAL tutorials, and others that complement metahuman animation to improve PreViz quality.
Time had to be focused on learning and developing materials for the workshops to obtain results under the project deadlines, however, we believe that this could have been a wonderful opportunity to explore film and effects beyond today’s trends, norms and conventions. As a matter of fact, this broader exploration would allow VPSN (the Virtual Production Studio Networks) to better innovate, to break new grounds beyond what VP can provide today, e.g. the nature of the script and enclosed environment were rather suitable to be explored as a photo-romance in the guise of la jetée (Chris Marker, 1962), yet with an emphasis in camera movement.
Nevertheless, once all proposals were submitted, BUas selected NORD’s version as it was more refined in terms of camera movement, challenges posed to VP and feasibility under the supervision of directors and producer – criteria that we had to concur with given NORD’s experience. The production bible was then updated by NORD, and roles were assigned according to the experience and background of the IPCA/ NORD team members.
Troubleshooting throughout the workshop
Overall, workshop 02 went fine considering that we shot everything that we predicted to be necessary. Nevertheless, we ran into several issues that should be taken into consideration to prevent them from occurring again as much as possible. The main identified issue is the limited led panel area one can work with at the XR stage at BUas. This size does not help to conduct more dynamic, wider pans and shots, which also, depending on real props and ground setup, may not allow shooting the full actor’s figures. Full figure shooting is by and large unnecessary in most cases, however, if there are no alternatives to do so, it can restrict filming to quasi-static perspectives, so we would like to learn how to mitigate these limitations, e.g., how can we make a long over-the-shoulder shot following a protagonist down into a cave in such a setting? Better, how can we make a pursuit? Or escape? Such limitations are visible in Figure 2.
Fig.2 Behind the scene photo of workshop 02, Breda University of Applied Sciences
As depicted in Figure 2, most of the shots were taken with a dolly (or static camera) in the same position with respect to the stage. This was done because delays would be introduced if the team had to move cameras and lighting while making sure that they would conform to the expected appearance between shots. In the end, the solution presented itself: the team simply moved the physical and the UNREAL scenario to comply with the intended scenes. This solution illustrates both the economy of means and of time, while making lighting constant, which is one of the most difficult aspects to control in VP.
The filming procedure was mostly traditional, the team took many improvisational steps to keep up with the schedule while complying with the pre-workshop script, storyboard and time sheets, and overcoming UNREAL latency, real-time motion camera capture and lens focus synchronization, among others. In other words, VP technology was marginally explored by only two team members in charge of synchronizing the UNREAL scenario with actual film shooting. That said other team members, with less knowledge of the technological aspect, could not afford to switch and learn about those roles as this could lead to further delays. We believe that participants should try their hand at the brain bar, as otherwise they will not be able in turn to help students. This is something crucial, it is about hands-on experience, and it cannot be learned simply by testing software detached from XR stage setups and equipment.
On a more positive note, the whole process points to the need for a well-oiled team, one whose team members are able to rapidly switch between traditional film roles, and have a high level of awareness of the stakes and what needs to be done at any given time on set. It is this synergy and work environment that we need to secure, not only in studio productions but also in learning circumstances if we want VP to advance. Students must have basic holistic knowledge about as many roles as possible, but most importantly, the soft skills to handle pressure, obstacles, and to be able to communicate well with each other.
Still, we cannot refrain from asking ourselves: Why in these circumstances were we using the LED screen at all? This seems to be a counterproductive question, but the fact is that the screen did not provide anything special – mostly it was used as a relatively static image reacting to small camera movements, and also blurred most of the time (perhaps due to moiré-effect, or because it contributed to the uncanny valley of disbelief), it seemed that we were simply driving a nail with a sledgehammer. One can argue that the screen provided a fictitious setting invaded by tropical plants, that otherwise would not be possible, but still, it didn’t seem to add much novelty to the film and to the VP process. We can consider the LED screen used in the workshop as a way to demonstrate that things can work with this tech this way, but we might have been able to push this outcome a little forward.
Finally, we cannot cease to emphasize the need for UNREAL scene optimization, i.e., limiting the size of the virtual set, actors and props, as this will help to overcome real-time interaction performance issues. In fact, by optimizing UNREAL, we may even find other ways to present our worlds and stories in novel ways. Also, this is of concern as many students won’t have personal computers apt to handle the UNREAL Engine software, nor too complex scenes in UNREAL, so this may be the only possibility for those who still want to develop themselves in this area.