top of page

Week 4 - Testing Virtual Production Part 1.5

Writer's picture: Porter JustusPorter Justus

Updated: Feb 9, 2023

In week 2, I tested the viability of CamTrackAR import camera movement in to Unreal. In week 3, I found a YouTube video that showed a workflow to record the camera movement of a cinema camera with CamTrackAR. Well, this week I tested that workflow, along with Move.ai and developing our Metahuman for wide shots.


CamTrackAR and Cinema Camera Test


Here are pictures of my camera with my phone mounted to it.



This is the footage captured in CamTrackAR along with an accompanied FBX file, with the camera's movement.

This is the footage from the Blackmagic Pocket Cinema Camera 4K.



We used clap sync to time everything together.


I brought the camera tracking data into Unreal and used an empty actor to act as the mover for the animated camera created by the FBX. I then parented a second camera which would be my render camera to the imported camera. and gave my render camera the correct offset based on the offest of the iPhone camera to the BMPCC4K sensor.



This is a screen capture in Unreal Engine of the imported camera from CamTrackAR

Using the Nuke Server we were able to trial the composite live and ultimately finish.



***Note that this screen grab was taken after multiple attempts to reconnect the Unreal Reader Node this workflow demands a lot from the GPU and RAM of the system***


This is the test result of the one of our more complicated shots. A 90 degree rotation around the subject.


MOVE.ai Motion Capture


We tested Move.ai with two iPhones and two iPads. We first tried to host a session with an iPad, but the network connect was weak and we determined that it was interfering with the devices connecting for the session. So, we recorded on the four devices without the app and uploaded from a computer.

Here is a video of the calibration.

We learned after the fact that there was an issue with the calibration. and the cameras used. Ultimately, our camera lens angles didn't match and the I wasn't patient enough for the calibration and moved incorrectly from one camera to the next.


The results were not great. Definitely, not as successful as some of the tests I've seen other people have gotten.

Moving forward and hoping and give us the best shot, I have invested in 4 GoPro 9s that we will use to for our next test. So next week I will go into a deeper dive into that. But needless to say, in my research I have seen more people have better results when using the GoPros. I have not seen too much on the iPhones, but GoPros are cheaper. I look forward to sharing my findings next week.


Metahuman Creation


One of the great tools oof Unreal Engine is the Metahumans and last year they introduced Mesh to Metahuman. So, to start the process I took a photogrammetry scan of Valentina, our actresses head using Polycam.



After that I took it into Unreal and created the Metahuman Identity and converted teh mesh into a Metahuman. It does take a bit of work and I followed the tutorial provided by Epic Games.



This is the result.



This next week I will be changing the outfit and hair style to match our actress's character better. We will also be testing motion capture targeted to the Metahuman, though there are some great tutorials I've found that will help with that.




Oth things that I accomplished were, changing camera moves and angles to be wider at the beginning and end.


This next week my to-do list is.

- Rigorous test of Move.ai with the GoPros

- Complete CamTrackAR Test comps

- Complete the Metahuman prep.

- Coordinate and Plan the Greenscreen on the 18th

18 views0 comments

Recent Posts

See All

Comments


"Out of the ashes of each passing project, rise to create again."

  • instagram
  • linkedin
  • generic-social-link
bottom of page