This week I tried live streaming my face with connecting LiveLinkFace to UnrealEngine. Hopefully we’ll be able to record the movement of the body and face together directly streaming to UnrealEngine next week. https://youtu.be/n03edMR40WU

move lists :

bench press - will, jaye, summ burpees - will, summ curls - will, jaye deadlift - will, jaye, summ lunge - will, jaye, summ plank - will, summ, (jaye- coach) pushup - will, jaye, summer russian twists - will, summer squat- will, jaye, summ step up- jaye, summ, (will-coach)

We realized that we didn’t start with the T-pose last week’s motion capture session which means it would be really difficult to retarget. So this week we especially made sure that our actors start with T-pose when recording.

This week we started with Ryan’s football movements and then moved on to my project. During the transition, when we were trying to create new skeletons for my actors, we spent a good chunk of time figuring out why we couldn’t create new characters. We all thought that it was the tracking markers that were causing the issues and we recalibrated the cameras but at the end of the day we realized that it was actually the Motive that wouldn’t let us create more that 3 skeletons.

I want to use the motion capture data in Unity to build a VR experience. This format makes the archive more than a static recording—it lets people step inside and engage with the movements from different perspectives, whether as a trainer, a peer, or the performer themselves. Unity makes this technically feasible and accessible, while VR preserves the embodied, spatial quality of the movements in a way that flat video cannot.

BTS of our this week Motion Capture session: