This is a indepth look at how we filmed the music video for Boomboxhead 2 inside Virtual Reality Watch on YouTube: https://www.youtube.com/watch?v=a5eT5UpHDiY
- Song Title: Walking Talking Boomboxhead
- Album: Boomboxhead 2
- Shooting Date: Sat Oct 19th 2019
- Time: 10 PM Central
- Discord: painterpainting#5603
- One moving camera
- One continous shot for the entire video length and 1 greenscreen composite vfx shot.
- The camera animation is smooth and moving around the studio and focusing on on each set/facade and subject of each scene.
- The set has been designed with photogrammety scans and the camera movement was manually animated to the musical phrase changes.
- There is a realtime light on the camera to light up the subjects and the camera starts and stops with an action that will be triggered by the director.
- Each shot is about 10-20 sec of focus on the subject and there is continuous cam movement for a handheld feel throughout the video.
- During the camera movement in the sky, At the end of the video, we will respawn/teleport bringing everyone together for 1 shot.
- Each set "backdrop" features avatars in fullbody tracking, some dancing, some in PC mode, and some standing and others sitting or acting natural.
- 20 avatars -3 dance scenes (choerographed or improvised)
Photogrammetry Sets by JIN captured in LA
- Starts out on a Boomboxhead sitting on the bench with a boombox on the side. Logo overlays of the cast and crew animate through logo sequence.
- The beat drops, logos go away, and camera motion starts. The camera reveals 1 dancer starting to pop and lock to the beat on the back wall then pans to a person sitting on the stoop(RollTheRed), nodding his head.
- The camera flys over to the camera controls and then to the first set of the cast on a photogrammatry backdrop. Cam zooms in and out, 10 seconds
- The camera flys out and right over to dancer (with a few of the crew in the background) and when the camera stops he starts to do breakdancing moves as the camera pans in and out.
- Then the camera pans out and over to the studio backdrop, where dancers will be performing for 15 seconds as the camera pans in and out as well.
- The camera rolls out and to the side revealing the main video console with Boomboxhead as the subject (a fast transition)
- Camera rotates and zooms into next set.
- The camera then tracks to the side and goes through the hallway revealing merch and product placements of the album (album marketing has been placed throughout the video as 3d objects, posters, and images). There is a person in the hallway signalling the camera to "come on, this way!"
- The camera reveals the outside area, once the camera pans and goes outside, it pans into the sky, and then we do a VFX shot revealing the music video inside a TV screen inside the VR Recording Studio with a cameara pan in and out back into the music video for the final shot.
- The final shot is a wide angle dolly shot going backwards. Everyone respawns while the camera is focused on the sky. Each avatar will finish in a unique pose.
- At the last beat drop the video cuts to credits with black screen with glitch effects and links to behind the scenes footage and documentation.
We created a pre-visualization of the final video to hone in on the details, context of shots, and frames. Stand in markers (avatars) were used to get the focal length, depth of field, and general look captured into a format we could share internally with our production team. To get everyone on the same page we used the previsualiztion video to communicate the project details with the prodution manager, cast, and crew. Documents like this were written as reference so that everyone can follow along the project.
Green Screens Used In Production
Cast | ||
---|---|---|
Kinetic Dance Crew | .JIN | IzayaShi |
Forceable | Snick Digglers | Ronn |
J4key | Vibe Crew | Text |
Lowren | JJFX-Multimedia | Cece |
Moximox | ItsLumi | NWKayz |
Rocktopus | Lowrhen | RollTheRed |
AtomJayy | AtomJayy | Boomboxhead |
Tierson | VRPill | NoLogicDavid |
We contacted dancers, actors, vtubers, and personalities that we wanted featured in the video and submitted the previs to them 3 days early. From there they could see and consider being part of the shoot. In total 3 dance crews signed on to the shoot. We have built our cast and crew completly over the network, regardless of location, saving time and money for the entire production while sourcing 20+ talented people for the shoot.
Destiny
.Storm
Inevanable
Cascadian
Peptron1
Bigtin
Toasty
Chano
DyllanTheVillainn
TheLastCrotch
Big Boi Gator
Bob_The_Wizard
Wiggly Piggly
Tree 224c
- Godfrey Director / Producer / Director
- Jin - Production Manager - Photogrammetry / Set Design
- Rocktopus - Photographer / Camera Tech Assistance / Virtual Room Moderator
Markers were used to direct the cast on where to stand.
- Baked studio lighting
- One camera light attatched to the animated camera
- Baked reflection probes
- Set all switches to Local so that the actors will not need lights or reflection probes for optimization
- Baked lighting to blend between photogrammetry and 3d geometry
- Added street lamps, and red exit lights to give some color to the scenes
- used light probes
Real time lighting was used (1 light on the camera)
- Repainted textures for album posters and props
- Repainting metallic channel on various sets for water spots
- Mixing 3D objects and scanned objects
- Added normal maps to photogrammetry assets
- Optimized texures with crunch compression and downscaling most to 1024 resolution
- using the Standard shader
The camera control system we used is available on Github if you would like to use this system to create your own videos in VRChat
- Camera on a track
- Added in 9 other cameras for Bshots (in case we need more shots)
- Adding camera handheld shake with Standard Assets unity script
- Adding Post Processing 2 for motion blur plus color correction
- Local camera on/off switch
- Animated the camera manually to the musical phrase changes in the beat
- Set camera clipping plane to .2 from .1 to stop Z-Fighting issues with depth resolution
- Forward Rendering Path
We aim to use optimised avatars but there will be about 20 on the shoot and the scene is 150mb+. To improve user expierence we designed the room to have switches that are locally controlling the lights and reflection probes. Furthermore, only the camera operator needs to have everything rendering while the users can hide/show objects as needed.
- Made items pickupable for props
- Music and camera animation Start / Sync trigger with VRC SDK
- Added RED/ GREEN visual cue for actors, triggerable with VRC SDK
- Adding colliders for actors to stand on
- Adding marker standings for previs video
- Added seats for actors on certain parts to sit down
- Add gif animations to TV screen for animation and movement
- Using a projector to overlay the transparent PNG for the logos
- Moved spawns outside so that we all can respawn at the end for a final shot
We used projectors with alpha to project PNGs (logos) into the scene to brand the cast and crew. By swapping between them with animation on the intro/outro in-game we entirely eliminate the post-production edit.
Virtual Reality Production Studio bu Godfrey Meyer
We created a credits list and took screen shots of the users in-game who helped us with all aspects of production. We made sure to get everybody's URL involved in the project for social media. We link everyone who was involved in the production in the description and metadata, as it is a collaboration and many people were involved making this possible.
We set up a communication channel on Discord for all of the cast and crew who have corresponded and signed on. On the day of filming we all meet up in virtual reality (production crew is 30 mins ealry to setup room) and do rehearsal takes, directing the actors whom have had a few days to choreograph their parts. Once everyone is on the same page, we will shoot the entire music video in one take.
- Music video "Walking Talking Boomboxhead" to accompany the release for the Album titled "Boomboxhead 2" by Godfrey Meyer coming out Nov 5th. After the video is live, a public world will be released with links to the video, case study, and behind the scenes production.
-
Occlusion culling turned on for performance
-
Camera Desync
-
Network limitations: Syncing physics over the network introduces delay and unpredictable behaivors regarding animation.
-
Originally we wanted to put an avatar in a seat and have him hold a camera, but we decided to use Standard Assets handheld cam script to simulate handheld camera movement as it would be more predictable in the final shoot.
-
Used Cinema4d to render out moving cubes with audio data for feedback
-
First we tried a camera match move starting from animated camera track in Unity. We then used FBX Exporter to export camera track and scene geometry to Cinema4d, a 3rd party 3D application. Using World Space export, the tracks were fine, but we found that the Focal Length, Sensor Size, Field Of View (horizontal) and Field of View (vertical) inside Cinema4d camera settings, does not match the Field of View in Unity, so we have a camera mis-match, and decided to render the sky in real-time.
-
Alpha Mattes Concept: since one person currently records the output of the virtual prodcution to OBS, there is no Depth maps, Alpha mattes, or any render buffers designed into vrchat outputs. We found that doing animation, and recording everything realtime inside vrchat would work best for our needs, but we will eventually need additional render buffers, as well as a recording of all telemtry / locomotion of everything in the room, for realtime playback. This would enable a perfect camera import and export workflow for 3rd party compositing and standard VFX pipelines.
-
We diddnt reherse the final shot (with 20 avatars) and found that on the computer we were using to record, there was frame drops, and lagg introduced with that many avatars being rendered together with all of the post processing stack, and realtime lighting and shadows. Next time, using a better top-of-the line computer will be a must for recording with no frame-drop or lag(other than network lag).
-
In post production, it was very difficult to line up the different takes (an empty BG plate, and 2 overall takes that we shot). We used a script on the camera for handheld movement, but because that movement was not baked, all of the takes were slightly different and not camera locked. This made camear match moving in post prodution almost impossible.
Merch Product Placement In The Video for Album Release
Ron
MehStrongBad - Technical Help
VRChat SDK / Company
Gunter - Assets
VRC Community Prefabs
Jetdog Prefabs
RollTheRed
- VRC Virtual Production Camera System https://github.com/gm3/virtualproduction-vrchat
Godfrey Meyer Dev Portfolio https://gm3.github.io/devportfolio/ Boomboxhead http://www.boomboxhead.com Boomboxhead 2 Album Release : https://distrokid.com/hyperfollow/godfreymeyer/boomboxhead-2
Gunters Universe https://www.youtube.com/channel/UCWVxbkYs52eX0FatbJCKBbg
VRC Community Prefabs https://twitter.com/vrcprefabs?lang=en
Jetdog Prefabs https://github.com/jetdog8808/JetDogs-Prefabs
RollTheRed https://www.youtube.com/channel/UCrbP1bylN30XJZobnq8O2rw
Moximox https://www.twitch.tv/moxi_moxi
VIRZ https://twitter.com/VirzM8
Forceable https://www.youtube.com/user/TheForceableDoom
J4key https://www.youtube.com/user/JJ990
Lowrhen https://www.twitch.tv/lowrhen
NoLogicDavid https://www.youtube.com/channel/UC_OZ31SBtcSkG19vx1_tRXw
VRCHat http://www.vrchat.com
Unity http://www.unity3d.com
Ron https://twitter.com/vrdesignguy
Rocktopus https://twitter.com/mjmurdoc
NoLogicDavid https://www.youtube.com/channel/UC_OZ31SBtcSkG19vx1_tRXw
AtomJayy
Tierson
.JIN http://www.twitter.com/dankvr
Snick Digglers
Vibe Crew
JJFX-Multimedia https://twitter.com/jjfx_multimedia?lang=en
VIRZ https://twitter.com/virzm8
ItsLumi https://www.twitch.tv/itslumivr
VRPill https://twitter.com/vrpill?lang=en
Izayashi https://twitter.com/Izayashii
Ronn
Cece https://www.twitch.tv/cecevr
NWKayz https://twitter.com/nwkayz