“Karta” is the Swedish word for map. With KartaVR you can easily stitch, composite, retouch, and remap any kind of panoramic video: from any projection to any projection. The KartaVR plug-in works inside of Blackmagic Design’s powerful node-based Fusion Standalone 9 and Resolve 15 software. KartaVR provides the essential tools for VR, panoramic 360° video stitching, and image editing workflows.
KartaVR running in Fusion Standalone has a full set of custom “bin” icons you can set up to view all the macro nodes.
KartaVR unlocks a massive VR toolset consisting of 138 nodes, 57 scripts, and 6 macroLUTS that will enable you to convert image projections, apply panoramic masking, retouch images, render filters and effects, edit stereoscopic 3D media, create panoramic 3D renderings, and review 360° media in Fusion’s 2D and 3D viewers.
KartaVR integrates with the rest of your production pipeline through a series of “Send Media to” scripts. With a single click you can send footage from your Fusion composite to other content creation tools including: Adobe After Effects, Adobe Photoshop, Adobe Illustrator, Affinity Photo & Designer, PTGui, Autopano, and other tools.
KartaVR provides a large quantity of Lua based pipeline scripts that have full source-code access to allow for deep customization.
Pricing and Availability
KartaVR v4 is freeware distributed exclusively through the Steak Underwater user community platform via the WSL Reactor Package Manager.
KartaVR v4 can be used on personal and commercial projects at no cost. KartaVR can legally be installed, for free, on an unlimited number of computers and render nodes via the Reactor Package Manager.
Reactor makes it easy to install the exact KartaVR tools and content you want on your system.
KartaVR works with Fusion (Free) v9, Fusion Studio v9, Fusion Render Node v9, Resolve (Free) v15+, and Resolve Studio v15+. KartaVR runs on Windows 7-10 64-Bit, macOS 10.10 – 10.14, and Linux 64-Bit RHEL 7+, CentOS 7+, and Ubuntu 14+ distributions.
KartaVR has low system requirements. If you can run Fusion Standalone or Resolve then you can run KartaVR. There is no need to *have* to upgrade your system to have a big GPU when you are just starting out in 360VR production and have a tight budget. KartaVR offers full cross-platform support and runs inside of Fusion Standalone 9 and Resolve 15 on MacOS/Windows/Linux.
New Features in KartaVR 4
Steak Underwater “Reactor” package manager suppport was added, along with new full-featured KartaVR freeware license that allows commercial use of the VR tools for $0.
macOS based users of KartaVR can run the new “Video Snapshot” tool that allows Fusion to capture live action footage from HDMI/SDI/USB video sources to disk. This video I/O captured media is accessed inside of Fusion using a managed loader node that can be added to the foreground comp with a single click inside the “Video Snapshot” window. The video snapshot tool can be used for stop motion animation work. Or a VFX supervisor can use it to grab footage from a video camera to help with on-set production comp-viz work. Or an XR media producer can do a fast node based 360VR stitching test in Fusion to make sure the footage captured on location is going to be able to be fine-stitched in post without any show-stopping issues.
Added an AcerWMRStereoRenderer3D Renderer3D macro that creates stereoscopic 3D 2880x1440px output from the Fusion 3D system. That interactively rendered output can be displayed directly on an Acer Windows Mixed Reality HMD on macOS/Win/Linux via a floating image view.
Added a ViewerAcerWMR2StereoOU node for displaying panoramic images on an Acer Windows Mixed Reality HMD on macOS/Win/Linux via a floating image view.
KartaVR Example 360VR Stitching Comps
KartaVR provides example compositing projects that include large media files and Fusion .comp files. This media will get you up to speed with node based live action panoramic 360° video stitching and photogrammetry workflows in Fusion.
Xmas 2017 Storm at Peggy's Cove, Nova Scotia, Canada - YouTube
This video shows the stormy weather at Peggy’s Cove, Nova Scotia, Canada on December 25, 2017.
The winds were blowing from a south-westerly direction at 86 km/hr (~Beaufort Scale 9) when the video was recorded. The seas were rough with gale force winds causing lots of spray and white foam crests on the waves.
This video shows the results of a KartaVR based VR workflow R&D experiment titled “West Dover Forest Z360 Disparity Depth Stitch“. The end goal was to be able to create a node based 6DoF Stereo 3D scene in KartaVR that had depth based volumetric fog layered into the shot.
The forest scene started put with 6 fisheye views (three sets of stereo pairs) that were filmed using a Nodal Ninja Panoramic head mounted on a Jasper Engineering 12″ stereo slidebar.
The fisheye footage was loaded into KartaVR and a node based stitch was done. A disparity mapping based approach made it possible to generate a LatLong formatted depthmap that matched the LatLong color image data.
Next, an omni-stereo 6DoF workflow was used in KartaVR where the color and depthmap based LatLong views were merged into a combined 360 based over/under image layout that had the color view on the top, and the depthmap view on the bottom.
The KartaVR based Z360Merge, Z360VRDolly, and Z360Stereo and nodes made it easy to convert the over/under formatted RGBZ data into a standard stereo pair of left and right eye LatLong views that could be displayed on an HMD or viewed in the standard viewer windows with anaglyph glasses.
The thing that makes KartaVR’s Z360 based 6DOF omni-stereo tools so useful is that you can use a Z360VRDolly node to freely rotate the color+depthmap based panorama on the yaw/pitch/roll axis. With the color and depth data isolated this way the quality of the stereo depth is not degraded since the final color left and right eye stereo LatLong views haven’t been generated yet.
Finally the rotated Z360 frame is routed from a Z360VRDolly node into a Z360Stereo node and your traditional left and right eye view color LatLong frame can be generated. With Z360 you have the ability to adjust the virtual camera spacing and the convergence setting to make a comfortable to view stereoscopic 3D image.
An Over/Under Z360 color + depthmap image can have XYZ translations and rotations applied inside the in the Z360VRDolly node. Then a Z360Stereo node will generate a traditional Over/Under left and right eye view that you can view in an HMD.
At this point the Z360Stereo node allows you to translate the Z360 footage back into a traditional omni-stereo left and right eye panorama. KartaVR’s Z360Stereo tool supports viewer window based real-time stereo previews. And it works as both a Viewer window macroLUT and as a traditional node you would add to the flow area.
Finally, since this was an R&D workflow test, CG rendered volumetric fog was added in KartaVR to the live-action stereoscopic 360° panoramic forest scene. The depthmap data was used to place the fog at the correct depth levels throughout the LatLong image. The fog was animated using keyframes to blow through the 6DOF scene over time so you could see the varying fog density as the video clip plays.