top of page

Latest posts

Check out recent posts and join the conversation by asking your questions.


This post is from a suggested group

Scene export dosen't contain focus distance

Hello!


I've gotten i shot that has some parse metadata and dynamic focus, which I've linked in the clip input node. And I can see if thru the scene but it dosen't seem to get collected when I export the scene. Neither when I export it as a .fbx or .usd file. Is it possible to have that in the file so the 3d department can use the focus distance?

46 Views

This post is from a suggested group

Stabilizing a tracked shot.

Hi there, I'm super happy we have a community place where we can get help! I have a couple of handheld shots where I have to track the camera as well as a moving object, and also then line up the objects as close as possible.

I've done the camera and object tracking already.


What's the best workflow for stabilizing the shots?


Aside for a few CG elements, I'm gonna do most of the post in Fusion. I have an idea how to chain Fusion's native tracking to track and stabilize. I'm not really clear on how to stabilize the PFT imported camera, but that's a question for another forum, however since I'm doing both camera and object tracking in PFTrack, it would be convenient if I could also stabilise there and get it out of the way.


Any tips would be appreciated.

Cheers!

128 Views
Adam Hawkes
Adam Hawkes
06. Nov.

Hi Milko, welcome to the community!


There isn’t a way to stabilise the plate in PFTrack, and it’s usually best not to run a stabilisation at this stage either before tracking or after solving. Doing so can cause alignment issues later, especially with STMaps or 3D data, because stabilising effectively translates the pixels, moving both the centre of projection and the lens distortion on each frame creating inconsistencies. 


So the best workflow is: track your camera and object in PFTrack, export to Fusion, and handle any plate stabilisation there. That keeps your tracking and distortion data clean and your comp flexible.


I’m pretty sure you can import 2D trackers as a .txt to lock or smooth motion in the compositing system but maybe other users familiar with Fusion will be able to help with that? 

This post is from a suggested group

FBX Export Without Camera Keyframes

Hi,


I've tracked a shot in PFTrack and need to export the result as an FBX to open in C4D.


The shot starts at frame 1955497 and ends at 1955582.


When I use the Scene Export node, the FBX exports without any problems. However, when I open the file, the only camera keyframe is at F0.


The trackers export correctly.


I’ve tried adjusting the Frame Offset in both the Scene Export node and C4D, but the camera keyframes still don’t appear.


147 Views
Adam Hawkes
Adam Hawkes
29. Okt.

Hello Henrique, welcome to the community!


It could be you’re running into a frame-time mismatch between PFTrack and Cinema 4D. The FBX format is time-based, meaning it relies on the frame rate to correctly convert frame numbers into time values during export and import. If PFTrack and C4D are using different frame rates, your camera keyframes might end up in the wrong place.


Here are some things you can do:


  1. Check that both applications use the same frame rate. Make sure your PFTrack project and C4D scene are set to identical FPS values.


  1. Inspect the scene attributes window in C4D. After importing the FBX, open the Scene Attributes and look at the Time Min/Max and Preview Min/Max fields to confirm the frame range matches your shot.


  1. Set the correct frame offset before export. If you want your exported FBX to start at time = 0, set the frame offset in PFTrack to the negative of your first frame number (for example, if your shot starts at frame 1001, set the offset to -1001).


This post is from a suggested group

Possible to get a survey from a 360 camera?

Hi! I've asked something close to this but I'm getting an error when I've solved a part of a 360 camera, and selected some frames to use with a "match camera" node. does the "match camera" node only work when the survey shot has gone thru the "photo survey" node? I'm getting a really good solve when solving it with a "camera solve" and not a not so good solve when using the "photo survey" node.


But i can't initialize the "match camera" node if I've solve the track with the regular "camera node". Only when it's been matched with the "photo survey" node

116 Views
ree

Match Camera uses a point cloud to match features of your footage, not another camera. It depends on what you want to achieve with it. You have to first solve your 360 Footage using Spherical track, and then extract a view that you want to "match" to a different camera to using Survey Solver. You have to enable "Already solved" in the second input within the Survey Solver node.

This post is from a suggested group

Using GoPro metadata as IMU for a different camera

I've been playing with the idea to use the Gyro data within action cameras for a long time (And, scattered through the web, lots of evidence are of people using it, but not of the "how"). The idea is simple: Attach the GoPro to the main camera rig, record at the same time of the footage (perfect sync doesn't matter, IMU data records at 240 tps), Then sync and export the IMU as .csv (as how I see other people deliver motion data). ... how do we use said .csv file inside pftrack? How do people do this the propper way? Say we have a Lumix GH5s with the GoPro attached to the hot shoe (Or an Insta360 or a DJI Osmo) and we have both the footage and the .csv file, how can we use that within PFTrack?. Having to always embed that data externally through EXR seems unnecessarily complicated when we…

110 Views
Adam Hawkes
Adam Hawkes
23. Okt.

Hello Angel, welcome to the community.


There are essentially two separate problems to solve here:


  1. Loading the metadata from a file and device that PFTrack doesn’t natively support.

  2. Applying that data to another camera, in this case, the main (hero) camera, that’s slightly offset from the IMU.

The first step is to write a small Python script to parse the CSV and translate the data into something PFTrack understands. You’ll need to inspect what the CSV actually contains and determine how its information maps to PFTrack’s coordinate system. For example, there are many different conventions for pitch, yaw, roll data, different axis orders, sign conventions, and units, so you’ll have to work out how the IMU defines its rotations.


Frame synchronisation is another important consideration. You need to make sure each line in the CSV corresponds to the correct frame of the hero shot. This can get tricky when working with streaming formats like MP4, which are common for these action cameras.


The benefit of EXR is that the metadata is embedded directly in the frame exactly where it's needed.


Once the data is successfully imported, you could use it as a guide or hint to help solve the hero camera. In practice, IMU or gyro metadata often isn’t precise enough to be used directly for compositing, but it can provide a useful starting point for the camera solve.


You also mentioned an alternative workflow involving a “perfect point.” I’m not entirely sure what’s meant by that term, but it sounds similar to using a survey point. You could potentially track your 360° footage, export the resulting point cloud, and then re-import that data into PFTrack as survey points. From there, you could use the Survey Solver node to help align and solve the hero camera.

This post is from a suggested group

Tracking whip pan shots

What is the recommended PFTrack workflow for tracking whip pan shots? These are notoriously hard to track, since there is a blurry middle part that bridges the handheld start and ending. I would like to know the proper way of handling these, I have not found any specific videos for this, official or unofficial.

123 Views
Adam Hawkes
Adam Hawkes
22. Okt.

Hi Josh, welcome to the group!


Whip pans can be tricky to solve. It’s hard to give specific advice without seeing your shot, as there’s no single approach that works for every case. If you’d like more detailed help, you can share your clip here via a link or use Enterprise Support for specific assistance. After you’ve set up your camera and lens, there are some general tips for both tracking and solving:


Tracking

For whip pans, supervised tracking (User Tracks) is generally the most reliable approach because it gives you full control over feature placement. If a tracker fails on a particular frame, you can always manually adjust its position on each frame. Distribute your trackers across both the foreground and background to give the solver as much depth information as possible.


Solving

Start with a short section of your clip where the motion is less extreme, usually at the beginning or end of the shot. Make sure there’s still some parallax visible. Use these frames to generate an initial camera solution, this gives PFTrack a stable reference before you solve. After the solve, inspect your camera curves. Whip pans often produce jumpy curves due to motion blur, so smooth or tweak the curves where needed to keep the motion looking natural and consistent.

This post is from a suggested group

Revised Support Community problems

I was surprised when I came to check on the support community and see that it has changed again? I no longer see the old organization structure, and when I click on any of the links to my posts, everything gets a file missing, so my previous items are missing (in fact, I can't see any history). Also, notifications sometimes are empty and sometimes show my old posts, but again,I can't read into them? Are these known bugs still being worked on?


Why the change? The previous format was working well (actually seeing more of an uptick in activity in the last few months)?

91 Views
Adam Hawkes
Adam Hawkes
21. Okt.

Hello,


You may have missed the notification we shared on October 9th regarding the recent updates. We were happy with how the forum was running before, but unfortunately, some background changes made by our hosting provider, which were beyond our control, required us to implement these updates.


We were also unable to migrate the last few threads from the previous forum (including yours), which is why you’re having trouble viewing them. This again was due to the changes made by the hosting provider. However, you can still find all your other threads by using the search tool at the top right of the page.

This post is from a suggested group

Postshot export script

This is an experimental export script for exporting cameras and points from PFTrack 24.12.19 and later to Jawset Postshot for Gaussian Splat training https://www.jawset.com/



Before using the script, please review the usage guidelines below to get the best results.


Download
Download zip file


299 Views

This post is from a suggested group

Lock Object Motion script

This script is for PFTrack 24.12.19 and later, and can be used to transfer the motion from a moving object geometry track to a camera, keeping the object locked in position in the first frame.



To use the script, download and unzip the file into your Documents/The Pixel Farm/PFTrack/nodes folder and relaunch PFTrack. This will create a new node called Lock Object Motion in the Python node category.


Download

http://lockObjectMotion.py.zip



112 Views
bottom of page