OBJ's Textures and Rendering
I feel like I've run down a rabbit hole with how PFTrack handles OBJs with textures.
I've been slowly adding my own custom primitives and objects for image modeling or checking my solution. I'd worked through how to get .obj's from my right handed software to PFTrack's left handed space, but I could never figure out why my textures never rendered (really hadn't spent the time to investigate until now). I finally decided to dig in on this and discovered that PFTrack only reads the texture from the MTL if a relative path is used. I totally understand that relative paths make moving OBJ's more portable, but I was suprised that absolute paths don't work. Is there a reason for this, seems like either should be able to be handled?
I also noticed that I get differnt behavior for rendering the OBJs I create depending on the node I add them to. When I create my OBJs for use with PFTrack I have to flip the normals to get them to render properly in the Image Modeling node but with one fo these when I copied it to the Objects directory and add it to the node it looks like it is flipped inward. I know the Test Object node has a coordinate system parameter, but the Image Modeling does not. This should work right, the same OBJ in both Image Modeling and Test Objects? Lastly, with Test object node, when I set the display render to Texture, I can't the texture from the MTL to render, it's a copy of the same object I'm using in the primitives node, so I would expect it to work. Are there details, I'm missing here or are there some bugs here?

Thanks so much Adam. 😀 Always apprciate the added detail. The normal flipping left/right coordinate system was a pretty easy one to deal with. I already went in and updated the mtl files to use the relative path, but it's good to know that both should work. I should really be using the relative path and just need to change some settings for that, but in case I forget now i know what else to look for. Thanks for looking at the other items too. Cheers