Hollywood director Carl Erik Rinsch found guilty of swindling Netflix out of $11M for luxury cars, watches and mattresses
https://www.dailymail.co.uk/tvshowbiz/article-15378535/Keanu-Reeves-director-Carl-Erik-Rinsch-guilty-Netflix.html
https://redd.it/1pktw89
@vfxbackup
https://www.dailymail.co.uk/tvshowbiz/article-15378535/Keanu-Reeves-director-Carl-Erik-Rinsch-guilty-Netflix.html
https://redd.it/1pktw89
@vfxbackup
Mail Online
Keanu Reeves' director found guilty of swindling Netflix out of $11M
Carl Erik Rinsch - who directed the 2013 Keanu Reeves movie 47 Ronin - has been convicted of scamming Netflix out of $11M, spending funds on cryptocurrency speculation and shopping spree.
For the past year I've been working on a sequel to my 3D animation. I've finally finished and here it is!
https://www.youtube.com/watch?v=YnnYLKGQK58
https://redd.it/1pkymcq
@vfxbackup
https://www.youtube.com/watch?v=YnnYLKGQK58
https://redd.it/1pkymcq
@vfxbackup
YouTube
The Backrooms - Tape 2
A Journey Through The Backrooms - Tape 2 is a sequel to my well recieved VHS found footage short horror film I made in 2024.
Inspired by the original "Backrooms" image, @kanepixels The Oldest View, Darkwood, my everyday life, the streets of Wrocław and…
Inspired by the original "Backrooms" image, @kanepixels The Oldest View, Darkwood, my everyday life, the streets of Wrocław and…
Notes with drawings are the best
Just a quick vent needed, when someone supplements a note on a shot with some drawings, it's the best. It's faster to understand, my artist brain picks up on the note more easily. I don't know why more leads and supes don't do it. Or even verbally chat with me, walk through a shot with me so I see and follow along with what they're seeing that needs change. I'm a visual creature, I thrive on visual communication.
I get quite a bit of written notes about a shot, telling me what's wrong, then not much follow up on how what to fix to change it. So then I have to keep messaging, taking screenshots, do my own doodles, to get more sustenance out of my lead/supervisor so I can figure out how to tackle the note.
Anyways, notes that come with a drawing/doodle are awesome. I'd like more of that for Christmas, thank you.
https://redd.it/1pl2svh
@vfxbackup
Just a quick vent needed, when someone supplements a note on a shot with some drawings, it's the best. It's faster to understand, my artist brain picks up on the note more easily. I don't know why more leads and supes don't do it. Or even verbally chat with me, walk through a shot with me so I see and follow along with what they're seeing that needs change. I'm a visual creature, I thrive on visual communication.
I get quite a bit of written notes about a shot, telling me what's wrong, then not much follow up on how what to fix to change it. So then I have to keep messaging, taking screenshots, do my own doodles, to get more sustenance out of my lead/supervisor so I can figure out how to tackle the note.
Anyways, notes that come with a drawing/doodle are awesome. I'd like more of that for Christmas, thank you.
https://redd.it/1pl2svh
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
VFX Before Color Grading: How Does it Work With flat Log Footage?
Most advice I’ve seen says VFX should be done before color grading.
What I’m still unclear on is how this works specifically with log footage, which is usually very flat. Since many VFX assets aren’t in log format, how are they typically integrated before color grading?
Do you usually:
Do some form of color correction on the log footage before VFX
Attempt to make VFX assets visually match the flat log plates?
Related question: how do you handle Rec.709 footage that are intentionally designed to appear flat or log-like? How are those typically integrated alongside standard Rec.709 VFX assets?
Curious what the standard professional workflow is here. I’m mainly asking because the idea of “reverse” color grading VFX assets just to make them fit flat footage feels off to me.
Thanks
https://redd.it/1pl8z49
@vfxbackup
Most advice I’ve seen says VFX should be done before color grading.
What I’m still unclear on is how this works specifically with log footage, which is usually very flat. Since many VFX assets aren’t in log format, how are they typically integrated before color grading?
Do you usually:
Do some form of color correction on the log footage before VFX
Attempt to make VFX assets visually match the flat log plates?
Related question: how do you handle Rec.709 footage that are intentionally designed to appear flat or log-like? How are those typically integrated alongside standard Rec.709 VFX assets?
Curious what the standard professional workflow is here. I’m mainly asking because the idea of “reverse” color grading VFX assets just to make them fit flat footage feels off to me.
Thanks
https://redd.it/1pl8z49
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
What were the most unhelpful client notes you ever got?
Stuff like “make it more sellable!” for example
https://redd.it/1plb5ln
@vfxbackup
Stuff like “make it more sellable!” for example
https://redd.it/1plb5ln
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Just tried the “Gemini Christmas Prompt” on Media io
So I stumbled on Media.io’s new “Gemini AI Christmas Prompt” page (yes free!) and gave it a go today. I uploaded a selfie and used one of their Xmas templates. Within seconds I got back a cozy holiday-style portrait: it is amazing.
https://redd.it/1pleetq
@vfxbackup
So I stumbled on Media.io’s new “Gemini AI Christmas Prompt” page (yes free!) and gave it a go today. I uploaded a selfie and used one of their Xmas templates. Within seconds I got back a cozy holiday-style portrait: it is amazing.
https://redd.it/1pleetq
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Laptop for work on the go
What laptops are you all using, and are you happy with your choice?
I’m looking to replace a 15" 2018 MBP and since what I do 95% of the time is just remote into my Windows workstation I’m really not concerned with having a high spec laptop, but what I do really want is a decent screen.
Although I’ve enjoyed my MBP I think they’re too expensive and I don’t really care about brands or MacOS vs Windows.
https://redd.it/1plfxlm
@vfxbackup
What laptops are you all using, and are you happy with your choice?
I’m looking to replace a 15" 2018 MBP and since what I do 95% of the time is just remote into my Windows workstation I’m really not concerned with having a high spec laptop, but what I do really want is a decent screen.
Although I’ve enjoyed my MBP I think they’re too expensive and I don’t really care about brands or MacOS vs Windows.
https://redd.it/1plfxlm
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
How to do VFX for Log footage
I am thinking of doing some VFX for my upcoming short film using Blender, however I have no Idea how the workflow should be. Like should I first do the final grade of my video, then do the VFX shots to match my grade? I myself will be doing the editing, directing, and VFX.
Im new to doing VFX on log footage so please help me out! I have no idea if anyone's asked this already so some help would be very nice, and please explain it to me simply 🙏.
https://redd.it/1plk6wa
@vfxbackup
I am thinking of doing some VFX for my upcoming short film using Blender, however I have no Idea how the workflow should be. Like should I first do the final grade of my video, then do the VFX shots to match my grade? I myself will be doing the editing, directing, and VFX.
Im new to doing VFX on log footage so please help me out! I have no idea if anyone's asked this already so some help would be very nice, and please explain it to me simply 🙏.
https://redd.it/1plk6wa
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
AI usr in recent trailers
What shots in recent trailers do you think were either done by AI? Or at least AI used used in the design process? The bus from the supergirl trailer looks a bit suspect and some of the cars from fantastic 4 trailer are some I think AI was used in the concept art.
https://redd.it/1plnu0w
@vfxbackup
What shots in recent trailers do you think were either done by AI? Or at least AI used used in the design process? The bus from the supergirl trailer looks a bit suspect and some of the cars from fantastic 4 trailer are some I think AI was used in the concept art.
https://redd.it/1plnu0w
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Light catcher in Houdini
Hi all,
I'm using Houdini 21, with Karma/Solaris.
I have a scene with an explosion in contact with the ground. I added a point light inside the explosion to have a better control of how many frames I want this light to appear.
The problem is that I can't have the geometry ground in the render, because I'm applying the explosion to a plate that already has a ground.
So, I want a way to make my geometry ground receive the light from both the explosion and the point light, but not appear in the render. Similar to when you disable primary visibility in Maya. I want something like a shadow catcher, but for light.
I tried using the render geometry settings node with the render visibility set to -primary, but it makes the ground disappear completely, not catching the lights.
Thanks in advance!
https://redd.it/1plqxxo
@vfxbackup
Hi all,
I'm using Houdini 21, with Karma/Solaris.
I have a scene with an explosion in contact with the ground. I added a point light inside the explosion to have a better control of how many frames I want this light to appear.
The problem is that I can't have the geometry ground in the render, because I'm applying the explosion to a plate that already has a ground.
So, I want a way to make my geometry ground receive the light from both the explosion and the point light, but not appear in the render. Similar to when you disable primary visibility in Maya. I want something like a shadow catcher, but for light.
I tried using the render geometry settings node with the render visibility set to -primary, but it makes the ground disappear completely, not catching the lights.
Thanks in advance!
https://redd.it/1plqxxo
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
How Movies finally Made De Aging Look Good
https://www.youtube.com/watch?v=BBNXj9TA4n0
https://redd.it/1plu4yj
@vfxbackup
https://www.youtube.com/watch?v=BBNXj9TA4n0
https://redd.it/1plu4yj
@vfxbackup
YouTube
This Is How De Aging Got Good
De-Aging and Deep fakes have gotten a lot better over the past few years. This video goes into how its gotten so good and the techniques that the filmmakers use.
Sources:
Callum's Paper: https://www.callumstuart.com/vfx-personal/de-aging-with-deepfakes-bsc…
Sources:
Callum's Paper: https://www.callumstuart.com/vfx-personal/de-aging-with-deepfakes-bsc…
Question about CG Animation
I saw a post on here that was showing before and afters of How To Train Your Dragon, and one showed just a skeleton in the before, and then the full dragon model with its body in the after.
My question is that are animators actually animating just the skeleton and then another department is going to put the body on and do a muscle sim and stuff like that? Or is it just for show, for the breakdown.
https://redd.it/1plycpi
@vfxbackup
I saw a post on here that was showing before and afters of How To Train Your Dragon, and one showed just a skeleton in the before, and then the full dragon model with its body in the after.
My question is that are animators actually animating just the skeleton and then another department is going to put the body on and do a muscle sim and stuff like that? Or is it just for show, for the breakdown.
https://redd.it/1plycpi
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Hollywood panics as Paramount-Netflix battle for Warner Bros
https://www.bbc.co.uk/news/articles/c8dyy47qy82o
https://redd.it/1pm1o0b
@vfxbackup
https://www.bbc.co.uk/news/articles/c8dyy47qy82o
https://redd.it/1pm1o0b
@vfxbackup
BBC News
Hollywood panics as Paramount-Netflix battle for Warner Bros
Interviews with dozens of actors, producers and camera crews reveal an industry attempting to weigh the lesser of two horrible choices.
Spent 2 months making a Blender Short Film
https://www.youtube.com/watch?v=oAMuNjTNquE
https://redd.it/1pmbkcb
@vfxbackup
https://www.youtube.com/watch?v=oAMuNjTNquE
https://redd.it/1pmbkcb
@vfxbackup
YouTube
Daydreaming | Blender Short Film
Hi everyone,
I spent 2 months making this short film in Blender
(composting in Davinci Resolve).
No AI was used in the making of this film.
I would love to hear your opinion on this film, the storytelling, visuals and your interpretation. I hope everyone…
I spent 2 months making this short film in Blender
(composting in Davinci Resolve).
No AI was used in the making of this film.
I would love to hear your opinion on this film, the storytelling, visuals and your interpretation. I hope everyone…
I'm automating Blender Camera Tracking (And I need your help)
I'm a huge fan of open source especially Blender, but when it comes its internal motion tracking system, it's - slow. So I created an addon that tries to automate the native tracking system inside of it. I have been somewhat successful by making it work for simple shots.
Here's me showcasing some of its features: https://youtu.be/NzI5vurW5C4
But as I explained in the video, I've fine tuned the system for different types of shots and the addon learns the settings locally by adapting those settings using math.
But to truly make it smart, I need to create a neural network(AI) that follows the patterns pros follow while they manually track. For that, I need data, which you can help provide.
If you are interested in trying out my automated addon, please check out the video link above and the Github link: https://github.com/usamasq/autosolve
https://redd.it/1pmfouq
@vfxbackup
I'm a huge fan of open source especially Blender, but when it comes its internal motion tracking system, it's - slow. So I created an addon that tries to automate the native tracking system inside of it. I have been somewhat successful by making it work for simple shots.
Here's me showcasing some of its features: https://youtu.be/NzI5vurW5C4
But as I explained in the video, I've fine tuned the system for different types of shots and the addon learns the settings locally by adapting those settings using math.
But to truly make it smart, I need to create a neural network(AI) that follows the patterns pros follow while they manually track. For that, I need data, which you can help provide.
If you are interested in trying out my automated addon, please check out the video link above and the Github link: https://github.com/usamasq/autosolve
https://redd.it/1pmfouq
@vfxbackup
YouTube
AutoSolve: The Open Source AI Camera Tracker for Blender (Research Beta)
Download AutoSolve (Free/Pay-What-You-Want): https://usamasq.gumroad.com/l/autosolve
Source Code on GitHub: https://github.com/usamasq/autosolve
Reddit Post: https://www.reddit.com/r/blender/comments/1pgg0na/im_tired_of_telling_my_students_to_use_other/…
Source Code on GitHub: https://github.com/usamasq/autosolve
Reddit Post: https://www.reddit.com/r/blender/comments/1pgg0na/im_tired_of_telling_my_students_to_use_other/…
Open to help with keying
Hi!
I worked as a junior compositing artist for two years and have been actively back in compositing for the last 6 months.
I am currently looking to strengthen my ahowreel and would be happy to help with challenging keying shots on an anpaid basis, as long as I am allowed to use the shot on my showreel.
If you are working on an indie project, short film, or any other production that needs solid keying - feel free to reach out.
Thank you!
https://redd.it/1pmg7bw
@vfxbackup
Hi!
I worked as a junior compositing artist for two years and have been actively back in compositing for the last 6 months.
I am currently looking to strengthen my ahowreel and would be happy to help with challenging keying shots on an anpaid basis, as long as I am allowed to use the shot on my showreel.
If you are working on an indie project, short film, or any other production that needs solid keying - feel free to reach out.
Thank you!
https://redd.it/1pmg7bw
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Any Budget VFX Asset Packs?
I'm an indie filmmaker making low-budget short films. Is there an all-in-one 2D VFX asset pack (muzzle flashes, explosions, smoke, fire, bullet impacts, blood splatters, etc.) available for around €50? I don't know if that's a realistic price, so don't judge
https://redd.it/1pmi7dz
@vfxbackup
I'm an indie filmmaker making low-budget short films. Is there an all-in-one 2D VFX asset pack (muzzle flashes, explosions, smoke, fire, bullet impacts, blood splatters, etc.) available for around €50? I don't know if that's a realistic price, so don't judge
https://redd.it/1pmi7dz
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
To what extent is AI being used in movies and TV shows now?
I know its been used for things like deepfakes for a while, but is it now used for more things like explosions, extras, creatures, etc?
Also, at the rate AI is going, how much work previously done by humans on set and is post is being replaced by AI?
https://redd.it/1pmlvcz
@vfxbackup
I know its been used for things like deepfakes for a while, but is it now used for more things like explosions, extras, creatures, etc?
Also, at the rate AI is going, how much work previously done by humans on set and is post is being replaced by AI?
https://redd.it/1pmlvcz
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Tips for optimizing VDB renders in Renderman/Katana?
I'm getting my ass kicked by some really dense VDB volumes that are being delivered to me in lighting. Adjusting dice settings helps, as does reducing the density, but I can only reduce the density so much before it really starts compromising the look, and I'm still not at a realistic render time. I tried a lot of other things without luck - does anyone have any magic bullets for helping with this? It doesn't help that the volumes are further being refracted through other surfaces, but the volumes are crazy slow on their own even when isolated for testing. I'm also open to suggestions that I could pass up the chain to FX to help.
https://redd.it/1pmoj0b
@vfxbackup
I'm getting my ass kicked by some really dense VDB volumes that are being delivered to me in lighting. Adjusting dice settings helps, as does reducing the density, but I can only reduce the density so much before it really starts compromising the look, and I'm still not at a realistic render time. I tried a lot of other things without luck - does anyone have any magic bullets for helping with this? It doesn't help that the volumes are further being refracted through other surfaces, but the volumes are crazy slow on their own even when isolated for testing. I'm also open to suggestions that I could pass up the chain to FX to help.
https://redd.it/1pmoj0b
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community