Laptop for work on the go
What laptops are you all using, and are you happy with your choice?
I’m looking to replace a 15" 2018 MBP and since what I do 95% of the time is just remote into my Windows workstation I’m really not concerned with having a high spec laptop, but what I do really want is a decent screen.
Although I’ve enjoyed my MBP I think they’re too expensive and I don’t really care about brands or MacOS vs Windows.
https://redd.it/1plfxlm
@vfxbackup
What laptops are you all using, and are you happy with your choice?
I’m looking to replace a 15" 2018 MBP and since what I do 95% of the time is just remote into my Windows workstation I’m really not concerned with having a high spec laptop, but what I do really want is a decent screen.
Although I’ve enjoyed my MBP I think they’re too expensive and I don’t really care about brands or MacOS vs Windows.
https://redd.it/1plfxlm
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
How to do VFX for Log footage
I am thinking of doing some VFX for my upcoming short film using Blender, however I have no Idea how the workflow should be. Like should I first do the final grade of my video, then do the VFX shots to match my grade? I myself will be doing the editing, directing, and VFX.
Im new to doing VFX on log footage so please help me out! I have no idea if anyone's asked this already so some help would be very nice, and please explain it to me simply 🙏.
https://redd.it/1plk6wa
@vfxbackup
I am thinking of doing some VFX for my upcoming short film using Blender, however I have no Idea how the workflow should be. Like should I first do the final grade of my video, then do the VFX shots to match my grade? I myself will be doing the editing, directing, and VFX.
Im new to doing VFX on log footage so please help me out! I have no idea if anyone's asked this already so some help would be very nice, and please explain it to me simply 🙏.
https://redd.it/1plk6wa
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
AI usr in recent trailers
What shots in recent trailers do you think were either done by AI? Or at least AI used used in the design process? The bus from the supergirl trailer looks a bit suspect and some of the cars from fantastic 4 trailer are some I think AI was used in the concept art.
https://redd.it/1plnu0w
@vfxbackup
What shots in recent trailers do you think were either done by AI? Or at least AI used used in the design process? The bus from the supergirl trailer looks a bit suspect and some of the cars from fantastic 4 trailer are some I think AI was used in the concept art.
https://redd.it/1plnu0w
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Light catcher in Houdini
Hi all,
I'm using Houdini 21, with Karma/Solaris.
I have a scene with an explosion in contact with the ground. I added a point light inside the explosion to have a better control of how many frames I want this light to appear.
The problem is that I can't have the geometry ground in the render, because I'm applying the explosion to a plate that already has a ground.
So, I want a way to make my geometry ground receive the light from both the explosion and the point light, but not appear in the render. Similar to when you disable primary visibility in Maya. I want something like a shadow catcher, but for light.
I tried using the render geometry settings node with the render visibility set to -primary, but it makes the ground disappear completely, not catching the lights.
Thanks in advance!
https://redd.it/1plqxxo
@vfxbackup
Hi all,
I'm using Houdini 21, with Karma/Solaris.
I have a scene with an explosion in contact with the ground. I added a point light inside the explosion to have a better control of how many frames I want this light to appear.
The problem is that I can't have the geometry ground in the render, because I'm applying the explosion to a plate that already has a ground.
So, I want a way to make my geometry ground receive the light from both the explosion and the point light, but not appear in the render. Similar to when you disable primary visibility in Maya. I want something like a shadow catcher, but for light.
I tried using the render geometry settings node with the render visibility set to -primary, but it makes the ground disappear completely, not catching the lights.
Thanks in advance!
https://redd.it/1plqxxo
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
How Movies finally Made De Aging Look Good
https://www.youtube.com/watch?v=BBNXj9TA4n0
https://redd.it/1plu4yj
@vfxbackup
https://www.youtube.com/watch?v=BBNXj9TA4n0
https://redd.it/1plu4yj
@vfxbackup
YouTube
This Is How De Aging Got Good
De-Aging and Deep fakes have gotten a lot better over the past few years. This video goes into how its gotten so good and the techniques that the filmmakers use.
Sources:
Callum's Paper: https://www.callumstuart.com/vfx-personal/de-aging-with-deepfakes-bsc…
Sources:
Callum's Paper: https://www.callumstuart.com/vfx-personal/de-aging-with-deepfakes-bsc…
Question about CG Animation
I saw a post on here that was showing before and afters of How To Train Your Dragon, and one showed just a skeleton in the before, and then the full dragon model with its body in the after.
My question is that are animators actually animating just the skeleton and then another department is going to put the body on and do a muscle sim and stuff like that? Or is it just for show, for the breakdown.
https://redd.it/1plycpi
@vfxbackup
I saw a post on here that was showing before and afters of How To Train Your Dragon, and one showed just a skeleton in the before, and then the full dragon model with its body in the after.
My question is that are animators actually animating just the skeleton and then another department is going to put the body on and do a muscle sim and stuff like that? Or is it just for show, for the breakdown.
https://redd.it/1plycpi
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Hollywood panics as Paramount-Netflix battle for Warner Bros
https://www.bbc.co.uk/news/articles/c8dyy47qy82o
https://redd.it/1pm1o0b
@vfxbackup
https://www.bbc.co.uk/news/articles/c8dyy47qy82o
https://redd.it/1pm1o0b
@vfxbackup
BBC News
Hollywood panics as Paramount-Netflix battle for Warner Bros
Interviews with dozens of actors, producers and camera crews reveal an industry attempting to weigh the lesser of two horrible choices.
Spent 2 months making a Blender Short Film
https://www.youtube.com/watch?v=oAMuNjTNquE
https://redd.it/1pmbkcb
@vfxbackup
https://www.youtube.com/watch?v=oAMuNjTNquE
https://redd.it/1pmbkcb
@vfxbackup
YouTube
Daydreaming | Blender Short Film
Hi everyone,
I spent 2 months making this short film in Blender
(composting in Davinci Resolve).
No AI was used in the making of this film.
I would love to hear your opinion on this film, the storytelling, visuals and your interpretation. I hope everyone…
I spent 2 months making this short film in Blender
(composting in Davinci Resolve).
No AI was used in the making of this film.
I would love to hear your opinion on this film, the storytelling, visuals and your interpretation. I hope everyone…
I'm automating Blender Camera Tracking (And I need your help)
I'm a huge fan of open source especially Blender, but when it comes its internal motion tracking system, it's - slow. So I created an addon that tries to automate the native tracking system inside of it. I have been somewhat successful by making it work for simple shots.
Here's me showcasing some of its features: https://youtu.be/NzI5vurW5C4
But as I explained in the video, I've fine tuned the system for different types of shots and the addon learns the settings locally by adapting those settings using math.
But to truly make it smart, I need to create a neural network(AI) that follows the patterns pros follow while they manually track. For that, I need data, which you can help provide.
If you are interested in trying out my automated addon, please check out the video link above and the Github link: https://github.com/usamasq/autosolve
https://redd.it/1pmfouq
@vfxbackup
I'm a huge fan of open source especially Blender, but when it comes its internal motion tracking system, it's - slow. So I created an addon that tries to automate the native tracking system inside of it. I have been somewhat successful by making it work for simple shots.
Here's me showcasing some of its features: https://youtu.be/NzI5vurW5C4
But as I explained in the video, I've fine tuned the system for different types of shots and the addon learns the settings locally by adapting those settings using math.
But to truly make it smart, I need to create a neural network(AI) that follows the patterns pros follow while they manually track. For that, I need data, which you can help provide.
If you are interested in trying out my automated addon, please check out the video link above and the Github link: https://github.com/usamasq/autosolve
https://redd.it/1pmfouq
@vfxbackup
YouTube
AutoSolve: The Open Source AI Camera Tracker for Blender (Research Beta)
Download AutoSolve (Free/Pay-What-You-Want): https://usamasq.gumroad.com/l/autosolve
Source Code on GitHub: https://github.com/usamasq/autosolve
Reddit Post: https://www.reddit.com/r/blender/comments/1pgg0na/im_tired_of_telling_my_students_to_use_other/…
Source Code on GitHub: https://github.com/usamasq/autosolve
Reddit Post: https://www.reddit.com/r/blender/comments/1pgg0na/im_tired_of_telling_my_students_to_use_other/…
Open to help with keying
Hi!
I worked as a junior compositing artist for two years and have been actively back in compositing for the last 6 months.
I am currently looking to strengthen my ahowreel and would be happy to help with challenging keying shots on an anpaid basis, as long as I am allowed to use the shot on my showreel.
If you are working on an indie project, short film, or any other production that needs solid keying - feel free to reach out.
Thank you!
https://redd.it/1pmg7bw
@vfxbackup
Hi!
I worked as a junior compositing artist for two years and have been actively back in compositing for the last 6 months.
I am currently looking to strengthen my ahowreel and would be happy to help with challenging keying shots on an anpaid basis, as long as I am allowed to use the shot on my showreel.
If you are working on an indie project, short film, or any other production that needs solid keying - feel free to reach out.
Thank you!
https://redd.it/1pmg7bw
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Any Budget VFX Asset Packs?
I'm an indie filmmaker making low-budget short films. Is there an all-in-one 2D VFX asset pack (muzzle flashes, explosions, smoke, fire, bullet impacts, blood splatters, etc.) available for around €50? I don't know if that's a realistic price, so don't judge
https://redd.it/1pmi7dz
@vfxbackup
I'm an indie filmmaker making low-budget short films. Is there an all-in-one 2D VFX asset pack (muzzle flashes, explosions, smoke, fire, bullet impacts, blood splatters, etc.) available for around €50? I don't know if that's a realistic price, so don't judge
https://redd.it/1pmi7dz
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
To what extent is AI being used in movies and TV shows now?
I know its been used for things like deepfakes for a while, but is it now used for more things like explosions, extras, creatures, etc?
Also, at the rate AI is going, how much work previously done by humans on set and is post is being replaced by AI?
https://redd.it/1pmlvcz
@vfxbackup
I know its been used for things like deepfakes for a while, but is it now used for more things like explosions, extras, creatures, etc?
Also, at the rate AI is going, how much work previously done by humans on set and is post is being replaced by AI?
https://redd.it/1pmlvcz
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Tips for optimizing VDB renders in Renderman/Katana?
I'm getting my ass kicked by some really dense VDB volumes that are being delivered to me in lighting. Adjusting dice settings helps, as does reducing the density, but I can only reduce the density so much before it really starts compromising the look, and I'm still not at a realistic render time. I tried a lot of other things without luck - does anyone have any magic bullets for helping with this? It doesn't help that the volumes are further being refracted through other surfaces, but the volumes are crazy slow on their own even when isolated for testing. I'm also open to suggestions that I could pass up the chain to FX to help.
https://redd.it/1pmoj0b
@vfxbackup
I'm getting my ass kicked by some really dense VDB volumes that are being delivered to me in lighting. Adjusting dice settings helps, as does reducing the density, but I can only reduce the density so much before it really starts compromising the look, and I'm still not at a realistic render time. I tried a lot of other things without luck - does anyone have any magic bullets for helping with this? It doesn't help that the volumes are further being refracted through other surfaces, but the volumes are crazy slow on their own even when isolated for testing. I'm also open to suggestions that I could pass up the chain to FX to help.
https://redd.it/1pmoj0b
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
What’s a common VFX mistake/error you hate seeing?
For me, I just hate seeing a shot where someone is using rotobrush and it’s glitching out during the scene. Once you see it, it’s so hard to unsee
https://redd.it/1pmuk86
@vfxbackup
For me, I just hate seeing a shot where someone is using rotobrush and it’s glitching out during the scene. Once you see it, it’s so hard to unsee
https://redd.it/1pmuk86
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
How to build an actual 'creative' portfolio?
So I've been editing videos for a living for about a year now. I've done all sorts of editing - from talking head videos to live events recap. I'm currently working with various agencies and mostly edit talking head style videos based on the brief I'm given.
One of the career-related issues I'm facing is that I'm not able to navigate my path properly. While producing content for these big brands is good and I'm getting paid a decent amount, it only gets so stagnant. All of this feels very 'corporate' to me. While I'm grateful for the fact I can put my efficiency in premiere pro and after effects to even make these videos at first point, I still wanna upskill myself and get into more 'creative' work.
For the last few months I've been learning VFX on the side and I've made 2-3 visualizers and edited a music video just to test my skill and I seem to really like this type of work. Though, my biggest problem is that I don't know how to get more into this field of creative work. I come across so many of these creatives who are actively directing and/or editing music videos or making cool visualizers and I wish to do the same, but I don't know how. Most of these people already have an established portfolio because obviously, they've been doing this stuff for a long time but I wanna know what would be a good starting point?
How do I actually go about making a 'creative' portfolio, or better yet, how do I make a 'niche' portfolio? Do I just start making random visualizers and re-edit music videos in my style and build a portfolio out of that?
Any piece of advice or opinion would be appreciated on this! Really struggling with how I wanna navigate my path into this field and make best use of my creative talents.
https://redd.it/1pmpmw0
@vfxbackup
So I've been editing videos for a living for about a year now. I've done all sorts of editing - from talking head videos to live events recap. I'm currently working with various agencies and mostly edit talking head style videos based on the brief I'm given.
One of the career-related issues I'm facing is that I'm not able to navigate my path properly. While producing content for these big brands is good and I'm getting paid a decent amount, it only gets so stagnant. All of this feels very 'corporate' to me. While I'm grateful for the fact I can put my efficiency in premiere pro and after effects to even make these videos at first point, I still wanna upskill myself and get into more 'creative' work.
For the last few months I've been learning VFX on the side and I've made 2-3 visualizers and edited a music video just to test my skill and I seem to really like this type of work. Though, my biggest problem is that I don't know how to get more into this field of creative work. I come across so many of these creatives who are actively directing and/or editing music videos or making cool visualizers and I wish to do the same, but I don't know how. Most of these people already have an established portfolio because obviously, they've been doing this stuff for a long time but I wanna know what would be a good starting point?
How do I actually go about making a 'creative' portfolio, or better yet, how do I make a 'niche' portfolio? Do I just start making random visualizers and re-edit music videos in my style and build a portfolio out of that?
Any piece of advice or opinion would be appreciated on this! Really struggling with how I wanna navigate my path into this field and make best use of my creative talents.
https://redd.it/1pmpmw0
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
VFX History: Star Trek, The Next Generation
https://www.youtube.com/watch?v=o3r2EGlsaYY
https://redd.it/1pn2baa
@vfxbackup
https://www.youtube.com/watch?v=o3r2EGlsaYY
https://redd.it/1pn2baa
@vfxbackup
YouTube
The VFX History of Star Trek: The Next Generation
A look at the visual effects team behind the iconic television show and how they used practical effects, model photography and artistry.
This was the TV show that first sparked my interest in visual effects and over the years, I've come back to it time and…
This was the TV show that first sparked my interest in visual effects and over the years, I've come back to it time and…
Is it possible to distort a Paint node in Mari?
Hi, I'm a texture artist who is switching from Substance to Mari. I was wondering if it's possible to do a simple warp deform on a Paint node, and not just on noise and tile nodes by plugging in the distorted UVs. In Substance, it's very common in the workflow to deform and filter anything, and I'd like to get as close to that as possible, limitations permitting. Thanks for your help.
https://redd.it/1pn7l96
@vfxbackup
Hi, I'm a texture artist who is switching from Substance to Mari. I was wondering if it's possible to do a simple warp deform on a Paint node, and not just on noise and tile nodes by plugging in the distorted UVs. In Substance, it's very common in the workflow to deform and filter anything, and I'd like to get as close to that as possible, limitations permitting. Thanks for your help.
https://redd.it/1pn7l96
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
Looking for Photogrammetry Artists to Collaborate on a Free Asset Library
https://redd.it/1pnhe01
@vfxbackup
https://redd.it/1pnhe01
@vfxbackup
Is this frowned upon?
https://reddit.com/link/1pnhcqm/video/zaxsfejc7f7g1/player
Let me start off by saying that I like most, aren't really a big fan of any generative AI stuff in general. It's hard to go to any social like facebook, tik tok, insta etc and not find yourself scrolling and swiping past a plethora of ai content. Maybe those things were impressive when they came out for like a min, but after the oversaturation of it, after your 87th cow coming out of a toaster landing on surfboard going around the statue of liberty video, your brain just shuts it out.
But it got me thinking, what is the general consensus regarding using AI as one would vfx-cgi where it would only be to achieve a certain effects for shots or projects that aren't Ai generated completely? Do we just care about the end result? Or if the end result is good, but you know its Ai, do you just dislike it by default?
This video is not perfect. It was an experiment I tried while marketing my electronic music on tik tok as opposed to just uploading album art and slapping a track on it in the hopes that it would make someone stop scrolling. It was my first attempt to try to integrate the AI shots between footage I took at the park without prompting it to do anything to the environment. I just wanted to see if I could get the frame of me jumping in mid air to "keep going" until I landed and stay consistent.
I know that any person with the most basic of vfx knowledge or less can pin point inaccuracies with the video. The feet, grass warping, and such is there, and again, it's not meant to be taken serious as it was just an experiment, but the consistency to me is impressive to the point that it got me thinking about all this. Is there acceptable usage of ai in film in scenarios like this?
I'm guessing shots like those in Star Wars and Fast and Furious series, Paul walker is a digital double and they just paste-track his face on to the actor on set right? Ai was not used to just put his face in there as one can do no with al the AI sites out there?
https://redd.it/1pnhcqm
@vfxbackup
https://reddit.com/link/1pnhcqm/video/zaxsfejc7f7g1/player
Let me start off by saying that I like most, aren't really a big fan of any generative AI stuff in general. It's hard to go to any social like facebook, tik tok, insta etc and not find yourself scrolling and swiping past a plethora of ai content. Maybe those things were impressive when they came out for like a min, but after the oversaturation of it, after your 87th cow coming out of a toaster landing on surfboard going around the statue of liberty video, your brain just shuts it out.
But it got me thinking, what is the general consensus regarding using AI as one would vfx-cgi where it would only be to achieve a certain effects for shots or projects that aren't Ai generated completely? Do we just care about the end result? Or if the end result is good, but you know its Ai, do you just dislike it by default?
This video is not perfect. It was an experiment I tried while marketing my electronic music on tik tok as opposed to just uploading album art and slapping a track on it in the hopes that it would make someone stop scrolling. It was my first attempt to try to integrate the AI shots between footage I took at the park without prompting it to do anything to the environment. I just wanted to see if I could get the frame of me jumping in mid air to "keep going" until I landed and stay consistent.
I know that any person with the most basic of vfx knowledge or less can pin point inaccuracies with the video. The feet, grass warping, and such is there, and again, it's not meant to be taken serious as it was just an experiment, but the consistency to me is impressive to the point that it got me thinking about all this. Is there acceptable usage of ai in film in scenarios like this?
I'm guessing shots like those in Star Wars and Fast and Furious series, Paul walker is a digital double and they just paste-track his face on to the actor on set right? Ai was not used to just put his face in there as one can do no with al the AI sites out there?
https://redd.it/1pnhcqm
@vfxbackup
Reddit
From the vfx community on Reddit
Explore this post and more from the vfx community
McDonald's Pulls Down AI-Generated Holiday Ad After Deluge of Mockery
https://futurism.com/artificial-intelligence/mcdonalds-ai-generated-commercial
https://redd.it/1pnk5sw
@vfxbackup
https://futurism.com/artificial-intelligence/mcdonalds-ai-generated-commercial
https://redd.it/1pnk5sw
@vfxbackup
Futurism
McDonald's Pulls Down AI-Generated Holiday Ad After Deluge of Mockery
McDonald's Netherlands is catching flak for a stupefying AI-generated video, which was roundly condemned on social media.