r_vfx Subreddit VFX Reddit r/vfx Backup by AppleEditing (AE) on Telegram – Telegram
r_vfx Subreddit VFX Reddit r/vfx Backup by AppleEditing (AE) on Telegram
21 subscribers
2K photos
2.02K videos
17.7K links
r/vfx subreddit backup on Telegram. A Backup Project by @RoadToPetabyte and @Reddit2telegram Join our subreddit backup on Discord and Pinterest: https://discord.gg/abCudZwgBr. Other Telegram backup https://discord.gg/jsAxt4rUCB
Download Telegram
Any Budget VFX Asset Packs?

I'm an indie filmmaker making low-budget short films. Is there an all-in-one 2D VFX asset pack (muzzle flashes, explosions, smoke, fire, bullet impacts, blood splatters, etc.) available for around €50? I don't know if that's a realistic price, so don't judge

https://redd.it/1pmi7dz
@vfxbackup
To what extent is AI being used in movies and TV shows now?

I know its been used for things like deepfakes for a while, but is it now used for more things like explosions, extras, creatures, etc?

Also, at the rate AI is going, how much work previously done by humans on set and is post is being replaced by AI?

https://redd.it/1pmlvcz
@vfxbackup
Tips for optimizing VDB renders in Renderman/Katana?

I'm getting my ass kicked by some really dense VDB volumes that are being delivered to me in lighting. Adjusting dice settings helps, as does reducing the density, but I can only reduce the density so much before it really starts compromising the look, and I'm still not at a realistic render time. I tried a lot of other things without luck - does anyone have any magic bullets for helping with this? It doesn't help that the volumes are further being refracted through other surfaces, but the volumes are crazy slow on their own even when isolated for testing. I'm also open to suggestions that I could pass up the chain to FX to help.

https://redd.it/1pmoj0b
@vfxbackup
What’s a common VFX mistake/error you hate seeing?

For me, I just hate seeing a shot where someone is using rotobrush and it’s glitching out during the scene. Once you see it, it’s so hard to unsee

https://redd.it/1pmuk86
@vfxbackup
How to build an actual 'creative' portfolio?

So I've been editing videos for a living for about a year now. I've done all sorts of editing - from talking head videos to live events recap. I'm currently working with various agencies and mostly edit talking head style videos based on the brief I'm given.

One of the career-related issues I'm facing is that I'm not able to navigate my path properly. While producing content for these big brands is good and I'm getting paid a decent amount, it only gets so stagnant. All of this feels very 'corporate' to me. While I'm grateful for the fact I can put my efficiency in premiere pro and after effects to even make these videos at first point, I still wanna upskill myself and get into more 'creative' work.

For the last few months I've been learning VFX on the side and I've made 2-3 visualizers and edited a music video just to test my skill and I seem to really like this type of work. Though, my biggest problem is that I don't know how to get more into this field of creative work. I come across so many of these creatives who are actively directing and/or editing music videos or making cool visualizers and I wish to do the same, but I don't know how. Most of these people already have an established portfolio because obviously, they've been doing this stuff for a long time but I wanna know what would be a good starting point?

How do I actually go about making a 'creative' portfolio, or better yet, how do I make a 'niche' portfolio? Do I just start making random visualizers and re-edit music videos in my style and build a portfolio out of that?

Any piece of advice or opinion would be appreciated on this! Really struggling with how I wanna navigate my path into this field and make best use of my creative talents.

https://redd.it/1pmpmw0
@vfxbackup
Is it possible to distort a Paint node in Mari?

Hi, I'm a texture artist who is switching from Substance to Mari. I was wondering if it's possible to do a simple warp deform on a Paint node, and not just on noise and tile nodes by plugging in the distorted UVs. In Substance, it's very common in the workflow to deform and filter anything, and I'd like to get as close to that as possible, limitations permitting. Thanks for your help.


https://redd.it/1pn7l96
@vfxbackup
Looking for Photogrammetry Artists to Collaborate on a Free Asset Library
https://redd.it/1pnhe01
@vfxbackup
Is this frowned upon?

https://reddit.com/link/1pnhcqm/video/zaxsfejc7f7g1/player

Let me start off by saying that I like most, aren't really a big fan of any generative AI stuff in general. It's hard to go to any social like facebook, tik tok, insta etc and not find yourself scrolling and swiping past a plethora of ai content. Maybe those things were impressive when they came out for like a min, but after the oversaturation of it, after your 87th cow coming out of a toaster landing on surfboard going around the statue of liberty video, your brain just shuts it out.

But it got me thinking, what is the general consensus regarding using AI as one would vfx-cgi where it would only be to achieve a certain effects for shots or projects that aren't Ai generated completely? Do we just care about the end result? Or if the end result is good, but you know its Ai, do you just dislike it by default?

This video is not perfect. It was an experiment I tried while marketing my electronic music on tik tok as opposed to just uploading album art and slapping a track on it in the hopes that it would make someone stop scrolling. It was my first attempt to try to integrate the AI shots between footage I took at the park without prompting it to do anything to the environment. I just wanted to see if I could get the frame of me jumping in mid air to "keep going" until I landed and stay consistent.

I know that any person with the most basic of vfx knowledge or less can pin point inaccuracies with the video. The feet, grass warping, and such is there, and again, it's not meant to be taken serious as it was just an experiment, but the consistency to me is impressive to the point that it got me thinking about all this. Is there acceptable usage of ai in film in scenarios like this?

I'm guessing shots like those in Star Wars and Fast and Furious series, Paul walker is a digital double and they just paste-track his face on to the actor on set right? Ai was not used to just put his face in there as one can do no with al the AI sites out there?

https://redd.it/1pnhcqm
@vfxbackup
Small Render Farm Early Access

Hello,

I’m a hardware nerd, and I’m teaming up with a VFX artist to start a render farm. Everything is ready to go, just gotta turn the farm on at this point.

We are looking for a few artists like yourselves to be the first ones to utilize the farm. We will be starting small, but will be a full functional farm utilizing Professional grade GPU’s to 256 Bit data encryption, and everything in between.

We know we will be slow in the beginning, and don’t compete with the big firms out there, but we are here to offer our services to those artists that need a extra gpu power/time, without having to wait hours for a queue.

While yes this is an advertisement, we are here as merely a hello. If you are interested in getting in the door at an early bird price point, send me a DM or comment below.

Thanks!

https://redd.it/1pnp6ak
@vfxbackup