r_vfx Subreddit VFX Reddit r/vfx Backup by AppleEditing (AE) on Telegram – Telegram
r_vfx Subreddit VFX Reddit r/vfx Backup by AppleEditing (AE) on Telegram
21 subscribers
1.99K photos
2.01K videos
17.6K links
r/vfx subreddit backup on Telegram. A Backup Project by @RoadToPetabyte and @Reddit2telegram Join our subreddit backup on Discord and Pinterest: https://discord.gg/abCudZwgBr. Other Telegram backup https://discord.gg/jsAxt4rUCB
Download Telegram
Help VFX Newbie

Hello, im currently working on a project for school and can't figure out a proper solution.
Basically it is two shots, one having people dressed as ghosts and the other having just a cloth falling to the ground. the task is to combine the two shots and make it look like the ghost is falling to the floor and stops existing (just a cloth on the ground).
We need to work in Davinci Resolve Fusion and I was thinking to rotoscope the ghost and the falling cloth and then grid warp them together for a smoother transition and then put the new footage on a clean background. Can anybody share how one would go about this or if there are any major flaws in my approach?

Thanks in Advance :)



https://redd.it/1pobx5v
@vfxbackup
Are you still using EbSynth, or was it replaced by AI?

Hi, I’m Šárka, one of the devs of EbSynth (the one from the tutorial video). We recently released a new version and I’d like to get some honest feedback from VFX artists on where to take it next.

Quick recap: EbSynth is a vfx tool that lets you edit/paint one video frame, and then propagates the look across the shot. It doesn't use AI for the propagation. It's practical for rotoscoping, cleanups, stylized looks, etc. The new version has an interactive user interface that lets you paint directly on videos. Kind of like a light merge between Photoshop and After Effects.

Right now, we're trying to navigate this AI era and figure out how to make EbSynth useful for you. So, here's what I'd like to learn from you:

Do you use EbSynth in your workflow?
For what tasks? Does it solve any problem for you?
Are you using any AI video editing tools in production?
If you don’t use EbSynth, what would it need to be worth using?

I'd appreciate any blunt feedback. Also, feel free to ask me anything :)

Thank you so much!

https://redd.it/1pofxpg
@vfxbackup
Meta gig update?

Hi there. Just wondering if anyone knows anyone who got hired for this gig. So far I only know of one lol. Thanks!

https://redd.it/1pol7lp
@vfxbackup
Google Bows to the Mouse (abides by Cease and Desist Order to purge Disney infringement from Veo)

https://gizmodo.com/google-has-taken-down-ai-generated-content-following-disneys-cease-and-desist-2000698254


Disney’s cease and desist letter to Google was delivered on Wednesday, before the announcement of its OpenAI deal, according to Deadline.

Disney CEO Bob Iger told CNBC the cease and desist letter came after past conversations with Google about this material had been fruitless. At least some of the the material mentioned in the Disney cease and desist order was generated with Google’s Veo, according to the trades. Google was apparently hosting IP from the Star Wars and Simpsons franchises, along with material featuring the auxiliary Marvel Cinematic Universe character Deadpool. Mickey Mouse was also on the list of Disney properties subject to the cease and desist, per Variety.

The episode follows OpenAI agreeing to a 3-year licensing agreement with Disney for 200 specific characters and designs of masked or anthropomorphic nature without voice or human talent data.



https://redd.it/1popako
@vfxbackup
My head Hurts ( and my eyes !)

Ok i got to write all this so its understood:

I just want to do reliable blemish removal over long clips with a lot of head movement. Over the past year i have bashed my head up against the wall to do this as painless as possible but to no avail: This is basically the summary:

>

I'm not looking for VFX-artist purity.

People feel insulted when i tell them rotoscoping planar tracking and all those O high and mighty ones up there others is slow and a headache.

Yet i feel insulted everytime they think i'm stupid because i cant get a simple blemish to disapear on a 3 minute clip without grinding myself towards an asylum.

I want speed, stability, and low mental load. I don't do this for a living but have to delve into only blemish tracking every other month.

>Is there really no app that:

* tracks as easily as Resolve,
* tracks as robustly as Mocha,
* without node graphs, planar setups, precomp hell, or babysitting.

I'm feeling stuck

Resolve’s idea is right (fast, timeline-based, minimal setup), but its tracker falls apart,

while Fusion / Mocha / Silhouette / Nuke are “correct” but way too slow and brain-destroying just to fix one moving blemish over minutes of footage.

I'm not looking for “VFX-artist purity”. i want speed, stability, and low mental load.
Is there no other offline based software that can do this without getting my eyes skewed for life ?

https://redd.it/1poyoqj
@vfxbackup
Is it reasonable to work in the VFX industry in Turkey?

As my question states in the noscript, there is a lot of competition abroad, but in Turkey there are only one or two studios, and recently they have started producing good work (1000Volt). Since competition is lower in Turkey, do you think this sector can be considered a long-term career?

https://redd.it/1poww8j
@vfxbackup
Jim Cameron: AI for millionaire actors and directors? Never. AI for VFX? Now that’s ok since VFX is too expensive

https://preview.redd.it/16giqa6l4t7g1.png?width=1696&format=png&auto=webp&s=441ac31df29f83c1a90f566d88f25b1b80460406

Curious to hear what others think about this

https://redd.it/1pp415o
@vfxbackup
Is the Vertex Selection on the PointCloudGenerator node bugged?

I haven't been able to select to create a mesh and match the position of a card because of it. Am I doing something wrong or is there any other way I can match the position?

https://redd.it/1pp5ns0
@vfxbackup
Were there earlier versions of Peter Jackson's Hobbit virtual filming cgi?

I don't know what you might call it, but I remember from the bts on the hobbit how Peter Jackson had a camera that he filmed an empty studio that had motion tracking sensors placed around. The footage would translate to real time camera moves on a virtual set you could see on screen. I also remember it being used for the mocumentary shaky camera style in Surf's Up (2007). Whatever this tech is called, who pioneered its use? Was Surf's Up one of the first ones?

https://redd.it/1pp9buv
@vfxbackup
What's it like working in the industry?

Hi,
I just wanted to know what it's like to work in animation, video games, or VFX.
What's a typical day like, what's it like to look for a job, or what's it like to connect with people in the industry?
Are you able to make a living doing what you love?

I'm just curious.

https://redd.it/1pp7503
@vfxbackup
How should I be doing stars?

What Im really asking is how do I do high quality, sharp stars in a 4K resolution?

In Nuke, I have Im starting with an 8K latlong noise with the values crunched, which produces my stars and is then projected onto a sphere.

The problem is that some of my bigger stars are up to 7 pixels wide and ultimately look too soft. Even if I bump the noise to 16K (which becomes too much for my potato PC) the stars still feel soft. I also tried turning off filtering in my scanline, but the results are questionable.

I recognize the problem is that since the stars are being piped into a sphere, the camera is only viewing a smaller section of the 8K star element. Is my only option to just increase the resolution of the star element? Is there no other way?

https://redd.it/1ppec7d
@vfxbackup
Hiring - VFX Artist/editor for Stream Promo Content!

Hey everyone!

I’m a streamer looking for a VFX artist/editor who can help push the visuals of my content through high-quality promo videos and intros, specifically compositing me into videos with clean, creative effects.

I put a ton of effort into making my stream feel different from the typical live setup—smooth visuals, high energy, and a style that’s always evolving. Now I’m looking for someone who can match that creativity with motion design that pushes things even further.

What I’m looking for:

• High-quality VFX and compositing (putting me into games, environments, or scenes)

• Clean, stylized effects synced to music

• Strong sense of timing, pacing, and visual flow

• Experience with tracking, masking/rotoscoping, and compositing

• (Bonus) Experience with gaming, streaming, or cinematic internet-style edits

• (Bonus) Ability to create short reusable promo intros or loops

This is an entry/mid-level opportunity that will grow quickly as the stream continues to scale. Ideally you’re someone who wants a long-term collaboration.

PLEASE REVIEW MY CONTENT BEFORE REACHING OUT.

Socials: @outofcheck

If you feel you can match the quality, creativity, and energy of the production, I’d love to connect.

Send your portfolio and a quick intro to info@outofcheck.com.

Can’t wait to work together!

https://redd.it/1ppik8y
@vfxbackup
Does Skydance Madrid sponsor work permit for non EU citizens?

Hi everybody,

I hope you are doing well.

I am from Mexico and I have applied to one of skydance jobs for their studio in madrid before, but I didn’t get past the HR meeting. I had almost all the requirements for the position and I tailored my CV to said position. My question is, do you guys know or has anyone from a non-EU country gotten the chance to have them sponsor your work permit? Or do they just automatically discard you after seeing you Not having a work permit/not being from an EU country?

I am mostly trying to figure out what I am doing wrong, if I am not moving forward due to the citizenship and lack of work permit or if I am messing up my initial meeting with HR.

Thank you for your time.

Edit. Forgot to delete something from an early draft

https://redd.it/1ppmzp4
@vfxbackup