In my experience which I admit is not particularly broad, with the editing programs I have used, FCPX, LumaFusion, iMovie, it really doesn't matter. It seems you can put practically anything in and get practically anything out and I don't see impacts in quality and I have yet to see a demonstration that does.
I'd happily have somebody point me to an authoritative and peer supported example where IQ is impacted to a noticeable degree by doing so. I'd also like to see the same for anything that demonstrated a visible difference between a 23.976, 25 or a 24 timeline.
Quite frankly, it’s exaggerated when it comes to video. My company had, as one service, a still frame print from 70mm film, for motion picture companies. When they sold 8x10’s from their films to fans, they would pick a still from the film in a number of places, and our 70mm camera would make a 4x5 transparency or negative, and make hundreds of prints for them to take to openings or, usually, fan fests.
Some people would ask me why they didn’t print from the still photos taken while the film was being made. Simple. Fans want the EXACT image they saw in the film at that exact moment. A still isn’t good enough, because it’s not from the camera that took the movie. So, it’s not the ”real” picture.
ok, so why bring this up? Because our visual cortex doesn’t see motion the way it sees a still. While watching that movie, everything looks more than sharp enough. We don’t really see the grain, and small defects aren’t noticed. But when we print a still, it’s different. Suddenly, it’s soft, there may be scratches we need to fix, the grain is prominent.
so, yes, some of the things complained about, such as not using the full frame for video isn’t really as noticeable as some think it is. Yeah, if someone is staring at the screen, looking for it, likely, if the monitor is good enough, you might see it. But if you’re not sitting close enough, you won’t.