30 vs 60 Frames per Second

pilotrs

Filing Flight Plan
Joined
Dec 18, 2015
Messages
24
Display Name

Display name:
PilotRS
So I've been looking into recording my flights and my favorite view when watching videos online is camera in the center showing the avionics and the front view, and majority of them say they're shooting at 1080p with 30fps and also using a nd filter, which lessens the light that enters the camera and thus increases the exposure (slower shutter speed) to fix the prop distortion.

At 30fps, the shutter speed is probably 1/60th of a second (based on that 180 degree rule), so at 60fps, the regular shutter speed would be 1/120th of a second, but there will still be twice the number of frames.

My Question: What if you increase the video to 60 fps (which will speed up the shutter time), are you really cancelling out the effects of the nd filter, which was put on to slow the shutter speed? or you'll still get a good video with the propeller blurred out?
 
The shutter speed is not tied to the frame rate. Sure, you can't have a shutter speed SLOWER than 1/frame rate, the shutter speed can be MUCH faster than that. The GoPro can shoot at 1/8192.

Even 1/120 should be sufficient to blur the prop.
 
True, the shutter speed is not tied to the frame rate, but it will change when the frame rate changes. So ND filter slows the shutter speed, but a higher fps increases the shutter speed. So would it cancel the nd filter effects/purpose a little?
 
Until you hit the 1/fps limit (or close to it), the frame rate has no bearing on the exposure.
If the light levels suggest a 1/250 shutter speed, then the camera will shoot at 1/250 regardless of whether you are using a 30 FPS or 60 FPS capture rate.
 

Did you bother to read that article? Shutter angle itself doesn't even exist for digital sensors. It's all electronic.

Shutter angle doesn't control either exposure or blur directly. It only makes sense when paired with the frame rate. And the reason? Because blur and exposure are determined by TIME. When an older film movie camera, yes, you used the shutter angle to control the exposure, and that's why cinemetographers talk in those terms, but it's really TIME that matters.
 
To Flying Ron: I did read the article? why else would I posting it for reference? But my apologies to Flying Ron.
 
Again, it still doesn't say what you are attempting to make it say. Blur and exposure is only affected by TIME. The frame rate is a hard limit on the low end of frame time.

Now, what your second article is saying is that there's subtle human perception issues that the frame rate and exposure time interrelates. This is because no matter what the frame rate is, you're still not taking moving pictures, you're taking a bunch of stopped frames. Your brain when confronted with the sequence of these rapidly shown to it, interprets changes as motion. You can hose it up by having too fast or slow of an actual shutter speed (as they note).

Another example of what I'm talking about can be seen just by going to any movie theatre that still uses FILM (or with some crude digital conversions). Movies historically are shot at 24 FPS. 24 FPS is slow enough that it would be perceived as flickering. In fact, movies are called "flicks" just because of this affect. To get rid of the flicker, without changing the frame rate, the projectors are shuttered so that each frame is shown twice. 48 FPS is fast enough that flicker is not perceived. Most of the time, this works pretty well, but it can cause problems. If you have some small high contrast object crossing the screen (imagine someone throwing a ball), it is shown twice at each location as it crosses the screen. Your brain is trying to turn that into a smooth motion and will essentially be fooled into thinking there are TWO balls (you'll get an apparent double image). If there was some blur there, you'd not see it because the images would merge together.

Believe me I know. I spent a long time dealing with high resolution images, high frame rates, and human perception dealing with intelligence imagery analysis displays. We got a lot of complaints at one point where people were telling us the images were blurring or double imaging. I knew immediately from my experience in film that it was that the graphics system was double flashing the images. Since this was a hot item, weeks after 9/11, I put together a summit of the government agency (NIMA at that point), Sun Microsystems (who made the computer), Dome Imaging (who made the display cards), and US (software) to sit down and find out what was causing the double flash. Turns out to have been in the display driver that Dome wrote. Their buffer swap was taking an entire frame time.

Frame rate makes an interesting thing for digital processing. We tended to run at either 60 or 70 FPS. 1/FPS was the time limit we had to get the next frame ready for display. If we missed it by a little, we'd get the double flash. If we missed it by longer we'd get a noticeable flicker (if we were regular about it) or jittery (if we were inconsistent).

Don't even get me started about the motion effects you can get with the older interlaced video formats.
 
Last edited:
Awesome explanation. So basically, I should be able to use 1080p @ 60 fps with a ND filter, and it should still fix the prop distortion. The reason for the 60p is that the I've read higher fps can be used to stabilize the video with the editing softwares.
 
It should work. Your prop is spinning past at about 75 times a second. If the ND can slow down the exposure enough it should work. The neat thing about these digital sensors is that they are VERY light sensitive, so I don't think even at 1/60 you're going to not be able to expose the frame sufficiently.
 
Awesome explanation. So basically, I should be able to use 1080p @ 60 fps with a ND filter, and it should still fix the prop distortion. The reason for the 60p is that the I've read higher fps can be used to stabilize the video with the editing softwares.

My experience is that 60fps even without the ND filter is good enough, and gives an otherwise crisper image.
 
OK everyone, I can give you definitive answers to these questions, as some of you know, I do a lot of inflight recording with filters.

Hands down, absolutely film at 60 FPS instead of 30. On a bright sunny day, a 64X (6 step) ND filter works perfectly for bluring out the prop, while only making the image minimally less sharp----a more than fair tradeoff in my opinion.

Not only does the 60 fps make the prop effect less noticeable, it also dampens vibration, which is useful when mounting it on the strut.

In terms of shutter speed at 30 vs 60 fps, its pretty simple really.......

If shooting at 30 FPS, the shutter speed will be NO SLOWER than 30 fps, and almost always MUCH FASTER than that, in any decent light.

If shooting at 60 FPS, the shutter speed will be NO SLOWER than 60 FPS, and almost always faster than that, on all but the darkest lighting conditions.

If in such dim lighting that the histogram would demand a slower shutter speed that the frame rate, the camera is up against a hard wall, and will raise the ISO to keep the proper exposure. If the ISO then maxes out (at 6400 I believe) then the only thing that can give in at that point is for the image to be underexposed.

One thing to keep in mind is the aperture on the Gopro (and similar POV cams I'm sure) is fixed at f2.8, regardless of how bright or dim the lighting is. To adjust for changes in light, the Gopro will only adjust the ISO and the shutter speed.

I find (unofficially) that when going into a dimmer environment from a brighter one, that the Gopro will slow the shutter long before also raising the ISO, which I like because I don't want to sacrifice lack of "noise" for a faster shutter speed.

I hope this explanation makes sense.
 
Back
Top