Like in animation strips, imagine the clip that you are recording as a series of still pictures. Each still image is a frame. You can make a 60 secs long cartoon clip by drawing 60 still images and showing them one after another, 1 frame per second. Now there will be a obvious flicker in the screen (as human eye can detect changes slower than 1/16th of a second), so you can eliminate the flicker if you show 16 still images per second - that is 16 fps (frames per second). Twenty five frames per second is good enough to give you a motion picture quality (you get to play around with these and check the quality of the output when you rip DVDs). Resolution is the density of pixels (usually mentioned in square inch like 100 dpi, 200 dpi (as in color printer configuration for instance), or 640x 480, 800x600 as rows of pixels vertically and horizontally in a screen). Higher the resolution, better the frame. In digital movie camers, a VGA resolution (600x400) is decent enough, but nowadays cameras offer a lot more, giving the user a choice to play around with the video (like making stills, enlarging them etc)
<i>All that is gold does not glitter, not all those who wander are lost - <b>Gandalf</b></i>
posted 16 years ago
One more question though. Is there any relation between resolution and fps, if my resolution is higher do I need more fps to not see flicker or are they completely independent?
They're pretty well independent. You might notice flickering more easily on a larger screen as smaller objects that move quickly in the distance seem to jump that on a smaller screen are invisible, but that's about all the link you could place.