The refresh rate of your display is independent of the frames per second, unless you link them up with the vsync setting. Games will happily "show" more frames than your monitor can draw and vice versa. For instance, if your display is at 120Hz but your game is playing at 60fps, then your screen will draw the same image twice, then the next image twice, and so on. If your display is 120Hz and the game is playing at 70fps, then the image changes while the screen is drawing, so the top part of the image is the "old" image while the bottom is the "new" image - which is what makes "screen tearing" happen. (Turning on vsync forces the game to play at 60fps, or 40fps, or some other rate that divides evenly - which fixes the tearing effect.)
If your video is recording at 120 frames per second while your game is playing at 60 frames per second, it will work in a similar way. Every second frame will look exactly like the previous frame, but it will still be playing at 120fps. (In other words, you'll just waste a bunch of disk space with pointless, repeated frames.) If you play at something between 60 and 120fps you'll get... something bad. I don't know if it would be stuttering or tearing, but either way it wouldn't be nice.
When recording a game with something like NVIDIA's Shadow Play, I think it records at a variable framerate. If the game is playing at 80fps, the video records at 80fps. If it slows down to 72fps, so does the recording. That makes it so the video encoder doesn't have to worry about the image repeating or changing in the mid-save. I could be wrong, though - you'll have to do some searching if it matters to you.