@Paul Johnston
Did some experimenting..
Experiment I:
1. In NVIDIA Control panel, I've set my default driver to NVIDIA (my system has an Intel HD too, but no difference in regards to fps though).
2. Using the default NVIDIA setting for vSync - "Use the 3D application setting".
3. Ran Qube's speed test (Studio version).
Results:
With AppGameKit vSync on: 30fps (max)
With AppGameKit vSync off: ~350fps.
Experiment II:
1. In NVIDIA Control panel, I've changed vSync to "On".
3. Ran Qube's speed test again.
Results:
With AppGameKit vSync on: still getting the low 30fps, so no change at all from the default setting.
With AppGameKit vSync off (thus forcing the vSync on the NVIDIA settings): 60fps. This was way up in the 350's when using the default "Use the 3D application setting", so a significant drop!
*Makes me think it really doesn't do what it should.
Experiment III:
1. In NVIDIA Control panel, I'v now set vSync to "Off".
2. Ran Qube's speed test again.
Results:
With AppGameKit vSync on: 60fps.
With AppGameKit vSync off: ~350fps.
Now this is more what I'd expected to get when using the default NVIDIA vSync-setting "Use the 3D application setting".
Looks a bit like miscommunication (between driver<>AGK<>OS?) to me. Or.. something changed in the way this used to work in the past or I never understood the way it was working in the first place or my machine is behaving crazy.
Dunno how it works behind the scene's ,but this is a bit weird I think and might be pointing to the needle in the haystack.