
I changed the framerate in my game from 120 to 1000, expecting that the game would slow to a crawl trying to do so much stuff per second. This is just a crude guess, but perhaps FRAPS measures further up the GPU pipeline (hence is more accurate about what frames are actually making it to the screen), whereas Fusion is measuring closer to the backend somewhere (counting some sort of computational 'frames', regardless whether they all make it to the screen, or even the GPU).ĭoes anyone have any more knowledge/insight/theories about how the frames are actually counted in Fusion? And what is the most useful way to interpret Fusion's FPS figure?Īnother thing that I've noticed, which may or may not be related to this question, is that increasing the "Frame Rate" in Runtime Options actually seems to improve performance. That makes me think that rather than Fusion being "wrong", it might simply be that what a traditional FPS counter like FRAPS (or EVGA Precision-X, or whatever) measures and what Fusion measures are simply two different things. because I've opened the debugger, or because too much stuff is going on on-screen), and FRAPS confirms that I have, say, 42fps.yet Fusion will report that the FPS is around 70, or even 120. For example: if I experience very noticeable slowdown in my game (eg. Fusion's native "Frames per second" counter (accessed through "Storyboard controls", or simply "FrameRate" in Expression Editor.and also displayed in the debugger) gives some seemingly innacurate results.įusion's "FrameRate" value will usually be 2 or 3 times higher than what's actually going on on-screen.
