In our last blog we talked about refresh rate. As a result, we now know displays can refresh up to hundreds of times per second to provide a great image. Often people confuse refresh rate with the term "frame rate". While both of these terms have to do with the refreshing of an image, they do differ in an important way. Before we get into that, let's define what frame rate means. What is frame rate? Frame rate is defined as the frequency at which consecutive images are displayed. Moreover, this is usually expressed shorthand as frames per second (fps). The interesting thing about frame rate is that it can be used across many different forms of media and entertainment.\u00a0 Movies, TV shows, video games, video cameras and recordings all have their own ideas about what the ideal frames per second is. For many years 30 frames per second was the standard that video games were held to. Currently, the standard is 60 or even 120 fps. Movies are shot in 24 frames per second. Occasionally you will see movies released at a High Frame Rate (HFR) like The Hobbit movies. The increase in frame rate is meant to reduce motion blur. This is great for things like sports or video games where you want to have the cleanest, most crisp picture imaginable. Yet, for movies and TV shows, a higher fps can clean up the picture a little too much. When that happens, the \u201cmovie magic\u201d disappears and the image can look like it was being recorded very cheaply. Now the real question is: how does frame rate differ from refresh rate? The differences between frame rate and refresh rate Simply put, the source of the image being displayed is the biggest difference. When looking at frame rate, your GPU and CPU are providing the image, so it is the number of frames per second that they can produce. The refresh rate is the speed at which your monitor is able to completely refresh what you are seeing. How they work together Let\u2019s look at an example. Let's say you are playing a game on your computer. Your computer is geared towards gaming and can output at 120 frames per second. Your monitor has a refresh rate of 144 Hz. At this point, you are experiencing the hand in hand nature of frame rate and refresh rate. The GPU in your system is pushing the frames per second and the refresh rate of the monitor can rise to meet it and supply the best image possible. Therefore, it is important to have a refresh rate that will match your frame rate, because image quality and smoothness would be lost if the two are not in sync. In Conclusion Do you need advice on how to configure your system? The technical staff here at General Technics are happy to help.