Tuesday, October 14, 2014

Nobody wants a cinematic slideshow!




Does fps (frames per second) really matter in video games or is it just a pretentious request of the PC master –race?!

Last week the world level design director on Assassin’s Creed Unity project from Ubisoft said:
"At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird.” 

And the game's creative director, Alex Amancio said the following:
"30 was our goal, it feels more cinematic. 60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum.” (The math is real!)

"It's like when people start asking about resolution. Is it the number of the quality of the pixels that you want? If the game looks gorgeous, who cares about the number?"

This started an entire discussion on the internet and I thought I should say my opinion on the subject matter.

               The quotes posted above either show that these guys have no clue on how fps in movies and games works or they are selling us a bunch of lies (bullsh*t) to cover up for the fact that Ubisoft doesn't care for the PC market that much to put extra effort in porting the game properly and the next-gen consoles can’t handle their graphical technology that good in order to deliver high fps (a thing quite clear for any gamer, but which developers avoid to mention).
               I will address the problem of resolution first because it is shorter and easier to explain. It is not the numbers that we care about, it’s the quality. Image scales with larger screens and the bigger the screen the bigger the textures become and the uglier the image. The higher the resolution of the game and the screen the better image scales providing a clearer quality. I would assume that a director from a video games developing company should know how this works…

…and now the big topic about the fps.

               Movies run at 24 fps because most of the projecting devices in the world used this speed and this became a worldwide standard and was embraced by Hollywood. Movies have blur between frames to smooth the action and improve the visual experience for the human eye (that’s why when a movie is paused the image most of the time has a low quality and is indecipherable), which can make a difference between 30, 60 and even 120 fps.
The reason the Hobbit movie looks so odd is because there are actors dressed up and full off make-up performing on sets with computerized effects, the faster the image the more noticeable it is.
When talking about how fps works in video games things change entirely. First of all games don’t have motion blur between frames, many games have  motion blur as a graphical effect, but it doesn’t do exactly what it does in the movies and it’s a graphical effect to simulate the distortional image that people get when they move their head from one direction to another. Video games motion blur does affect performance by lowering the fps and even causing micro-stuttering (not to mention it can be tiring) so most of the time is the gamers choose to disable or tune this feature down in order to gain a performance boost. 
               The second thing and probably the more important one is the fact that games have a free camera controlled by the players in a virtual rendered environment where smoothness of motion is more important. Every movement is synchronized with the player controlling device and any problem can be noticed not only by the eye but can be felt as an annoyance to the hand that controls the camera. (oh the beautiful human nervous system)
When watching a movie we just sit back, relax and enjoy it (or not depending on the movie), but when we play a game we are doing input and every major drop in fps can be noticed because the low fps makes the response to the player’s actions to feel laggy and sluggish.
It is true that in some fixed camera point and click adventure games or in 2D games the fps issue is not that big, but the statements quoted above were not made for such games.
There is no question about which is better, higher (and constant) fps is always better than lower fps, human eye can adapt and notice the difference even above 100 fps. There can be a question about what gamers prefer or accept, but that is a subjective matter. So if someone wants to play the game at a lower fps there should be options for locking the game at a desired fps, but video games should not be fps locked by default. It’s insulting to the PC consumers to do such things as there are many gamers that spend hundreds or thousands of dollars on powerful video cards and monitors that can support up to 144 fps.
 There is no such thing as more cinematic, games are not movies. We don’t buy video games to watch them, we do it because they are interactive, fun and require our input.

Ubisoft staff, if you want something to be so cinematic, maybe you should stop developing games and start making movies. And you should know we are not stupid (!!!) so stop lying in our face in order to cover up the fact that the next-gen consoles are already way behind the PCs tech (and you don’t want to spend extra money to create better ports for the PC market…), we know how fps works and how better games play at a higher fps.


Here is a link to the difference between the “cinematic” 30 fps and 60 fps from different games genres for those who are not convinced:




Nodrim

No comments:

Post a Comment