Na tako niskim detaljima taj procesor ti je drastičan bottleneck u toj igri. Pa to i sam vidiš. Povećaj detalje i prati fps i zauzeće, primjetiti ćeš isti (ili neznatno manji) fps a veće zauzeće grafičke. Hoćeš više FPS na niskim detaljima, nabavi jači procesor.

Since I've seen some misinformation about bottlenecking floating around here in the recent few days, let me explain how a CPU (processor) bottlenecks a GPU (graphics card). The CPU is the component that prepares frames for the GPU.
Preparing frames means the CPU is the component that tells where each object on a frame is. You can relate to this as "game objects" and "physics".
The GPU takes a frame made by the CPU and adds graphics to it (polygons, textures, colors, shadows, shading, illumination and other details) which we will call frame rendering.
the GPU renders the frame created by the CPU and outputs it to the buffer from which the monitor/display takes frames.
The objects and physics determine the CPU requirements. This is why open-world games like Assassins Creed Origins or GTA V require a strong CPU (the amount of objects is enormous).
The resolution, texture quality, mesh quality and other details determine the GPU requirements. Meaning - nicer games are harder to run, higher resolutions and details are harder to run.
If a CPU can prepare 50 frames in a second (50fps) in a certain X game with certain objects but the GPU can render 100 frames on the graphical settings the X game uses, the GPU gets used only about 50% of the time since it has to wait for frames.
This "waiting" of the GPU to get frames prepared by the CPU is called CPU-GPU bottlenecking.
A GPU does not prepare anything for the CPU so there's nothing a CPU should wait to get from a GPU - there's no GPU bottlenecking a CPU.
A GPU can't bottleneck a CPU.
A monitor can not bottleneck a GPU since the GPU does not wait for anything from the monitor.
A CPU does not need to be at 100% usage to bottleneck a GPU. That depends purely on how a game/application uses the CPU. A CPU can be at 30% but still bottleneck a GPU (but, in that case, it's the fault of the game, application or API).
Increasing the display resolution does not affect a bottleneck, it simply increases the load on the GPU so that it decreases the difference between the number of frames the CPU can prepare and the number of frames the GPU can render.
A CPU that bottlenecks a GPU at 1080p might not bottleneck it on 4K.
If your GPU has no problems running at 100% usage with your CPU in the games you play, there is a high chance that you can upgrade your GPU and have even better performance (unless your CPU runs at 100% load).

If I remember anything more, I will write. #bottleneck #cpu #gpu #hardware
Edit: Do not use the website thebottlenecker.com - whoever made it does not understand bottlenecking. It claims a GPU can bottleneck a CPU and such stupidities.