Benchmarks: 1

cpuset trick

Game settings: 1920x1080, 50% render scale, Low graphics settings (to get CPU bound scenario), fps uncapped to 400fps. Windowed Fullscreen, LXDE, X11. ------------------------------------ DXVK: Compiled from git, commit 0eec9584 compile flags: cpp_args and c_args = -march=native -O3 -pipe -flto -floop-strip-mine -fno-semantic-interposition -fipa-pta -fdevirtualize-at-ltrans c_link_args = -flto -static static-libgcc ------------------------------------ Wine-tkg: Fsync enabled env vars: STAGING_SHARED_MEMORY=1 __GL_DXVK_OPTIMIZATIONS=1 _protonify="true"_CROSS_FLAGS _GCC_FLAGS and _CROSS_FLAGS="-O3 -pipe -floop-strip-mine -fno-semantic-interposition -fipa-pta -fdevirtualize-at-ltrans" ------------------------------------- Kernel: linux-tkg command-line: random.trust_cpu=on nowatchdog mitigations=off amdgpu.ppfeaturemask=0xffffffff noreplace-smp clearcpuid=514 compile settings: _sched_yield_type="0" _rr_interval="default" _tickless="1" _fsync="true" _compileroptlevel="2" _processor_opt="native" _smt_nice="true" _timer_freq="500" ----------------------------------------------- Naming convention: - obs: obs recording happens with software x264 encoder, medium preset, FHD 60fps, High profile, Tune: animation - cpuset-smt: cpuset is used by separating vcores by smt, for each physical core, one smt goes to one cpuset and the other goes to the other cpuset - cpuset-ccx: cpuset is used to separate the ccxs, one ccx for overwatch, the rest for everything else (including obs recording if enabled) When there's a number 1 or 2, the computer has been restarted and the simulation redone. It seems that a 5fps variation is within margin of error. --------------------------------------- RAM: 4 dimms, single rank, 3400MHz, CR 1, CL 15, RCDRD 19, RCDWR 15, RP 16, RAS 34

about 1 month ago

Distro: Arch Linux
Kernel: 5.10.8-112-tkg-upds
GPU Driver: 4.6 Mesa 20.3.3
GPU: Radeon RX 5600 OEM/5600 XT / 5700/5700 XT
CPU: AMD Ryzen 7 3700X 8-Core Processor
RAM: 32 GB