Obviously this doesn't show things like 1% lows and it can vary game-to-game, etc etc etc, but as an illustration of the relative difference between a 'good' CPU vs a 'great' CPU at 4k:
The Core i9-9900KS is Intel's new consumer flagship processor. It runs at 5 GHz boost no matter how many cores are active, which translates into 10% application performance gained over the 9900K. Gaming performance is improved too, but pricing is high, especially compared to what AMD is offering.
www.techpowerup.com
That's like a 2% performance difference average between an i5 8400 and an i9 9900KS
Websites claiming to give automated analysis of your CPU/GPU bottlenecks should be ignored as it's essentially made up. Even some sites that seem like benchmarks like Userbenchmark are at best junk and at worst misleading.
As for TR (ROTTR? SOTTR?) being GPU bound 'only' 65% of the time, it depends what that means. If it means that the game runs at (e.g.) 90 fps 35% of the time and that is when you are CPU bound, there's no need or real benefit to eliminating that CPU 'bottleneck'. Especially when you're on a 60hz monitor. Because it's not the CPU causing choppy gaming.
As for future-proofing versus next-gen titles, the best way to do that is to wait for next gen hardware.
If the gaming experience itself is fine - and if most of the framerate drops you get are GPU-related anyway (as they would tend to be at 4k) - stick to your current system and wait for future games to actually be a problem before buying new stuff. By which time there will be newer stuff to buy.
Edit: typo