Specific to 4k Bluray players. If you're outputting to a 4K TV, they're all more/less going to look the same if you're after Minimal image processing.
On a 4K bluray,
The pixel Luma Brightness information is 4K, the Chroma (Color) information is 1080p.
So if nothing fancy is the goal, the only thing the player has to do is translate the Luma 1:1, and Upscale the Chroma to 4K, and SEND.
The more expensive players will have slightly better Chroma processing, but overall, even the Most-expensive chroma processing vs the Computationally-Cheapest , the difference is mostly invisible during a moving shot.
So with respect to 4K, even a budget player would do an arguably good job, 4K to 4K.
There are ofcourse more Exciter features, which reprocesses the image for a -different look-, similar to Audio EQ. There's no way this type of processing would look good on a budget player, because they wouldn't have the processing power to do it well.
So if you're a purist, expensive players do no more than budget players. If you're a muscle car / ricer kind of person, an expensive player has the bigger wheels.
The other half of the equation is if you intend to put Traditional 1080p Blurays in there, and have it go from 1080p to 4K..
An Xpensive player HERE, could greatly outperform a budget player.
If you haz access to Arrrgh Matey, Then Today's landscape is such that, HTPC still offers the absolute BEST image quality output.
1080p content output to 4K PC requires ~1050/ ti
2160p content output to 4K PC requires ~1060/ 1070
Then finally there's Colorimeters, which cost nearly as much as your new tv, so while I'd say it's worth it. it's tough to impart the Awesomeness, unless you're already inclined in this area. For example, if you've ever taken screenshots from different sources of video and compared to see which one's better. If you've stared at 1 corner of the TV for an extended 30 minutes and concluded that the bottom right corner looks better. (You know, insane people).