You can get movies in real 4K only in a movie theatre. It is wide enough and has 4:4:4.
"4K" is a cinema standard for:
- a display that is at least 4096×2160, which is slightly wider than 16:9 of TVs.
- an image that fills either the width, the height or both.
- 4:4:4 colour, and
- The DCI-P3 colour space, which can have richer colours than a standard TV.
Consumer displays that can show real 4K do exist but are rare.
The most accessible one would be an 21.5" iMac "with retina display", which has a screen of 4096×2304, with the primaries of DCI-P3 but the gamut of "Full HD"/sRGB
The actual name for the TV standard is not "4K" but "Ultra HD".
When it was under development, there was a lot of talk about making it support true 4K, and that did seep out into the press. Ultimately, they decided to do it 16:9 like "Full HD" so as not to get black bars on upscaled content.
Sony was first to call their devices "4K UHD", as a way to show early adopters that this is what all that 4K work had resulted in .. and then it spiralled from there.
"High dynamic range" (HDR) is most often a buzz word for when a screen supports a better colours space than HD television/sRGB, and/or if it supports more bit depth than 8 (256 brightness levels) for each primary (R,G,B).
Ultra HD supports the Rec.2020 colour space that is wider than DCI-P3. However ... even if a screen is marked as supporting "HDR" that may refer only to the electronics. Whether the screen is actually capable of the full colour space of DCI-P3 or "Adobe RGB" (or whatever) is another issue.
The Rec.2020 colour space is wider than what practically any display on the market right now is capable of, except for perhaps some very expensive OLED. Not all LCD computer monitors right now are even able to support the full sRGB colour space and only the very best support DCI-P3 or Adobe RGB (which is close to DCI-P3).