Next HDR Resolution!

Discussion in 'Computers and The Internet' started by wooleeheron, Apr 9, 2018.

  1. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    What many don't know is that 4k resolution is widely considered as good as it gets, with 8k not being worth spending more money on because the difference is that slight. However, even 4k requires an outrageous computer and Internet connection, while 2k can be made to look almost as good using Sony's new HDR10+ standard. HDR monitors and TVs with the new standard are about to flood the market making 1440p (~2.5k) eventually the new resolution standard for anything but the cheapest TV or monitor. In video games and TVs, its always about faking it and this is about as good a fake as anyone could ask for, with even 144hz refresh rates and roughly double the brightness of current monitors and TVs.

    Using higher resolutions and more light, means these screens can produce a lot more subtle grays and colors, as well as, other details and more often make the screen look like a window and make everything appear more 3D. Four times the resolution of a 1080p screen which, combined with the higher refresh rates, makes aiming easier and eliminates blur in the background and similar problems. Prices on the better ones just now coming on the market should come down to normal consumer prices rather quickly over the next 3-6 years. Already you can find cheap Asian versions, with electronics being dirt cheap in most of Asia.

    Something like a geforce gtx 1060 is enough to run an HDR display for gaming, but you get higher frame rates and settings with more vram and a more powerful card with 6-8gb of vram being a minimum. A good rule of thumb is that if your current video card can play all your games at 1080p and 60fps, its good enough for 2k gaming. The difference is guaranteed to knock your socks off, and probably marks the end of the TV and monitor wars for the ideal flat screen replacement for a CRT. Black Friday is the best time to look for sales on such monitors or any expensive electronics, by having websites even email you all day long with offers up to fifty dollars or more off the regular price of something like a normally $350.oo monitor.
    Last edited: Apr 10, 2018
  2. wilsjane

    wilsjane Member

    The limiting factor of most digital video lies in the data. In order to cram films onto a single disk. All to often, compression thresholds are set purely to fit the film onto one disk.

    Refresh rate of 100 hz is good for reducing visible flicker, but above 100 is pointless when the camera is still shooting 25 images per second. Panasonic started introducing 300 hz to accommodate the high speed cameras used to film sports such as tennis, but as yet, it has not been recognized as an SMPTE standard.

    Definition above 4k is also pointless when watching material originating from 35mm film, since 4k corresponds to the grain size of the print stock.

    Watching 4k in the cinema is so much better than at home (at the moment) simply because the signal is raw, from several terabytes of data. However bad setup and maintenance of cinema equipment, coupled with distributors proposals to supply data via the internet, may change all that for the worse.

    For home use, OLED (organic light emitting diode) technology is superior, but reliability, life expectancy and poor fine control make buying one of these sets a risky investment. Hopefully Sony will soon set the standards.
  3. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    OLEDs are outrageously expensive, while competing technologies such as QLED can already theoretically be mass produced much cheaper. QDot displays are possibly the long term future, but the hybrid of QLED looks more likely right now and even has a low environmental impact.

    The human eye is incredibly perceptive, and tests with gamers have shown they can tell the difference between even 144hz monitor and well over 200hz. Things like cars appearing to blur in the background of a movie are an artifact of how few frames the camera has taken, but in a video game you don't want the background always blurring like a movie. It can make you dizzy, and miss. Hollywood still has yet to settle on a new standard for filming movies and whatnot, but they always follow the common video format used for TVs and monitors. If they did not compensate for the limitations of current TVs and monitors, dark scenes would be almost impossible to see and other problems would arise that would dramatically lower the value of their films, with Hollywood increasingly moving into VR, which requires higher resolutions.

    Many things I expect the industry to take their time sorting out, but HDR10+ looks extremely likely to be adopted as the new standard due to its low cost and being something they would sort out quickly. I'd say, within six years at most, they should have the basics sorted out. A nice benefit of using HDR is you can always turn it off if it doesn't look right.
    Last edited: Apr 9, 2018
  4. wilsjane

    wilsjane Member

    The problem with 144hz is that it does not line up in multiples with the original material, meaning that a variable rate of frame repeat has to be employed in the standards converter, This is often an inexpensive addition with abysmal quality output. I believe that 144 is somehow connected to the gaming market, but I am not certain.

    The reason that Panasonic proposed the 300 system, was because it can be used as a direct multiple of 25 frame European signals and 30 frame from the US. You need to remember that these two standards originate from the days when cameras derived their time base from the mains 50 and 60hz mains electricity supplies. Hollywood and the rest of the film industry largely used 24 fps for filming, as per the SMPTE standard. Originally showing cinema film on TV raised the pitch of the analogue sound tracks, but digital sound has resolved this problem, now making it possible to show opera and classical music filmed in 35mm on TV
  5. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    All of these standards were adopted as merely the minimum acceptable frame rates with, for example, silent films using 18fps for many years. Bringing the frame rate up to 24fps was merely the minimum required for people to stop running around like wind-up toys in films and for most special effects to have any real impact. King Kong looked pretty good at lower frame rates, but all special effects are not limited to claymation and Buster Keaton stunts. Monitors come in all kinds of refresh rates, but 144hz is evenly divisible by the 24fps movies use, and many VR applications require at least 90fps per eye. To significantly reduce background blur when turning fast, you want at least 90-120hz, with 144hz being fast enough to knock it down to a minor issue. Its also a better multiple for modern multicore processing, allowing for processing matrices and using sparce voxel technology and advanced compression. Ray tracing in particular might benefit a great deal from adopting 144hz as a widespread standard.

    It might not sound like you'd notice any difference, but even just moving a mouse cursor around on the screen the refresh rate makes a significant difference in how much you have to concentrate on what you are doing. By increasing the resolution as well, you can double the amount of real estate on the desktop without making it any harder to navigate. Displays are becoming cheaper and larger, but somewhere around 30" is about as big a monitor as you can easily use sitting at a desk, without having to move your head around all the time, and increasing the frame rate and resolution allows you to squeeze more content onto a single display without driving the operator nuts trying to position a mouse cursor.

    Theoretically, there is no reason a display can't have a refresh rate of over a million frames per second, but that technology is still too expensive for the average consumer. Its all about what's cheap to do with current LCD technology, with an ideal monitor having a frame rate in the millions per second, 8k resolutions, and 7-10x the brightness of current monitors, but that technology does not exist and 8k resolutions would cripple any gaming rig on the planet. Again, its not necessary either since HDR makes 2k almost as good as 4k, which is almost as good as 8k. The additional increase in frame rates is especially useful for VR applications which can easily make users throw up when they move their head because neither the 24fps Hollywood has used for so long nor the 60hz monitors and TVs have used is enough for a good experience.

    The industry doesn't legislate such standards, and individual manufacturers merely adopt whatever looks likely to become the next standard and is cheap to produce. If we were still using CRT television monitors for everything, it would be a non-issue, but that technology is not as cheap or consumer friendly as current flat screens. Of course, the idea of carrying an old fashioned CRT around in your pocket for a cellphone is absurd.

    Although people tend to think of acoustics as being a long since mastered science, the truth is we are only now beginning to master acoustics. Even modern surround sound systems with seven or more speakers, do not reproduce the sound with anything remotely like realism, and manufacturers are making progress in creating new standards and cheap technology for 3D sound. For example, by merely adding another speaker on top of a bookshelf speaker at a 45 degree angle, its possible to mix sound in mid-air right next to your ear as if a character on the screen were whispering to you. The ideal dirt cheap monitor and sound system has yet to be invented, but for around 10-20 thousand dollars you can approximate one.
    Last edited: Apr 9, 2018
  6. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Like I said, this technology is new and will require a few years to work out all the bugs. Early adopters know, ya takes your chances! In this case, the reviewer got ahold of a defective LG display, but it was Samsung and Sony that developed the standard and QLED screens, and LG obviously has yet to do their homework. Usually LG is more on top of things, but this is a complicated standard which requires contextual data input from the source material. However, the reviewer gives a good account of the variety of limitations and issues with the technology and, I would have to agree, its not quite ready for prime time. Early adopters like to take more risks, but that's usually because they know what they are doing and this technology will only start to mature around the time it comes down to more competitive prices, perhaps, in six years or less.
  7. wilsjane

    wilsjane Member

    Sound has been my specialty since the late 1960s and in my opinion it has been downhill all the way since.
    Back then, a handful of films were recorded using 5 stage channels and surround for 70mm film presentation. The first 70mm films ran at 30fps and were striped with 6 magnetic tracks. The results were out of this world, relying on the acoustics of the theater building and horn loaded low frequency speakers coupled with multicell HF horns and drivers. The LF speakers were ribbon wound with DC induced magnets. We crossed over at 500 hz half section (12db per octave).
    Oklahoma was the finest ever recording.
    The first drop in qualiy happened soon, when 70mm reduction to 35mm laboratory optical internegative systems were developed, saving having to shoot the film twice for suburban release. This unfortunately involved running the 70mm camera at 24fps, thus lowering the frequency response and signal to noise ratio of the soundtrack.
    A few years later, Kodak developed fine grain negative stock, allowing cameras to run 35mm negative and reversing the the optical 70mm reduction to 35mm blow up. However the 'Todd AO' and 'Super Panavision 70' lens systems were lost. The films had to be shot using a 2.35:1 anamorphic lens (Cinemascope) to accomodate the 70mm screen ratio. All this severely impacted on the picture quality, but the sound remained unaltered.
    Next came 4 magnetic tracks on 35mm, but these were so narrow that tracking them was a nightmare and the background noise from the lower film speed involved gating the effects speakers and filtering the gating signals off. The problems, cost of the prints and unwillingness of cinema owners to install the sound equipment and replace all the projector sprockets, sounded the death knell after about 4 films were made. It was subsequently used for a few west end presentations.

    Then Dolby laboratories got involved. I could spend all night telling you how the sound went from bad to worse as one gimmick after another was added, each adding to phase shift and intermodulation distortion until by the early 1980, people were walking out of the cinema because they could hear the roar of the space ships, but on more artistic films they could not understand half the dialogue.
    The cinema industry was on its knees, but no one had the sense to realize why.

    Early digital sound on film was the next disaster, because without a complete refitting of every cinema the tracks were printed between the perforations and since readers had to be cobbled onto the projection equipment in all sorts of different places, the sound led the picture and was synchronized by delay processors. A complete shambolic mess.

    Digital film projection has opened a window of opportunity to the industry, but all the current signs are that it will fail completely.
  8. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    The problem with cameras, sound, and communications in general is that we have not been able to construct the necessary meta-materials to cheaply capture and reproduce them. For example, its only in recent years that physicists have constructed the first "perfect" microphone. Its wall paper that contains microscopic pores which capture the entire sound wave and store it as a single standing wave within the entire wall. You just can't get better fidelity than capturing all the sound the human ear can hear without adding noise, but it required their special wallpaper.

    In the near future, what's coming are camera lenses so flat a cellphone camera can act like a telescopic lens or a high powered microscope. These are the kinds of meta-materials required to inexpensively capture fidelity, with it even being possible to construct an entire sound system complete with acoustic circuitry and amplifiers that don't introduce noise. Theoretically, you can do the same with light and preserve an entire scene with absolute fidelity as far as the human eye and ear are concerned. Lightfields are a bit in the future still, but they are rapidly making advances and the technology to do all these things should be mature within two decades at most. According to many people today, we are currently poised to step into your wildest Star Trek fantasies and beyond.

    A computer or game console could theoretically use all optics for the picture and all acoustic circuitry for the sound, reducing the noise and power requirements to almost nothing. Its just not cheap to make right now even if we had all the technology, and would be at least a million times more powerful than any desktop computer, requiring only a single core processor with a few thousand optical transistors. No graphics card or fancy motherboard or anything, just a chip that uses perhaps 3w and you can plug a six foot optical fiber into. The display can use the same technology to reproduce the original content with absolute fidelity, as if you were merely looking through a window.
    Last edited: Apr 9, 2018
  9. wilsjane

    wilsjane Member

    This may interest you.
    Few people have any idea what is involved in running a 70mm film.
  10. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Yeah, I deliberately designed my work station computer to look like something out of a physics laboratory. The NSA already has a satellite in orbit with a gigapixel camera capable of reading every gum wrapper on Manhattan island simultaneously with one picture. Theoretically, you could make wallpaper like that or post-it notes. The guys working these machines are pioneers.
  11. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    This is an expert review of the newest Samsung QLED HDR monitor that includes a lot of gaming footage and detailed explanation and artistic impressions of the displays strengths and weaknesses, beginning around minute 24. The game he uses is Modern Warfare WWII, and even without seeing all the high texture resolutions in the video you can easily tell how much better HDR can be for rendering AAA titles. While watching him, I could not help but wonder how much better the game would look using ray traced lighting, which is coming to a console and HDR monitor or TV near you, soon.

Share This Page