The Black Hole......

OLED, but this goes for other moderns TV's I have seen as well, seem to suffer from an abundance of picture processing, mainly over-sharpening and -noise reduction. Why they need this when transmissions are digital beats me. Turning them off, or at the lowest possible setting, helps a lot to get a more realistic picture.
 
Member
Joined 2004
Paid Member
The image processing is really essential once you learn how much info is lost on the encode side. Broadcast TV bandwidth is 20 Mb/second. A 1080i uncompressed signal is 5 Gb/sec. Or in other words only 1/20 of the original signal can get through. The system reconstructs the rest with varying degrees of success. Its really obvious on moving images, especially a zoom (you will never look at a TV the same after this). There has been huge competition on video processors to get the best results. 4K just makes it all harder to do.

I gather in hollywood 1080P is the prefered production format, as much for production tool chains as anything else. Scaling to 4K or 8K is not difficult and few would ever see the difference. Further production design and the cinematographer may not want that hard edge ultra resolution look anyway if it doesn't move the story forward.
 
Interesting how you read things but with friends like those....

The technical issue was simply that higher resolution can be both good and bad. Do you really want to see the warts?

This is what you said, what did you mean?

HDTV, sporting event cheerleaders, higher resolution of very skimpy costumes revealing issues that had not been noted before!
 
The image processing is really essential once you learn how much info is lost on the encode side. Broadcast TV bandwidth is 20 Mb/second. A 1080i uncompressed signal is 5 Gb/sec. Or in other words only 1/20 of the original signal can get through. The system reconstructs the rest with varying degrees of success. Its really obvious on moving images, especially a zoom (you will never look at a TV the same after this). There has been huge competition on video processors to get the best results. 4K just makes it all harder to do.

I gather in hollywood 1080P is the prefered production format, as much for production tool chains as anything else. Scaling to 4K or 8K is not difficult and few would ever see the difference. Further production design and the cinematographer may not want that hard edge ultra resolution look anyway if it doesn't move the story forward.

I'm pretty sure they are shooting / working in DCI 4k or 2k formats mostly for films. I'm not sure if anyone shoots in raw, I see lots of Apple ProRes or similar codecs from Avid and RED. I don't work in Hollywood though, of course.

The resolution isn't overly consequential once you are talking compressed video. For a certain target bitrate you really don't see a lot of difference, in my experience, as the resolution varies if it's not too extreme. Most services seem to use a higher target rate for higher resolution settings, so it ends up looking better even if that resolution is wasted in some ways.
 
Last edited:
www.hifisonix.com
Joined 2003
Paid Member
When I worked at the P company, I was surprised - perhaps shocked is a better word - to learn the software that ran in the main digital TV processor was over 2 GB. At one stage there were 600 people working on that stuff in Bangalore. And that was 2012.

A wise decision was made and that business was exited (along with mobile phone chip sets - another money pit) in 2013/4. A few of the other big players in that market left shortly afterwards as well. Last time I looked, the a Taiwanese owned that business (Mediatek) and the Chinese TV manufactures. LG and Samsung do their own stuff, but few others.

Mugs game with razor thin margins.
 
Member
Joined 2014
Paid Member
The technical issue was simply that higher resolution can be both good and bad. Do you really want to see the warts?
News can be unnerving in HD with every pore and bead of sweat visible.


The image processing is really essential once you learn how much info is lost on the encode side. Broadcast TV bandwidth is 20 Mb/second. A 1080i uncompressed signal is 5 Gb/sec. Or in other words only 1/20 of the original signal can get through.
.
It's odd but when they started HD channels over the air comparing to the SD showed a significant difference. A couple of years on and its hard to tell. Not sure if they have improved the SD codec or just gotten lazy. I do know that fire and explosions block up horribly.


And of course putting on a blu-ray shows how much is thrown away.
 
Member
Joined 2016
Paid Member
Mind you, with a 50 inch screen ten feet away, the differences are hard to spot - though motion artefacts show up on streamed stuff when it gets throttled, hd, uhd or whatever it is...
Still once they roll out 5G we'll all have downloads in the GB/s range :rofl:

But back in 2020, plenty of people have sub 10MB links to their exchange, and the backhaul is totally overloaded... I'm 3km down the wire, but get a steady 17MB vdsl2 connection, but BT frequently throttle that a lot as the pipe to the rural exchange is too thin.

Oleds I'm told are bad at burn in. Some tvs have code to move the top left channel ID logo to stop it burning in....
 
...Oleds I'm told are bad at burn in. Some tvs have code to move the top left channel ID logo to stop it burning in....

That is not my experience, on a pure white screen I am seeing zero burn in after 18 months of use...but I am just using the OLED to watch TV, not as a computer monitor.

I really dislike LCDs with their angle-dependent picture quality and low contrast ratio and could never get myself to buy one as a TV. From 1994 until 2017 I used a Sony 32" CRT which although not HD had a great picture. When I first saw OLED I knew I was going to buy one and gave that >150 pound CRT set to a friend and it is still going strong in his basement family room...what a beast! I can't say anything bad about the last gen Trinitron reliability.

I like the contrast ratio and angle-independent emissive displays much better, indeed I disliked LCD PC monitors so much I used a Sony GDM-FW900 (24", 16:9, 2300x1440, $$$ CRT) for years until the replacement CRTs were no longer available (It still works and is in my basement if anyone wants one with slightly low emission). When new they were unbeatable. I am currently using a new wide-angle 27" IPS panel Dell for a PC monitor and it is OK for PC use, with movies not so much....I am looking forward to an emissive display for computer use when they make the phosphors/filters more long lived. What ever happened to the promising micro CRT concept?

Other than that I have no opinion on displays...;)
Howie
 
You got me curious, so I did the calibration and some caps testing:
  • 2uF (2.048uF & D=0.001 as per my 0.7% Megger LCR): 2.043u, 1.3R
  • 100nF 1%: 100.6nF
  • 10nF 1%: 9981/9937/9944pF
  • 1nF 1%: 1001/995/996pF
On low cap values, the last 2 digits change a bit when the test is repeated, but still within 1%.

I'm blown away!!!

As announced, I replaced the six 680R and 470K resistors for 0.1% 25ppm versions.
A 0.1% 47K resistor now measures as 47k11 between 1-2 but also between 1-3, which gave different results before.
All caps that I tested easy within 0.5%.
A simple change with much effect on the accuracy.

Hans
 
That is not my experience, on a pure white screen I am seeing zero burn in after 18 months of use...but I am just using the OLED to watch TV, not as a computer monitor.

I really dislike LCDs with their angle-dependent picture quality and low contrast ratio and could never get myself to buy one as a TV. From 1994 until 2017 I used a Sony 32" CRT which although not HD had a great picture. When I first saw OLED I knew I was going to buy one and gave that >150 pound CRT set to a friend and it is still going strong in his basement family room...what a beast! I can't say anything bad about the last gen Trinitron reliability.

I like the contrast ratio and angle-independent emissive displays much better, indeed I disliked LCD PC monitors so much I used a Sony GDM-FW900 (24", 16:9, 2300x1440, $$$ CRT) for years until the replacement CRTs were no longer available (It still works and is in my basement if anyone wants one with slightly low emission). When new they were unbeatable. I am currently using a new wide-angle 27" IPS panel Dell for a PC monitor and it is OK for PC use, with movies not so much....I am looking forward to an emissive display for computer use when they make the phosphors/filters more long lived. What ever happened to the promising micro CRT concept?

Other than that I have no opinion on displays...;)
Howie

LG's TV panels are the only OLEDs that actually have near perfect viewing angles. All direct RGB panels suffer from microcavity effect related color shift that I have seen. Even the iPhone X and up and every new Samsung Galaxy. It manifests itself as blue or yellow shift usually when you tilt the device a bit.

(PDF) Analysis and optimization on the angular color shift of RGB OLED displays

I returned a Google Pixel 2 XL because of the extreme blue shift. My iPhone XS Max has some yellow shift at subtle angles and I find it more offensive than the off-axis brightness loss you experience with a good IPS LCD like were in the older LCD iPhones (8 Plus, etc. not the XR - that's a cheap and inferior LCD). I guess it's preferable to the color shift you get with a VA LCD or gamma shift with a TN panel.
 
Last edited:
LG's TV panels are the only OLEDs that actually have near perfect viewing angles. All direct RGB panels suffer from microcavity effect related color shift that I have seen. Even the iPhone X and up and every new Samsung Galaxy. It manifests itself as blue or yellow shift usually when you tilt the device a bit.

(PDF) Analysis and optimization on the angular color shift of RGB OLED displays

I returned a Google Pixel 2 XL because of the extreme blue shift. My iPhone XS Max has some yellow shift at subtle angles and I find it more offensive than the off-axis brightness loss you experience with a good IPS LCD like were in the older LCD iPhones (8 Plus, etc. not the XR - that's a cheap and inferior LCD). I guess it's preferable to the color shift you get with a VA LCD or gamma shift with a TN panel.


Thank you for explaining why I did not buy that Samsung tablet with super AmoLED display. I found out in the shop that there were thin blue lines on the top of every black. Didn't notice it on the Samsung phones though.
 
Thank you for explaining why I did not buy that Samsung tablet with super AmoLED display. I found out in the shop that there were thin blue lines on the top of every black. Didn't notice it on the Samsung phones though.

That particular issue sounds more like the PenTile subpixel matrix causing artifacts rather than the effect described above (causing off-axis color shift), but I’m not sure. You can most easily see the color shift on white screens. PenTile artifacts tend to be less visible as the pixel density increases. It’s also been improved over time I think.

Subpixel rendering without color distortions for diamond-shaped PenTile displays - IEEE Conference Publication
 
Last edited:
As announced, I replaced the six 680R and 470K resistors for 0.1% 25ppm versions.
A 0.1% 47K resistor now measures as 47k11 between 1-2 but also between 1-3, which gave different results before.
All caps that I tested easy within 0.5%.
A simple change with much effect on the accuracy.

Hans

More tests, my Megger LCR131 (0.7%) against my Mega328 tester in stock form, original 681 & 474K resistors, precision unknown:
  • 1uF MKP: 0.9712uF / 0.9712uF / 0%
  • 100nF MKP: 97.88nF / 97.88nF / 0%
  • 10nF 1% PS: 10.032nF / 10.02nF / -0.12%
  • 1nF 1% PS: 996.8 pF / 1001pF / 0.42%
For the 2 last tests, the residual capacitance of the Megger was nulled out to 0.1pF, the Mega328 was untouched.

In stock form, this $10 tester delivers way more than expected.
Thanks for the data for the resistors, but I think I'll leave mine alone.
 
Last edited:
Hysteresis Distortion

I just read an interesting paper by Bruno Putzeys titled Hysteresis Distortion:
This Thing We Have About Hysteresis Distortion - PURIFI

The good news for Class D enthusiasts is thanks to faster FETs, pretty soon the switching frequencies will be high enough to use reasonably sized air-core coils in the output filters and avoid this.

Hey, at least it's about audio...
Howie
 
I just read an interesting paper by Bruno Putzeys titled Hysteresis Distortion:
This Thing We Have About Hysteresis Distortion - PURIFI
I'm reminded of the Weigand effect:
Wiegand effect - Wikipedia
I had head of this effect a couple decades ago, but what I just learned from the article is:
The Wiegand effect is a macroscopic extension of the Barkhausen effect, as the special treatment of the Wiegand wire causes the wire to act macroscopically as a single large magnetic domain.

The good news for Class D enthusiasts is thanks to faster FETs, pretty soon the switching frequencies will be high enough to use reasonably sized air-core coils in the output filters and avoid this.

Hey, at least it's about audio...
Howie
The currents as well as the switching frequencies are pretty high, resulting in higher magnetic fields. I can imagine such coils would need to be shielded to keep from inducing unwanted voltages into other parts of the circuit.