It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
The fact that I have to turn on closed captioning to understand anything tells me these producers have no idea what we want and shouldn’t be telling us what settings to use.
One problem is that the people mixing the audio already know what is being said:
Top-down processing
(or more specifically, top-down auditory perception)
This refers to perception being driven by prior knowledge, expectations, and context rather than purely by sensory input. When you already know the dialog, your brain projects that knowledge onto the sound and experiences it as “clear.”
TV shows changed completely in the streaming age it seems.
These days they really are just super long movies with glacial pacing to keep users subscribed.
You know when something doesn't annoy you until someone points it out?
It's so obvious in hindsight. Shows like the Big Bang theory, House and Scrubs I very rarely caught two episodes consecutively (and when I did they were on some release schedule so you'd forgotten half of the plot by next week). But they are all practically self contained with only the thread of a longer term narrative being woven between them.
It's doubtful that any of these netflix series you could catch one random episode and feel comfortable that you understand what's going on. Perhaps worse is the recent trend for mini-series which are almost exactly how you describe - just a film without half of it being left on the cutting room floor.
It's a different medium, and it's intentional. And not even new either. The Singing Detective, Karaoke and Cold Lazarus did the same thing decades ago. Apparently they were successful enough that everybody does it now.
I'm fine with this. I always wished regular movies were much longer. I wish lord of the rings movies included all the songs and poems and characters from the book and lasted like 7 hours each.
As opposed to the House model where every episode is exactly the same with some superficial differences?
I like the long movie format, lots of good shows to watch. Movies feel too short to properly tell a story. It's just like a few highlights hastily shown and then it's over.
There's been a lot of speculation/rationalisation around this already, but one I've not seen mentioned is the possibility of it being at least a little down to a kind of "don't look back" collective arrogance (in addition to real technical challenges)
(This may also apply to the "everything's too dark" issue which gets attributed to HDR vs. SDR)
Up until fairly recently both of these professions were pretty small, tight-knit, and learnt (at least partially) from previous generations in a kind of apprentice capacity
Now we have vocational schools - which likely do a great job surfacing a bunch of stuff which was obscure, but miss some of the historical learning and "tricks of the trade"
You come out with a bunch of skills but less experience, and then are thrust into the machine and have to churn out work (often with no senior mentorship)
So you get the meme version of the craft: hone the skills of maximising loudness, impact, ear candy.. flashy stuff without substance
...and a massive overuse of the Wilhelm Scream :) [^1]
[^1]: once an in joke for sound people, and kind of a game to obscure its presence. Now it's common knowledge and used everywhere, a wink to the audience rather than a secret wink to other engineers.
Honestly what I don't get is how this even happened though: it's been I think 10 years with no progress on getting the volume of things to equal out, even with all the fancy software we have. Like I would've thought that 5.1 should be relatively easy to normalize, since the center speech channel is a big obvious "the audience _really_ needs to hear this" channel that should be easy to amplify up in any downmix....instead watching anything is still just riding the damn volume button.
I toyed with the idea of making some kind of app for this but while it may work on desktop it seems less viable for smart tvs which is what I primarily use.
Though I have switched to mostly using Plex, so maybe I could look into doing something there.
Perhaps a mixing issue on your end? Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want. Unfortunately I think there is variability in hardware (and software players) in how to down-mix, which sometimes results in background music in the surround channels drowning out the dialog in the centre channel.
> Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want
Are you talking about the center channel on an X.1 setup or something else? My Denon AVR certainly doesn't have a dedicated setting for dialog, but I can turn up the center channel which yields variable results for improved audio clarity. Note that DVDs and Blurays from 10+ years ago are easily intelligible without any of this futzing.
It's reasonable for the 5.1 mix to have louder atmosphere and be more dependent on directionality for the viewer to pick the dialog out of the center channel. However, all media should also be supplying a stereo mix where the dialog is appropriately boosted.
It's an issue even in theaters and is the main reason I prefer to watch new releases at home on DVD (Dune I saw in the theater, Dune 2 I watched at home.)
The sound mixing does seem to have gotten much worse over time.
But also, people in old movies often enunciated very clearly as a stylistic choice. The Transatlantic accent—sounds a bit unnatural but you can follow the plot.
I "upgraded" from a 10 year old 1080p Vizio to a 4K LG and the sound is the worst part of the experience. It was very basic and consistent with our old TV but now it's all over the place. It's now a mangled mess of audio that's hard to understand.
I have the same sound issues with a lot of stuff, my current theory at this point is that TVs have gotten bigger and we're further away from them but speakers have stayed kinda shitty... but things are being mixed by people using headphones or otherwise good sound equipment
it's very funny how when watching a movie on my macbook pro it's better for me to just use HDMI for the video to my TV but keep on using my MBP speaker for the audio, since the speakers are just much better.
If anything I'd say speakers have only gotten shittier as screens have thinned out. And it used to be fairly common for people to have dedicated speakers, but not anymore.
Just anecdotally, I can tell speaker tech has progressed slowly. Stepping in a car from 20 years ago sound... pretty good, actually.
I agree that speaker tech has progressed slowly, but cars from 20 years ago? Most car audio systems from every era have sounded kinda mediocre at best.
IMO, half the issue with audio is that stereo systems used to be a kind of status symbol, and you used to see more tower speakers or big cabinets at friends' houses. We had good speakers 20 years ago and good speakers today, but sound bars aren't good.
A high end amp+speaker system from 50 years ago will still sound good. The tradeoffs back then were size, price, and power consumption. Same as now.
Lower spec speakers have become good enough, and DSP has improved to the point that tiny speakers can now output mediocre/acceptable sound. The effect of this is that the midrange market is kind of gone, replaced with neat but still worse products such as soundbars (for AV use) or even portable speakers instead of hi-fi systems.
On the high end, I think amplified multi-way speakers with active crossovers are much more common now thanks to advances in Class-D amplifiers.
I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.
Officially it is just that they switch to a better encoding for ads (like mpeg2 to MPEG-4 for DVB) but unofficially for the money as always...
I feel like the Occam's Razor explanation would be that way TVs are advertised makes it really easy to understand picture quality and far less so to understand audio. In stores, they'll be next to a bunch of others playing the same thing such that really only visual differences will stand out. The specs that will stand out online will be things like the resolution, brightness, color accuracy, etc.
> I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.
It's not just that. It's obsession with "cinematic" mixing where dialogues are not only quieter that they could, to make any explosion and other effects be much louder than them, but also not enough above background effects.
This all work in cinema where you have good quality speakers playing much louder than how most people have at home.
But at home you just end up with muddled dialogue that's too quiet.
I think the issue is dynamic range rather than a minor conspiracy.
Film makers want to preserve dynamic range so they can render sounds both subtle and with a lot of punch, preserving detail, whereas ads just want to be heard as much as possible.
Ads will compress sound so it sounds uniform, colorless and as clear and loud as possible for a given volume.
Which is just another drama that should not be on consumers shoulders.
Every time I visit friends with newer TV than mine I am floored by how bad their speakers are. Even the same brand and price-range. Plus the "AI sound" settings (often on by default) are really bad.
I'd love to swap my old tv as it shows it's age, but spending a lot of money on a new one that can't play a show correctly is ridiculous.
Soundbars are a good option, but spend some time reading reviews as there is a huge gap between the cheaper ones and good quality that will actually make a difference.
My brother has 2 of the apple speakers in stereo mode and they sound pretty good imo.
Conspriacy theory ... TVs have bad sound so you're compelled to by a soundbar for $$$
I've certainly had the experience of hard to hear dialog but I think (could be wrong) that that's only really happened with listening through the TV speakers. Since I live in an apartment, 99% of the time I'm listening with headphones and haven't noticed that issue in a long time.
I don't think the bad sound is necessarily deliberate, its more of a casualty of TV's becoming so very thin there's not enough room for a decent cavity inside.
I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.
Soundbars are usually a marginal improvement and the main selling point is the compact size, IMO. I would only get a soundbar if I was really constrained on space.
Engineering tradeoffs--when you make speakers smaller, you have to sacrifice something else. This applies to both soundbars and the built-in speakers.
Like all conspiracy theories, this seems rooted in a severe lack of education. How exactly do you expect a thin tiny strip to produce any sort of good sound? It's basic physics. It's impossible for a modern tv to produce good sound in any capacity.
It's easier to believe in conspiracy than do a few minutes of research to discover that you need a good quality sound system to have good quality sound.
I had the same thing with Severance (last show I watched, I don't watch many) but I'm deaf, so thought it was just that. Seemed like every other line of dialogue was actually a whisper, though. Is this how things are now?
Our tv’s sound is garbage and I was forced to buy a soundbar and got a Sonos one. Night mode seems to crush down the sound track. Loud bits are quieter and quiet bits are louder.
Voice boost makes the dialogue louder.
Everyone in the house loves these two settings and can tell when they are off.
Using some cheap studio monitors for my center channel helped quite a bit. It ain't perfect, I still use CC for many things, but the flat mid channel response does help with speech.
This is probably the sound settings on your TV. Turn off Clear Voice or the equivalent, disable Smart Surround, which ignores 2.0 streams and badly downmuxes 5.1 streams, and finally, check your speaker config on the TV - they’re often set to Showroom by default, which kills voice but boosts music and sfx, and there should also be options for wall proximity, which do matter, and will make the sound a muddy mess if set incorrectly.
For an interesting example that goes in the opposite direction, I've noticed that big YouTube creators like MrBeast optimize their audio to sound as clear as possible on smartphone speakers, but if you listen to their content with headphones it's rather atrocious.
There's a things called 'hidden hearing loss' in which the ability to pause midband sounds specifically in complex/noisy situations degrades. This is missed by standard tests, which only look for ability to hear a given frequency in otherwise silent conditions.
One big cause of this is the multi-channel audio track when all you have is stereo speakers. All of the dialog that should be going into the center speaker just fades away, when do you actually have a center the dialog usually isn't anywhere near as quiet.
Depending on what you're using there could be settings like stereo downmix or voice boost that can help. Or see if the media you're watching lets you pick a stereo track instead of 5.1
We've been mixing vocals and voices in stereo since forever and that was never a problem for clarity. The whole point of the center channel is to avoid the phantom center channel collapse that happens on stereo content when listening off center. It is purely an imaging problem, not a clarity one.
Also, in consumer setups with a center channel speaker it is rather common for it to have a botched speaker design and be of a much poorer quality than the front speakers and actually have a deleterious effect to dialog clarity.
Are there any creators that evolved and shoot at high frame rates that eliminate the need for motion interpolation and its artifacts or is the grip of the bad old film culture still too strong? (there are at least some 48fps films)
Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens, which will sometimes double a frame, sometimes triple it, creating uneven motion. Viewing a well shot film with perfect, expressive motion blur on a proper film screen is surprisingly smooth.
The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.
Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.
I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.
And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.
> Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens
That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.
For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.
> Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?
You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.
It'll probably take the next generation of viewers and directors..
I'm surprised they didn't mention turning off closed captioning, because understanding the dialog is less important than experiencing the creator's intent.
Incidentally, that's the reason why I love photography in Nolan's movies: he seems to love scenes with bright light in which you can actually see what's going on.
Most other movies/series instead are so dark that make my mid-range TV look like crap. And no, it's not an HW fault, as 500 nits should be enough to watch a movie.
Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look blocky and janky.
I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDR version and maybe I'll be able to see what it was supposed to look like.
I have some *arrs on my server. Anything that comes from Netflix is bitstarved to death. If the same show is available on virtually any other streaming service, it will be at the very least twice the size.
No other service does this.
And for some reason, if HDR versions of their 1080p content are even more bitstarved than SDR.
YouTube does this. When I open a video the quality is set to Auto by default. It'll also show the "actual" quality next to it, like "Auto 1080p". Complete lie. I see this and see the video looks like 480p, manually change to 1080p and it's instantly much better. The auto quality thing is a flat out lie.
He is absolutely right. The soap opera effect totally ruins the look of most movies. I still use a good old 1080p plasma on default. It always looks good
I watched the most recent avatar and it was some HDR variant that had this effect turned up. It definitely dampens the experience. There’s something about that slightly fuzzed movement that just makes things on screen look better
Careful what you wish for, or we might get AI-powered "Vibrant Story" filters that reduce 62 minutes of plot-less filler to a 5 minute summary of the only relevant points. Or that try to generate some logic to make the magic in the story make narrative sense.
I just said to a friend that the season 5 writing is so bad that I think AI would have done a better job. I hope someone tries that out once we get the final episode: Give an LLM the scripts for the first 4 seasons, the outcome frome the finale, and let it have a go and drafting a better season 5.
And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.
Much like a chain of email AI filters that turn short directions into full-fledged emails, that in turn get summarized into short directions on the receiving end.
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.
To be fair, "vivid" mode on my old Panasonic plasma was actually an impressive option compared to how an LCD would typically implement it. It didn't change the color profile. It mostly changed how much wattage the panel was allowed to consume. Upward of 800w IIRC. I called it "light cannon" mode. In a dark room, it makes black levels look like they have their own gravitational field despite being fairly bright in absolute terms.
I miss my old Panasonic Plasma. I chose to leave it with my old home because of its size and its age. It was rock solid after 10+ years with many cycles to go. Solid gear! Sigh…
I setup my TV (LG OLED CX) with filmmaker mode in all relevant places and turned off a lot of nubs based on HDTVs [1] recommendations. LG has definitely better ways of tuning the picture just right than my old Samsung had. For this TV I had to manual calibrate the settings.
The interesting thing when turning on filmmaker mode is the feeling of too warm and dark colors. It will go away when the eyes get used to it. But it then lets the image pop when it’s meant to pop etc.
I also turned off this auto brightness [2] feature that is supposed to guard the panel from burn it but just fails in prolonged dark scenes like in Netflix Ozark.
Thanks for the thought but from what I’ve heard from friends I’ll be keeping the final season unwatched just like I did with the last 2 episodes of GoT.
It's been a while - I remember liking the first two seasons. Season three felt a bit silly to me without going into much detail (we need a spoiler text wrapper for HN). Season four has a lot of "zombie-esque" stuff which just doesn't have near the dread horror that the first two seasons did IMHO. Haven't seen any of the final season.
All of the characters are constantly arguing with each other. The story line requires constant suspension of belief based on the endless succession of improbable events and improbable character behaviors. Contradictions with earlier episodes and even details within the same episode. It's really bad. I hope the final episode redeems it but I have my doubts. I want to have an LLM rewrite season 5 and see how much it improves.
It really isn't. I keep seeing comparisons to the last seasons of Game of Thrones, but while there is a dip in quality this season, it is no where near as bad as what happened to GoT.
I rewatched it in recent weeks and enjoyed all the bits that I enjoyed years ago during the first watch. The stories I found a bit tedious first time (High Sparrow plotline, Arya and faceless men) weren't as miserable; I think I was expecting them to drag on even more. My biggest grievance on the rewatch was just how poorly it's all tied up. I again enjoyed The Long Night through the lens of 'spectacle over military documentary'. The last season just felt like they wrote themselves into a corner and didn't have time and patience to see it through. By that point, actors were ready to move on, etc.
The TrueMotion stuff drives me crazy. Chalk it up to being raised on movies filmed at 24fps, plus a heavy dose of FPS games (Wolf, Doom, Quake) as a kid, but frame rate interpolation instantly makes it feel less like a movie and more like I’m watching a weird “Let’s Play.”
"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content."
------------------------
Settings that make the image look less like how the material is supposed to look are not "advances".
Q: So why do manufacturers create them?
A: They sell TV's.
Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology. If every manufacturer does just that, then their screens will all look extremely similar.
If your TV looks like everybody else's, how do you get people strolling through an electronics store to say, "Wow! I want that one!"? You add gimmicky settings that make the image look super saturated, bizarrely smooth, free of grain etc.. Settings that make the image look less like the source, but which grab eyes in a store. You make those settings the default too, so that people don't feel ripped off when they take the TV out of the store.
If you take a typical TV set home and don't change the settings from default, you're typically not going to see a very faithful reproduction of the director's vision. You're seeing what somebody thought would make that screen sell well in a store. If you go to the trouble of setting your screen up properly, your initial reaction may be that it looks worse. However, once you get used to it, you'll probably find the resulting image to be more natural, more enjoyable, and easier on the eyes.
>Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology.
That basically isn’t true. Or rather, there are real engineering tradeoffs required to make an actual consumer good that has to be manufactured and sold at some price. And, especially considering that TVs exist at different price points, there are going to be different tradeoffs made.
Yes, there are tradeoffs, but LCD, etc. technology is now sufficiently good that displays in the same general price category tend to look quite similar once calibrated. The differences are much more noticeable when they're using their default "gimmick" settings, and that's by design.
I don't know what kind of a joke you tried here, but I think a vast majority of TV screens can be put in game or PC mode, and all the input lag and stupid picture processing goes away. I run a 43" LG 4K TV as a PC monitor and never have I had a (flat screen) monitor with a faster response rate! My cinema TV is an old FullHD 42" Philips that has laughably bad black levels. I run it also in PC mode but the real beauty of this TV is that without further picture processing it produces nice and cinemalike flat color that is true to the input material that I feed it. Flashy capeshit will be flashy and bright, and a muted period drama will stay muted.
You might want to setup WireGuard on your Pihole device [1], so that you can VPN to it for DNS resolution remotely. It's crazy good. (And it can also be used as a full VPN, if you want to access anything remotely.)
Especially when the "content" is a blatant AI summary:
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision.
Dynamic Contrast = Low is needed on LG TVs to actually enable HDR scene metadata or something weird like that. 60->120hz motion smoothing is also useful on OLEDs to prevent visual judder; you want either that or black frame insertion. I have no idea what Super Resolution actually does, it never seems to do anything.
Also, as a digital video expert I will allow you to leave motion smoothing on.
noo motion smoothing is terrible unless you like soap operas and not cinema, black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image, the best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder
> noo motion smoothing is terrible unless you like soap operas and not cinema
That's what's so good about it. They say turning it off respects the artists or something, but when I read that I think "so I'm supposed to be respecting Harvey Weinstein and John Lasseter?" and it makes me want to leave it on.
> black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image
That's not necessarily true unless you know to set it to the right mode for different content each time. There are also some movies without proper motion blur, eg animation.
Or, uh, The Hobbit, which I only saw in theaters so maybe they added it for home release.
> he best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder
That's not really a TV mode, it's more about the thing on the other side of the TV I think, but yes you do want that or VFR.
> regular backup of your mail. Google's Takeout service is a straightforward way to achieve this.
Takeout is a horrible way to do regular backups. You have to manualy request it, it takes a long time to generate, manual download... I only use it for monthly full backups.
Much better way for continous incremental backups is IMAP client that locally mirrors incomming emails (Mutt or Thunderbird). It can be configured to store every email in separate file.
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content.
I know I'm pretty unsophisticated when it comes to stuff like art, but I've never been able to appreciate takes like this. If I'm watching something on my own time from the comfort of my home, I don't really care about what the filmmaker thinks if it's different than what I want to see. Maybe he's just trying to speak to the people who do care about seeing his exact vision, but his phrasing is so exaggerated in how negatively he seems to see these settings makes it seem like he genuinely thinks what he's saying applies universally. Honestly, I'd have a pretty similar opinion even for art outside of my home. If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them. It doesn't really seem like you're doing a good job as an artist if you have to give people instructions on how to look at it.
The tone might be a miss, but I enjoy having access to information on the intended experience, for my own curiosity, to better understand the creative process and intentions of the artist, and to habe the option to tweak my approach if I feel like I'm missing something other people aren't.
I hear you, artists (and fans) are frequently overly dogmatic on how their work should be consumed but, well, that strikes me as part-and-parcel of the instinct that drives them to sink hundreds or thousands of hours into developing a niche skill that lets them express an idea by creating something beautiful for the rest of us to enjoy. If they didn't care so much about getting it right, the work would probably be less polished and less compelling, so I'm happy to let them be a bit irritating since they dedicated their life to making something nice for me and the rest of us, even if it was for themselves.
Up to you whether or not this applies to this or any other particular creator, but it feels appropriate to me for artists to be annoying about how their work should be enjoyed in the same way it's appropriate for programmers to be annoying about how software should be developed and used: everyone's necessarily more passionate and opinionated about their domain and their work, that's why they're better at it than me even if individual opinions aren't universally strictly right!
If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them.
That's arguably a thing, due to centuries of aged and yellowed varnish.
You can watch whatever you want however you want, but it's entirely reasonable for the creator of art to give tips on how to view it the way it was intended. If you'd prefer that it look like a hybrid-cartoon Teletubby episode, then I say go for it.
To me it's not about art. It's about this setting making the production quality of a billion dollar movie look like a cardboard SNL set.
When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?
To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".
This is not high art. It's... well... the soap opera effect.
If films shot at a decent enough frame rate, people wouldn’t feel the need to try to fix it. And snobs can have a setting that skips every other frame.
Similar is the case for sound and (to a much lesser extent) colour.
Viewers need to be able to see and hear in comfort.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
From what I’ve read, you want to make sure that the setting is spelled FILMMAKER MODE (in all caps) with a (TM) symbol, since that means that the body who popularized the setting has approved whatever the manufacturer does when you turn that on (so if there’s a setting called “Cinephile Mode” that could mean anything).
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.
At first I thought it's about turning off settings that allow me to watch garbage TV shows (or garbage ending seasons of initially decent TV shows in this case)
Just because someone has different taste doesn't make it bad taste. Books have lower resolution still, and they evoke far greater imaginative leaps. For me, the magic lies in what is not shown; it helps aid the suspension of disbelief by requiring you imagination to do more work filling in the gaps.
I'm an avid video game player, and while FPS and sports-adjacent games demand high framerates, I'm perfectly happy turning my render rates down to 40Hz or 30Hz on many games simply to conserve power. I generally prefer my own brain's antialiasing, I guess.
It is a well-known description for what each brand calls something different. As I wait in a physiotherapist office I am being subjected to a soap opera against my will. Many will have seen snippets of The Bold and the Beautiful without watching a single episode, but enough to know that it looks 'different'.
That choice was made long before Scorsese made The Godfather; and so has virtually every other movie made over the past century.
Real artists understand the limits of the medium they're working in and shape their creations to exist within it. So even if there was no artistic or technical merit in the choice to standardize on 24 FPS, the choice to standardize on 24 FPS shaped the media that came after it. Which means it's gained merit it didn't have when it was initially standardized upon.
Real high framerate is one thing, but the TV setting is faking it with interpolation. There's not really a good reason to do this, it's trickery to deceive you. Recording a video at 60fps is fine, but that's just not what TV and movies do in reality. No one is telling you to watch something at half the intended framerate, just the actual framerate.
I find the rejection of higher frame rates for movies and TV shows to be baffling when people accepted color and sound being introduced which are much bigger changes.
I disliked the effect (of an unfamiliar TV’s postprocessing) without calling it that and without ever having seen a soap opera. What’s your analysis, doc?
It's called the soap opera effect because soap operas were shot on video tape, instead of film, to save money. It wasn't just soap operas, either. Generally, people focus on frame rate, but there are other factors, too, like how video sensors capture light across the spectrum differently than film.
I haven't thought about or noticed in nearly two decades
My eyes 100% adjusted, I like higher frame and refresh rates now
I cant believe that industry just repeated a line about how magical 24fps feels for ages and nobody questioned it, until they magically had enough storage and equipment resources to abandon it. what a coincidence
My TV is from around 2017 and some of those settings definitely suck on it. I'm curious if they have improved any of them on newer TVs.
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
I thought there is such a thing (although probably some TV sets do not have) as "film maker mode" to do it according to the film maker's intention (although I don't know all of the details, so I do even know how well it would work). "Dolby Vision Movie Dark" is something that I had not heard of.
(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)
Release your movie in native 120 fps and I'll turn off motion interpolation. Until then, minor flickering artifacts when it fails to resolve motion, or minor haloing around edges of moving objects, are vastly preferable to unwatchable judder that I can't even interpret as motion sometimes.
Every PC gamer knows you need high frame rates for camera movement. It's ridiculous the movie industry is stuck at 24 like it's the stone age, only because of some boomers screaming of some "soap opera" effect they invented in their brains. I'd imagine most Gen Z people don't even know what a "soap opera" is supposed to be, I had to look it up the first time I saw someone say it.
My LG OLED G5 literally provides a better experience than going to the cinema, due to this.
I'm so glad 4k60 is being established as the standard on YouTube, where I watch most of my content now... it's just movies that are inexplicably stuck in the past...
I hope AI tools allow for better fan edits. There's enough of a foundation and source footage to redo the later episodes of Stranger Things ... The Matrix ... etc.
I need to test the new audio demuxing model out for fan edits. Separating music, dialog, and sound effects into stems would make continuity much easier. Minor rewrites would be interesting, but considering Tron Ares botched AI rewrite dubbing so bad I’m not holding my breath.
I wouldn't be surprised if the free/open voice cloning and lip-synch tools of today are better than whatever "professional" tools they were using however many months/year ago they did that edit.
Yes, I think that this is one place to be very bullish on AI content creation. There are many people with fantastic visions for beautiful stories that they will never be in a position to create the traditional way; oftentimes with better stories than what is actually produced officially.
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
Nothing is stopping you right now from buying or finding or creating a catalog of loops and samples that you can use to create your own Artistic Vision[tm]. The technology exists and has existed for decades, no AI required.
i often think about all the music ruined by self obsessed dorks singing soulless middle school poetry, and it's the main application of AI i'm quite excited for
Creative intent refers to the goal of displaying content on a TV precisely as the original director or colorist intended it to be seen in the studio or cinema.
A lot of work is put into this and the fact that many TVs nowadays come with terrible default settings doesn't help.
We have a whole generation who actually prefer the colors all maxed out with motion smoothing etc. turned to 11 but that's like handing the Mona Lisa to some rando down the street to improve it with crayons.
At the end of the day it's disrespectful to the creator and the artwork itself.
Totally agreed. I read somewhere that the only place these features help is sports. They should not be defaults. They make shows and films look like total crap.
Actually, they do not belong anywhere. If you look at the processing pipeline necessary to, for example, shoot and produce modern sporting events in both standard and high dynamic range, the last thing you want is a television that makes its own decisions based on some random setting that a clueless engineer at the manufacturer thought would be cool to have. Companies spend millions of dollars (hundreds of millions in the case of broadcasters) to deliver technically accurate data to televisions.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
As someone who has built multi-camera live broadcast systems and operated them you are 100% correct. There is color correction, image processing, and all the related bits. Each of these units costs many times more and is far more capable with much higher quality (in the right hands) than what is included in even the most high end TV.
They're the equivalent of the pointless DSP audio modes on 90's A/V receivers. Who was ever going to use "Concert Hall", "Jazz Club", or "Rock Concert" with distracting reverb and echo added to ruin the sound.
I think it is helpful to have settings that you can change, although the default settings should probably match those intended by whoever made the movie or TV show that you are watching, according to the specification of the video format. (The same applies to audio, etc.)
This way, you should not need to change them unless you want nonstandard settings for whatever reason.
Yeah, televisions come full of truly destructive settings. I think part of the genesis of this virus is the need for TV's to stand out at the store. Brands and models are displayed side-by-side. The only way to stand out is to push the limits of over-enhancement along every possible axis (resolution, color, motion, etc.).
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
that article ends with AI slop (perhaps all of it)
"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision."
This article seems to imply that the default settings are the manufacturer recommended ones for streaming movies - is that bad ux? Should Netflix be able to push recommended settings to your tv?
The problem is it can be subjective. Some people really like the “smooth motion” effect, especially if they never got used to watching 24fps films back in the day. Others, like me, think seeing stuff at higher refresh rates just looks off. It may be a generational thing. Same goes for “vivid color” mode and those crazy high contrast colors. People just like it more.
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better. These motion interpolation settings are now ubiquitous and pretty much nobody cares about said effect anymore, which is great, because maybe now we can start having movies above 24FPS.
To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.
Personally, I have no issue watching things that are shot at 60fps (like YouTube videos, even live action) but the motion smoothing on TV shows makes it look off to me.
I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.
The motion smoother also has to guess which parts of the picture to modify. Is the quarterback throwing the ball the important part? The team on the sidelines? The people in the stands? The camera on wires zooming around over the field to get bird’s eye views? When it guesses wrong and enhances the wrong thing, it looks weird.
Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.
Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.
Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.
Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.
You’d need to actually support your assertion that higher FPS is objectively better, especially higher FPS via motion interpolation which inherently degrades the image by inserting blurry duplicated frames.
People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.
Films rely on 24 fps or, rather, low motion resolution to help suspend disbelief. There are things that the viewer are not meant to see or at least see clearly. Yes, part of that specific framerate is nostalgia and what the audience expects a movie to look like, but it holds a purpose.
Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.
I think that whole complaint is just "people getting used to how it is". Games are just worse in lower framerate because they are interactive and because we never had 24 fps era, the games had lower framerate only if studio couldn't get it to run better on a given hardware
With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.
I'm not sure I buy that it helps the audience suspend their disbelief.
If it did horror films would be filmed at higher frame rates for extra scares.
Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.
easy... because 24fps has that dream like feel to it.. second you go past that it starts to look like people on a stage and you loose the illusion... i couldn't watch the hobbit because of it
movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports
> The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better.
But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.
> To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.
"Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.
For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.
Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.
When people say “creator’s intent”, it sounds like a flavor. Like how food comes out of the kitchen before you put toppings on it to make it your own.
But vivid mode (et al) literally loses information. When the TV tries to make everything look vibrant, it’s effectively squishing all of the colors into a smaller color space. You may not be able to even tell two distinct objects apart because everything is similarly bright and vibrant.
Same with audio. The famous “smile” EQ can cause some instruments to disappear, such as woodwinds.
At the end of the day, media is for enjoyment and much of it is subjective, so fine do what you need to do to be happy. But few people would deliberately choose lower resolution (except maybe for nostalgia), which is what a lot of the fancy settings end up doing.
Get a calibration if you can, or use Filmmaker Mode. The latter will make the TV relatively dark, but there’s usually a way to adjust it or copy its settings and then boost the brightness in a Custom mode, which is still a big improvement over default settings from the default mode.
Without even clicking I know he’s talking about motion smoothing.
Went to the in-laws over the holidays and the motion smoothing on the otherwise very nice LG tv was absolutely atrocious.
My sister had her Nintendo Switch connected to it and the worst thing was not the low resolution game on the 4k display - it was the motion smoothing. Absolutely unbearable. Sister was complaining about input lag and it was most definitey caused by the motion smoothing.
I keep my own TV on game mode regardless of the content because otherwise all the extra “features” - which includes more than just motion smoothing - pretty much destroys picture quality universally no matter what I’m watching.
what about not filming entire show in darkness. or, i don't know, filming it in a way that it will look ok on modern televisions without having to turn off settings.
> filming it in a way that it will look ok on modern televisions without having to turn off settings.
That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.
...no, a lot of their content is clearly filmed and mastered for cinema. Too dark, voice too low or muddy, stuff that would sound and look fine in a dark room with good, loud sound system but meh everywhere else
I'm not even convinced anyone really watches Stranger Things, so I don't see the point. Seems like something people put on as background noise while they are distracted by their phones.
The first seasons were captivating. This last one? I walked out of the room, to do some housework, came ban 10 minutes later, asked what happened? Answer was a simple sentence.
I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.
People were clearly watching through at least season 4. That show used songs that nowadays most viewers would consider to be oldies that became hits again after the episodes containing them were released.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
I see a tonne of “fan” content on the video sites tagged #strangerthings, which is strange since I have that tag blocked. It's almost like it's all paid promotion…
Ironically the Apple TV Netflix app really wants to soup the intro - going so far as to mute the intro to offer the “skip” button. You have to hit “back” to get the audio back during the intro.
Not she why Netflix is destroying destroying the experience themselves here.
Yeah, kiss m'ass. I agree that some of those settings do need to be turned off. When I visit someone and see their TV on soap opera mode, I fight the urge to fix it. Not my house, not my TV, not my problem if they like it that way, and yet, wow, is it ever awful.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?
I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.
Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.
I agree that the viewer should change the settings if they want different settings than the film maker intended, although it also makes sense to have a option (not mandatory) to use the settings that the film maker intended (if these settings are known) in case you do not want to specify your own settings. (The same would apply to audio, web pages, etc.)
Sure. I’m all for having that as an option, or even the default. That’s a good starting place for most people. I think what I most object to is the pretentiousness I read into the quote:
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.
I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.
And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.
100% agree. I’ve tried multiple times to use the cinema modes in my TVs, the ones that are supposed to be “as the director intended” but in the end they’re always too dark and I find things hard to see, and turns out I just subjectively like the look of movies on the normal (or gasp sometimes vivid if it’s really bright in the room) than in the “proper” cinema mode. I don’t really care what the creator thinks, it looks better to me so it’s better for me.
> What if I need higher contrast to make out what's happening on the screen?
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
> I would say that TV's should ship without any such enhancements enabled.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
Top-down processing
(or more specifically, top-down auditory perception)
This refers to perception being driven by prior knowledge, expectations, and context rather than purely by sensory input. When you already know the dialog, your brain projects that knowledge onto the sound and experiences it as “clear.”
It's so obvious in hindsight. Shows like the Big Bang theory, House and Scrubs I very rarely caught two episodes consecutively (and when I did they were on some release schedule so you'd forgotten half of the plot by next week). But they are all practically self contained with only the thread of a longer term narrative being woven between them.
It's doubtful that any of these netflix series you could catch one random episode and feel comfortable that you understand what's going on. Perhaps worse is the recent trend for mini-series which are almost exactly how you describe - just a film without half of it being left on the cutting room floor.
I like the long movie format, lots of good shows to watch. Movies feel too short to properly tell a story. It's just like a few highlights hastily shown and then it's over.
(This may also apply to the "everything's too dark" issue which gets attributed to HDR vs. SDR)
Up until fairly recently both of these professions were pretty small, tight-knit, and learnt (at least partially) from previous generations in a kind of apprentice capacity
Now we have vocational schools - which likely do a great job surfacing a bunch of stuff which was obscure, but miss some of the historical learning and "tricks of the trade"
You come out with a bunch of skills but less experience, and then are thrust into the machine and have to churn out work (often with no senior mentorship)
So you get the meme version of the craft: hone the skills of maximising loudness, impact, ear candy.. flashy stuff without substance
...and a massive overuse of the Wilhelm Scream :) [^1]
[^1]: once an in joke for sound people, and kind of a game to obscure its presence. Now it's common knowledge and used everywhere, a wink to the audience rather than a secret wink to other engineers.
https://en.wikipedia.org/wiki/Wilhelm_scream
EDIT: egads, typing on a phone makes it far too easy to accidentally write a wall of text - sorry!
doesn't seem like anyone outside the audience thinks it's a serious problem (?)
Though I have switched to mostly using Plex, so maybe I could look into doing something there.
Doesn't solve for single units but could help with people who use soundbars or amps
Abandoned though - was basically just multiband compression and couldn't find a way to make it adaptable enough (some media always ended up sucking)
Would be super interested to hear what you tried!
Are you talking about the center channel on an X.1 setup or something else? My Denon AVR certainly doesn't have a dedicated setting for dialog, but I can turn up the center channel which yields variable results for improved audio clarity. Note that DVDs and Blurays from 10+ years ago are easily intelligible without any of this futzing.
Then I noticed that native speakers also complain.
Then I started to watch YouTube channels, live TV and old movies, and I found out I could understand almost everything! (depending on the dialect)
When even native speakers can't properly enjoy modern movies and TV shows, you know that something is very wrong...
But also, people in old movies often enunciated very clearly as a stylistic choice. The Transatlantic accent—sounds a bit unnatural but you can follow the plot.
it's very funny how when watching a movie on my macbook pro it's better for me to just use HDMI for the video to my TV but keep on using my MBP speaker for the audio, since the speakers are just much better.
Just anecdotally, I can tell speaker tech has progressed slowly. Stepping in a car from 20 years ago sound... pretty good, actually.
IMO, half the issue with audio is that stereo systems used to be a kind of status symbol, and you used to see more tower speakers or big cabinets at friends' houses. We had good speakers 20 years ago and good speakers today, but sound bars aren't good.
Lower spec speakers have become good enough, and DSP has improved to the point that tiny speakers can now output mediocre/acceptable sound. The effect of this is that the midrange market is kind of gone, replaced with neat but still worse products such as soundbars (for AV use) or even portable speakers instead of hi-fi systems.
On the high end, I think amplified multi-way speakers with active crossovers are much more common now thanks to advances in Class-D amplifiers.
I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.
Officially it is just that they switch to a better encoding for ads (like mpeg2 to MPEG-4 for DVB) but unofficially for the money as always...
It's not just that. It's obsession with "cinematic" mixing where dialogues are not only quieter that they could, to make any explosion and other effects be much louder than them, but also not enough above background effects.
This all work in cinema where you have good quality speakers playing much louder than how most people have at home.
But at home you just end up with muddled dialogue that's too quiet.
Film makers want to preserve dynamic range so they can render sounds both subtle and with a lot of punch, preserving detail, whereas ads just want to be heard as much as possible.
Ads will compress sound so it sounds uniform, colorless and as clear and loud as possible for a given volume.
Which is just another drama that should not be on consumers shoulders.
Every time I visit friends with newer TV than mine I am floored by how bad their speakers are. Even the same brand and price-range. Plus the "AI sound" settings (often on by default) are really bad.
I'd love to swap my old tv as it shows it's age, but spending a lot of money on a new one that can't play a show correctly is ridiculous.
They are notorious for bad vocal range audio.
I have a decent surround sound and had no issues at all.
I want to get a 3.0 setup with minimal changes to the equipment.
My brother has 2 of the apple speakers in stereo mode and they sound pretty good imo.
Nothing beats true surround sound though.
I've certainly had the experience of hard to hear dialog but I think (could be wrong) that that's only really happened with listening through the TV speakers. Since I live in an apartment, 99% of the time I'm listening with headphones and haven't noticed that issue in a long time.
I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.
Engineering tradeoffs--when you make speakers smaller, you have to sacrifice something else. This applies to both soundbars and the built-in speakers.
Voice boost makes the dialogue louder.
Everyone in the house loves these two settings and can tell when they are off.
Flatscreen TV's have shitty speakers.
https://www.audiology.org/consumers-and-patients/hearing-and...
The sound is mud, we've just become accustomed.
Depending on what you're using there could be settings like stereo downmix or voice boost that can help. Or see if the media you're watching lets you pick a stereo track instead of 5.1
Also, in consumer setups with a center channel speaker it is rather common for it to have a botched speaker design and be of a much poorer quality than the front speakers and actually have a deleterious effect to dialog clarity.
Unfortunately settings won't help Season 5 be any better, it verges on being garbage itself, a profound drop in quality compared to previous seasons.
The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.
Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.
I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.
And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.
That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.
For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.
> Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?
You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.
It'll probably take the next generation of viewers and directors..
Real is good, it’s ergonomic and accessible. Until filmmakers understand that, I’ll have to keep interpolation on at the lowest setting.
Most other movies/series instead are so dark that make my mid-range TV look like crap. And no, it's not an HW fault, as 500 nits should be enough to watch a movie.
Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look blocky and janky.
I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDR version and maybe I'll be able to see what it was supposed to look like.
Despite being a subscriber I pirate their shows to get some pixels.
No other service does this.
And for some reason, if HDR versions of their 1080p content are even more bitstarved than SDR.
for us nerds there is hidden stats for nerds option.
https://blog.sayan.page/netflix-debug-mode/
https://www.smbc-comics.com/comic/summary
And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.
To be fair, "vivid" mode on my old Panasonic plasma was actually an impressive option compared to how an LCD would typically implement it. It didn't change the color profile. It mostly changed how much wattage the panel was allowed to consume. Upward of 800w IIRC. I called it "light cannon" mode. In a dark room, it makes black levels look like they have their own gravitational field despite being fairly bright in absolute terms.
The interesting thing when turning on filmmaker mode is the feeling of too warm and dark colors. It will go away when the eyes get used to it. But it then lets the image pop when it’s meant to pop etc. I also turned off this auto brightness [2] feature that is supposed to guard the panel from burn it but just fails in prolonged dark scenes like in Netflix Ozark.
[1] https://youtu.be/uGFt746TJu0?si=iCOVk3_3FCUAX-ye [2] https://youtu.be/E5qXj-vpX5Q?si=HkGXFQPyo6aN7T72
------------------------
Settings that make the image look less like how the material is supposed to look are not "advances".
Q: So why do manufacturers create them?
A: They sell TV's.
Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology. If every manufacturer does just that, then their screens will all look extremely similar.
If your TV looks like everybody else's, how do you get people strolling through an electronics store to say, "Wow! I want that one!"? You add gimmicky settings that make the image look super saturated, bizarrely smooth, free of grain etc.. Settings that make the image look less like the source, but which grab eyes in a store. You make those settings the default too, so that people don't feel ripped off when they take the TV out of the store.
If you take a typical TV set home and don't change the settings from default, you're typically not going to see a very faithful reproduction of the director's vision. You're seeing what somebody thought would make that screen sell well in a store. If you go to the trouble of setting your screen up properly, your initial reaction may be that it looks worse. However, once you get used to it, you'll probably find the resulting image to be more natural, more enjoyable, and easier on the eyes.
That basically isn’t true. Or rather, there are real engineering tradeoffs required to make an actual consumer good that has to be manufactured and sold at some price. And, especially considering that TVs exist at different price points, there are going to be different tradeoffs made.
Yes, I usually run add blockers, Pihole etc, I’m away from home and temporarily without my filters.
[1] https://docs.pi-hole.net/guides/vpn/wireguard/
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision.
You'd think television production would be calibrated for the median watcher's TV settings by now.
Also, as a digital video expert I will allow you to leave motion smoothing on.
That's what's so good about it. They say turning it off respects the artists or something, but when I read that I think "so I'm supposed to be respecting Harvey Weinstein and John Lasseter?" and it makes me want to leave it on.
> black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image
That's not necessarily true unless you know to set it to the right mode for different content each time. There are also some movies without proper motion blur, eg animation.
Or, uh, The Hobbit, which I only saw in theaters so maybe they added it for home release.
> he best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder
That's not really a TV mode, it's more about the thing on the other side of the TV I think, but yes you do want that or VFR.
Takeout is a horrible way to do regular backups. You have to manualy request it, it takes a long time to generate, manual download... I only use it for monthly full backups.
Much better way for continous incremental backups is IMAP client that locally mirrors incomming emails (Mutt or Thunderbird). It can be configured to store every email in separate file.
I know I'm pretty unsophisticated when it comes to stuff like art, but I've never been able to appreciate takes like this. If I'm watching something on my own time from the comfort of my home, I don't really care about what the filmmaker thinks if it's different than what I want to see. Maybe he's just trying to speak to the people who do care about seeing his exact vision, but his phrasing is so exaggerated in how negatively he seems to see these settings makes it seem like he genuinely thinks what he's saying applies universally. Honestly, I'd have a pretty similar opinion even for art outside of my home. If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them. It doesn't really seem like you're doing a good job as an artist if you have to give people instructions on how to look at it.
I hear you, artists (and fans) are frequently overly dogmatic on how their work should be consumed but, well, that strikes me as part-and-parcel of the instinct that drives them to sink hundreds or thousands of hours into developing a niche skill that lets them express an idea by creating something beautiful for the rest of us to enjoy. If they didn't care so much about getting it right, the work would probably be less polished and less compelling, so I'm happy to let them be a bit irritating since they dedicated their life to making something nice for me and the rest of us, even if it was for themselves.
Up to you whether or not this applies to this or any other particular creator, but it feels appropriate to me for artists to be annoying about how their work should be enjoyed in the same way it's appropriate for programmers to be annoying about how software should be developed and used: everyone's necessarily more passionate and opinionated about their domain and their work, that's why they're better at it than me even if individual opinions aren't universally strictly right!
That's arguably a thing, due to centuries of aged and yellowed varnish.
You can watch whatever you want however you want, but it's entirely reasonable for the creator of art to give tips on how to view it the way it was intended. If you'd prefer that it look like a hybrid-cartoon Teletubby episode, then I say go for it.
When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?
To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".
This is not high art. It's... well... the soap opera effect.
Similar is the case for sound and (to a much lesser extent) colour.
Viewers need to be able to see and hear in comfort.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.
I like how it looks because it is "high quality videogame effect" for me. 60 hz, 120hz, 144hz, you only get this on a good videogame setup.
I'm an avid video game player, and while FPS and sports-adjacent games demand high framerates, I'm perfectly happy turning my render rates down to 40Hz or 30Hz on many games simply to conserve power. I generally prefer my own brain's antialiasing, I guess.
He had been told to slow down because 24hz simply could not capture his fast movements.
At 144hz, we would be able to better appreciate his abilities.
Real artists understand the limits of the medium they're working in and shape their creations to exist within it. So even if there was no artistic or technical merit in the choice to standardize on 24 FPS, the choice to standardize on 24 FPS shaped the media that came after it. Which means it's gained merit it didn't have when it was initially standardized upon.
>before Scorsese made The Godfather
Can you let me in on the joke?
...that being said motion interpolation is abomination
If you watch at a higher frame rate, the mistakes become obvious rather than melting into the frames. Humans look plastic and fake.
The people that are masters of light and photography make intentional choices for a reason.
You can cook your steak well done if you like, but that's not how you're supposed to eat it.
A steak is not a burger. A movie is not a sports event or video game.
The technical limitations of the past century should not define what constitutes a film.
What next, gonna complain resolution is too high and you can see costume seams ?
The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate
They literally had to invent new types of makeup because HD provided more skin detail than was previously available.
It’s why you’ll find a lot of foundation marketed as “HD cream”.
I'm a filmmaker. Yes, it was.
> What next, gonna complain resolution is too high and you can see costume seams ?
Try playing an SNES game on CRT versus with pixel upscaling.
The art direction was chosen for the technology.
https://www.youtube.com/shorts/jh2ssirC1oQ
> The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate
You don't need 48fps to make a good film. You don't need a big budget either.
If you want to take a piece of art and have it look garish, you do you.
Did you read an interview with the cow’s creator?
I haven't thought about or noticed in nearly two decades
My eyes 100% adjusted, I like higher frame and refresh rates now
I cant believe that industry just repeated a line about how magical 24fps feels for ages and nobody questioned it, until they magically had enough storage and equipment resources to abandon it. what a coincidence
It was 10 stars before it was even released... Are humans still needed at all? Just have LLMs generate crappy content and bots upvote it.
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)
Every PC gamer knows you need high frame rates for camera movement. It's ridiculous the movie industry is stuck at 24 like it's the stone age, only because of some boomers screaming of some "soap opera" effect they invented in their brains. I'd imagine most Gen Z people don't even know what a "soap opera" is supposed to be, I had to look it up the first time I saw someone say it.
My LG OLED G5 literally provides a better experience than going to the cinema, due to this.
I'm so glad 4k60 is being established as the standard on YouTube, where I watch most of my content now... it's just movies that are inexplicably stuck in the past...
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
I wouldn't call it a "technological advance" to make even the biggest blockbuster look like it was filmed with a 90s camcorder with cardboard sets.
Truemotion and friends are indeed garbage, and I don't understand how people can leave it on.
For those unfamiliar with the term you should watch Vincent Teoh @ HDTVTest:
https://www.youtube.com/hdtvtest
Creative intent refers to the goal of displaying content on a TV precisely as the original director or colorist intended it to be seen in the studio or cinema.
A lot of work is put into this and the fact that many TVs nowadays come with terrible default settings doesn't help.
We have a whole generation who actually prefer the colors all maxed out with motion smoothing etc. turned to 11 but that's like handing the Mona Lisa to some rando down the street to improve it with crayons.
At the end of the day it's disrespectful to the creator and the artwork itself.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
This way, you should not need to change them unless you want nonstandard settings for whatever reason.
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision."
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
No.
I don't care about the "filmmaker's intent", because it is my TV. I will enable whatever settings look best to me.
To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.
I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.
Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.
Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.
Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.
Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.
People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.
Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.
With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.
If it did horror films would be filmed at higher frame rates for extra scares.
Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.
movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports
But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.
"Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.
For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.
Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.
1: https://www.youtube.com/watch?v=zAPf5fSDGVk
But vivid mode (et al) literally loses information. When the TV tries to make everything look vibrant, it’s effectively squishing all of the colors into a smaller color space. You may not be able to even tell two distinct objects apart because everything is similarly bright and vibrant.
Same with audio. The famous “smile” EQ can cause some instruments to disappear, such as woodwinds.
At the end of the day, media is for enjoyment and much of it is subjective, so fine do what you need to do to be happy. But few people would deliberately choose lower resolution (except maybe for nostalgia), which is what a lot of the fancy settings end up doing.
Get a calibration if you can, or use Filmmaker Mode. The latter will make the TV relatively dark, but there’s usually a way to adjust it or copy its settings and then boost the brightness in a Custom mode, which is still a big improvement over default settings from the default mode.
Went to the in-laws over the holidays and the motion smoothing on the otherwise very nice LG tv was absolutely atrocious.
My sister had her Nintendo Switch connected to it and the worst thing was not the low resolution game on the 4k display - it was the motion smoothing. Absolutely unbearable. Sister was complaining about input lag and it was most definitey caused by the motion smoothing.
I keep my own TV on game mode regardless of the content because otherwise all the extra “features” - which includes more than just motion smoothing - pretty much destroys picture quality universally no matter what I’m watching.
That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.
they film for screens , regardless of where those might be.
I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
Not she why Netflix is destroying destroying the experience themselves here.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?
I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.
Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.
I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.
And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.
The equalizer analogy is perfect.
Having said that, there are a lot of bad HDR masters.
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
Etc.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.