Tap to unmute
Give Me 4K or Nothing
Umieść na stronie
- Opublikowany 7 mar 2023
- Watch the full WAN Show: • We Shattered a Co...
► GET MERCH: lttstore.com
► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
► OUR WAN PODCAST GEAR: lmg.gg/wanset
FOLLOW US ON SOCIAL
---------------------------------------------------
Twitter: linustech
Facebook: LinusTech
Instagram: linustech
TikTok: www.tiktok.com/@linustech
TikTok (LMG Clips): www.tiktok.com/@_lmgclips_
Twitch: www.twitch.tv/linustech Nauka i technika
While I'm not a fan of Nvidia's consumer practices, I can't argue that the software they have is fantastic.
NVIDIA's product is almost always good. Doesnt matter what they make- That's why they piss me off so much
@Chrysippus true
@Gabriel M. yeah but if Nvidia can piss off well over half of gamers on a good day, imagine how much they care if all Linux users complain.
Sucks they've limited it to the 30 series and above.
Just get a job now
I’ve enjoyed VSR quite a bit in the past week or so. Of course there are some videos that don’t work well with it… but I agree with Linus that a good source being upscaled tends to look better than PLclip’s native 4K. Nowwwww a garbage source being upscaled isn’t going to fundamentally change it. But I’m sure it’ll improve as time goes on :)
In my opinion, the future of video codec lies in AI upscaling combined with extra data. You would encode your stream at 1080p, and then the AI would handle 90% of the upscaling, with the remaining 10% being perfected by providing hints to the AI. It's worth noting that the size of these hints could vary depending on the AI's performance. Looking even further ahead, I believe we may see an encoder AI and decoder AI that work so well together that they can generate just enough bytes for optimal video reproduction.
I find VSR works great for animated content like Cartoons and Anime.
Cleans up aliasing on the line art from lower resolution uploads and sharpens up the text for Subtitles in Anime nicely.
@joseph Bryan Asuncion I do plan on buying One Punch Man once the manga wraps up. Want to do it all in one go.
Would really be nice if there was a streaming option that was respectful to the creators. Especially since it's getting harder to get your hands on physical media these days.
@Daniel Cobia if you really want to support the creator buy their merch, read their work and follow them on social media atleast
@Daniel Cobia like what i say, there some distributor in PLclip that broadcast anime on PLclip literally legally. And the reason why i say people tend to pirate anime is because of these localizer distributors,and licensor like cringeroll who make disrespectful decisions to people like what they do to some anime this season.
@joseph Bryan Asuncion Not having a good way to legally get the content is terrible. I want to reward the creators of anime that I really like.
@Daniel Cobia reason why most of the people pirate,have i guess try bili bili,some anime in the current season can be streamed on PLclip legally.
Me with my 2080TI after hearing 40 and 30 series GPU's: "What am I chopped liver?"
They're gonna port it to the 2000 series, it's just that the 30 and 40 series are priority.
This was exactly why I ended up getting a 3080 12GB card cause they just kept leaving my poor 2080 Super behind. The 20 series is great but just feels like the redheaded stepchild compared to 30 and 40 series...
I feel you. 🙁
1080 gang
I read somewhere that it should becoming to 20 series later
*OK FOR THOSE WHO WONDERING HOW TO MAKE THIS WORK:* u only need to download the nvidia's current drivers (make sure is games ready GDR, it wont work with studio's) go into nvidia's control panel go all the way down to "Adjust Video Image settings" and at the right side checkmark the RTX video enhancement set quality to 4 if u want 4k. You will also need latest chrom/edge to use this it wont work with firefox atm.
Enjoy!
It's supposed to work with every Chromium based browser, however I haven't been able to get it to work with Brave yet
The ideas about encoding repeating patterns and moving objects are pretty similar to how compression already works. But I'm sure AI will make it a lot more efficient.
I've moved my plex watching from the Plex Windows desktop app to the web app to take advantage of the NVidia 4k upscaling. It works SOOO well for TV shows that were only aired in 1080p!
@Gary Stinten There's definitely still a noticeable quality difference obviously, but it does do a pretty decent job of smoothing things out a bit and cleaning it up. Better with it on then off IMO.
How about anything under 1080p like say pre HD times (count old VCR rips from the 80s)?
I still don't understand how you guys do a whole 2+ hour stream and still do everything else. I respect the work ethic
@Simon Dowsett projecting much?
@Simon Dowsett
Trash Taste
Ear Biscuits
H3 podcast
The weekly planet
Sorry that you seem incapable of searching PLclip or using google. 4 at least weekly podcasts that are 1-3 hours long and have people that also busy themselves with other time consuming endeavours while being able to talk for a few hours each week on subjects they have interest in.
And I don’t see why it being live is any more impressive as edited podcasts usually have more time cut from then meaning they’re actually longer than you experience. Live podcasts also have live chats to use as a conversational crutch.
The optimization topic reminds me a lot of SVG. No image format scales as well as the format that is literally just the math to redraw the image at any resolution. Similarly, I could see a new video format that is designed to draw the video frame by frame rather than a fixed stream of pre-rendered images
Linus' idea sounds like replay-files in games. they only save the information about what happened and the video is generated in-engine where you can move the camera around freely so no graphical information is saved in the replay at all. for podcasts that are always in front of the same background that would work, the viewer just has to download the information about the set and the people once and can use it every episode. wouldn't work for any other kind of video though and it would make it harder when, for example in wan-show, someone shares their screen with the stream
I'm playing around with it right now. It looks surpassingly good for upscaling lower res video, but once you go past the 1080 resolution, you can see a hint of the similar DLSS smudginess depending on what's on screen.
This compression thing is a really cool idea for some theatre kid programming student to do as some thesis for a research project. I really want to see this go somewhere in the next 5 years.
300 watts of power demand, when all your doing is watching a youtube video, is honestly hilarious lmao it very well describes the current state of PC hardware
@billy101456 yes. Needing a PSU what wouldn't fit in the first place is exactly the problem.
@Studio Brock I think Luke’s specific problem was it didn’t fit in his case and he didn’t have the right cord for the psu he had. And needing more/different cables for an upgrade and you don’t remember where your extra modular cables went is always a struggle. Questionable if it had fit if there was enough air in there to get it remotely cool.
Yeah. Honestly I was about to start looking at a used 30 series and then Linus dropped that bomb.
It's really cool but seriously this trend of throw heat and power to the wall needs to stop. You shouldn't need to mount any GPU on top of your case and power it with a separate PSU like Luke is doing with his 7900 XTX. That's disgusting from a tech perspective but from a jank perspective I love it.
Power? Noooooo. We have so much power, we’re just giving it away!
Watching _a slightly more enhanced playback_ of a PLclip video!
There's no need to save on power consumption in this day and age anyway, right?
What's actually going to happen is that new video codecs are going to be created (experimental codecs are already being worked on) that have AI reconstruction built in on the codec level, and the trained neural net is part of the spec - that way everyone has the same decoder and gets the same experience. But the bitrate requirements are an order of magnitude lower.
No need to add metadata to modify an H.264 video, when it's far more efficient to _replace_ H.264 with something completely designed for AI from the ground up.
Nice! Ive had this idea for a while when it comes to scifi holograms / VR Presence --- send a super low quality transmission over and let the other end artificially upscale it and fill in missing details. In most use cases, you don't need the actual details, but the absence of details can be jarring
Reminds me of the days of midi sound cards where music would sound different depending on what card you had
I use it to upscale shows, don't really care about it for PLclip videos as it's not as important to me. But it's pretty good. I watch a lot of 1080p content on a 1440p screen and that causes some MINOR fuzziness but VSR clears it up pretty well. Only thing is, at quality 4... damn. GPU uses like 310 watts and 30-40% usage. Which is why I actually believe Nvidia when they say the 20 series cards couldn't keep up and need additional development time
Honestly the jump from 1080p to 1440p alone is such a big difference. I am really happy with 2k, anything beyond i don't notice unless the display gets bigger than 32 inch and more
@Keel3r Okay then. I checked LG's website. Doesn't list 2k anywhere. I checked Acer's website, doesn't list 2k anywhere. It does list 4k UHD though. I just checked Dell's website, it does not list 2k anywhere. They all list QHD or WQHD. Hell, even googling for "insert brand 2k monitors" you don't get any specific results, only for Dell do you get referred to a list of WQHD monitors.
So show me a place where monitor manufacturers refer to 1440p as 2k. Because so far, it seems you're the one who doesn't have a clue.
@MGsubbie "Never", you don't look at them then, so you shouldn't speak on what you have no clue about.
@Keel3r I am aware. It doesn't prove me wrong. And here is how this entire argument is debunked : 3840x2160 is not actual 4k, yet it is constantly referred to as 4k, as it's close enough to 4096x2160. Calling 2560x1440 2k is not consistent. Also, I have never seen a single monitor manufacturer refer to them as 2k. Ever. I only see the label QHD or WQHD used. But I could be mistaken there.
But let's just stay away from 2k and 4k as it's a mess anyway. We can stick to HD, FHD, QHD and UHD. Or just stating the pixel counts.
@MGsubbie You miss the point. 2k monitors at the official resolution, DO NOT EXIST. Every single 2k monitor is 2560x1440p resolution. If 2k monitors were the offical resolution, then you could say 2k monitors are 2048x1080p, but they do not exist. If you buy a 2k monitor expecting 2048x1080p you will not get that, it will be 2560x1440p, therefor a 2k monitor is 2560x1440p. Maybe instead of arguing about it you guys should go look at monitors and when you realize a 2k monitor is 2560x1440p then complain to the companies and tell them not to mislabel. But as long as every single 2k resolution labeled monitor is 2560x1440p, then a 2k resolution monitor is 2560x1440p. Saying otherwise is wrong.
@Keel3r No. Many people refer to it as such, but it's a common mistake. Calling 3840x2160 4k but then 2560x1440 2k makes zero sense. 1920x1080 is equally close to 2k as 3840x2160 is to 4k.
I would love we talk more about the good old MPV player. It's basically the same as VSR, people create algorithm, sometime neural nets, and play/upscale/correct videos with it. It is life changing.
Problems are you cannot open DRM content (twitch and youtube, live or not, work though) and there is no good integration in the browser.
Better is, you can choose you algorithm depending on you hardware and what you are watching.
The current high tech encodings for video are almost "mocap" already (they move sections of the video as groups of pixels). But yeah, an AI could take that idea even further.
For example the "fuzzy snow" example is good. It'd need the encoder to know how to remove the "snow/confetti" and then tell the AI "Just draw snow/confetti back in please". :P
Watching this with 1080p to 4k upscaling with RTX super resolution. It looks so amazing. Only issues is the laptop 3080 I have is utilizing at 30% gpu usage. Makes the laptop noisy.
@Chaitanya Chaphalkar you're right. depends how cheap your laptop was but the factory stuff is usually ok for a few years, but if you ever do a repaste get the best stuff you can find because it's not expensive
@Houssam Alucad it didn’t even hit 70c.. i doubt thermal paste reapplication is gonna help. Just gotta change the fan curves.
@William Morey-Baker just testing it honestly. No other reason. I’ll mess with the fan curves. Thanks.
@ASBESTOS Fibers just to test it out. Don’t tell me that’s pointless.
Consider reapplying the thermal paste, the noise should go down dramatically or you'll get more FPS depending on how the fan curve is configured
Video patterns being upscaled differently like you can play a midi file with different sound fonts is a hilarious idea.
I really like that question that someone asked what is the bitrate of the floatplane and Luke just made his way out of the answer. Do someone know what is the bitrate of the floatplane?
You would just need to do some of the processing server side statically. Then send that preprocessed data to the user who would do some more processing (or none, but turn that data into an upscaled format). Maybe they've tried it and the bandwidth use is too high sending pre-processed data? Still pretty incredible.,
4k tv have gotten pretty cheap over the past few years but 4k monitors priced refuse to budge.
I've seen a 27 inch 4K monitor selling for about 200 bucks but it's only 60 Hz. I managed to snag a 4K 144Hz monitor for about 500 bucks during a sale.
Monitors have different requirements. TVs are subsidized by bloatware, don't need as good latency, typically have lower refresh rates, and often don't need to be as color accurate.
So you can get a crap 4k monitor for somewhat cheap now, but a good 4k monitor will probably be more expensive than a good 4k TV.
Also, monitors probably don't benefit from as much economies of scale dude to there being far more possible ranges of form factors and sizes. TVs these days don't have curves, for example. And static elements on a TV don't really exist for the most part, so there's less chance of burn in.
The tvs are subsidized by ads
I just want Nvidia to release a new shield with updated 4K upscaling
I use it, works on everything from anime to live-streams of video games, I wish there was a quick way to toggle it though btcause when the gpu becomes busy with a game or otherwise it bogs down
I wonder if Nvidia can use frame generation to turn 30 to 60 fps video or higher?
Works pretty well on older 360p/480p VHS type video that's been uploaded to YT, not perfect but still does a pretty decent job at increasing the quality and removing artifacts.
I absolutely love my 1440P w HDR. It's going to be a while before I have a real need to upgrade to 4k
maybe in the future, the video description will contain an URL for AI-upscaling to retrieve the dataset to upscale the video in the best way, a few high res layers for the AI to work in-between or something. and if big-tech has their way, there might even be an add layer to dump adds into parts of the video image.
A URL
They are explaining how video compression already works. They already encode to move objects and patterns etc.
When doing content creation (not on this channel so don't look for it I don't advertise) 4k recording would be way too expensive as the file sizes are massive and I don't like to delete my footage. This is good news.
Imagine if servers could send neural network fragments to users that are fine tuned for their particular content. Or imagine if there was some kind of hashing so they could borrow fragments you've already downloaded. I could imagine the server and client being like "Draw a giraffe here" - 'What's a giraffe?' - "It's like a horse but longer" - 'okay done'
@deiminator2 I’m imagining something more like a textual inversion or a LORA. Just trying to get the concept across without getting bogged down in the details
Neural network fragment? LMAO you have to be trolling
Nvidia Canvas is pretty cool. I'm writing a fanfic and described an environmental scene. Three moons of different colours, two rising while the third descended over a mountain range. One of my reviewers asked for an image of it. Which obviously didn't exist.
Jumped onto Canvas, painted a sky and a mountain range in a few seconds, cut them apart in a photoshop program and added in the moons. Took me an hour to take something that was an image in my mind and make a version of it other people could see.
It obviously still has serious limitations. I could draw the entire scene via AI, just the 'backdrop' and then had to make and layer the moons manually. But as these tools get stronger, I could probably 'draw' an entire scene like that in Canvas if they add more element options to paint with. The trick then would be finding ways to retain consistency with previously generated designs so that each time you want to make a new one, it doesn't start all over and you get a totally different look in every image.
This and missing HDR support will probably drive me from firefox to chrome *grumpy*
Soon we’ll be able to turn any film into any kind of quality footage we want. Like ok, let’s take the series Friends and tell it we want it to look like it was filmed on expensive Hollywood Cameras for movies like the ones Christopher Nolan uses. Later, we’ll be able to have it take an episode and tell it to turn it into a dark film directed by Christopher Nolan. Literally all the pieces are being developed right now, and I thought music was going to be the hard part but Googles recent diffusion text2music paper showed it will be able to do it a hellva lot better and we’re a lot closer than I expected. The only thing that seems to be the thing it would need to get right is extreme subtlety in multiple dimensions, writing acting cinematography sound music etc etc. That’s going to be a huge hurdle and until a few months ago I’d have said we were way far away from that. But now i know we’re way closer. Of course we’ll start simply, but the question will be how much better, and how fast will the the improvements will be with the release of each new model.
There was a gap of like 5-6 months with Midjourney ver 4 to 5, and they said they expect their next version to be released within 2 months! The “AI can’t do hands” meme mockery will be something that we will barely remember.
I am SO glad Linus mentioned the power draw of RTX VSR. Gamers Nexus did an excellent video but for some reason they literally did not mention power draw whatsoever. I don’t care much about power draw, but I don’t really want to suck back 330W on my 3080 just to watch PLclip videos when CPU bound games won’t even use that much lmao
I love this feature so far. Wake up one day to an extra thing on my gpu. Not bad
For those who aren't aware RTX VSR is currently built to run with the 30 series' sparse matrix improvements, this is why it requires work it'll run on 20 series.
What your talking about with AI compression could be done with an Autoencoder network. The encode side is a large network that condenses into fewer vectors while the decoding side takes a few vectors and expands into the original size, the AI is then trained for encode/decode accuracy and the compression results can be absolutely wild. Toss an AI upscaling network like Nvidia's over top of the output and you could probably compress videos to a tenth of their original size without losing viewing quality. Of course, doing all of this on the fly would require insane computing power.
Can anyone tell me how (the 4k upscaler) it would affect say VCR rips to digital?? Or even a VCR stream to screen output??
Linus, might be a good video opp
From what I've seen 360p videos get washed out and smudged. So going lower (like to VHS) will probably just become big blobs of color. And that's assuming you can even run it since it only works in Chrome at 360p or higher currently.
From what I've seen 360p videos get washed out and smudged. So going lower (like to VHS) will probably just become big blobs of color.
8:37 yeah that's the main issue with streaming video games and the MAIN reason to turn all the way down or off if u can the particles effects in games as no matter if u have a 4090 it will still tank your streaming image quality cos of that compression with bitrate
7:00 What you're talking about is called an auto-encoder. A lossy machine learning compression algorithm that's basically a big neural network with a bottle neck in the middle. You split the model at the bottle neck - the first half becomes the encoder, and the second half is the decoder.
It has an advantage over regular compression because it's able to leverage the predictability of the content it's compressing.
VSR works "fine" on lower power devices (if you ignore the glaring bugs that make it nearly impossible to use) but if we are talking about power consumption, I think nvidia just doesnt know how to manage higher power GPUS and just tries to throw the entire GPU at it instead of what is needed.
Compression is the biggest killer of quality. 1080p offline often looks better than 4k online
Imagine how this could affect game streaming. As little computation locally as possible regarding what happens in the game, but then heavy computation regarding rendering the streamed image with supporting meta data to be interpreted by the target device...
AMD could easily make a VFSR version of FSR, and I think Intel could make a XeVSS version of their XeSS, for similar results, but ideally also available for rendering videos. And the advantage of FSR is that it can work on all GPUs, including GPUs integrated in CPUs (aka. APUs).
"Floatplane would be DRM protected too... well, don't worry about that"
put a pin in that Lafr
god that looks good, video is smoother even on native youtube resolution
my eyes when luke's hand thru mic's hand stand after he lift it up above to be seen become 3d something out of the adobe cut without final filter is already something and I'm testing this new super resolution
I wish I could get float plan but I can never get my validation email!!!
They dropped upscaling for ages. It used to be something with that nvidia dvd/video player back in the day.
There is already a video chat codec that transmits a still image and motion data of the other person and reconstructs video at the destination.
I hope they expand functionality to firefox browser as well
We switched back to the Apple TV for AirPlay and I forgot how much i relied on the nvidia shield AI upscaling!
Too bad even a 3060 can't handle the lowest Up Scale setting without dropping the video's framerate to like 40fps at 95+% usage. I wish they added an option to upscale to 1080p and just remove artifacts instead of pushing for 4k
I tried it and I thought it looked horrible, especially at level 4. That typical AI enhanced water colour effect was all over it, especially on close ups of faces. Weirdly I didn't have that problem with the Shield's upscaling.
Is it possible to make a video comparing the Shield upscaling to the new one? Like can you capture the video feed from a Shield directly?
Is it possible to make a video comparing the Shield upscaling to the new one? Like can you capture the video feed from a Shield directly?
So the NFL confetti cannon now shoots out square cards with 'confetti goes here' QR codes... Cool!
Hmm, I haven't been able to see a difference between VSR on or off. How do you know its working?
This... Might make me buy a Nvidia GPU again.
I just tried this watching this video, and it didn't look great to me? No matter what setting I had it set to it made linus' beard and neck look extremely AI generated by smoothing it out like crazy
Me enjoying upscaled 4K videos.
My gpu: 🔥
PLclip 4K can have banding depending on how you shoot it and the bit rate you use. There are youtubers who do extremely high quality 4k Videos with no banding. This will just allow more low quality videos.
What will REALLY bake your noodle is how we can produce a 360p (etc) resolution video designed to make the AI upscaling work better, by having the video compressor (essentially) use the same AI in reverse. (By “we” I mean “they” as in video tech wonks. I’m just an investor.)
Did he say 300w to watch a damn upscaled video? My jaw fell open and there’s no way in hell I’m enabling that.
I wonder if Kodi can upscale our libraries of videos ? 😊
Opera Gx have been doing this for a while now. its nice.
So does it work on Netflix? If so how will a vendor like Netflix be able to charge for hi-res streams if people can just upscale the low-res streams?
Considering who's behind it, I could see this being used in cloud gaming to improve responsiveness. Basically ship video signal at lower quality to reduce the amount of data sent, then enhance on client machine.
@Clawzz that is a very good point
@fanfanboy The current implementation is insanely computationally intensive(considering the 300W requirement). If they get that requirement down to something more reasonable, it could trickle down to lowspec models. Again, this is just a first attempt at something like this.
@Jceggbert5 While that is a good idea, thats not at all what im talking about, I'm saying if you already have the hardware for great graphics, just play the game on your machine, not stream it and have it upscaled to a varied quality that may or may not be as good. Plus, the people that are streaming games probably have a mediocre pc and so they will not be able to upscale it.
@fanfanboy instantly start play after purchase, and you stream the game while the game downloads in the background
Since it requires 30/40 series, I don't think so. What would be the point of using cloud gaming if your hardware is already top of the line.
Okay, when do we get it on actual browsers? (firefox)
I have been following this and now it is here I see no difference. But, High bitrate 1080P from PLclip IS impressive. Nvidia thing seems like bs.
Too bad I can't use that in Firefox on Linux
Haven't been impressed by 4k yet, just looks like sharper 1080p with 4x the performance requirements
This feature has been available on Nvidia SHIELD TV devices since November 2019 IIRC. The Shield TV uses an underpowered Tegra X1+ ARM SoC (essentially a Nintendo Switch running at stock clock speed), can anyone explain why the F**K it took Nvidia 3 years to bring the feature to their discrete PC GPUs that are an order of magnitude more powerful than the Tegra X1??? Honestly, I'm baffled they didn't debut this feature on PC and then optimize it for ARM. The Tegra X1+ doesn't use more than 20 watts while gaming so the AI upscaling can't use 300 watts for sure, so why is it so unpolished on PC and why does it seem like the requirements to make it work are a lot more complicated? The Tegra X1 is on Maxwell GPU architecture, and Nvidia can't even promise this will work on Turing?!? It doesn't make any sense!! lol
I've been wanting to say the same as well. And from what I've seen of it (and my own testing) it's not even better than old technology. NGU upscaling is free and looks sharper. It's noise reduction might not compare with Nvidia's but I haven't really tried it because noise reduction in general just makes it look blobby and washed out.
I've been wanting to say the same as well. And from what I've seen of it (and my own testing) it's not even better than old technology. NGU upscaling is free and looks sharper. It's noise reduction might not compare with Nvidia's but I haven't really tried it because noise reduction in general just makes it look blobby and washed out.
LOL, thank you. I was looking for this comment
we free users are locked at 1080p in xxi century. after years of 4k now we miss that feature
If .. the power consumption is this high, the first thing has to be to drastically lower the power usage.
If that can be achieved, streaming will be heavily reduced bandwidth, which overall would be good for everyone. The downside is, nobody will see the real source anymore, so a loss of data and reality would come along with that. Which in itself offers room for bad actors as well.
finally linus duke nukem in 4k!!!!
I think at some point we will get a netflix type subscription where you give a input from 1 line to multiple pages where you describe the type of content you want to watch and its story and you get a generated video.
Hopefully they just allow it instead of a subscription. We’re already using our own gpus and power.
what about 4k to 4k or 1080 to 1080 to deal with the insane over compression they do on YT?
@Ari Barra It's YT, EVERYTHING is way over compressed. :(
it's better to upscale a good quality 720p video than an over compressed 1080p one
Can someone test this out with TheSlomoguys glitter video where they broke PLclip video compression?
Wait, you can upscale Netflix?
Isn't Netflix 4k a higher price tier?
I wonder how long until upscaling Netflix gets disabled
4:57 a quick 700mv undervolt would fix that :P
imagine how much bitrate it would save if theu stopped waving their arms as theu speak
Unfortunately it's not that simple. Most encoders really struggle with small movements. It's similar to the problem Luke mentioned. If a video is _completely_ still then it's very compressable. But if there's even a tiny bit of movement the bitrate quadruples. I can't understand why they haven't solved this yet considering how much of an effect it has on compression.
Moving your head amounts to as much bitrate as moving your arms and body around, essentially.
Unfortunately it's not that simple. Most encoders really struggle with small movements. It's similar to the problem Luke mentioned. If a video is _completely_ still then it's very compressable. But if there's even a tiny bit of movement the bitrate quadruples. I can't understand why they haven't solved this yet considering how much of an effect it has on compression.
I hope this moves to Firefox, too. I am seriously considering moving AWAY from Chrome b/c of the way too dominant position over the web the way Internet Explorer had- and we NEED an alternative to grow and force the world to NOT comply ONLY to Chrome...
They were basically just describing Adobe Flash
A motte is something that others can't do (CUDA). Everyone can do upscaling. Entertainment systems have been doing upscaling for a decade. The AI upscaling was announced within a week for both Intel and AMD. Personally, I'm sick of upscaling. I want pixel perfect 4k because hitting the limitations of technology is how you get better tech. Downgrading your processes to enable more with less is how you get taken as a consumer. Projectors rendering 3-4 1080p frames for 4k. Games with no optimizations relying on AI Super Sampling to make it playable. Coax hitting 1Gb instead of Fiber to the home so you still have shit upload.
I feel like im taking crazy pills. I have a 40 series card, updated every possible bit of software on my pc and I dont notice a single difference on this video at any resolution with vsr on, which im watching on a 4k tv.
Most people are saying the same. It barely makes a difference at all. And the hype videos that say "You can immediately see an insane jump in quality" are obviously blind. It's sad that those videos even exist in the first place.
There's barely any difference when you turn it on. So it'll just end up forgotten in a matter of weeks. Which is a shame. But at least better technology already exists. MadVR/NGU upscaling still gives better sharpness and it's been around for many years. I really wanted a successor to it but this ain't it.
I don't need upscaling, I just need youtube to stop changing my quaility settings from 4k down to garbage. Linus says almost no one uses 4k on youtube, well maybe it's because youtube keeps turning it off
I tried it while watching this video and in my own opinion it looks bad. It just makes everything look like a moving painting. As someone stated in another comment, it might work better for animated videos so i'm gonna have to try that out. But for now I doubt i'll be using it
Netflix: quality above 480p requires a more premium membership for $6 more a month.
Me: lol jokes on you, I'll just buy a 4090 and save myself the cash over a 40 year period.
Ok guys, time to start uploading wan clips in 4k. Pretty please.
@jgal it can be a blank image and would still serve the purpose of wan show 95% of the times
@Ari Barra 1080p looks pretty offensive blown up to a 55” oled
what for? to watch their face wrinkles in more detail?
VSR unfortunately will NOT fix your favorite anime
Linus the future you are envisioning is here.
Edit:
It's called "Event Based MetaVision" from France if I recall correctly.
Didn't PLclip want to remove 4K from free users? Only for premium? hah! Nvidia Upscale here it comes!
surely this should be avaliable for older GPUs as well, ai upscaling cant be that demanding for high end 10 series gpus, i really have no idea though, but id be happy for my 10 series gpu to be at 90% usage if im watching youtube, i would love to get rid of youtubes crap compression
If a tegra X1 can do it why can't 9, 10, or 20 series capable?
As the owner of a 6700xt, I was initially very jealous of nvidia users when I heard about this feature. But if it doesn't work on Firefox, it's a non-starter for me - I have no desire to switch to Chrome or Edge.
Me, a lifelong Firefox user with a 4090: wat
Then again I'm on Linux so I guess I'm out of luck either way