I was curious, so I looked it up. AV1 is more efficient than HEVC by like 28%! On the downside, encoding is horrifically slooow. It’ll be interesting to see how much hardware support AV1 gets in the coming years, because encoding time will have a dramatic effect on its adoption rate.
Interesting to note: AV1 can be played in Kodi, Plex, Emby, Jellyfin, VLC, Chrome, Firefox, Edge, and Opera. So on the software side, it’s pretty widely supported.
They didn’t say that. They just said it’s better, but didn’t mention their rationale. But IMHO opinion, it being FOSS makes it about a million times better than HEVC alone
Yes, AV1 is the next big deal. You can compress the hell out of the video and it still looks near original. I’ve re-encode some of my locally ripped movies for fun to see how it looks and it’s really impressive.
Y’know, there’s a similar one used in the gemstone industry: “eye clean,” which only applies if the stone has no inclusions (artifacts) that can be seen with the naked eye. As you can imagine, it’s usually a pretty desirable trait, especially in diamonds. It doesn’t really matter if there’s random garbage floating around in it, it just has to be undetectable to our human eyes.
You’d describe the encoding, not the source. The fun part is that it also applies to audio. “At 256 kbps, MP3 is transparent.”
It only applies to lossy codecs. Lossless codecs, by definition, have no error. “Error” itself being a borrowed term. Good encodings don’t have fewer errors… they have less error. For example, measured as mean squared error, where an individual sample being very wrong counts more than many samples being slightly wrong.
O dear, 😂, thats bad for longevity of such a expensive device (or would it be doable with an SW update?)
Luckily I’m poor and doesn’t have to think about that 😂 but would be nice to get this first edit, since it is the most likely jailbreakable vision ever made I guess.
Well if really never one releases with a data port, or similar.
I wouldn’t call h.264 the current industry standard. It’s the smallest common denominator since more or less every device that’s capable of streaming video can decode h.264. However h.265 is pretty much standard for resolutions above 1080p. AV1 is nowhere near standard yet, though.
There is a trade off between just getting more storage and reencoding. I enjoy seeing the results of the re-encodes, but it’s more cost effective to just get a larger hard drive.
If you reencode to a more efficient codec, you can save ridiculous amounts of space. If you’re interested in reencoding and are willing to play with self hosting, look into Tdarr, it’s an app that can reencode your whole library. Been using it for a while after switching from my personal solution has been wonderful. I just put files into my media directories and it picks it up, reencodes the file and replaces the original if everything checks out.
It’s always possible to re-encode video; it’s usually called transcoding. However, you lose a bit of quality every time you encode, so you might not gain much in the end. You can offset a bit of the quality loss by encoding at a higher bitrate/quality factor/etc than you otherwise would, but that of course takes up extra space.
Thanks I will give it a shot and see how it goes. The biggest thing holding me back is older hardware like the Nvidia Shield for example not supporting AV1.
Ahh, yeah that could be an issue. It takes my laptop like 11 hours to encode one of the Lord of the Rings Blu-ray. I also change the audio to eAC3 while I’m in there for better client support.
AV1? that’s a codec, right? I see in the preferences section for Piped. Is better than AVC (h.2640)?
It’s also better than H.265/HEVC. Plus, it’s open-source and royalty free.
Sweet!
Better quality per file size than HEVC? cite?
I was curious, so I looked it up. AV1 is more efficient than HEVC by like 28%! On the downside, encoding is horrifically slooow. It’ll be interesting to see how much hardware support AV1 gets in the coming years, because encoding time will have a dramatic effect on its adoption rate.
Interesting to note: AV1 can be played in Kodi, Plex, Emby, Jellyfin, VLC, Chrome, Firefox, Edge, and Opera. So on the software side, it’s pretty widely supported.
As long as your cpu/gpu can handle it
Playback has pretty wide support by now.
Ah, yes, that’s correct, thank you. Your cpu/gpu must support it, or it won’t play.
That’s the definition of every piece of software and media ever.
They didn’t say that. They just said it’s better, but didn’t mention their rationale. But IMHO opinion, it being FOSS makes it about a million times better than HEVC alone
deleted by creator
Yes, AV1 is the next big deal. You can compress the hell out of the video and it still looks near original. I’ve re-encode some of my locally ripped movies for fun to see how it looks and it’s really impressive.
Presumably you know, but for anyone else: the word for this is “transparent.” It’s when the codec leaves no noticeable artifacts.
Huh, there’s a term for that? TIL
Y’know, there’s a similar one used in the gemstone industry: “eye clean,” which only applies if the stone has no inclusions (artifacts) that can be seen with the naked eye. As you can imagine, it’s usually a pretty desirable trait, especially in diamonds. It doesn’t really matter if there’s random garbage floating around in it, it just has to be undetectable to our human eyes.
Misread as “gaming industry” and was briefly very confused.
How would you use that in a sentence? Like “You can compress the hell out of the video and it’s transparent”?
You’d describe the encoding, not the source. The fun part is that it also applies to audio. “At 256 kbps, MP3 is transparent.”
It only applies to lossy codecs. Lossless codecs, by definition, have no error. “Error” itself being a borrowed term. Good encodings don’t have fewer errors… they have less error. For example, measured as mean squared error, where an individual sample being very wrong counts more than many samples being slightly wrong.
deleted by creator
Yeah, from the re-encodes I’ve done, I only noticed artifacts in clouds and the New Line Cinema intro to lord of the Rings
I wonder if the apple vision pro is able to play AV1 files 🤔 i guess, would be really bad if not
Sorry, doesn’t look like it supports AV1:
O dear, 😂, thats bad for longevity of such a expensive device (or would it be doable with an SW update?)
Luckily I’m poor and doesn’t have to think about that 😂 but would be nice to get this first edit, since it is the most likely jailbreakable vision ever made I guess. Well if really never one releases with a data port, or similar.
Usually this kind of stuff is done in hardware for performance reason. So likely no.
That’s probably not their main focus with that thing
Can you tell me more about reencoding to save space?
I don’t know anything about this new one they’re talking about, but here is a comparison between h.264 (the current industry standard) and h.265.
https://www.epiphan.com/blog/h264-vs-h265/
The short version: Basically the same quality but half the file size, but it takes much longer to encode.
I wouldn’t call h.264 the current industry standard. It’s the smallest common denominator since more or less every device that’s capable of streaming video can decode h.264. However h.265 is pretty much standard for resolutions above 1080p. AV1 is nowhere near standard yet, though.
There is a trade off between just getting more storage and reencoding. I enjoy seeing the results of the re-encodes, but it’s more cost effective to just get a larger hard drive.
If you reencode to a more efficient codec, you can save ridiculous amounts of space. If you’re interested in reencoding and are willing to play with self hosting, look into Tdarr, it’s an app that can reencode your whole library. Been using it for a while after switching from my personal solution has been wonderful. I just put files into my media directories and it picks it up, reencodes the file and replaces the original if everything checks out.
I wonder if it’s possible to re-encode from H.265/HEVC to AV1
It’s always possible to re-encode video; it’s usually called transcoding. However, you lose a bit of quality every time you encode, so you might not gain much in the end. You can offset a bit of the quality loss by encoding at a higher bitrate/quality factor/etc than you otherwise would, but that of course takes up extra space.
You can play in handbrake with AV1 encoding to see how it goes. I think I set the compression to 36 or something.
Thanks I will give it a shot and see how it goes. The biggest thing holding me back is older hardware like the Nvidia Shield for example not supporting AV1.
Ahh, yeah that could be an issue. It takes my laptop like 11 hours to encode one of the Lord of the Rings Blu-ray. I also change the audio to eAC3 while I’m in there for better client support.
It’s h.264, just FYI.