Skip to content

Use shader for YUV to RGB conversion on received frames#168

Merged
chenosaurus merged 11 commits intomainfrom
dc/yuv_to_rgb_shader
Dec 5, 2025
Merged

Use shader for YUV to RGB conversion on received frames#168
chenosaurus merged 11 commits intomainfrom
dc/yuv_to_rgb_shader

Conversation

@chenosaurus
Copy link
Contributor

  • replace slow CPU conversion step to using a GPU shader

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces GPU-accelerated YUV to RGB video frame conversion for LiveKit's Unity SDK. The change replaces the previous CPU-based RGBA conversion with an optimized shader-based approach that processes YUV420 planar data directly on the GPU.

Key changes:

  • Added YUV to RGB conversion shader using BT.709 limited range color space conversion
  • Refactored VideoStream to support both GPU and CPU conversion paths with fallback mechanism
  • Changed output texture from Texture2D to RenderTexture to support GPU rendering pipeline

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
Runtime/Shaders/YuvToRgb.shader New HLSL shader implementing BT.709 YUV to RGB color space conversion
Runtime/Shaders/YuvToRgb.shader.meta Unity metadata for shader asset
Runtime/Shaders.meta Unity metadata for Shaders folder
Runtime/Scripts/VideoStream.cs Refactored to use RenderTexture with GPU shader conversion and CPU fallback
LICENSE.txt.meta Unity metadata for license file

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@chenosaurus chenosaurus requested a review from Copilot November 7, 2025 23:25
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Contributor

@ladvoc ladvoc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On-device testing was successful, just a few minor comments.

{
public delegate void FrameReceiveDelegate(VideoFrame frame);
public delegate void TextureReceiveDelegate(Texture2D tex2d);
public delegate void TextureReceiveDelegate(Texture tex);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this constitutes a minor breaking API change—are we able to use Texture2D here? If not, we can note this in the release notes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried but seems to affect performance to convert back to a Texture2D. Let's make a note that this is changed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the app side, the texture is still used the same way w/o any modifications.

return o;
}

float3 yuvToRgb709Limited(float y, float u, float v)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you should be able to use half, that will save and boost the performance more.

Assuming you are using the legacy rendering pipeline:

inline half3 YUV709Limited_to_RGB(half y, half u, half v)
{
half c = y - half(16.0/255.0);
half d = u - half(128.0/255.0);
half e = v - half(128.0/255.0);

half Y = half(1.16438356) * c;
half3 rgb;
rgb.r = Y + half(1.79274107) * e;
rgb.g = Y - half(0.21324861) * d - half(0.53290933) * e;
rgb.b = Y + half(2.11240179) * d;
return saturate(rgb);

}

half4 frag(v2f i):SV_Target
{
half y = tex2D(_TexY, i.uv).r;
half u = tex2D(_TexU, i.uv).r; // U,V textures are W/2 x H/2
half v = tex2D(_TexV, i.uv).r;
return half4(YUV709Limited_to_RGB(y,u,v), 1.0h);
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

chenosaurus and others added 7 commits November 19, 2025 09:18
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@chenosaurus chenosaurus merged commit 0d755e5 into main Dec 5, 2025
17 of 21 checks passed
@chenosaurus chenosaurus deleted the dc/yuv_to_rgb_shader branch December 5, 2025 00:11
@mpqst
Copy link

mpqst commented Jan 14, 2026

Hey guys, I was using version 1.3.1 and after upgrading to version 1.3.3, the frames have a white-ish tint on them now, which I belive was caused by this update. Would you have any suggestion on what could be causing that issue? I'll share some of the code bellow as well. Appreciate any help or advice you might have!

private void HandleTrackSubscribed(IRemoteTrack track, RemoteTrackPublication publication, RemoteParticipant participant)
{
   if (track is RemoteVideoTrack videoTrack)
   {
       publication.SetVideoQuality(VideoQuality.High);

       _trackSidToParticipantSid[videoTrack.Sid] = participant.Sid;
       var stream = new VideoStream(videoTrack);
       stream.TextureReceived += (tex) => {

           OnVideoTextureReceived?.Invoke(videoTrack.Sid, tex);
       };
       stream.Start();
       StartCoroutine(stream.Update());
       _activeStreams[videoTrack.Sid] = stream;
   }
}
 
public void CreateOrUpdateVideoTile(string id, Texture tex)
{
   if (string.IsNullOrEmpty(id)) return;

   if (tex != null)
   {
       tex.filterMode = FilterMode.Bilinear;
       tex.anisoLevel = 2;
   }

   if (!_videoObjects.TryGetValue(id, out GameObject cell))
   {
       if (!string.IsNullOrEmpty(_fullScreenParticipantId))
       {
           _fullScreenParticipantId = null;
           Debug.Log("LiveKitUIManager: New user joined. Resetting to Grid view.");
       }

       cell = Instantiate(videoTilePrefab, participantTileParent.transform);
       cell.name = $"Cell_{id}";

       PressableButton btn = cell.GetComponent<PressableButton>();
       if (btn != null) btn.OnClicked.AddListener(() => ToggleFullScreen(id));

       RawImage image = cell.GetComponentInChildren<RawImage>();

       if (image != null)
       {
           image.texture = tex;

           if (!string.Equals(id, "local user"))
               image.transform.localEulerAngles = new Vector3(0, 180, 180);

           AspectRatioFitter fitter = image.GetComponent<AspectRatioFitter>();
           if (fitter == null) fitter = image.gameObject.AddComponent<AspectRatioFitter>();
           fitter.aspectMode = AspectRatioFitter.AspectMode.FitInParent;
           if (tex != null) fitter.aspectRatio = (float)tex.width / tex.height;
       }

       _videoObjects[id] = cell;

       RefreshVisibility();
       RebuildVideoLayout();
   }
   else
   {
       RawImage image = cell.GetComponentInChildren<RawImage>();
       if (image != null) 
       {
           image.texture = tex;
       };
   }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants