
- #BLACK LINE ON EDGE OF MOVIE TEXTURE UNITY EASYMOVIE ANDROID#
- #BLACK LINE ON EDGE OF MOVIE TEXTURE UNITY EASYMOVIE CODE#
We will first define it as float is our shader program, as usual. Properties are created a bit differently with the post-processing stack. Next, _Scale will need to be added as a configurable property. Note that it is possible to sample textures with different filtering types inside a shader using inline sampler states, but for our purposes we will continue to use point filtering. The above demonstrates a comparison between point filtering (left) and bilinear filtering (right). For this reason, we make sure our UVs are incremented one pixel at a time to ensure we are always correctly sampling the buffers. This means that you cannot sample a point "in between" pixels and retrieve a blended result. By scaling our UVs this way, we are able to increment our edge width exactly one pixel at a time-achieving a maximum possible granularity-while still keeping the coordinates centred around i.texcoord.īoth the normals and depth buffers are by default sampled using point filtering. These two values will alternatively increment by one as _Scale increases. We first calculate two values, halfScaleFloor and halfScaleCeil. float halfScaleFloor = floor(_Scale * 0.5) įloat halfScaleCeil = ceil(_Scale * 0.5) įloat2 bottomLeftUV = i.texcoord - float2(_MainTex_TexelSize.x, _MainTex_TexelSize.y) * halfScaleFloor įloat2 topRightUV = i.texcoord + float2(_MainTex_TexelSize.x, _MainTex_TexelSize.y) * halfScaleCeil įloat2 bottomRightUV = i.texcoord + float2(_MainTex_TexelSize.x * halfScaleCeil, -_MainTex_TexelSize.y * halfScaleFloor) įloat2 topLeftUV = i.texcoord + float2(-_MainTex_TexelSize.x * halfScaleFloor, _MainTex_TexelSize.y * halfScaleCeil)
#BLACK LINE ON EDGE OF MOVIE TEXTURE UNITY EASYMOVIE CODE#
Add the following code to the top of the fragment shader. We will sample pixels from the depth buffer in a X shape, roughly centred around the current pixel being rendered. Some edge detection algorithms work with grayscale images because we are operating on computer rendered images and not photographs, we have better alternatives in the depth and normals buffers. If the values are very different, we will draw an edge. To generate outlines, we will sample adjacent pixels and compare their values.

As well, our effect will be automatically compatible with other post-process effects in the stack, like the Bloom shown below. There is also a function named alphaBlend defined we will use it later for blending our outlines with the on-screen image.īy integrating our shader with the post-processing stack, we gain access to powerful built-in anti-aliasing solutions. As well, some functionality, such as texture sampling, is now handled by macros.Ĭurrently, the Outline file contains a simple fragment shader (named Frag) that samples the image on-screen and returns it without modification. Although the shader code itself is the same, it is encapsulated in HLSLPROGRAM blocks, instead of CGPROGRAM. Shaders written for Unity's post-processing stack have a few differences compared to standard image effects. Open the Outline shader in your preferred code editor. For the "finished" screenshots in this tutorial, and for best results, anti-aliasing is set to Subpixel Morphological Anti-aliasing (SMAA) at High quality. It can useful to keep anti-aliasing disabled when developing screen space shaders, as it allows you to see the end product of the shader without any further modification applied. Note that by default Anti-aliasing in the layer is set to No Anti-aliasing. This will contain data that we will use to configure our outline effect. Assigned to the Profile field of the volume is a post-process profile, OutlinePostProfile. These components allow us to make use of the post-processing stack. When playback finishes, the screen will fade back to your content.If you select the Main Camera, you will note that already attached to it are the Post Process Layer and Post Process Volume components. It might take some time before the movie is ready to play but in the meantime, the player will continue displaying the background color and may also display a progress indicator to let the user know the movie is loading.

#BLACK LINE ON EDGE OF MOVIE TEXTURE UNITY EASYMOVIE ANDROID#
However, device vendors are keen on expanding this list, so some Android devices are able to play formats other than those listed, such as HD videos.įor more information about the supported compression standards, consult the Android SDK Core Media Formats documentation.Īs soon as you call Handheld.PlayFullScreenMovie the screen will fade from your current content to the designated background color. 3gp) and using one of the following compression standards: Unity Android supports any movie file type supported by Android, (ie, files with the extensions. You need to keep your videos inside the StreamingAssets folder located in the Assets folder of your project.
