Notch Layer Overview

Notch is a generative content creation platform that integrates with Disguise software.

Notch Layer Overview

The Notch layer allows users to use Notch Blocks exported from Notch Builder.

Notch content creation should be approached from a similar standpoint to that of rendered content, whereby the user specifies as much in advance as possible and test the content on a real world system before show time to reduce the likelihood of performance related issues.

Warning: Extremely demanding Notch blocks can cause an oversubscription of available memory resources - click here for full advisory.

Some Generative layers take their source from other layer types - either content or generative - by use of an arrow. Linking two layers with an arrow defines the arrowed from layer as the source, and the arrowed to layer as the destination. If you have an arrow between a content layer and effect layer, it is said that the content layer is being ‘piped in’ to the effect layer. For more information on arrows, see the compositing layers topic.

To draw an arrow between two layers, hold down ALT and left-click & drag between the source and destination layer.

On the Notch layer, you can specify which source the layer is using (either texture or an arrowed layer) by using the Video Loader parameter. The Video parameter of the layer will show a thumbnail of either the texture chosen, or the content coming from the arrowed layer (depending on the selection made).

 

Notch v1.0+

In 2024 Notch has released Notch v1.0 private beta (1.1 beta is also available). Designer officially supports notch v1.0+ beta blocks from Designer version r27.7 released on 12/06/2024.

We support for Notch’s new colour management pipeline. You can find more information about Notch colour management at the bottom of this page.

 

Warning: While Notch v1.0+ beta blocks can be loaded and run on select servers with Designer r27.7+, Notch v1.0+ is still in beta, so care should be taken using them in real show environments. Please contact Notch support about your plans to use their beta too.

 

The following limitations affect Notch v1.0 beta blocks:

  • Windows OS must be Windows 10 1607 or higher.

  • Notch v1.0 only supports Nvidia GPUs with Maxwell architecture or newer.

  • Notch v1.1 only supports Nvidia GPUs with Turing achitecture or newer.

Please see the compatibility table to discover if your machine supports Notch v1.0+ blocks.

 

Notch Advice

Please note: It's always recommended to combine multiple Notch blocks into one block, as oppose to using lots of small blocks in your project. Notch has functionality to combine blocks in Notch builder.

  • Notch renders to the size of the resolution of the mapping being used on the Notch layer within Disguise software.

  • Rotation is displayed in degrees within Notch, but shown as radians once exposed within Disguise software.

  • The Y and Z axes are different in Notch and Disguise software, and need to be flipped/converted manually using an expression.

  • The general consensus is that one should NOT use Universal Crossfade alongside Notch. Using Universal Crossfade adds twice as much load, and depending on the effect used it is unlikely to generate the desired effect.

  • Only ever have one dfx file connected to a Kinect.

  • When using a Notch block with Kinect input, in order to see this input the machine will need the Kinect SDK installed. You will also need to enable Kinect input via Devices in Notch Builder.

  • For audio reactive, its worthwhile to define the audio device being used on the server in you notch projects audio device (device > audio device).

  • When using sockpuppet please advise the Notch content creators to create unified naming conventions for all exposed parameters (ie: FX1, FX2, Speed1, Speed2, Color1, Color2, Color3, etc.) or the end management will be difficult to manage.

  • Any block that stores frames (i.e. frame delay) needs to be managed extremely carefully or it may eat up all memory resources. If vram resources are being eaten up inexplicably, it’s worth checking whether the Notch block is storing frames for use anywhere.

  • Also, as Notch blocks are not user definable in terms of DMX assignment order, it is always best to pre-determine the number of attributes one wishes to have exposed in the Master block.

Workflows

The layer is specifically for playing back Notch blocks. Depending on your application and production needs, there are a number of workflows you can employ in order to integrate Notch effects into . Below are a few recommended workflows for their respective applications. Bear in mind that these are stripped to the bare minimum elements for simplicity, and users are required to have a valid Notch Builder software with export capabilities in order to follow along.

For more information on Notch Builder, see here.

 

IMAG effects

These are probably the most straight-forward effects to implement on a show, as blocks are usually designed to be plug and play, with a video source and exposed parameters to control and interact with the effect. Below we denote the workflow.

 

In Notch
  1. Create a Notch effect with a node that accepts a video (usually Video Source).

  2. Expose the Video Source property and compile the block.

 

In Disguise
  1. Create a Notch layer, load the IMAG Notch block into it.

  2. Create a Video layer (or any layer you wish to output content from, i.e. generative layers).

  3. Set Video Source to Layer.

  4. Move the Notch layer above the other layer in the stack, then arrow the video layer to it.

You should now be able to see the IMAG effect applied to any content from the layer below. You can change the effects being applied under the parameter group’s Notch Layer parameter, if the effect is set up to use layers as individual effects.

 

Notes
  • The Disguise software does not yet support Video selection without layer arrowing. The Video parameter under Video Sources is mostly unused, though it can still be used for placeholder images. The images displayed there are taken from the DxTexture folder.

  • If multiple Notch layers are used and you wish the arrowed video to be the same for all of them, you will need to set up the exposed parameter’s Unique Identifier in Notch to be the same for all exposed video sources.

  • If multiple layers are arrowed into an effect that accepts multiple sources, the source layer is chosen in order of selection (i.e. the first layerselected will be the first Video source, the second will be the second, etc) regardless of the order of the parameters themselves within the list.

3D virtual lighting simulations

These workflows often involve a 3D mapped object with projectors simulating light sources moving and affecting the object in real time. These effects rely on the virtual 3D space to match the real space and object, along with the coordinates systems of Disguise software and Notch.

 

In Notch
  1. Add a 3D object node (or a Shape 3D node, etc) and add it to the scene.

  2. Add a light source.

  3. Create a UV camera node to output the lit textures onto the object’s UVs.

  4. Expose the appropriate parameters (light positions, object positions, etc).

  5. Compile and export the block.

In Disguise
  1. Create a surface with the same object used in Notch.

  2. Calibrate the projectors to the surface with your preferred calibration method.

  3. Create a Notch layer.

  4. Apply the Notch layer via Direct mapping to the object.

  5. Move the lights around to see the object UVs being affected.

Notes

Warning: While it’s often advised to enable Deferred Rendering in Notch, it might negatively impact performance depending on the complexity of the scene. Use this functionality cautiously.

 

  • For the Notch scene to match the scene within Disguise software, accurate measurements need to be taken on stage and an origin point reference needs to be determined from the start. Setting an origin point early in the process will make the line up process easier.

  • Lights can be linked to MIDI, OSC or DMX controls like every parameter in Disguise software, or can be keyframed and sequenced on the timeline..

  • The mapped object’s movement can be linked to automation or tracking systems, and the positional data can then be used to drive the exposed position and rotation parameters.

  • If multiple objects are in a scene, you will need to create a larger UV layout that accommodates each object in a separate UV area, and then match the overall lightmap resolution by setting the surface resolution within Disguise software. You can use the UV Output section of the 3D objects node in Notch to determine where in UV space a specific mesh will be output within the overall canvas.

Particles systems and tracking regions

A very common application of Notch is to use it alongside tracking systems such as BlackTrax in order to generate particles from specific points in space, to be either projected on a surface or displayed on an LED screen. Below is a broad outline of the workflow using an LED screen, the BlackTrax system, and region camera to specify the tracking regions:

In Notch
  1. Create a particle system (Minimum required: Particle Root, Emitter, Renderer).

  2. Create a Region Camera.

  3. Expose the position parameters of the Particle Emitter, as well as the Region Camera’s Top Left X and Y, and Bottom Right X and Y.

  4. Compile the block and export to Disguise.

In Disguise
  1. Create an LED screen. This can be placed anywhere in the virtual stage, though it is recommended to place the LED in the correct position to match the physical space.

  2. Ensure the BlackTrax system is connected and tracking data is being received from the beacons, then select a stringer to use as a tracked point for the Notch particles.

  3. Create a Notch layer and load the exported block into it.

  4. Set the Play Mode to Free-run, or press play on the timeline in order for the particles to begin spawning. They are a simulation, and only spawn over a span of time.

  5. Right-click on the BT point being tracked to open a widget that displays the point’s current coordinates in 3D space.

  6. With the Notch layer open and the Particle Emitter position parameters visible, navigate to the tracked point’s widget, hover over one of the position values, then Alt+left click and drag an arrow from there to the corresponding position parameter in the Notch block.

  7. You should now see the particle effect either disappearing off-screen (if the beacon is not in range of the LED screen), or moving towards it.

  8. If the world coordinates of Notch and Disguise do not match, and the particle effect is limited to a particular screen or mapped area, a Region Camera can be used to mark the boundaries of the tracking region instead.

  9. To set the region camera, simply measure the XY position of the top-left corner of the LED screen, and do the same for the bottom-right, and enter these values in the exposed Region Camera parameters.

Notes
  • An often quicker method of finding out the region camera values is to place a tracked BT point on the top left of the screen, and then the bottom right, and manually enter the xyz coordinates displayed on the tracked point’s widget, as they are sure to match precisely.

  • A common mistake that leads to oddities with tracking regions is when the wrong axes are used. As a rule of thumb, for vertical surfaces you’ll need to take the X and Y position of the top left and bottom right corners, whereas for horizontal surfaces (i.e. for effects built to be displayed on the ground) you will need X and Z. This is obviously also dependant on how the Notch block itself was built and the orientation chosen in the region camera node, so it’s important to double-check these details beforehand.

  • When using the region camera, particle size plays a fairly important role. It may be advisable to expose emitter size, particle size, and camera distance in order to achieve the desired result.

  • If multiple machines are outputting the same set of particles but are seeing a different result, it is because both particle roots are running separate instances of the simulation on two machines. You can fix that by setting the Particle Root node for that emitter to Deterministic by ticking the box in the node editor in Notch Builder.

Notch Layer Properties

Notch layers are comprised of a set of default properties (detailed here) and additional properties that appear depending on what is actually in the Notch block. For explanation of properties beyond the default, please refer to your Notch content creator.

 

Effect

The Effect parameter defines which Notch DFX File the layer is looking at.

 

Blend Mode

Blend Mode controls how the output of the layer is composited with the layers below.

 

Brightness

This property (which appears as a light bulb icon) controls the brightness of the layer output.

 

Mapping

The mapping property controls how the layer output is mapped onto the screen(s) in the Stage level.

Mapping parameter

 

For information on mapping, including how to use the different mapping types offered by Disguise software, please see the chapter Content Mapping.

 

Processing Size

There are two options:

Output size - The resolution of the screen the effect is mapped to (not the mapping itself)

Input size - The resolution of the effect as set in the Notch Builder.

 

Dry-Wet Blend

Global Intensity level for the effect on a scale of 0-255.

Mode

 

This specifies the playback mode.

There are three modes; each one has a specific behaviour that is useful for a different situation. 

  • Locked

    If the play cursor continues to play or stops at the end of a section, the video frame number locks to the timeline. When the play cursor holds at the end of a section, the video will play continuously.

  • FreeRun

    If the play cursor continues to play or stops at the end of a section, the video will play continuously. Jumping around the timeline while the cursor is playing or has stopped does not affect which frame is being played.

  • Normal

    When the play cursor stops, the video will also stop and the frame number will lock to the timeline position. When the cursor continues to play or holds at the end of a section, the video will play continuously. Jumping around the timeline while playing does not affect which frame is being played.

  • Paused

    The block will pause on the frame at which the Paused playmode was selected. Moving the play cursor around the timeline in any way will not affect which frame is held. Leaving the layer and re-entering the layer will also not affect which frame is held.

     

    Notch Colour Management

     

    This section outlines Notch colour management tools and conventions in Notch.

     

    Before Notch v1.0

    Before Notch v1.0 all blocks expect inputs and output in sRGB Gamma 2.2 (x^2.2) colour space. This cannot be changed.

     

    Notch v1.0+

    Notch v1.0 introduces colour management options. You can now set the input/output format of a notch block to one of the following settings before exporting the Notch block:

    • sRGB Linear

    • sRGB ‘Gamma 2.2’ (Note: This actually uses the standard sRGB curve, not Gamma22 as the name might suggest)

    • ACEScg

    Designer will automatically detect the input and output transforms that the Notch block expects and will convert between Notch and Designer’s working space when passing textures in and out of the block.

     

    For more information about colour management in Designer see our colour transforms page.

    For more information about colour management in Notch see Notch’s v1.0 manual here.

Please note: The Notch manual for Notch v1.0 is currently password protected while Notch v1.0+ is in beta and it is subject to rapid iteration and change.