From cgwiki

Editing materials in unreal involves recompiling/saving each time, but its considerably longer and more irritating than a quick recompile of a vop network. There's something similar to promoted parameters to avoid compilation (called 'parameters'), but to use them you need to create instances. The houdini analogy is if you were to promote parameters on a vop network, but then to use them, you have to create an HDA out of your vopnet, create a new instance, and you can only use those promoted parms on the new instances (and only within the vop network editor, not 'on the front' of it).

Scripting via UnrealJS

There's no vop/vex analogy, so native unreal either needs to use blueprint (vops) or C++, there's no middle ground built-in. Epic added hooks to allow for scripting engines a while ago, Epic provided a Lua example which has lapsed, SkookumScript looks pretty good, I think I read somewhere that Epic plan to make their own scripting engine at some point. In the meantime a guy has bolted Chrome's fast V8 javascript engine in, its very promising. For the all the hate and heat javascript gets, its close enough syntax wise to vex to be non threatening, and the browser wars mean that V8 is very fast. There's 2 youtube vids explaining how to get the basics going, and an interesting general one of how a talented javascript guy who's done a lot of the google chrome experiments has fallen into unrealJS and is doing cool things.

It's now available as a plugin directly in the unreal asset library, I've managed to make it say 'hello world' and create text that displays in game engine, but nothing beyond that yet.

Houdini to prototype blueprint

On a fast gaming laptop blueprint is still a little too slow to use interactively. The basics of blueprint, especially in terms of texture operations, maps loosely onto vops. I've been doing experiments in vops to work out uv operations, then use what I've learned there and recreate networks in blueprint. There's an irony here of using Houdini for realtime feedback on realtime shaders, because the shader editor for a realtime engine like Unreal is too slow for my impatient needs. :)

Change material parameter with keypress at runtime

More effort than expected!


  1. define keypress event
    1. go to settings -> project settings, input section
    2. new axis mapping, name it
    3. define a key with the dropdown
    4. define a key to do the reverse action if needed, set its scale to -1
  2. Make a dynamic instance material from your instanced material at runtime:
    1. Level blueprint, 'create dynamic material instance' function
    2. set dropdown to the material instance
  3. Assign that material to your object at runtime
    1. Choose object, edit blueprint, construction script
    2. use that event to trigger a 'create dynamic material instance' function
    3. drag in a variable of the static mesh, use as target,
    4. drag out the return value to set a variable we can call from another blueprint soon
  4. link keypress event to parameter
    1. open the event graph now for the same object
    2. drag in the variable you just made
    3. create a 'set scalar parameter value', link variable to target
    4. r.click, look for the name of the keypress axis event you defined earlier (should be in the menu inputs -> axis events )
    5. link its trigger event to the event input of the 'set scalar parameter value'
    6. manually type the parameter name into the purple field (there MUST be a way for this to introspect the names right?)
    7. set the value you want in the green field
  5. force this blueprint to be aware of player keyboard input
    1. in same graph, link an 'event begin play' to a 'enable input' function
    2. create a 'get player controller', feed that to the 'player controller' input
  6. incrementally add to the parameter when the key is pressed
    1. insert a 'get scalar parameter value' function in between the axis event and the 'set scalar parameter value' function, wire it up so it also read the same parameter name, and is linked to the same dynamic instance material
    2. create a 'float +' node to add the return value from the 'get scalar parameter value', and the axis value from the axis event
    3. send this value to to the 'set scalar' function
    4. if the increments are too big, insert a 'float x' after the input axis value, and set the second term to, say, 0.001 to slow it down.

Unreal keypress.gif

Make widgets support clicks from gear vr


UMG is the buttons/sliders/etc tool for making menus and whatnot in unreal. A widget lets you take UMG and put it in 3d space, so a button can be in the level rather than just for a start menu. They assume that they'll get input from a keyboard press or mouse/gamepad, so there's some minor fiddling involved to make it work on the gear vr. A widget interaction component is needed, as a child of the player (which is a playercontroller), ideally under the camera, so that you have a virtual pointer in space. Then the playercontroller needs to bind the limited events the gearvr suppports (which are actually just phone/tablet screen touch events), and treat them as a left mouse button click.

Creating a widget is covered in this guide: https://docs.unrealengine.com/latest/INT/Engine/UMG/HowTo/InWorldWidgetInteraction/index.html

To make the playercontroller do its thing, look in world settings (window -> world settings, it should appear in a tab), and find the playercontroller entry that's assigned. If you have a custom one already that can be edited, great, edit it, otherwise make a new playercontroller blueprint, and assign to the world settings.

Edit the playercontrooler blueprint, make sure teh component tab is visible (window -> components), add a widgetinteraction component, parent it under the camera if you have a camera for your player controller.

Edit the event graph for the playercontroller blueprint, add an 'Input Touch' event, and make it drive a press pointer key' and a 'release pointer keyy'node, with the key set to 'left mouse button'. Control-drag in the widget interaction variable, and wire that up too. To make it easier to test on the desktop, you can bind a regular keyboard key to also drive the press and release pointer key functions.

Player controller blueprint widget.jpg

Click the 'class defaults' button at the top, find the input section towards the bottom of the details view, and set "auto receive inputs" option to 'player 0', so it will listen to touch and keyboard events.

Playercontroller widget input.jpeg

Now select the widget interaction component in the top-right component view, and on its details panel set the interaction distance to what you need (I set mine silly high, 5000), and set the interaction source to 'center screen'.

Widgetinteraction settings.jpg

With all that done, you should be able to go back to the widget and its event graph blueprint, and add an 'on pressed (button 1)' event to drive behaviour, and it should all work.

Widget onpressed bp.jpg