0. Who this is for

These guidelines are for users of the App SDK who want to use it to create a scene for rendering with V-Ray from the data in their host application scene. We will not discuss details on how to write code with the App SDK here - there are separate docs and examples for this included in the SDK package. Rather we will cover what V-Ray plugins to add to the scene, how to link them and what parameters to set.
Note: This document will be updated and extended over time.

1. Introduction

1.1. A note on terminology

1.2. V-Ray scene contents

We could informally define three kinds of plugins that build up a V-Ray scene.
One would be the so called "top-level" plugins, which can exist on their own, without being part of some plugin tree. Lights are top-level plugins for example. They can also receive input from other plugins of course, but they do not output values to other plugins.
The plugins which are not top-level serve as input for parameter slots which require a plugin of certain type. For example a material plugin may require an optional texture input and you'd reference a texture plugin there. The texture plugin may receive input from a UVW generator and so on.
The third kind would be a special type of top-level plugins which only have one instance. These are basically settings plugins. Most of them have "Settings" in their name, but a few don't. The V-Ray camera is also defined by such a singleton settings plugin.

There is an example tiny scene in section 7. Its contents should become understandable by reading the following sections.

1.3. Parameter types

The following are the types recognized in a V-Ray scene (think .vrscene file). They have corresponding types in the different AppSDK language bindings. The SDK uses the respective basic language types wherever possible and defines custom type classes for the rest.

Parameter polymorphism is an important feature of V-Ray. Texture parameters accept simple (basic) values, so instead of creating an additional texture plugin which generates a single color you just set a Color value to the texture slot. Same goes for float textures and single float values etc. You can also set the value of a texture parameter to an output parameter as described above.

1.4. V-Ray scene file format

V-Ray uses a text based scene file format (.vrscene). It is quite simple which makes it convenient to debug and modify by hand. The format is case-sensitive and a little similar to JSON. Large data values, such as geometry definitions can be compressed (and text encoded). The file contains consecutive plugin instance definitions: type, instance name, parameter list. Nesting is not supported.
The main rules are:

Scene files are supposed to be created by V-Ray. You shouldn't try to write them yourself, but you are of course free to modify them by hand.

It is important to note that the order of definition of plugin instances generally does not matter. This also applies to a scene created in memory, even when you don't export it to a file. There are a few exceptions such as camera related settings plugins, which may not work correctly if V-Ray doesn't process them in a particular order.

1.5. Default values

Every parameter has a default value, so even after you create an "empty" plugin instance it is actually fully functional. Of course it might need data, as for example with geometry plugins, but a light will work right away (placed at 0 coordinates). That being said, some plugins have inconvenient default values, which almost always need to be changed (for example some settings plugins, such as the image sampler settings or some shadow parameters). We usually can't fix the defaults, because it would break existing user scenes. Nevertheless, unless you know what you're doing, it is recommended to stick to the default values. You are free to experiment, of course, but don't use values which you don't understand as they may have performance implications or quality implications or they may even break physical plausibility.
Note that when you export (save) a vrscene file it will always contain a bunch of settings plugins, even if you didn't create them. They will have default parameter values. This is how V-Ray always saves files.

1.6. Debugging and help

Apart from documentation included with the App SDK and this guide, the help pages for 3dsMax and Maya on docs.chaos.com are a good source of parameter information and examples, although they use the user-friendly UI names for things and not the actual scene parameter names.
A very useful tool for basic parameter information is plgparams.exe included in the binary folder of the SDK. It lists all parameters for the specified plugin (or all plugins with -list) and their types, default values and text comments. Similar information can be obtained using the ListAllPluginsAndProperties example in the C++ folder (or equivalent code for another language).
It is often useful to save out your scene to a file to inspect if you did everything properly. For example you may have failed to set some parameter properly and you will see this in the file as a missing or incorrect value, although you can also check the result of the set operation in your code. You can try to pinpoint problems by deleting parts of the scene (parameters or whole plugins) and re-rendering.
It can be very helpful if you have a V-Ray for 3dsMax or Maya and use it to export vrscene files to see what plugins and parameters are written out. The exporters for 3dsMax and Maya can be considered "ground truth" (even though they may have an occasional bug or missing feature).
If you're getting a black render make sure your camera is positioned and oriented properly and not inside an object. Keep in mind the default up-axis is Z, but it can be set to something else, usually Y. You might also get invisible or black objects if something is wrong with the attached material. In this case you can still the object in the alpha channel, especially if there is nothing behind it.
Another thing to watch out for is V-Ray's errors and warnings, so always implement the DumpMessage callback.

Information on all V-Ray plugins is available in the header file of the C++ vrayplugins.hpp and for C# in the VRayPlugins.cs file.

If renderer is a VRayRenderer instance in Python or Node.js, information about a plugin can be obtained by calling:

  • renderer.getDescription() - shows the plugin description
  • renderer.getMeta() - shows info about all plugin properties/parameters
  • renderer.getCategories() - gets the category to which the plugin belongs

1.7. "subdivs" parameters

You will see subdivs (subdivisions) parameters on many light and BRDF plugins. They control the number of new rays spawned for calculating glossy and diffuse effects on materials or the number of light and shadow evaluations and so on. The number of actual rays is proportional to the square of the parameter value. These can be used to increase or decrease sampling for the respective light or material, but we highly recommend leaving these at default values. We also recommend disabling local subdivs values altogether - see the DMC sampler section for details. Some of the settings plugins also have subdivs parameters which are ok to change, like the Irradiance Map and Light Cache for example.

Note: The reason the renderer uses the square of the parameter value is the property of the Monte Carlo integration method that to reduce noise (variance) by half (1/2 noise), you need four times as many samples (4x), to get 1/10 the variance you need 100x more samples... This way we get linear results from linear increases in the subdivs parameters.

2. Defining camera position

One of the first things you'd want to do is control your camera. This is done through the RenderView plugin. You will always want to create and setup this plugin, exactly one, in your scenes. (The exception is when you are baking textures - then you'd use BakeView.)

The main parameters are:

For advanced camera effects, such as DoF, exposure, distortion, vignetting, etc. you will need to enable the physical camera in addition to RenderView. See the Physical camera subsection in the settings section below for details. You can also use SettingsCameraDof instead, if you only need depth of field (see 6.11. Miscellaneous).

3. Creating lights

Good lighting is the most important thing for getting good photorealistic renders. It also affects image noise and performance. Some lights (generally the simpler ones, especially the first four in the list) render faster and with less noise than others. Things that may affect noise and performance of lights are: having a texture instead of flat color; having a complex mesh; being an area light with small relative size.

Listed below are V-Ray's light plugins, ordered roughly by increasing photorealism. Common parameters are at the end and some specific parameters are described in the Reference at the end of this document. There are even more parameters that we will not mention here.

Some of these plugins have versions from 3dsMax with additional parameters, such as LightOmniMax, LightSpotMax etc.

Common light parameters:

Other parameters common to many, but not all of the lights (check respective parameter lists):

The rectangle, mesh and dome lights can be textured. Keep in mind that the texture files are re-sampled at a resolution controlled by a tex_resolution parameter which has a default value of 512 for dome and rectangle and 256 for mesh lights. So if the light texture looks too pixelated, especially on the dome environment, try increasing this resolution. This will of course increase memory usage and render time slightly.

4. Creating geometry

4.1. The Node plugin

Before we get to defining your actual geometry data, there is a top-level plugin for non-light-emitting objects called Node. It links the geometry data (geometry parameter) with a material plugin (material parameter) and positions it in the world (transform parameter). You could reference the same geometry in different nodes with different positions.

4.2. Geometry sources

Lets look at the main geometry source plugins (we will ignore some others):

4.3. Instancing

Instancer and Instancer2 plugins can be used to efficiently instantiate a high number of objects in the scene from other geom source plugins. Often used for particles and vegetation.

Instncer2 is also a geometry source and can be connected to a Node plugin:


5. Creating materials

Exporting materials is probably the most complicated part, because it may involve complex shader networks, especially in apps with high artistic control, such as the popular DCC tools from Autodesk. Nevertheless, you can get good results even by using only a few plugins.

There are two types of plugins involved - material (names start with "Mtl") and BRDF (names start with "BRDF") plugins. BRDFs describe what happens with light when it interacts with the surface, like refraction, diffuse scattering, etc. BRDFs get plugged into material plugins, which may extend their behavior or combine them. Finally the material plugin gets plugged into a Node that links it to a geometric object in the scene.

You should always connect a MtlSingleBRDF instance to the Node::material slot. Despite its name, the MtlSingleBRDF::brdf parameter accepts other Mtl plugins, so there is no problem with using any material before finally adding MtlSingleBRDF (see note at the end of this section).

The workhorse of your material setups should be BRDFVRayMtl. It combines three types of effects in one plugin - diffuse, reflection and refraction (we have them in separate plugins as BRDFDiffuse, BRDFMirror and BRDFGlass respectively). You can use only one of the layers, though, by setting the colors of the others to black. The plugin splits the incoming light energy according to the colors for the three layers. For example, by default the reflection and refraction have black color, so these layers are skipped. If you set a white reflection color you will get a mirror. If you set a white refraction color you will get glass. The layers are evaluated in this order: reflection, refraction, diffuse. The amount of energy that passes to the next layer is the complementary color to the color of the layer. If you have a (0.2, 0.2, 0.2) reflection color, 20% of light the energy will go into the specular reflection and 80% will pass on to the refraction layer. If your refraction layer has color (1.0, 1.0, 1.0) all of the remaining energy will continue as refracted light, so you will get the effect of a glass material (a little reflection and full refraction of the rest). If the refraction color is (0.5, 0.5, 0.5) instead, 50% of the remaining energy will refract and 50% will scatter off from the diffuse layer. If the diffuse color is white, all of that remaining energy is scattered around the scene and if it's black all of it is absorbed (that would heat the object in the real world). So lets summarize what happens with the incoming energy with a few examples:

The absorption of energy in the diffuse layer matters for GI (global illumination). Having a closed room with pure white diffuse walls is like having a room of mirrors. V-Ray will bounce light around until it reaches its maximum GI depth and this may slow rendering down. If the walls were 88% white instead, only 7% of the original energy will remain after 20 bounces. At some point V-Ray will decide that the ray is too weak and insignificant and it will stop tracing it. V-Ray makes a lot of decisions based on light intensity, so this matters. Samples with intense light are considered more important than weak ones.

An important aspect of BRDFVRayMtl is the Fresnel reflection parameter (which is off by default when you create an instance, while it's on by default in V-Ray for 3dsMax). When it's enabled reflection amount depends on the angle of incidence. Glazing angles produce more reflection than looking head-on at the surface. This changes the conclusions we made above about how energy is distributed in the layers of the material. Even with 100% reflection (white color) some of the energy will go through to the refraction and diffuse layer. Fresnel reflections are a must for physically plausible materials. For Fresnel you need to set a correct index of refraction (IOR), even if the material doesn't refract any light. For example metals have very high IORs.

BRDFVRayMtl also has a translucency option, but you should use BRDFSSS2 instead for subsurface scattering.

Important note on opacity: Many materials, including BRDFVRayMtl, have a separate opacity parameter slot (a value also known as "alpha", the inverse of transparency). Even if i.e. the diffuse parameter has some alpha values below 1.0, they will not have an effect. You need to separately connect for example TexBitmap::out_alpha to opacity. Also keep in mind that a material may look transparent even without any opacity set, because of the refraction layer.

Note: There are some material and BRDF plugins that still exist for compatibility reasons, but we don't recommend you use them. For example instead of connecting MtlDiffuse to a Node, you should create BRDFDiffuse -> MtlSingleBRDF::brdf -> Node::material. Plugins with names ending with "_ASGVIS" can also be ignored. We don't have an exhaustive list of deprecated plugins at the moment.

5.1. Material and BRDF plugins

See this page for details on GLSL and this page for details on OSL.

Some advanced materials:

Below is a list of BRDF plugins for the materials. The difference between the Blinn, Phong, Ward, Cook-Torrance and GGX models is in how the specular highlight behaves and what parameters you have to control it. The respective BRDF plugins combine a diffuse and a glossy reflection component, while VRayMtl provides more options, like a Fresnel term, refraction, etc.

Some BRDFs simulating specific real world materials:

And bump maps:

5.2. Textures and UVW generators

The BRDFs (and some lights) have many Color, AColor and Float parameter slots that can be textured. You will usually just load texture files, but V-Ray also offers many procedural textures, as well as texture plugins that serve as modifiers (i.e. apply some simple function, combine textures, extract data from one of the channels etc.). There are over 200 texture plugins, so we will only look at the most important ones. Many of the plugins are made for specific host applications. Apart from the texture data itself, V-Ray uses UVW generator plugins to generate UVW mapping coordinates or modify the original data from the geometry source. Using a UVWGen plugin is not obligatory.

For texture files:

Textures that modify and combine colors:

Some procedural textures:

Finally, the UVW generators:

6. Scene and render settings

A general note on settings plugins: when you create a new Renderer object there are no instances of them, so you will need to create them before changing parameters. If you start rendering the AppSDK will create a SettingsOutput and if the render mode is RT it will create SettingsRTEngine. On the other hand, if you're loading a scene from file, it will have instances of most (but not all) settings plugins and you need to use them. This is because every time V-Ray exports a vrscene file it automatically writes out the settings even if they are at default values.

Starting with version 1.09 nightly builds after June 14th 2016, the VRayRenderer class includes the method setImprovedDefaults(). It creates (if necessary) some settings plugin instances and sets several parameters to values different from the plugin defaults. These values match the defaults used in our main products, such as V-Ray for 3dsMax and Maya. The plugin defaults couldn't be changed for compatibility reasons (keep old scenes working as they were). But the UI default values have changed over time. Many of the parameters and respective values that are set by setImprovedDefaults() are mentioned in the paragraphs below. These are just good initial values. Feel free to use them or not, or to overwrite some of them, for example reducing quality to get faster renders.

6.1. Image and region size

These are controlled from the SettingsOutput plugin, but it is one of the few exceptions where you should not touch the plugin directly. The AppSDK has APIs for setting image and region size (i.e. renderer.setRenderRegion, depends on language).

6.2. Image sampling and filtering

V-Ray has several types of image samplers. These are algorithms that determine how many samples to take for each pixel. This is combined with an image filter, which can soften or sharpen the final image.

The image sampler is controlled from SettingsImageSampler. These are the four types for its type parameter:

The progressive sampler produces whole images iteratively, refining them with each pass. The other samplers work on small "buckets" and only return an image when it is complete.

For details on the adaptive and progressive sampler see this page. You can also see what our CTO has to say about sampling: https://www.youtube.com/watch?v=tKaKvWqTFlw.

Some of the default values when you create (or if you don't) the SettingsImageSampler plugin are not optimal. They are currently kept for compatibility reasons. Here are some guidelines for changing them:

Most filters have just one size parameter for the kernel radius. Catmull-Rom has no parameters. The available filter plugins are:

To apply a filter, just create an instance of one of those plugins. You can use only one at a time.

See 3dsMax docs or Maya docs for more info on filters.

Note: Do not create a SettingsImageFilter. It is deprecated. Use the plugins described above.

6.3. DMC sampler

For details on the Deterministic Monte Carlo Sampler see this page.

We recommend leaving the parameters of SettingsDMCSampler at their default values, with the exception of use_local_subdivs. Set this to 0, so that only the global subdivs settings are used.

6.4. Global illumination

See this page for details on the different GI engines.

By default global illumination (GI) is disabled in an empty scene (with the exception of RT GPU mode where you can't disable GI). You need to set SettingsGI::on to 1 to enable tracing of secondary rays. Any photorealistic render needs GI, so apart from some kind of debugging, you will always want to enable GI.

Other SettingsGI parameters of interest are:

We don't recommend changing the contrast and saturation parameters.

The choice of GI engines and their parameters is paramount, especially for interiors where most of the illumination is indirect. Bad choices can lead to too much noise or artifacts and/or disproportionate render time for certain image quality. The common setup for interiors is Irradiance Map with Light Cache secondary and for exterior scenes it is often Brute Force primary with BF or LC secondary.

6.4.1. Brute force

The settings for Brute force are in SettingsDMCGI. There are only two parameters:

6.4.2. Irradiance map

The Irradiance map is configured through SettingsIrradianceMap.

See this page for details on the algorithm and its parameters.

The main two parameters are:

6.4.3. Light cache

The Light cache is configured through SettingsLightCache.

See this page for details on the algorithm and its parameters.

There are a lot of parameters for fine tuning and fixing specific problems but the main parameter is:

6.5. Environment

You can define environment background, lighting (GI), reflection, refraction colors or textures through the SettingsEnvironment plugin. Usually all slots have the same value. Environment textures use UVWGenEnvironment for spherical, cube etc. mapping.

You can also add scene-wide volumetric effects through the environment_volume list.

See 3dsMax Environment documentation for details.

A special case is the Sun-Sky system. V-Ray has a special procedural texture, TexSky, for these environment slots that is coupled with SunLight. The color of the environment depends on the position of the Sun.

TexSky takes its parameters from SunLight if TexSky::sun is set. You can also use a TexSky without a Sun. For parameter details see the reference section at the bottom.

6.6. Units (scale)

Some calculations in V-Ray based on physics require accurate scaling of scene units to real world units like meters, Watts, seconds, etc. This is controlled through the SettingsUnitsInfo plugin with the properties listed below. This affects the physical camera, IES lights, volumetric effects, etc.

6.7. Physical camera

Although the name doesn't hint at it, the CameraPhysical plugin is a settings plugin (singleton). It modifies the way camera rays are shot for effects like DoF (depth of field) and distortion and how they are integrated into the image - exposure. The position and orientation of the camera is still defined by RenderView. Many of the parameters are exactly the same as on a real world DSLR or video/movie camera.

Most parameters are well described in this page. We will only add a few things here:

6.8. Color mapping

This is controlled by SettingsColorMapping.

For parameter descriptions see the Maya docs.

The default values when you create a SettingsColorMapping plugin are different from the recommended values in 3dsMax and Maya for legacy reasons. These are the values you should use for new scenes:

6.8.1. Linear workflow

In order for V-Ray to perform mathematically correct calculations, input and output color data should be linear. The results should only be converted to a non-linear color space (such as sRGB) for display. This is called linear workflow

For proper linear workflow, you need to set the SettingsColorMapping parameters gamma, adaptation_only and linearWorkflow to the values listed above. The parameter linearWorkflow is intended only for compatiblity with old vrscene files and should not be set to 1, despite its misleading name.

By default the VFB (V-Ray Frame Buffer) has its sRGB option enabled, so you will see the original linear image with sRGB applied as a post-process. The actual files you save from the VFB or with VRayRenderer::saveImage() will be linear like the actual data if saved in a floating point format like EXR or VRIMG. On the other hand, most 8/16-bit integer formats implicitly save their data gamma-encoded (e.g. JPEG applies ~2.2 gamma for dynamic range compression. Applications that read JPEG handle that and display the data in the required color space, gamma-encoded or linear). Saving to such integer file formats with saveImage() or from the VFB button also bakes any color corrections done in the VFB, including the sRGB conversion. So these files will look exactly like you see them in the VFB. This does not apply to EXR and VRIMG.

6.8.2. Bitmaps

There is one more place to control color mapping - BitmapBuffer - for texture file assets. It has two parameters for converting the image file colors - gamma and color_space. The gamma parameter is considered only when color_space=1 and makes texture reads apply inverse gamma to decode gamma-compressed values to linear. Most integer formats will require setting color_space=1 and gamma=0.454545. When color_space=2, sRGB decoding is performed (similar, but not identical to color_space=1 & gamma=0.4545).

6.9. Stereo and panorama rendering

For spherical panorama rendering you need to:

For cubic (6x1) panorama rendering you need to:

For stereo rendering:

Also remember to set a double horizontal resolution for stereo. So if your normal render resolution is 640x480, make that 1280x480. The left half of the image will be the left eye view and the right half is the right eye.

6.10. RT Engine

If your renderer object was created with RT CPU or RT GPU mode parameter, you can control RT-specific parameters with SettingsRTEngine. Note that GI is always enabled when rendering on the GPU and cannot be disabled.

The render stops when any one of max_sample_level, max_render_time or noise_threshold is reached. If all are zero, the sampling goes on indefinitely.

6.11. Miscellaneous

The SettingsRaycaster plugin has one main parameter of interest:

SettingsRegionsGenerator controls the size and order of generation of buckets in production mode.

SettingsOptions holds many miscellaneous options, some of which are:

See this page for more SettingsOptions parameters.

If you save the rendered image from the VFB and not from an AppSDK API, the corresponding Settings{JPEG|PNG|EXR|TIFF} plugin controls compression quality and bits per channel. You may need to change these according to your needs.

If you want to enable motion blur, set the SettingsMotionBlur plugin on parameter to 1. It also has the geom_samples parameter that affect quality, but may cost a lot of render time if increased. It should equal the number of geometry samples in the geometry data if it is non-static. Note that this plugin (SettingsMotionBlur) conflicts with CameraPhysical.

The SettingsLightLinker plugin allows you to define include or exclude lists for lights and objects, so that for example specific lights do not affect some objects etc. Refer to the plugin parameter metadata for explanations.

SettingsCaustics can enable improved rendering of caustic effects with photon mapping. This also requires setting some material parameters to make it work.

See this page for some details on caustics.

If you are not using CameraPhysical for depth of field, you can use SettingsCameraDof instead. Note that this plugin (SettingsCameraDof) conflicts with CameraPhysical.

The .vrscene can store multiple cameras, where only one can be set as ‘renderable’ at a time. The rest of the cameras can be chosen for rendering, for example in V-Ray Standalone, by using the -camera command flag. This is useful, as oftentimes a project requires rendering different sequences from different cameras, while nothing else changes in the scene - a simple example would be to render two different views of the same visualization and in this way, the same scene can be rendered twice with V-Ray Standalone. An example scene with two cameras would have one them to render by default while the other will have dont_affect_settings flag raised.

In shading setups, textures can be projected from cameras. This can either be the same camera used for rendering, or another camera in the scene, used only for projecting a texture. In both cases, an extra SettingsCamera and RenderView plugin is exported for the camera projection with the dont_affect_settings flag raised. This is similar to the Multiple Cameras setup. In all cases, it’s a good idea to export one camera for rendering, and another one for the texture projection, even if it’s the same camera, as different properties apply for each.

7. Minimal renderable scene

This is a diagram of the most simple possible scene that will render an object. These are the plugins and parameters you will need to create and set.


A. Brief plugin reference

This reference does not list all of V-Ray's plugins, but most users will rarely need the ones that are not mentioned here.

Also only some of the parameters are explained here. As a general rule, follow the advice in Debugging and help when looking for info on plugins.

A.1. Common plugins

Light plugins (see also section 3 above)

LightOmni

LightSpot

LightRectangle

LightSphere

LightMesh

LightDome

LightIES

SunLight

Geometry plugins

Node

GeomStaticMesh

GeomStaticNURBS

GeomStaticSmoothedMesh

GeomDisplacedMesh

GeomHair

GeomMeshFile

Textures and UVWGenerators

TexSky

A.2. Other plugins

CameraDome

EffectLens

GeomParticleSystem

VRayClipper

Volumetric plugins


The information from this page is also available at User Guide V-Ray Application SDK