Cubism SDK‎ > ‎Introduction‎ > ‎

04. Integration With Other Systems

※Content in blue are added on 2013/10/03)

 This page explains the essential considerations for developing Live2D integration with other systems.
    These exclude the ones that already has SDK available, such as Unity, cocos2d-x, etc..
        Also, for developments that operate in these platforms, you may skip this section.
        If the following operation and image rendering are available in your preferred system, the integration compatibility would be much higher.

Live2D Initialization and Memory Release
    
    Live2D: init() to initialize Live2D.
    Live2D: dispose() to release memory and offload library. (Usually games do not need to offload when running)
Memory Management

    To save memory usage (CPU and GPU) of Live2D, please use external declaration for memory allocation, and then initiate Live2D (Live2D: init()) to 
    startup.
    Live2D memory manager requires at least 4K memory.


Model and Texture - Reading and Initializing 

 Live2D only provides image rendering functions, but not resource management. It is assumed to external codes or game engine to handle 
    resources.
  • For standalone files such as the Live2D model, it is meant to be read from folders outside of Live2D, and initialized (byte string) in the library through external codes.
  • As for textures, they are to be initialized outside of Live2D before loaded into Live2D.

Motion and Expression
    
    Use event listeners such as mouse-click to call Live2D commands, such as motion or expression change.
    Live2D internal operation is not thread-safe, thus do not use multi-threads to update model data and rendering simultaneously

Frame Refresh and Render for Model
 Every frame going through the initiation processing mentioned above allows the model to be ready for taking commands of the following:
  • Configuration of the model's parameters. Playing and stopping animation data.
  • Model's update() processing (mainly CPU)
  • Model's draw() processing (mainly polygon construction)      - Explained in details later

Details About Image Render 描画の詳細
 Different from 3D rendering (using depth buffer), Live2D uses 2D polygons (including transparent texture) by layering order to render the 
    visual. (As if using Photoshop's layering feature but used for 3D purpose)

 Conceptual details  as following: 
  • Depth Test is set to "Off" by default. (Individual polygon can has its Z value changed for rendering)
  • Shade only handles the Alpha mixing of textures
  • Utilizing vertex, UV, and index arrangement to render polygons that consist texture
    In platforms such as OpenGL, use the draw() on the Live2D model to call all kinds of rendering commands. They should be called from the 
    Live2D library. Although the source code for Live2D's image rendering is not opened to public, it is possible to make contract agreement for 
    special arrangement. Please contact our tech support team for further details.

・・・
 Rendering a Live2D object together with a 3D object in the same screen and rendering in Live2D's off-screen buffer may result different performance. If you need to make your Live2D character semi-transparent, using method 2. below is necessary. 

    1.Rendering Live2D image directly on screen

You may render a semi-transparent character by rendering a 3D polygon that is semi-transparent first, and then call Live2D's image rendering to draw the character. However, to use depth buffer on a Live2D object with 3D objects infront or behind of it is rather difficult to handle, and requires a high skill level. It is only worth using this method if it is for a 2D game that needs a Live2D character to look semi-transparent.
 
   2.Rendering Live2D image on an off-screen buffer

Alternatively, use Frame Buffer Object (FBO) to process the Live2D image first, and then rendering it on screen. The following is an example of using FBO:

2-1 Use Live2D render on a transparent FBO
2-2 Render the FBO on screen (same as rendering any texture with semi-transparency)
First place the Live2D image on the FBO, and then adjust the opacity of the FBO rendering on the screen, then you get a semi-transparent Live2D model. (If directly rendering the Live2D model on screen, adjusting transparency would cause your model's layers to mix opacity with each other. This causes problem like seeing through the semi-transparent face texture and notice the hair-back texture).


※When using off-screen buffering, if the texture has not been through premultiply-processing (multiplying RGB color with the alpha opacity value), then 1) color displayed may be inaccurate, and 2) character's outline may result noise pixels.

Comments