Static and animated 3D content can provide more realistic and engaging mobile augmented reality experiences. However, creating 3D objects is not as straightforward as featuring video and photo AR layers. To help you out, we’ve written a blog in collaboration with our partner, Jānis Uzraugs, sharing how to use Blender to create 3D content for the Overly app.
Blender is free and open 3D creation software and offers the same features as Cinema 4D, 3DS Max or Maya. Unless you already have a paid-for software, there is no notable benefits to using a costly alternative over Blender.
Where does Unity3D come in?
Overly app is built using Unity3D. Therefore, all 3D objects need to be prepared so they would integrate with Unity. If you are creating 3D objects that do not move, you can export files in .OBJ format. If you need to create more complex scenes with animations, you need to render these in .FBX format and prepare asset packs for the Overly app. However, do not worry about getting familiar with Unity. Once you send over .OBJ or .FBX files, our in-house team integrates your project with Unity to place in the Overly app.
Polygons (low poly)
While there are no particular requirements in terms of the number of polygons necessary for preparing augmented reality models and scenes, it is good to consider that people use various devices to retrieve this content. Some mobile devices may be quite old so it is always the best practice to keep the number of polygons low.
While the most current devices could easily process 300 to 400 thousand polygons, to stay on the safe side it’s best to keep your scenes at 250 thousand polys for mobile devices. If you are creating a seperate 3D object, do not exceed more than 100 polygons.
We didn’t want to rewrite the facts and have borrowed some information from www.autodesk.com, as we think they perfectly explain how to understand polygons.
“Polygons are straight-sided shapes (3 or more sides), defined by three-dimensional points (vertices) and the straight lines that connect them (edges). The interior region of the polygon is called the face. Vertices, edges, and faces are the basic components of polygons. You select and modify polygons using these basic components.
When you model with polygons you usually use three-sided polygons called triangles or four-sided polygons called quadrilaterals (quads). Maya also supports the creation of polygons with more than four sides (n-gons) but they are not as commonly used for modeling.
An individual polygon is commonly called a face, and is defined as the area bounded by three or more vertices and their associated edges. When many faces are connected together they create a network of faces called a polygon mesh (also referred to as a polyset or a polygonal object). You create your 3D polygonal models using polygon meshes.”
Just adding on a quick summary with a visualization. Vertex is a point where two or more curves, lines, or edges meet. As a consequence of this definition, the point where two lines meet to form an angle and the corners of polygons and polyhedra are vertices. Face refers to a 2D shape on a mesh that is defined by the surrounding vertices. Edge is the spot where two faces meet together.
Animations can be prepared by using object transformation, changing their location, size as well as rotation across the coordinates of x,y,z axes. You can also create animations with object rigging system. Object rigging will be helpful in animation such objects as humans, animals, cars or other mechanical devices that consist of more than one part. In one of our recent projects that was created for a park tourism brochure, rigging system was used to create an impression of trees swaying in the wind.
All animations are prepared in the timeline, noting key frames where changes to the object bone or armature are taking place in the x, y,z space.
Each object has its own animation with a unique title attached, so it is easier to when objects and animations are uploaded to Unity. As any one object may have more than one animation, the title should note both the object it is for as well what animation it is. This is mostly relevant to scenes with more than one object, so the assets are organized and not confusing for the Overly support team that handles Unity integration.
Animations are created in the Timeline.
Animation titles and their attachment to specific objects are added to a Dope Sheet tab, which can be accessed through the Action Editor view.
Object as one or many separate layers
As in the park example featured above, any other project created for the Overly app should be exported as a single file once the project is complete. The single file should consist of categorized objects, so there would be no issues if there is a necessity to change something once uploaded to Unity3D. All parameters such as materials and textures should be included for each object separately.
The origin point of all elements must be located at point 0 (Zero) on x,y and z axes.
Textures/materials/colors (color and light differences: rendered visualization vs AR)
If an object has only a color assigned, there is no need to create textures. You can just choose the object’s material and give it a specific color.
However, if you want to assign more than a color to an object, for example color changes, textures, hieroglyphs, etc., you will need to create textures.
One option is to create these textures in Blender and bake them to the specific object as .JPG or .PNG. However, you must be careful with a couple of things – you shouldn’t bake lightning or shadows from the scene, because this will look wrong alongside Unity project lights. Therefore, you must choose the bake type Diffuse (it is defaulted to Combined) and leave just the color option turned on.
Before the texture is baked, you must check if everything is correct with the normals and if they are located towards the correct direction. Otherwise, the polygons that are turned the wrong direction will be baked black within the textures and will appear black or transparent once imported to Unity.
In this instance normals can be twisted around so they would all face the same direction.
When the textures are baked, all you need to do is assign them to specific objects and create a new material with a certain texture attached.
Before the objects are textured, a UV coordinate system map has to be created for each. It sounds more complex than it is.
In short, it is a two dimensional image of where different textures appear on the 3D object. It is the same as creating a layout for a cube to later cut it out from a piece of paper, fold it and glue it.
Once you’ve created layouts, you can create textures for each object for importing to Unity.
One of the options of how to create textures was provided above. However, you can also take those layouts and open them in a different program such as Photoshop if you prefer.
Here is an example of the above building layout:
Different types of textures exist, which are responsible for making materials glossy, transparent, rough or similar. Each of these textures have to be saved as .JPG fails. That is quite useful for creating such stuff as reflections for windows or highlighting various aspects of a building’s facade.
As the prepared files are later imported to Unity, there is no need to create lights and cameras in Blender. The only exception could be to create a render for yourself or run a scene or an object past a client.
File format, .FBX size
Once the animated file is prepared, it has to be exported as an .FBX file. This file format allows you to save textures, objects, animations and armature.
It is important to highlight just mesh and armature when exporting. When you submit a file to Overly for uploading to Unity, it is best to supply a map with all textures – just to be on the safe side.