You can get Voxel Farm now. For more information click here.

Thursday, September 8, 2016

Introducing Farm Cloud

We have included a preview of our collaborative edition system in the latest Voxel Farm 3 release.

The following short video shows what takes to setup and share a project from scratch:

We are still working on this system, but hopefully you get the idea. Our focus is to make the setup as simple as possible so people can start collaborating without significant effort.

This is also intended for more than collaboration and source control. This same layer of distributed persistence can be used by any application to store and sync the changes their users make.

This is a very active area of development for us. I will be posting more about this in the future, meanwhile as always I look forward to the questions you may have about this system.

Monday, August 22, 2016

Pyramid Scene in UE4

Here you can see a video of the Pyramid scene from the previous post:

There are many improvements in this new engine version. Aside the new features like textured voxels and procedural materials, we invested a big deal of time in the quality of the experience. There is still some pop-in, which is coming from a little bug in the meta-material layer. Aside from that the LOD changes are pretty stable and unnoticeable for the most part. The use of UV-mapped voxels also helps minimize the pop-in.

Another old problem area was the replanting of procedural flora (and other objects) every time there was an edit. The new instancing system provides a stable representation that is able to transfer the flora planting into the new geometry. Only edits that really affect the planted instances result in a visual change.

The destruction and building shown here is controlled by blueprints inside UE4. You could easily switch the bombs by a pickax or by rockets.

Thursday, August 4, 2016

System Integration

Here is a scene that combines all the recent developments in Voxel Farm:

This integration is subtle. There are no giant turtle mountains here, but the same system we used for the giant turtle is applied here to produce the natural rock pillars that are connected by the bridges, also the platform where the pyramid rests:

These natural structures are defined by a low resolution mesh that captures the basic shape. This is expanded in realtime into detailed features like rocks, sand and dirt using procedural materials that we call Meta-materials.

The large terrain around the pillars and the pyramid's base is using Voxel Farm's standard procedural terrain.

This scene also uses UV-mapped voxels. This is a different method to apply textures to voxels. It allows much finer control. The entire pyramid is using them:

There is also the new instancing system, which is responsible for all the trees plants and rocks you see scattered around. You cannot see this in just screenshots, but this new system is able to preserve existing instances even in the event of user edits and destruction. This avoid the visible "replanting" of vegetation and rocks around edits.

And all this is running in Unreal Engine 4. I would say the UE4 integration is the last of the systems than went into taking these shots.

Friday, July 15, 2016

Intelligent Terrain Synthesis

Don't you hate it when your favorite TV series puts out an episode that is just clips of stuff that happened in earlier episodes? This post has some of that but hopefully will provide you with a better idea of how we see Procedural Generation in the near future.

This video shows the new procedural terrain system to be released in Voxel Farm 3:

In case you want to find out more about what is happening under the hood, these previous posts may help:

Geometry is Destiny Part I and Part II
Introducing Pollock
Terrain Synthesis

The idea is simple. Instead of asking an artist to labor over a hundred of different assets, either by hand or by using complex generation tools like World Machine, we now have a synthetic entity that can do some of that work through a mix of AI and simulation. You do not have to be an expert or initiated at all in the arts of procedural generation to get a satisfactory outcome.

Why are AI and simulation important? After working for a while in procedural generation, it became clear to me there was no workaround to the entropy problem. This I believe can be stated like this: Viewers of procedurally generated content will perceive only the "seed" information, not the "expanded" data. Yes, you may have a noise function that can output terabytes of data, but all this data will be compressed by the human physique to the few bytes that take to express the noise function itself. I posted more in detail about this problem here:

Uncanny Valley of Procedural Generation
Procedural Information
Evolution of Procedural

This does not mean all procedural generation is bad. It means it must produce information in order to be good. Good Procedural Generation is closer to simulation, automation and AI. You cannot have information without work being done, and if work is to be done better leave it to the machine.

The video at the top shows our first attempt at having AI that can do procedural generation, stay tuned because more is coming.

Monday, July 11, 2016

Geometry is Destiny Part 2

This is a continuation of an earlier post. That post ended in a literal, virtual cliffhanger. We generated continent shapes and we used tectonic plate simulation to compute where mountain ranges would appear.

The remaining step was to assign biomes. We wanted biome placement to be believable so we did a bit of research on what biomes are where and why they appear.

Scientists have distilled this to a very convenient set of rules, which are captured by the following chart:

I stole this particular image from Mugan's Biology Page, but you will find it pretty much everywhere occurrence of biomes on Earth is discussed. This is a classification made by Robert Whittaker based on annual precipitation and average temperate. His study suggests temperature and humidity are enough to determine biome occurrence.

This made it simple for us. We only needed to compute two parameters across the continent: temperature and humidity.

Let's start with the easiest, which is temperature. We figured out some of this would have to be user input. as we did not know the latitude for the continent we would not be able to determine how cold or hot it was. Instead of asking how much sun the landmass was getting (which is what latitude means in this case), we chose to ask temperature values for each corner of the map:

To get the temperature for any position within the map we just do a linear interpolation from these four corner values.

Temperature also changes with elevation. This ranges from -6 to -10 degrees Celcius for each extra kilometer in altitude. We had a rough elevation map from the tectonic plate simulation, with this additional piece we are able to compute a fairly good temperature value for any point on the map. This is the horizontal axis of the Whittaker chart.

The vertical axis is rainfall or humidity percentage as other biome charts have it. This one is trickier. In real life, water evaporates from large water masses like the sea into clouds. Air rises when it meets higher elevation. There it cools down losing some of its ability to hold water. The water excess becomes rainfall.

We chose to simulate exactly that process. We knew the continent would be surrounded by ocean water. This would be the main source of water. The next step would be to determine wind direction. Continents are exposed to different wind systems depending on where they are on the planet. Global wind simulation was out of scope for us, so we chose again to ask the user:

The input is quite simple, just a wind direction for each map of the corner. With this, we would be able to produce a wind vector field for the entire map.

If you remember the previous post, a key aspect of this simulation framework was the use of a mesh instead of a regular 2D grid:

This came handy for simulating water transfer. Each node is seeded with an initial amount of water. Nodes over the ocean would get 100% and nodes over the landmass would get zero.

Then we perform multiple simulation steps. Each step looks at each pair of connected nodes and figures out how much water moved from one node to the next and how much was lost as precipitation.

Assume two connected nodes, A and B. The dot product between the wind vector at A, and the vector that goes from A to B will tell us how able the wind is to carry water from A to B. Then based on the water already contained in A and B, and the temperature changes, we can compute how much water moved and how much rainfall there is.

After sufficient simulation steps to cover the node graph, the following pattern emerges:

Here the wind comes from the south. The grayscale shows humidity. The red areas show where the mountains are. As you can see most of the moisture carried by the wind precipitates right after entering the continent, leaving most of the land behind very dry.

If we switch wind direction to the north, a very different pattern emerges:

Once both temperature and humidity are known, assigning biomes is trivial. You can think of the Whittaker chart as a 2D matrix. Humidity determines which row you would use and temperature the column:

This could be generalized to other biome types by providing a different matrix, but I have not paid too much attention to other-worldly biome systems. I have not found any good examples of biome attribution beyond temperature and humidity.

Once you know the biome for each node in the mesh, obtaining a 2D map is a matter of drawing each triangle in the mesh to a bitmap. We had a software rasterizer for occlusion culling that came pretty handy here.

I leave you with a few examples of what happens when you play with wind directions and base temperatures:

There was an error in this gadget