How to "flatten" or "index" 3D-array in 1D array?

x + y*WIDTH + Z*WIDTH*DEPTH. Visualize it as a rectangular solid: first you traverse along x, then each y is a "line" width steps long, and each z is a "plane" WIDTH*DEPTH steps in area.


I think the above needs a little correction. Lets say you have a HEIGHT of 10, and a WIDTH of 90, single dimensional array will be 900. By the above logic, if you are at the last element on the array 9 + 89*89, obviously this is greater than 900. The correct algorithm is:

Flat[x + HEIGHT* (y + WIDTH* z)] = Original[x, y, z], assuming Original[HEIGHT,WIDTH,DEPTH] 

Ironically if you the HEIGHT>WIDTH you will not experience an overflow, just complete bonkers results ;)


The algorithm is mostly the same. If you have a 3D array Original[HEIGHT, WIDTH, DEPTH] then you could turn it into Flat[HEIGHT * WIDTH * DEPTH] by

Flat[x + WIDTH * (y + DEPTH * z)] = Original[x, y, z]

As an aside, you should prefer arrays of arrays over multi-dimensional arrays in .NET. The performance differences are significant


Here is a solution in Java that gives you both:

  • from 3D to 1D
  • from 1D to 3D

Below is a graphical illustration of the path I chose to traverse the 3D matrix, the cells are numbered in their traversal order:

2 Examples of 3D matrices

Conversion functions:

public int to1D( int x, int y, int z ) {
    return (z * xMax * yMax) + (y * xMax) + x;
}

public int[] to3D( int idx ) {
    final int z = idx / (xMax * yMax);
    idx -= (z * xMax * yMax);
    final int y = idx / xMax;
    final int x = idx % xMax;
    return new int[]{ x, y, z };
}