LiDAR (.las) to bare-earth DEM: average points spacing and cell size?
It really depends on what you're going for here. If your goal is to make accurate measurements, you will want the grid spacing to be smaller, around the average point spacing of the dataset. If this is part of a process of creating orthophotos or analyzing watersheds, or any other task which requires a more generalized terrain model, you will want the spacing to be larger.
Also, if you've filtered the data to the ground points, USE THOSE! You can ignore the other classifications.
Three times the size of the point spacing is generally considered a good idea for generalization because it allows the grid cell three points to average out to a theoretically consistent value. In my profession, sometimes when I produce orthophotos I allow a huge grid size, often 8-12 times the size of the point spacing.
My point (heh) is: there is no right or wrong way to produce a bare earth model. Sure, there are specifications sometimes (and if this is one of those cases, I suggest you disregard whatever I'm saying and follow the specifications), but producing what you need to suit your problem can be totally heuristic.
Do you want a more rigid, presumably more accurate model with harder edges on which you can model terrain more precisely? Use a smaller spacing. Do you want a more general terrain model that is smoother, less precise, but shows the overall terrain characteristics of a large area? Use a large spacing.
A lot of this is trial and error. That's the beauty of this kind of thing: it's about 30% science and 70% art.
I work as part of a production team that acquires LiDAR and generates DEMs for several government agencies, including the USGS. It is common to see specifications that limit cell size to be no smaller than the nominal pulse spacing (NPS) of the entire project (all points). For instance, if the NPS was 2m for a dataset, our LiDAR derivatives were to be no smaller than 2m. The reasoning behind this is that when the cell size is smaller than the NPS, there is considerable interpolation of values between know points. This in turn creates a less than accurate dataset.
EDIT: NPS defined
A good definition in this document:
Nominal pulse spacing (NPS) refers to the average point spacing of a LiDAR dataset typically acquired in a zig-zag pattern with variable point spacing along-track and cross-track. NPS is an estimate and not an exact calculation; standard procedures are under development by ASPRS for NPS calculations
For comparative purposes, I produced two bare earth DEMs using as cell size value in one instance the average point value of the whole LiDAR dataset, in the other case the average spacing of only ground points (rounded up: it was 0.668, I entered the value 1, so obtaining a DEM with a resolution of 1 m).
The procedure was made via ArcGIS's Las dataset to Raster
tool.
I am noting a neat difference in favor of the second DEM. Focusing on the same portion of the landscape, I do note an improvement in details and quality of the representation of the terrain. So, I am very happy with the second DEM.