They seem like such a great idea until you have to deal with all that data.
You have your new iPad or iPhone with the built-in lidar scanner. Cool. You have some technology that would have cost in the tens of thousands several years ago right there in your pocket.
Are you planning on using it for Architecture? Maybe create some asbuilts? I hope so. Pointclouds here we come. Now depending on what you plan to do with your scan, you might choose to use a program that either creates a point cloud or actually just creates the mesh from the start.
The advantage of using something like Polycam, Forge, or even Scaniverse is that you get a usable mesh right out of the scan. The mesh exports with textures and to a scale. But there is a drawback.
If you are scanning multiple rooms or rooms with small items that need to be somewhat captured, you will be kicking yourself for not using an app that can automatically register the multiple scans. This would be something like Sitescape.
Sitescape allows you to scan a room to a point cloud, then before closing the app scan the next room and so on. These multiple scans can be brought into Cloud Compare and automatically registered. These registered pointclouds can then be brought into recap and exported as an E57 file for autocad. That process is pretty easy and fluid. The pointcloud is also quite manageable inside the autocad pointcloud system.
If you are thinking that you will then take the data and get a realistic mesh out of it… that is where it all seems to fall apart. How do we get it into something usable for Blender? This is where a convoluted system gets even worse.
First step: Bring all your scans into Cloud Compare. Join them into one file. Not that big of a deal.
Second step: Export and bring the .PLY file into Meshlab. It will most likely look all wacky until you compute the normals. This takes some time if you are working with a larger model scan.
Third step: Compute the normals of a pointcloud in Meshlab.
Fourth step: Simplify the pointcloud. Trying to mesh a pointcloud that is super dense will lock up your machine for… let’s say for-EVER.
Fifth step: Reconstruct the surface of the pointcloud into a mesh.
Sixth step: Pray it looks decent. Nope.
If you are an expert with this process, I would love to find out how you are doing it to get great textured mesh results from a pointcloud. Otherwise I am going to keep using pointclouds as pointclouds and meshes as meshes. You have to make a choice from the beginning based on your use case.