Data Management at Scale | XGRIDS Pro Guide
XGRIDS Pro Guide™ / Module 5: Project Scale

5.5 Data Management at Scale

How to organize, store, share, and work with large point cloud datasets after the fusion job is complete.

The Scale Problem

A single-floor commercial scan might produce 15 to 20 GB of processed point cloud data. A 200-minute fusion project across multiple floors of a large facility can produce 200 GB or more of raw data and 50 to 100 GB of processed output, depending on density settings and export format. At that scale, data management is no longer incidental, it is a project phase that requires the same deliberate planning as collection and processing.

The challenges that emerge at this scale are file size and transfer time, storage capacity on the processing machine and in client deliverable systems, software performance when loading very large datasets, and the need to share data with clients and collaborators who may not have the hardware to open a 100 GB point cloud directly.

The tools and approaches below address each of these challenges. None of them require additional XGRIDS software, they work on the standard output formats that LixelStudio already exports.

Folder Structure and File Organization

Large projects require an explicit folder structure established before collection begins and maintained through delivery. Inconsistent organization across a 200 GB project is not just untidy, it causes files to get lost, processing to reference wrong segments, and deliverable preparation to take far longer than it should.

Recommended Project Folder Structure

Create a top-level project folder named with the project identifier and date. Inside it, maintain four subdirectories: Raw Data (untouched files transferred directly from the scanner), Processing (LixelStudio project files and intermediate outputs), Exports (final processed outputs in delivery formats), and Documentation (field notes, control point name lists, deviation logs, client correspondence).

Never process directly from the Raw Data folder. Always work from the Processing directory. If processing fails or produces a bad result, the Raw Data folder gives you a clean starting point without having to retransfer from the scanner.

File Naming at Scale

Apply your segment naming convention (BuildingName_Floor1_SegmentA) consistently from raw files through final exports. When you deliver a file named 200-minute-fused-FINAL2.las to a client, there is no way to trace which collection session it came from, which processing run produced it, or what version it represents. When you deliver BuildingName_20260219_FusedFull_E57.e57, everyone knows.

Autodesk ReCap Pro

ReCap Pro is the Autodesk application designed specifically for managing and preparing large point cloud datasets for use in BIM workflows. It handles the file size and performance challenges of large scans more effectively than opening raw LAS or E57 files directly in Revit.

The RCP Project File Structure

ReCap Pro works with two file types. RCS is the individual scan region file containing actual point cloud data. RCP is a lightweight project file that references one or more RCS files without embedding the raw data. This structure means the RCP file you share with a Revit user may be only a few kilobytes, while the actual data remains in RCS files on a shared drive. The RCP file tells Revit where to find the data, it does not contain it.

When you import a LAS, LAZ, or E57 export from LixelStudio into ReCap, it creates an RCS file. That RCS file can then be added to an RCP project. Multiple RCS files from multiple scan sessions can be organized under one RCP project, giving Revit users a single import that represents the entire facility.

Decimation on Import

ReCap Pro applies a decimation grid on import that reduces point density. The default maximum is 100 mm spacing, meaning only one point is retained per 100 mm voxel. For architectural documentation where you need to identify 50 mm features, this matters. Adjust the import settings before converting if fine detail preservation is required for your deliverable.

Working with Large Datasets in ReCap

  • Use region-based editing to clip the dataset to areas of interest before delivering to downstream users. A 200 GB full-facility scan delivered to a contractor needing only the mechanical room is not useful, clip it first.
  • Intelligent decimation in ReCap reduces file size for datasets where full density is not required for the deliverable. Apply it after verifying accuracy at key measurement points.
  • ReCap supports unified export: combining multiple RCS files into a single output while preserving coordinate system alignment. Use this when a client needs one file rather than a project folder.
  • Performance degrades with very large projects. 32 to 64 GB of RAM is recommended for projects above 90 GB of total point cloud data.

E57 Export from LixelStudio

When exporting E57 from LixelStudio for use in ReCap, use pose_no_offset.csv for data that has absolute coordinates. Use pose.csv for data without absolute coordinates. This selection is easy to overlook and choosing the wrong pose file produces a dataset with incorrect spatial registration that will not align with survey data in Revit.

E57 filenames must be 20 characters or shorter. LixelStudio's E57 export fails silently on filenames longer than 20 characters, the export completes but the resulting file is corrupt or missing. Apply this limit before exporting. It applies to the filename only, not the full path.

CloudCompare

CloudCompare is a free, open-source point cloud viewer and editor with capabilities that are useful for quality review and lightweight delivery preparation, without requiring an Autodesk license. It is not a replacement for ReCap in BIM workflows, but it handles tasks that do not require a full BIM pipeline effectively.

Supported Input Formats

What CloudCompare Can Open

E57, LAS, LAZ, PLY, PTS, XYZ, OBJ, and several additional formats. All standard LixelStudio export formats are compatible. No conversion is needed.

Core Capabilities

What You Can Do With It

Clipping and cropping, noise filtering, point cloud measurement, cross-section generation, color visualization, density analysis, and format conversion. Suitable for QC review and delivering cleaned subsets.

Practical Uses on Fusion Projects

  • Load the fused output and inspect segment boundaries before client delivery. Misalignment that is subtle in LixelStudio's viewer is often more visible when navigating in CloudCompare.
  • Generate horizontal cross-sections at multiple floor elevations to verify geometry consistency and check for drift artifacts that produce slanted floors or walls.
  • Clip the dataset to a specific area of interest and export that subset as a smaller LAS file for a client who does not need the full facility coverage.
  • Measure known dimensions against the point cloud to spot-check accuracy before formal delivery.

CloudCompare is particularly useful when a client or collaborator needs to review point cloud data but does not have Autodesk software or a LixelStudio license. The software is free to install, and a cropped LAZ export is a practical self-contained deliverable that most AEC professionals can open without additional tools.

Sharing and Delivery

How you deliver point cloud data should match the client's workflow and software environment, not just the format LixelStudio produced. Delivering a 100 GB LAS file to a client who needs to view it on a laptop is not useful delivery regardless of how good the data quality is.

Autodesk Ecosystem Clients

Deliver RCP/RCS for clients working in Revit, AutoCAD, Civil 3D, or Navisworks. The RCP project file structure allows large datasets to load efficiently in these applications. Provide the complete RCS file set and the RCP project file together, the RCP file is useless without the RCS files it references.

Non-Autodesk or Mixed Environments

Deliver LAZ (compressed LAS) as the general-purpose format. It opens in CloudCompare, ArcGIS, Leica Cyclone, Bentley, and most other AEC platforms without conversion. The file size reduction relative to uncompressed LAS is approximately 80 to 90% with no quality loss.

Large File Transfer

For files above 5 GB, avoid email and standard cloud services with upload size limits. Use a shared NVMe drive physically handed off, a dedicated large-file transfer service, or a client-provided FTP or SFTP server. Include a file manifest, a plain text list of every file in the delivery, its size, and its purpose, so the client can verify the transfer was complete.

Archiving Raw Data

Raw scan data from the scanner, the unprocessed HBC files and project data folders, should be archived separately from processed outputs and kept for a minimum period after project delivery. The raw data is the only source from which you can reprocess if a client requests a different output format, a higher accuracy reprocess, or if a quality dispute arises.

The practical storage requirement for a full 200-minute L2 Pro project is 200 to 270 GB of raw data. Cold storage on external drives or a NAS is appropriate. Do not archive raw data only to the processing machine's working drive, that drive needs to be cleared for the next project, and if it fails, the archive is gone with it.

Keep a written record of what was archived, when, and where. A drive labeled "2026 scans" that contains six unidentified project folders is not an archive, it is a problem waiting to be discovered when a client calls three years later.

Next: diagnosing and resolving fusion failures.

Troubleshooting Fusion →

©2026 Alpine Reality Capture LLC  •  XGRIDS Pro Guide™