# LaMAR dataset This directory holds the data of the [LaMAR dataset](https://lamar.ethz.ch). Please refer to the GitHub repository for a description of the data format, dataset and sample code to evaluate localization algorithms: [https://github.com/microsoft/lamar-benchmark](https://github.com/microsoft/lamar-benchmark). The data is provided under the Creative Commons license [Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/). This means that you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. See the LICENSE file for more details. The data is released in two variants: - the raw data, as recorded by the original devices (HoloLens, phones, NavVis scanner), is available in `raw/`. - the data processed for the localization and mapping benchmark is available in `benchmark/`. ## Benchmark data We release the benchmark data in 3 Capture directories, one for each of the CAB, HGE, and LIN scenes. The format is described in the [specification document](https://github.com/microsoft/lamar-benchmark/blob/main/CAPTURE.md). The ground-truth poses (sessions/map/trajectories.txt and sessions/query_val_{hololens,phone}/proc/alignment_trajectories.txt) were obtained using SuperPoint + SuperGlue. For more details, please refer to https://github.com/magicleap/SuperPointPretrainedNetwork and https://github.com/magicleap/SuperGluePretrainedNetwork. We received the following answer from Magic Leap regarding the use of SuperPoint/SuperGlue for the generation of ground-truth poses: > We are fine with ETH Zurich publishing the LaMAR dataset, which was created in part with SuperPoint and SuperGlue, under a permissive license (CC BY-SA 4.0). No license waiver is needed since LaMAR is a research and benchmarking project. To download the data to the current local directory, run: ```bash export URL="https://cvg-data.inf.ethz.ch/lamar/benchmark/" for scene in "CAB" "HGE" "LIN"; do wget "$URL/$scene.zip" && unzip "$OUT/$scene.zip" && rm "$OUT/$scene.zip"; done ``` ## Raw sensor data We release the raw data in 3 Capture directories, where each session corresponds to a recording of one of the devices. This data includes additional sensor modalities that were not used for the benchmark, such as depth, IMU, or GPS. We also release the 3D laser scans (as point cloud and mesh) obtained by the NavVis scanner and used to estimate the ground truth poses. Each scene directory contains files `metadata_{phone,hololens}.json` that indicate, for each recording, its duration (in seconds) as well as whether some sensor modalities are missing. ## Versioning Current Data version 2.2 Changelog: - v2.2 - 28/09/2023 - Release of the raw sensor data - v2.1 - 16/03/2023 - Update license information for ground-truth poses - v2.0 - 23/12/2022 - Replace incorrect query sequences in `CAB/query_phone` - Replace incorrect validation sequences in `{CAB,HGE,LIN}/query_val_{phone,hololens}` - v1.0 - 15/11/2022 - Initial release