York University
AUSMLab Logo

Research Datasets

Access our published datasets for urban modeling and spatial analysis research.

Dataset

YUTO Semantic

By Gunho Sohn, Sunghwan (Jacob) Yoo

YUTO Semantic is a multi-mission large-scale aerial LiDAR dataset specifically designed for 3D point cloud semantic segmentation. The dataset comprises approximately 738 million points, covering an area of 9.46 square kilometers of York University Campus in Toronto, Ontario Canada. Each point in the dataset is annotated with one of nine semantic classes.

Dataset

YUTO MMS

By YeonJeong Jeong, Gunho Sohn

YUTO MMS: A Comprehensive SLAM Dataset for Urban Mobile Mapping with Tilted LiDAR and Panoramic Camera Integration

The York University Teledyne Optech (YUTO) Mobile Mapping System (MMS) Dataset, encompassing four extensive sequences totalling 20.1 kilometres, was thoroughly assembled through two data collection expeditions on August 12,
2020, and June 21, 2019. Acquisitions were performed using a uniquely equipped vehicle, fortified with a panoramic camera, a tilted LiDAR, a Global Positioning System (GPS), and an Inertial Measurement Unit (IMU), journeying through
two strategic locations: the York University Keele Campus in Toronto and the Teledyne Optech headquarters in City of Vaughan, Canada. This is a robust benchmark of prevailing Simultaneous Localization and Mapping (SLAM) systems.

For more details on YUTO MMS dataset, please refer to our paper.

<center> <a href="https://github.com/ausmlab/yutomms/tree/main/images/maverick_route.jpg"> <img src="images/maverick_route.jpg" height="170"> </a> </center>

Paper

Zhang Y, Ahmadi S, Kang J, Arjmandi Z, Sohn G. YUTO MMS: A comprehensive SLAM dataset for urban mobile mapping with tilted LiDAR and panoramic camera integration. The International Journal of Robotics Research. 2024;0(0). doi:10.1177/02783649241261079

@article{doi:10.1177/02783649241261079,
author = {Yiujia Zhang and SeyedMostafa Ahmadi and Jungwon Kang and Zahra Arjmandi and Gunho Sohn},
title ={YUTO MMS: A comprehensive SLAM dataset for urban mobile mapping with tilted LiDAR and panoramic camera integration},
journal = {The International Journal of Robotics Research},
volume = {0},
number = {0},
pages = {02783649241261079},
year = {0},
doi = {10.1177/02783649241261079},
URL = {https://doi.org/10.1177/02783649241261079},
eprint = {https://doi.org/10.1177/02783649241261079}
}

Dataset Description

The directory structure of our YUTO MMS dataset is shown in the following figure.

<center> <a href="https://github.com/ausmlab/yutomms/tree/main/images/YUTO-Dataset-directory-structure.JPG"> <img src="images/YUTO-Dataset-directory-structure.JPG" height="400"> </a> </center>

YUTO MMS dataset general information

SequenceNumber of image filesNumber of LiDAR scansNumber of GPS+IMU dataTotal directory volume
A7001432118453.8 GB
B83821739514363745.6 GB
C107782299218987559.2 GB
D450096157950625.2 GB

Dataset Evaluation

SequenceORB-SLAM2VINSRPV-SLAMHDPV-SLAMLOAMCartographerPVL-Cartographer
A5.8943.9971.6181.4Fail4.0230.766
B100.87086.89712.9109.58Fail152.2302.599
C155.908160.76530.66111.93Fail183.6193.739
D10.66512.8755.6734.69Fail58.5762.204

Our papers using YUTO MMS dataset

M. Ahmadi, A. A. Naeini, M. M. Sheikholeslami, Z. Arjmandi, Y. Zhang and G. Sohn, "HDPV-SLAM: Hybrid Depth-Augmented Panoramic Visual SLAM for Mobile Mapping System with Tilted LiDAR and Panoramic Visual Camera," 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE), Auckland, New Zealand, 2023, pp. 1-8, doi: 10.1109/CASE56687.2023.10260361.

Zhang, Yujia, Jungwon Kang, and Gunho Sohn. 2023. "PVL-Cartographer: Panoramic Vision-Aided LiDAR Cartographer-Based SLAM for Maverick Mobile Mapping System" Remote Sensing 15, no. 13: 3383. https://doi.org/10.3390/rs15133383

J. Kang, Y. Zhang, Z. Liu, A. Sit and G. Sohn, "RPV-SLAM: Range-augmented Panoramic Visual SLAM for Mobile Mapping System with Panoramic Camera and Tilted LiDAR," 2021 20th International Conference on Advanced Robotics (ICAR), Ljubljana, Slovenia, 2021, pp. 1066-1072, doi: 10.1109/ICAR53236.2021.9659458.

Dataset

Q-Drone Benchmark

By Gunho Sohn

Precise positioning of the Unmanned Aerial Vehicle (UAV) is critical to conduct many sophisticated civil and military applications in challenging environments. Many of the-state-of-the-art positioning methods rely on active range sensors. Among many available ranging sensors, Ultra-wideband (UWB) can provide many benefits such as high precision, power efficiency, and not prone to multipath propagation and noise. Thus, the UWB has recently been attracting many interests from the research community as a complementary positioning sensor. However, there is a significant lack of UWB benchmark data available to support developing, testing, and generalizing their own positioning methods using UWB sensors. In this website, we present a unique benchmark dataset that provides UWB and IMU signals acquired by a Q-Drone system in a diverse environment, including an indoor, open field, close to buildings, underneath the bridge, and semi-open tunnel. This benchmark also provides ground truth of UAV positions independently measured with robotic total stations.