Abstract
The 360° camera is a compact omnidirectional perception system for capturing panoramic images with the same field of view as LiDAR. This boosts its versatility for use in autonomous driving and robotics. However, most existing datasets of 360° panoramic images primarily focus on indoor or virtual environments, or they offer only low-resolution outdoor images and LiDAR configurations. In this letter, we present PAIR360, a multi-modal dataset encompassing high-resolution 360° camera images and 3D LiDAR scans, aimed at stimulating research in computer vision. To this end, we collected a comprehensive dataset at Kyung Hee University Global Campus, capturing 52 sequences from 7 different areas under diverse atmospheric conditions, including sunny, cloudy, and sunrise. The dataset features 8K resolution panoramic imagery, six fisheye images, point clouds, GPS, and IMU data, all synchronized using LiDAR timestamps and calibrated across visual sensors. We also provide additional data, such as depth maps, segmentation, and 3D maps, to demonstrate the feasibility of our dataset and its application to various computer vision tasks.
Original language | English |
---|---|
Pages (from-to) | 9550-9557 |
Number of pages | 8 |
Journal | IEEE Robotics and Automation Letters |
Volume | 9 |
Issue number | 11 |
DOIs | |
Publication status | Published - 2024 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
Keywords
- Data sets for SLAM
- data sets for robotic vision
- omnidirectional vision
- sensor fusion