Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -38,3 +38,60 @@ dataset_info:
|
|
| 38 |
}
|
| 39 |
---
|
| 40 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 38 |
}
|
| 39 |
---
|
| 40 |
|
| 41 |
+
# AeroGrid100
|
| 42 |
+
|
| 43 |
+
**AeroGrid100** is a large-scale, structured aerial dataset collected via UAV to support 3D neural scene reconstruction tasks such as **NeRF**. It consists of **17,100 high-resolution images** with accurate 6-DoF camera poses, collected over a **10×10 geospatial grid** at **5 altitude levels** and **multi-angle views** per point.
|
| 44 |
+
|
| 45 |
+
## 🔗 Access
|
| 46 |
+
|
| 47 |
+
To access the full dataset, [**click here to open the Google Drive folder**](https://drive.google.com/drive/folders/1cUUjdoMNSig2Jw_yRBeELuTF6T8c9e_b?usp=drive_link).
|
| 48 |
+
|
| 49 |
+
## 🌍 Dataset Overview
|
| 50 |
+
|
| 51 |
+
- **Platform:** DJI Air 3 drone with wide-angle lens
|
| 52 |
+
- **Region:** Urban site in Claremont, California (~0.209 km²)
|
| 53 |
+
- **Image Resolution:** 4032 × 2268 (JPEG, 24mm FOV)
|
| 54 |
+
- **Total Images:** 17,100
|
| 55 |
+
- **Grid Layout:** 10 × 10 spatial points
|
| 56 |
+
- **Altitudes:** 20m, 40m, 60m, 80m, 100m
|
| 57 |
+
- **Viewpoints per Altitude:** Up to 8 yaw × 5 pitch combinations
|
| 58 |
+
- **Pose Metadata:** Provided in JSON (extrinsics, GPS, IMU)
|
| 59 |
+
|
| 60 |
+
## 📦 What’s Included
|
| 61 |
+
|
| 62 |
+
- High-resolution aerial images
|
| 63 |
+
- Per-image pose metadata in NeRF-compatible OpenGL format
|
| 64 |
+
- Full drone flight log
|
| 65 |
+
- Scene map and sampling diagrams
|
| 66 |
+
- Example reconstruction using NeRF
|
| 67 |
+
|
| 68 |
+
## 🎯 Key Features
|
| 69 |
+
|
| 70 |
+
- ✅ Dense and structured spatial-angular coverage
|
| 71 |
+
- ✅ Real-world variability (lighting, pedestrians, cars, vegetation)
|
| 72 |
+
- ✅ Precise pose annotations from onboard GNSS + IMU
|
| 73 |
+
- ✅ Designed for photorealistic NeRF reconstruction and benchmarking
|
| 74 |
+
- ✅ Supports pose estimation, object detection, keypoint detection, and novel view synthesis
|
| 75 |
+
|
| 76 |
+
## 📊 Use Cases
|
| 77 |
+
|
| 78 |
+
- Neural Radiance Fields (NeRF)
|
| 79 |
+
- View synthesis and novel view generation
|
| 80 |
+
- Pose estimation and camera localization
|
| 81 |
+
- Multi-view geometry and reconstruction benchmarks
|
| 82 |
+
- UAV scene understanding in complex environments
|
| 83 |
+
|
| 84 |
+
## 📌 Citation
|
| 85 |
+
|
| 86 |
+
If you use AeroGrid100 in your research, please cite:
|
| 87 |
+
|
| 88 |
+
```bibtex
|
| 89 |
+
@inproceedings{zeng2025aerogrid100,
|
| 90 |
+
title = {AeroGrid100: A Real-World Multi-Pose Aerial Dataset for Implicit Neural Scene Reconstruction},
|
| 91 |
+
author = {Zeng, Qingyang and Mohanty, Adyasha},
|
| 92 |
+
booktitle = {RSS Workshop on Leveraging Implicit Methods in Aerial Autonomy},
|
| 93 |
+
year = {2025},
|
| 94 |
+
url = {https://im4rob.github.io/attend/papers/7_AeroGrid100_A_Real_World_Mul.pdf}
|
| 95 |
+
}
|
| 96 |
+
|
| 97 |
+
|