Hey everyone,
I’m a complete beginner with RealityCapture and photogrammetry in general. I’m trying to create a point cloud of my house using photos from both my iPhone 14 Pro and DJI Mini 3 Pro, but I’m having trouble getting them to align properly.
Here’s what I’ve done so far:
• Captured a mix of ground-level photos with my iPhone and aerial shots with the DJI Mini 3 Pro. • Imported all the photos into RealityCapture, but they don’t seem to align correctly.
Some potential issues I’ve noticed: • The overlap between the two datasets might not be great.
• There could be differences in resolution or metadata (GPS info from the drone vs. none from the iPhone). • I haven’t used control points yet because I’m not sure how to do that effectively.
Does anyone have advice for how I can make these two datasets align properly? Are there specific steps I should follow or best practices to keep in mind when combining drone and smartphone photos?
Any help would be greatly appreciated! Thanks in advance.
submitted by /u/Scared-Class-9734
[visit reddit] [comments]
Source link