An Open Source Photogrammetry Pipeline
In this article, I’m gonna teach you how to transform your drone photos into 3d models and put the result on a map
Introduction
According to Wikipedia:
Photogrammetry is the science and technology of obtaining reliable information about physical objects and the environment through the process of recording, measuring and interpreting photographic images and patterns of electromagnetic radiant imagery and other phenomena.
First of all, I like to say that I am not an expert in the field, I’m only a developer trying to do more interesting things with my drone. With that said, to create a 3D model using photogrammetry it’s necessary a bunch of photos that overlap each other.
This is going to be our final result:
https://joyful-daffodil-16661f.netlify.app/
And here is the code:
https://github.com/kauly/church-map
The code is pretty simple but a very cool set of libraries and services is necessary to achieve our goal. The steps are as follows:
Flight planning
Every drone comes with a proprietary mobile app and within it intelligent flight modes options. I have a FIMI X8S2, and I am going to use its app here as a reference, but the options are pretty much the same for all drones.
First, you need to choose your target. Here, it will be an old church in my neighborhood.
This is a good youtube video explanning the whole flight planning and capture. It’s in Portuguese, my language by the way, but is possible to activate subtitles in any language.
The idea is to take overlapped photos of the target from different angles. You can use two different flight modes to do that: Orbit or Waypoints. Waypoints is the most used one, but I chose Orbit because it’s more straightforward. In Orbit mode, it’s necessary to put the drone at the target’s center and choose a radius. The camera must always be pointed at the target.
With the flight plan configured, go to the camera options and choose “Lapse, 2 seconds.” Another important thing to configure it’s the drone velocity during the flight I chose 3 m/s. The combination of velocity and lapse will give us overlapped photos. There are fancy calculations for this, but these are good defaults.
All right, put your drone to flight, fire the photo button, and keep an eye on the camera angle. It should have a good vision of the target. One more thing, the more pictures, the better. I took 241 photos.
Image Processing with OpenDroneMap (ODM)
ODM is an amazing open-source tool; it will take your drone photos and create lots of useful outputs.
The only thing necessary to run ODM is docker. ODM also has good documentation that’s definitely worthy the read. I’m gonna leave the GitHub here. There, you will find more detailed instructions.
So, put your SD card on the computer and take a good look at the images. Delete images that are not pointed to the target. It’s necessary to create two folders for the project, like this:
├── church
│ ├── images
Inside the church folder, run the following command:
docker run -ti - rm -v .:/datasets/code opendronemap/odm - project-path /datasets
This command will take a long time to complete. Here it took more than one hour. After the command finishes, you should have the files mentioned in the docs above in your project folder church in my case. There is a quality report in the odm_report folder with lots of information about the generated data and even some previews.
From the image below, you can see my Orbit flight mode and the positions of the camera shots:
The 3D model is located in the odm_texturing folder. You can render it using software like Blender, but in the next section, we’re gonna see the generated model on CesiumIon.
Upload Data to CesiumIon
An open platform for tiling, hosting, and serving geospatial data as 3D tiles.
We’re gonna use CesiumIon to host and serve the model properly. The model will be served as a 3D Tile. Go ahead and create a Cesium account. It’s free for developer use. The process of uploading the model to CesiumIon is really straightforward and is covered in the following tutorial:
Do not forget to update all files in the odm-texturing folder. I had a problem, though. Cesium could not find the model location in the globe, so I had to manually set it. Cesium also has a tutorial for this:
Here, I took the coordinates from one of the drones’ photos. You can find the coordinates in the image metadata.
Coding
Yes, let’s finally do some coding. We will use various libraries to render this model on a map, but the final code will be very simple. These are the libraries:
- Vite — bootstrap the project
- react-map-gl — render a base map in a React way
- maplibre — used by react-map-gl, it’s a mapbox-gl replacement
- deck.gl — render the 3D model
- loaders.gl — load the 3D model
I’m also using Tailwind, but it’s just a habit. The CSS for this project is very simple. Also, I’m currently using pnpm, but npm or yarn is OK too. Let’s start by creating a project.
To bootstrap a React and TypeScript project:
pnpm create vite your-project-name - template react-ts
Go to the project folder, and install the dependencies:
cd your-project-name
pnpm add @deck.gl/core @deck.gl/layers @deck.gl/react @deck.gl/mesh-layers @deck.gl/geo-layers @deck.gl/mapbox @loaders.gl/3d-tiles react-map-gl maplibre-gl
Now, let’s create a folder to keep our components:
cd src
mkdir components
touch components/Loading.tsx components/ChurchMap.tsx
You can delete the app.css file and all the boilerplate code at App.tsx . After that, go to main.tsx and delete the app.css import and add the maplibre CSS import. This CSS will make the base map render properly.
import React from "react";
import ReactDOM from "react-dom/client";
import App from "./App.tsx";
import "./index.css";
import "maplibre-gl/dist/maplibre-gl.css";
ReactDOM.createRoot(document.getElementById("root") as HTMLElement).render(
<React.StrictMode>
<App />
</React.StrictMode>
);
To avoid errors later, let’s render only the base map first. I’m using a raster base map here; vector maps are better but paid. A mapbox-style object is mandatory by react-map-gl, so let’s create a file to keep it.
touch src/mapHelpers.tsx
This is a mapbox-style object. We can specify map sources and style them, among other things.
// src/mapHelpers.tsx
import { MapboxStyle } from "react-map-gl";
export const mapStyle: MapboxStyle = {
version: 8,
sources: {
osm: {
type: "raster",
tiles: ["https://a.tile.openstreetmap.org/{z}/{x}/{y}.png"],
tileSize: 256,
attribution: "© OpenStreetMap Contributors",
maxzoom: 19,
},
},
layers: [
{
id: "osm",
type: "raster",
source: "osm",
},
],
};
It’s time to create a map. First, we’re gonna render only the base map.
// src/components/ChurchMap.tsx
import Map, { NavigationControl, useControl, MapRef } from "react-map-gl";
import maplibregl from "maplibre-gl";
import { mapStyle } from "../mapHelpers";
const INITIAL_VIEW_STATE = {
longitude: -48.5495,
latitude: -27.5969,
zoom: 9,
};
export default function ChurchMap() {
return (
<Map
mapLib={maplibregl}
mapStyle={mapStyle}
initialViewState={INITIAL_VIEW_STATE}
style={{ width: "100vw", height: "100vh" }}
>
<NavigationControl />
</Map>
);
}
With a working base map, we can add Deck.gl to it. Deck.gl has good documentation that teaches us how to integrate it with other libraries. In our case, that’s react-map-gl, but there is a problem. Lots of their examples are using older versions of react-map-gl. To properly integrate these two libraries, we must use the following example:
In my example, grab the asset-id and access-token from your CesiumIon account. Create a .env.local file to store your accessToken.
touch .env.local
# .env.local
VITE_CESIUM = yourAccessToken
Update theChurchMap.tsx component with Deck.gl and CesiumIon data.
import { Tile3DLayer } from "@deck.gl/geo-layers/typed";
import { CesiumIonLoader } from "@loaders.gl/3d-tiles";
import { MapboxOverlay, MapboxOverlayProps } from "@deck.gl/mapbox/typed";
import Map, { NavigationControl, useControl, MapRef } from "react-map-gl";
import maplibregl from "maplibre-gl";
import { mapStyle } from "../mapHelpers";
import { useRef, useState } from "react";
// CHANGE THE FOLLOWING WITH YOUR CESIUM DATA
const CESIUM_CONFIG = {
assetId: 1691493,
tilesetUrl: "https://assets.ion.cesium.com/1691493/tileset.json",
token: import.meta.env.VITE_CESIUM,
};
const INITIAL_VIEW_STATE = {
longitude: -48.5495,
latitude: -27.5969,
zoom: 9,
};
function DeckGLOverlay(
props: MapboxOverlayProps & {
interleaved?: boolean;
}
) {
const overlay = useControl<MapboxOverlay>(() => new MapboxOverlay(props));
overlay.setProps(props);
return null;
}
export default function ChurchMap() {
const mapRef = useRef<MapRef>(null);
const layer3D = new Tile3DLayer({
id: "layer-3d",
pointSize: 2,
data: CESIUM_CONFIG.tilesetUrl,
loader: CesiumIonLoader,
loadOptions: {
"cesium-ion": {
accessToken: CESIUM_CONFIG.token,
},
},
onTilesetLoad(tile) {
const { cartographicCenter } = tile;
if (cartographicCenter) {
mapRef.current?.flyTo({
center: [cartographicCenter[0], cartographicCenter[1]],
zoom: 19,
bearing: -80,
pitch: 80,
});
}
},
});
return (
<Map
mapLib={maplibregl}
mapStyle={mapStyle}
initialViewState={INITIAL_VIEW_STATE}
style={{ width: "100vw", height: "100vh" }}
ref={mapRef}
>
<DeckGLOverlay layers={[layer3D]} />
<NavigationControl />
</Map>
);
}
The model takes some seconds to load in my repo, I put a loading indicator.
That is it, everyone. Feel free to post suggestions and doubts in the comments section.
Open Source Photogrammetry Pipeline was originally published in Better Programming on Medium, where people are continuing the conversation by highlighting and responding to this story.