My collegue TUYISINGIZE Deogratias (“Deo) and others at Dian Fossey Gorilla Fund International have been studying golden monkeys (Cercopithecus kandti) in Rwanda. Golden monkeys are an endangered monkey along the Albertine Rift (including the Virungas, host to the endangered mountain gorilla). They are also cute as can be, but more on that another time.
Deo has been leading efforts to track the golden monkeys in several locations across their range, observing their habits. Among the data gathered is the location of the groups of the monkeys as they move through their range. One element we want to understand from these data are how much does each group move per day.
The raw data look something like this:
So we tweak things a bit to get ids in order of date and time, and also prep the data so that the date and time are proper types in PostgreSQL:
DROP TABLE IF EXISTS goldenmonkeys_sorted;
CREATE TABLE goldenmonkeys_sorted AS
WITH nodate AS (
SELECT gid, geom, id, lat, lon, alt, dater || ' ' || timer AS dater, month AS monther, season, groupid FROM hr_g_all
, sorted AS (
SELECT gid, geom, id, lat, lon, alt, dater::TIMESTAMP WITH TIME ZONE AS datetimer, monther, season, groupid FROM nodate
ORDER BY groupid, datetimer
SELECT gid AS id, ROW_NUMBER() OVER( PARTITION BY gid) AS gid, datetimer, date(datetimer) AS dater, monther, season, groupid, geom FROM sorted
ORDER BY gid
Resulting in the following:
Golden Monkey Ranging in Gishwati National Park
Ok. Now we want to turn this into traces of the movements of the monkeys everyday. Something like this:
But for every trace, for every day for each group.
We will create a function that leverages WITH RECURSIVE. We’ve seen this before. WITH RECURSIVE allows us to take each record in sequence and perform operations with the previous record, in this case calculating travel time, travel distance, and combining the individual points into a single line with ST_MakeLine.
Now to use our function, we need a list of dates and groups so we can calculate this for each day:
Now we have traces not just for one day and group, but all traces and groups:
Often one has points in QGIS in a given coordinate system and wants them in latitude and longitude for various reasons. Solutions I have used in the past included converting to WGS84 EPSG:4326 and then using the field calculator in QGIS to calculate X and Y values for Longitude and Latitude respective, exporting to CSV while projecting to 4326, or pulling into PostGIS and writing a function.
I got tired of those shenanigans for our field survey around Gishwati National Park in Rwanda this week, and finally wrote something in QGIS functions.
For X, we use the field calculator with either x_min or x_max (doesn’t matter for points) applied to the transform of the geometry from UTM Zone 35 S or EPSG:32735 (in our case) to geographic in WGS84 or EPSG:4326:
It is with sadness I leave the OSGeo Community Sprint. Folks will be continuing to do great work all week, so stay tuned for those updates elsewhere. It’s been a busy year travel wise and it’s time to get back to the family.
This was a bit of a homecoming for me, seeing all sorts of folks I haven’t seen in a while, including some of those founders and core contributors to the backbones of the open geospatial ecosystem (you know who you are…), as well as newer contributors to august projects.
PostGIS and GEOS
My first code sprint was in Boston in 2013, and I recall fondly the welcome especially from the PostGIS crew. While a heavy user of PostGIS, and I expected I would contribute to the project, that still hasn’t happened. But I got to sit near the PostGIS crew and listen in on their questions, problems, and directions.
Probably one of the more interesting things in that space lately is Paul Ramsey and Martin Davis’ movement to Crunchy where they’re doing a whole lot of work on PostGIS and GEOS.
Point Data Abstraction Library
PDAL continues to grow, and Howard Butler, Andrew Bell, Connor Manning, Brad Chambers (and others I’m so rudely forgetting at the moment) are in attendance. I had great conversations with all (some I was meeting for the first time), but boy does my mind get blown when talking to Howard. So many great connections, ideas, and hints of paths to follow that support where OpenDroneMap and photogrammetric point clouds can go. Also PDAL is going in interesting directions beyond the point cloud, so watch that space.
Anna Petrasova and Vashek Petras are in attendance. Among other things they’re working on that include PDAL, python 3, and git, Anna is looking into building an extension to WebODM that will enable processing OpenDroneMap outputs further using GRASS. There are some really exciting possibilities for custom work flows.
Steve Lime and company are there too, being hosted in Minneapolis are working this week on MapServer. My experience with MapServer is building a functional version of it for Mac OS X in 2004 in advance of a job interview (it only took me 160 hours and it was the first project I ever built or compiled. It was so cool to see maps rendered on the fly and sent to the browser and inspired much of the FOSS4G I have done since.
Also, Steve is a gracious host, and this is a productive, well fed, and chill event.
Finally, I would be amiss if I didn’t mention cool work being done this week by David, Jake, and Nathan at Solspec. They’ve been integrating OpenDroneMap into their workflows and had both lots of ideas of improvements and hope to contribute some of those ideas back in the form of code this week. Watch this space for more.
Ok, the plane is boarding soon. Time to go snuggle the children and help out around the house. Much love to all the above folks and all the rest I haven’t mentioned.
Code/community sprints are a fascinating energy. Below, we can see a bunch of folks laboring away at laptops scattered through the room at the OSGeo’s 2019 Community Sprint, an exercise in a fascinating dance of introversion and extroversion, of code development and community collaboration.
A portion of the OpenDroneMap team is here for a bit working away at some interesting opportunities. Tonight, I want to highlight an extension to work mentioned earlier on split-merge: distributed split-merge. Distributed split-merge leverages a lot of existing work, as well as some novel and substantial engineering solving the problem of distributing the processing of larger datasets among multiple machines.
Image of the code sprint.
This is, after all, the current promise of Free and Open Source Software: scalability. But, while the licenses for FOSS allow for this, a fair amount of engineering goes into making this potential a reality. (HT Piero Toffanin / Masserano Labs) This also requires a new / modified project: ClusterODM, a rename and extension of MasseranoLabs NodeODM-proxy. It requires several new bits of tech to properly distribute, collect, redistribute, then recollect and reassemble all the products.
Piero Toffanin with parallel shells to set up multiple NodeODM instances
“Good evening, Mr. Briggs.”
The mission: To process 12,000 images over Dar es Salaam, Tanzania in 48 hours. Not 17 days. 2 days. To do this, we need 11 large machines (a primary node with 32 cores and 64GB RAM and 10 secondary nodes with 16 cores and 32GB RAM), and a way to distribute the tasks, align the tasks, and put everything back together. Something like this:
… just with a few more nodes.
Piero Toffanin and India Johnson working on testing ClusterODM
This is the second dataset to be tested on distributed split-merge, and the largest to be processed in OpenDroneMap to a fully merged state. Honestly, we don’t know what will happen: will the pieces process successfully and successfully stitch back together into a seamless dataset? Time will tell.
For the record, the parallel shells were merely for NodeODM machine setup.
While the last decade has been dominated by the growing hegemony of the global base map, mapping will swing now for a while towards the principle of mapping the world, one organic pixel at a time. 2014 is the beginning of artisanal satellite mapping, where we discover the value in 1-inch pixels from personally and professionally flown unmanned aerial systems (drones). There is, as all things military-industrial, the dark side of drones. But as with all of these technologies, we will be discovering the great democratizing power of the artisanal, as applied to ‘satellite’ views.
So many FOSS options… .
After making the 2014 prediction post and then deciding to start OpenDroneMap, I had a doubt: what if someone comes along and creates something better? What if something better already exists? What if there is no point to the work? And then I remembered all the above and relaxed. Also, if someone comes along and creates something better, in the informal parlance: yay for the world!
The reality when I started the project was there was an existing project that was FOSS and was photogrammetry for drones and other small cameras: MICMAC. It’s exquisite: great quality, fully FOSS being a CeCILL-B (I like to think of it as a French version of the GPL, but IANAFL [I am not a French Lawyer] and not completely sure that’s correct), and at the time, difficult to use for non-French speakers.
MICMAC has evolved a lot since then, with better docs and community posts in both French and English, however usage of it can still be a challenge. Free and Open Source is hard. It can be hard for users, it can be hard for maintainers. So, it is such a relief when FOSS becomes ever so much easier.
So, it is with some excitement that I turn your attention to NodeMICMAC. NodeMICMAC is a fork of NodeODM, and thus provides web API access to MICMAC, in the same way that NodeODM does for OpenDroneMap’s command line ODM application.
NodeMICMAC makes using MICMAC really easy, and should slot into the OpenDroneMap ecosystem pretty seamlessly, thanks to JP and the folks at DroneMapper.
When we spoke with JP, I was curious about his motives: this is such a cool move that changes the industry. Why? The answer: for the same reasons that we work on OpenDroneMap, to grow this really cool open photogrammetric ecosystem.
(Sidenote: check out DroneMapper’s geoBits: ArUco Ground Control Point (GCP) Targets and Detection for Aerial Imagery — more on that later — so cool!).
But, you may ask, how does it stack up? Frighteningly well. If you have been paying attention to the improvements in OpenDroneMap the last couple of years, you may have noted orders of magnitude improvements in every step in processing. Even with these, MICMAC shows us how a mature photogrammetry project should display its wares.
Point cloud comparisons
ODM vs. MICMAC’s point cloud over a house:
ODM vs. MICMAC’s point cloud over a drainage:
Orthophoto over truck, fence, road, vegetation:
Orthophoto over a duplex house:
MICMAC is, as it’s reputation indicates, a bit of a grail. It gives us some very nice results, and now simply at the expense of spinning up a docker instance. Watch this space the next few weeks — Masserano Labs will be working with DroneMapper on integrating it into WebODM and quickly graduating it to a first class citizen of the OpenDroneMap ecosystem.
So, will NodeMICMAC replace NodeODM and all the work in ODM? Not so fast! There’s still space for what we’ve built. Remember my intro above? Of course you do. With upcoming capacity to handle massive datasets, NodeODM, PyODM, ODM, and other projects will still get our love. But as they say, if you can’t beat them, have them join you. Right? I think that’s the phrase… . Perhaps MICMAC isn’t joining OpenDroneMap, but we will be happy to fork their code and contribute back where we can, and thanks to JP and DroneMapper for making this possible!
Most of our blog posts on OpenDroneMap are meant for interested users. Every now and then we have a gem for our contributors / power users who like to dive into the code a bit and enhance things. For any of you who have done this, or have wanted to do this in the past…
For anyone using OpenDroneMap to process really large datasets, some good news came through early last year with improvements to how OpenSfM handles large datasets. This came in the form of an innovative, first of it’s kind, hybrid SfM method which combines the better attributes of global and incremental SfM approaches. TLDR: This helps make processing…
Hello from Musanze, Rwanda, a short 10 km from one of two remaining populations of Mountain Gorillas where I’ve spent the last couple of weeks with Dian Fossey Gorilla Fund International and students from University of Rwanda helping with important research in support of gorillas and biodiversity in the Virunga Massif.
But, this post is not about these things (more on these later) but a call to all those in the Free and Open Source Community to come and join the OSGeo Community Sprint in Minnesota this May!
What is a community sprint, you may ask? Well, it’s like a code sprint in that there will be code geeks working away at improving the software of various OSGeo and OSGeo caucusing projects, but also folks working on the equally important work of documentation, translation, tutorials, and other parts of the OSGeo is ecosystem. More code is like more cowbell, so I’m calling on the coders but also the other folks in our community to join.
One of the problems we worked on in Kerala wasn’t a drone problem at all, but an infrastructure problem. Given a distribution of locales, how can we cluster those according to distance to help reduced duplicated infrastructure services (internet, electricity, etc.)? There are lots of ways to solve this, but we chose to do it in PostGIS because… Ok: because that’s what I always often do whether I should or not.
I worked on this with Deepthi Patric, the Geomatics expert for the group. For the record, the points below are not the points we actually analyzed, but a randomized distribution within occupied areas of Kerala.
Points across Kerala for clustering
View over Trivandrum, Kerala
In this case, PostGIS was a good choice of tool, at least once we started using it intelligently. At first we tried something less intelligent and more blunt wherein we buffered and created convex hulls with left joins, etc.:
This was… well. Not the best method. So, back to the drawing board. Let’s use something a little smarter. Since PostGIS 2.2, we’ve had ST_ClusterWithin. This promisingly named function is just what we need… . It does return a geometry collection, so we need to manipulate the results a bit, but the query is pretty reasonable in the end:
Thus changing individual points:
Inset of unclustered points.
to smart clusters based on proximity:
Inset of points clustered into points within 1km of each other