osmgraphing 0.11.1

Playing around with graphs created via parsing OpenStreetMap data
Documentation

osmgraphing

Build Status nightly

Tag Crates.io Docs

Changelog Last commit

License

Welcome to the osmgraphing-repo! :) Goal of this repo is parsing openstreetmap-data to calculate traffic-routes and different related use-cases on it. This repo deals with analyzing selfish routing and learning metrics for balancing load in street-networks. All calculations should be done effectively on a single desktop instead of an expensive cluster.

Setup and usage

cargo is the build-tool of Rust and can be used to run everything except scripts in scripts/. cargo run will give you help, e.g. it tells you to use cargo run --example. Running this command will print names of runnable examples. Further, refer to the examples for more details, or to cargo-docs to get details about the repo's setup and implementation.

Downloaded osm-data is provided in xml (osm) or binary (pbf), where nodes are related to location in latitude and longitude. Problems will be the size-limit when downloading from openstreetmap, but there are other osm data providers like geofabrik for instance.

For testing, some simple text-based format fmi is used. Since they are created manually for certain tasks, parsing them - generally speaking - is unstable. However, this repository has a generator, which can create such fmi-files from pbf- or other fmi-files (for different metric-order). The binary mapgenerator (binaries are in target/release after release-building) helps with generating proper config-files, but have a look at resources/configs/blueprint to get further explanations. A tool for creating fmi-map-files, which contain graphs contracted via contraction-hierarchies, is multi-ch-constructor.

Requirements for large maps

In general, the requirements depend on the size of the parsed map and your machine. Following numbers base on an 8-core-CPU and the pbf-map Germany running on archlinux. Further, they base on the assumption, that you don't use more than 4 metrics (besides ignore and ids), because up to 4 metrics are inlined with SmallVec. You should change the number of inlined metrics according to your needs in the module defaults, because, during build-phase, the memory is allocated anyways.

  • Parsing Germany (~50 million nodes, ~103 million edges, pbf-file) needs around 10 GB of RAM. (Using only one metric, inlined, the memory-peak is slightly over 8 GB.) After parsing, the memory-needs are lower due to the optimized graph-structure.
  • Preprocessing Germany (including parsing) needs around 4 minutes. This highly depends on the number of cores.
  • A routing query on Germany of length 620 km takes around 16 seconds with bidirectional Dijkstra. This could be improved by removing intermediate nodes (like b in a->b->c), but they are kept for now. An Astar is not used anymore, because its only purpose is reducing the search-space, which can be reduced much more using Contraction Hierarchies. Further, Astar has issues when it comes to multiple or custom metrics, because of the metrics' heuristics.

Small maps like Isle of Man run on every machine and are parsed in less than a second.

Credits

The project started in the mid of 2019 as a student project. This page honors the workers and helpers of this project, sorted by their last names.

Florian Barth
is the supervisor of the project since beginning and is always helping immediately with his experience and advice.

Dominic Parga Cacheiro
has been part of the project's first weeks when project-planning and learning Rust was on the scope. He continues the work and is writing and improving the simulation.

Jena Satkunarajan
has been part of the project's first weeks when project-planning and learning Rust was on the scope. He has implemented the first (and running) approach of the A*-algorithm.