Workshop on Decision Theory and Network Science

There are plenty of problems arising in networks that lie on the interface between optimisation and statistics. To bring these two communitites closer together, Simon Lunagomez and I organise a workshop on the topic on the 18th of September in Lancaster.

Thanks to funding through STOR-i, participation is free of charge and we have four excellent invited speakers: Michal Valko (Inria, France), Sergio Bacallado (University of Cambridge), Wolfram Wiesemann (Imperial College), and Matthias Müller-Hannemann (University of Halle-Wittenberg). We also invite everyone interested to present a poster on research aligning with the workshop description.

I will post some more updates here; meanwhile do send me a message if you want to know more!

Atmos 2017, update

So they deadline came and passed (luckily, it did get extended), the programme committee discussed and evaluated the submissions, and the results are already here – time flies!

The quality of submissions was outstanding this year, so there will be a novum: Instead of the usual 12 papers, this year’s proceedings will consist of 18 papers. That will also mean that the workshop itself will be longer to accommodate the additional presentation slots. An exciting development for the workshop format.

I was lucky to have both my submission accepted. The first one I already discussed here. It compares the quality of uncertainty sets for robust shortest path problems in a real-world network. The second one presents a new method to find periodic timetables in railway planning, which turned out to be surprisingly effective, beating all the highscores on the PESP library page by quite some margin. As usual, the proceedings will be published open-access and can be found here once they are available.

Sadly, this year’s ALGO in Vienna will be held in parallel with the OR conference in Berlin. Which means that I won’t meet all of you travelling to Berlin, but there is always a next time.

ATMOS 2017

The same procedure as last year: The ATMOS deadline is approaching fast! It’s this Saturday, 24th June. Let’s hope there will be a small extension again.


The opportunities to travel and visit collaborators is surely one of the nicer aspects of our work in academia. And few would say No to an invitation to give a seminar talk in Montpellier. I certainly didn’t.

The University of Montpellier, founded in 1160, is one of the oldest in the world and is now home to some excellent research, in particular in robust optimisation. We are currently working on the concept of K-adaptability, where the decision maker selects a set of solutions with fixed cardinality, and once the scenario is revealed, he/she can choose one of the solutions from the set. These problems are more general than classic min-max approaches, but more restrictive than what is done in adjustable robustness.

It’s unfortunate if there is a call deadline at the same time – but now our application for a Newton Fund Institutional Link grant with Indonesian partners for disaster management has been submitted.

Robust Timetabling and Stacking Problems

Last week I went to the “Mini-Workshop on Integrated Timetabling” in Göttingen, though I wonder if “mini” is the appropriate name for a three-day event? We enjoyed excellent talks on various combinations of timetabling with other problems in public transport. My own presentation was on “A New Robust Local Search Method and its Application to Uncertain Timetabling“, with the abstract below:

Railway timetabling problems are challenging to solve, even if all problem parameters are known exactly. If travel times are uncertain, as is the case in practice, finding good solutions is even harder.

I introduce a new local search technique to find robust solutions under implementation errors. In this setting, not the problem parameters are considered uncertain, but the actual implementation of a solution has an error margin. I first discuss an existing general approach to this problem by Bertsimas, Nohadani and Teo. I then introduce a new approach that is able to overcome local optima, and discuss its application to the train timetabling problem. While this is still work in progress, some first experimental results are presented.

I then went on to Osnabrück and gave a seminar talk on robust selection problems.

Uncertainty Sets for Robust Shortest Path Problems

A lot has been written about solving robust problems when one knows exactly what the set of possible parameter realisations looks like. With the variable-sized robustness approach we have considered an alternative setting, in which the size of uncertainty is not really known. Still, we assume that the shape is given.

In a recent paper with the title “An Experimental Comparison of Uncertainty Sets for Robust Shortest Path Problems”, we go back another step and evaluate which uncertainty sets actually make sense for shortest path problems. We use actual real-world traffic data to generate uncertainty sets and compare the performance of the resulting solutions. It turns out that hardness of the robust problem is not really an indicator for the usefulness of the resulting paths. See for yourself!

The abstract:

Through the development of efficient algorithms, data structures and preprocessing techniques, real-world shortest path problems in street networks are now very fast to solve. But in reality, the exact travel times along each arc in the network may not be known. This lead to the development of robust shortest path problems, where all possible arc travel times are contained in a so-called uncertainty set of possible outcomes.
Research in robust shortest path problems typically assumes this set to be given, and provides complexity results as well as algorithms depending on its shape. However, what can actually be observed in real-world problems are only discrete raw data points. The shape of the uncertainty is already a modelling assumption. In this paper we test several of the most widely used assumptions on the uncertainty set using real-world traffic measurements provided by the City of Chicago. We calculate the resulting different robust solutions, and evaluate which uncertainty approach is actually reasonable for our data. This anchors theoretical research in a real-world application and allows us to point out which robust models should be the future focus of algorithmic development.