Abstract Lars Arge
Improvements of mapping technologies have resulted in a major increase in both the amount and quality of terrain data being acquired. However, as in many other big data applications, this has revealed scalability problems with existing terrain processing software. Often these problems are a result of the underlying algorithms not taking the hierarchical structure of memory systems into account - especially the large difference in the access time of main memory and disk. In this presentation, we will discuss the above problems further and describe some of our work on overcoming the problems in the context of flood risk analysis using massive detailed terrain data. More precisely we will discuss so-called I/O-efficient algorithms - that is, algorithms that try to minimize movement of data between internal memory and disk - for a number of flood risk analysis problems. We will also discuss how this work has led to a successful start-up company.