TopoDot User Conference Keynote

Geospatial visionary is developing world class transportation asset management system

for Budapest, Hungary

Recently I had the opportunity to interview Gyula Sz. Fekete, Department Head, GIS Application Development and Data Capture for the Road Management Company of Budapest, Hungary. Gyula has his degree in Survey Engineering. He is currently leading the development of one of the most advanced, 3D transportation asset management systems in the world which is employing TopoDOT to support the critical 3D feature extraction workflow.

Gyula has agreed to be the keynote speaker for Certainty 3D’s upcoming TopoDot User Conference which is being held in Orlando, FL May 9 – 13. The following is a preview of some of the best practices and lessons learned on this exciting project.

How did you get involved with 3D laser scanning and mobile lidar?

Being trained in remote sensing I knew the technology from the beginning of my career in 2000; although I had mostly been working on image sensor based projects.

When I joined Budapest Kozút (former BKK Közút) in 2012 my first task was to explore the market, find solutions and then design and budget a fast, productive, precise and efficient data production solution for road related asset management activities. The area of interest (AOI) was the urban road network of Budapest with 4700km of road length.

Gyula 1

Mobile laser scanning (MLS) was just the perfect technology for such needs, in combination with smart MLS data capture solutions like TopoDOT, so I looked around the market. That 2-3 months of research, testing and conference visits helped me to decide which were the best (not necessary cheapest) products that I should recommend. I also had chance to test the flagship MLS products on the marked, compare hardware, producers and also sample data sets.  

What are the key benefits of using mobile lidar? 

Productivity, level of detail for feature extraction and 3D analysis, speed for large area data acquisition and (with proper software packages and workflow) uniform, 3D information products on large and dense road network. 

What are some of the key challenges and how have you managed them? 

There were many challenges with this project… 

MLS Data Production: 

1) MLS measuring technique (data acquisition) in large city environment (4700 km of road and street network):

Traffic and traffic rules are far away from ideal for MLS scanning. GPS loss in narrow streets with high buildings, IMU uncertainty in traffic jams, few proper places for dynamic alignment. 


Proper design of daily MLS scanning, special driving technique and fluent on-board system control is essential. Daily proper documentation of measured paths is also important for QC. 

2) Network MLS scanning path design

Instead of “simple” linear road/highway MLS scanning we have to scan paths in a busy city where roads and streets are connected to each other and we cover the same area many times.  


Careful scanning preparation, and special way of path design to acquire the city street network. Good way of path design helps with post-processing the data and also on-site navigation of the predesigned scanning path is easier.

gyula 3

3) Transfer of mass amount of daily acquired data and QC of the data transfer

We are collecting 200-800 GB raw data every possible scanning day for 2 years. This data was copied to a centralized server minimizing any raw data loss – that could occur from a failure of MLS post processing. 


Strict data transfer workflows with daily data transfers, scripts for automated data check between data content of MLS control unit’s SSD and data transferred to server. 

4) File management of complex network scanning

We have over 600 scanning paths to manage. Number of connections (crossings) between paths are over 20,000 connections. It is hard to track which path in what production stage (path design, data acquisition, under process, under QC, finished, etc). 


– WebGIS based point cloud production system where each path is visualised and production metadata are stored).

– Strict naming convention on fileserver

– file-server based data storage with automated file naming 

5) Visualise and store in GeoDatabase the key information (metadata) for each MLS path

How to extract from raw data the exact scanned AOI in order to be able to track the coverage of scanned AOI? How to add query capability? 


– Scripts that extract necessary information from MLS log files and generate line and polygon feature automatically are stored to the Geodatabase 

6) Keep absolute accuracy as uniform as possible for the entire city in our local co-ordinate system 


Custom scripts that compares connected paths, additional solution to align them, well referenced TLS scanning and auto-generated planes from TLS point cloud as reference objects for MLS data.

gyula 4

7) Classify and tile the large – many times 40-120 GB – point clouds to be usable for feature extraction tools like TopoDOT 


Custom tiling and unique naming conventioon for each tile, organised in a well ordered file structure. 

8) Build the connection between different software tools from  

Many specialised tools are used in RODIS. (Data acquisition, MLS post-processing, GDB management SW, GIS Server, Feature Extraction software, CAD software).  

The goal was to find for each step the best solution and tie them together. It was quite a complex issue. 


Data interoperability and QA/QC scripts between each data transfer.  

Feature Extraction 

1) Big size data management during feature extraction 

Navigating on 60GB size point cloud is challenging during feature extraction. 


Tiling the point cloud and engine in the feature extraction software so as to handle these tiles using only the ones are on the screen. TopoDOT solved this issue quite well. 

2) (Semi) automated feature extraction 

Automatic feature extraction in such complex MLS data gives errors. Validation and correction takes more time than doing the work manually.  

I believe that the computer should give only suggestion, and the operator should validate on-the-fly. This way data production is faster, QC time shorter. TopoDOT has very good tools for this. 

Please describe your experience with TopoDOT.

I first tested TopoDOT on 2012. I had some very useful webinar training that gave me enough knowledge to work alone. 

We implemented many of the built-in solutions to our production workflow and due to the flexibility and strong support of TopoDOT we could build our custom, semi-automated tools.

My role was to find the most efficient way to use the existing toolset of TopoDOT. It was also my task to connect the CAD DGN files with database, this way we could build GIS geodatabase enjoying the advantage of CAD drawing tools and specialized MLS point cloud feature extraction solutions of TopoDOT. 

How do you see the state of the art changing over the next few years? 

LIDAR is growing in importance in the market. Many professionals are beginning to understand the advantage of very detailed, 3D data. MLS data gives the ability to analyze and provide automatic or semi-automatic results to complex analysis.  

Regardless of how they are made (laser scanner or imagery sensors) 3D point clouds and models are important for visualization. 

Any other thoughts that you would like to share? 

Here is a short video of ROIDS – it helps to understand its goal and the important role TopoDOT has in this project. 

I also want to highlight CSR part of this project. Almost half of the feature extractors and point cloud processor operators are deaf. After 2 years of experience I can report that these colleagues are the best, most productive and most reliable group on our team.  


To learn more about this world class asset management project you can meet Gyula in person at the upcoming TopoDOT User Conference May 9 – 13 in Orlando, FL where he will be the keynote speaker. We’ll see you at TUC in sunny Florida.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.