Justin Lalieu is a Data Scientist and Full-Stack Engineer at Kapernikov. With a Master’s in Data Science from UCLouvain, he specializes in bridging the gap between complex algorithms and user-friendly web interfaces.
He combines expertise in AI/ML with a strong passion for GIS and geospatial data, building intuitive tools for infrastructure managers.
What are you building for the Asset Master Data project?
We are building a centralized web application to manage asset data. The reality is that railway data lives in many different systems—GIS, SAP, signaling databases—and those systems don't always speak the same language.
We created a solution that ingests this data through a workflow engine and presents it in a unified visualization layer. It’s about bringing everything together in a way that makes sense.
Why is a custom web application necessary?
Accessibility. The raw data is complex, and we can't expect every user to be a computer scientist or a database expert.
The web app hides that complexity. It provides a clean interface for domain experts to view, validate, and manage data without needing to write SQL or understand the underlying ETL pipelines. It empowers them to make decisions faster.
Is it just about viewing data?
No, the goal is impact. We don't just want a dashboard; we want to improve the source quality. The application is designed to help feedback updates to the source systems, keeping everything as up-to-date as possible. It closes the loop between analysis and operations.
What was the biggest challenge in building this?
It wasn't just the volume of data, though that is tremendous. The real challenge was the complexity of the data model.
Railway assets aren't just "dots on a map." They are interconnected linear assets spread across the country. Visualizing that geometry nicely—and allowing users to adapt it—is a massive challenge. You need a good look-and-feel while handling complex topological connections in the background.
We tried to make a user-friendly application where someone could work with this complex geospatial master data without needing weeks of training. Balancing that technical depth with a simple user experience was the hardest—and most interesting—part of the work.
What’s your takeaway from bridging data and UX?
It’s the variety. You’re not just building a pipeline or a screen; you're reconciling data. From the ingestion workflows in the backend to the interactive components in the browser, you get to own the full delivery of value. That direct line to the user is what ensures the data actually makes sense.
The takeaway
Great data engineering needs a great interface. By treating internal data tools as products with real UX requirements, we ensure that the master data isn't just accurate—it's actually used.