Almost Not Applying
When I first saw the announcement for an internship opportunity on the Z_GIS internship dashboard (I made a customized version of the dashboard when my search was tough, check it out here), I hesitated. My initial thought was: “They’ll probably require fluent German.” I had the technical background, but language felt like an insurmountable barrier. I was already adjusting to living in Salzburg for my master’s studies, and though I had picked up a few German words, I wasn’t confident enough to work in it professionally. For a while, I almost closed the tab and moved on.
But something nudged me to apply. I sent an email to express my interest. In fact, I didn't even attach my resume because I just wanted to apply and see whay would happen. Here is a screenshot of the brief email I sent.

The response I received changed everything. Next thing I was asked about my availability for a meeting to discuss about the internship. Not even a formal interview where you'll meet the HR, boss, and the boss of the boss. Trust me now, I went to the meeting with my updated portfolio, ready to show workings, it was as if I went to a fist fight with a grenade luncher. I preloaded all my best projects on a web browser to ensure no internet glitch. At the meeting, I was confident and demonstrated my expertise in a concise and remarkable manner. After the demonstration, we talked about the timeline, overview of tasks and resumption. I'd say I kind of met the boss and didn't go through any stressful process. It was after the interview, the HR reached out for my Resume, and other details. Maybe I am surprised because in my previous work experience, prospective interns have to go through various rounds of interviews, and even exams/tasks before getting an offer.
Looking back now, I am so glad I took the step. That one email taught me an important lesson: sometimes the only thing between you and an opportunity is the courage to ask a simple question.
The Host Institution
The internship was at the Department of Geodata Infrastructure, Water Division of Land Salzburg, located at Michael-Pascher-Straße 36, 5020 Salzburg. The department is at the heart of geospatial data management for the state of Salzburg. Their responsibilities include collecting, maintaining, and disseminating spatial data that support governance, planning, and public services. They are also behind some of the state’s most important geospatial initiatives:
-
SAGIS Online: a public-facing platform for accessing Salzburg’s geodata.
-
SAGIS Web:supporting professionals and administrators with advanced mapping tools.
-
GIS4Kids: an educational initiative that introduces young students to the world of GIS in fun and engaging ways.
Walking into my office for the first time, I was struck by how integrated their work was with daily life in Salzburg. Whether it was land use planning, environmental monitoring, or education, spatial data was central, and this department was the backbone making it possible. I was asigned a workstation with two monitors, and a brand new headphone for online meetings. I was later introduced to the team, and their internal systems including time tracking, lunch booking, staff directory among other things. Particularly it was fascinating how they all work together in a highly connected environment. With my IT experience, I was able to appreciate the kind of integrations they must have built to enable such a robust internal system combining storage alocation, device provisioning, geodatabase management and automation.
My Journey Through the Tasks
The internship was structured around a series of projects, each with real-world applications. These tasks not only tested my existing skills but also pushed me into entirely new areas of GIS.
1. Converting Kesselfallstraße Data
The very first assignment was a classic GIS operation: converting a CSV dataset of the Kesselfallstraße into a usable feature class. At first glance, it seemed simple, but the coordinates messed with me a little bit. For someone who often work with clean data and just use for visualization, having to go through the process of converting meters to geographic coordinates was one of the things I do not find interesting. When the points finally rendered correctly on the map, it was a small but satisfying victory.
2. Validating OSM Road Data for Salzburg
Next, I validated OpenStreetMap’s road network data against authoritative Salzburg datasets. This was an exercise in data quality assurance, catching overlaps, gaps, and attribute inconsistencies. Editing geometry can be tedious, but I realized the importance of reliable road data, especially for services like navigation, emergency response, and urban planning. I am not the kind of guy that would go through each features in a table and update the attribute based on what I see in the underneath Orthophoto of Salzburg, this task taught me patient. After few iterations, I eventually made a tailored workflow that enabled me do it faster.
3. Automating Geosphere Precipitation Data
A core aspect of my internship focused on automating repetitive workflows. One of the most impactful projects was the development of a Python script to automate the import of precipitation data provided in BIL raster format by Geosphere. Previously, these datasets had to be imported manually, which was both time-consuming and prone to error. I wrote a script in ArcPy that parsed file directories, converted the rasters into geodatabase-compatible formats, and appended them to an existing feature class.
4. Converting Scanned PDF Maps to TIF
In addition to handling raster datasets, I was tasked with converting scanned PDF maps into TIF images for archival and analytical purposes. This task required developing a script that could detect whether a PDF was scanned and then convert it into a high-resolution TIF format suitable for georeferencing. I used Python libraries in combination with ArcGIS tools to automate the conversion process. One difficulty was ensuring that the output maintained both resolution and clarity, which is crucial when dealing with scanned documents that often have faint lines and text. I tested several conversion settings before finalizing one that balanced file size with visual clarity. The result was a batch processing tool that significantly streamlined the conversion of historical maps, making them ready for integration into GIS workflows.
5. Building a 3D Web Application with Three.js
One of the more advanced and rewarding tasks of my internship was the development of a web application for visualizing 3D MTL objects. The goal was to create a lightweight browser-based tool that could be integrated into SAGIS Online, providing users with an interactive way to explore three-dimensional geospatial objects. To achieve this, I used the Three.js framework alongside Node.js and JavaScript. I began by setting up a local environment where I could load sample 3D models and experiment with rendering techniques. After successfully displaying static models, I extended the application to include interactive controls such as zoom, pan, and rotation, which significantly enhanced the user experience. One of the challenges I faced was ensuring that large models loaded efficiently without causing performance bottlenecks in the browser. By implementing asynchronous loading and optimizing the model textures, I was able to overcome this issue, although the models still need to be downloaded to the browser before rendering. The completed application allowed seamless integration of 3D content within SAGIS Online, offering users a modern and intuitive visualization tool.
6. Cleaning and Integrating INVEKOS Data
Another task involved working with INVEKOS agricultural data obtained through the INSPIRE directive. The dataset, which contained agricultural information, was large and complex, requiring careful processing before it could be integrated into the geodatabase. My first step was to download the data in its raw format, after which I carried out a series of cleanup and conversion operations from geopackages to feature classes.
7. Tracking Parcel Lineage Over Time
A particularly interesting project I worked on was the development of a script to track the lineage of land parcels over time. Land parcel data is dynamic, with frequent changes due to subdivision, merging, or reclassification. The task required designing a system that could detect and record these changes, thereby creating a historical record of each parcel’s evolution. I approached the problem by developing a Python script that compared successive versions of parcel datasets, identified modifications in boundaries and attributes, and logged these changes in a separate tracking table. This provided not only a snapshot of the current parcel configuration but also a traceable history of how each parcel had changed. The most challenging aspect of this task was handling complex cases such as multiple splits or merges within the same update cycle. I refined the decision logic within the script to ensure that all possible scenarios were accounted for. The result was a robust tool that can serve as an invaluable resource for land administration and planning.
8. Importing Geosphere Data into Geodatabase
Another major part of my work was importing multiple datasets from Geosphere into a centralized geodatabase. The datasets included INCA, SNOWGRID, and WINFORE, each of which came in distinct formats and with varying temporal coverage. I wrote a Python script that automated the download, extraction, and transformation of these datasets into geodatabase feature classes. The script also included quality checks to ensure that no corrupted or incomplete files were imported. This automation saved countless hours that would otherwise have been spent on manual processing. The challenge in this task was accommodating the differences in file structures between the datasets, as each had its own parameters format and naming convention. By creating modular functions within the script, I was able to handle these variations gracefully. The end result was a well-structured geodatabase containing harmonized datasets, ready for visualization and analysis.
9. Visualizing Geosphere Data
Once the Geosphere datasets had been successfully imported, I created detailed maps in ArcGIS Pro to visualize their various parameters. Each parameter in the datasets (precipitation, snow cover, and wind forecast) was displayed in its own layer, with appropriate symbology applied to highlight spatial patterns. I used classification methods such as graduated color ramps to represent continuous variables and ensured that each map adhered to cartographic standards for clarity and readability. This task not only required technical knowledge of ArcGIS Pro but also a strong understanding of how best to communicate data visually. One of the insights I gained was how different datasets can complement each other when displayed together, revealing relationships that are not immediately obvious in isolation.
10. Checking PDF File Properties
I was also responsible for developing a script that examined PDF files to determine whether they contained the appropriate properties. This was necessary for quality control in workflows where documents were required to adhere to certain standards. Using Python, I created a tool that read metadata and structure information from PDF files and flagged any documents that did not meet the predefined requirements. The script was capable of checking properties such as page dimensions, embedded fonts, and presence of digital signatures. Initially, handling files with inconsistent encodings presented some difficulty, but by integrating specialized PDF libraries, I was able to resolve these issues.
11. Analyzing Wind Speed Data
The final analytical task I carried out involved examining Geosphere wind speed data over the preceding three months. This required extracting temporal subsets of the dataset, conducting statistical analyses, and producing visualizations to highlight trends. I calculated monthly averages, identified peak wind events, and compared spatial patterns across different regions. ArcGIS Pro was used to generate maps that displayed wind speed gradients, while Python scripts helped in computing the descriptive statistics. One challenge was handling the large temporal resolution of the dataset, which initially made it cumbersome to process. To overcome this, I created scripts that filtered the data by date ranges before analysis. The outcome was a clear picture of wind speed variability, which could be valuable for applications such as renewable energy planning and risk assessment. Initially, I created an animation by using the windspeed as the symbol size, and the wind direction as the rotation, I published this in a map on ArcGIS Online. Afterwards, I was tasked to create a rose diagram because it was important to see the direction where the wind was mainly coming from during the period in view.

The Big Challenge: Parcel Lineage
If I had to name one project as the defining challenge of this internship, it would be the parcel lineage script. Land parcel data is central to land administration, and tracking how parcels evolve over time is notoriously complex. Usually, ArcGIS Parcel Fabric would have taken care of this part but
I had to account for:
-
Splits (one parcel dividing into many).
-
Merges (multiple parcels joining into one).
-
Identifier changes.
-
Edge cases that didn’t follow neat rules.
It took multiple iterations and countless debugging printouts to get it right. But this challenge taught me perseverance, logical thinking, and how to approach complexity step by step. I also learnt about the use of sets as data/variable type in python which was valuable in my comparison module. It was the project I felt most proud of.
What I Learned
The internship wasn’t just about completing tasks, it was about learning, adapting, and growing. Some of my biggest takeaways were:
-
ArcPy mastery: I became more fluent in automating workflows in ArcGIS Pro.
-
Web and 3D visualization: I learned to use Three.js and Node.js, opening a new dimension of geospatial applications.
-
APIs and data integration: I understood how to pull data from Geosphere and INSPIRE using modern API calls.
-
The role of interoperability: I saw firsthand how mismatched fields and projections can stall projects, and how scripts can solve these challenges.
-
3D printing in GIS: I got to see how DEMs can be physically printed, making abstract landscapes tangible. using the Prusa XL 3D printer, I printed some boats for my children!
Beyond technical skills, I learned resilience: how to keep going when things don’t work, and how rewarding it feels when persistence pays off.
Looking Back
As I reflect on this journey, I am deeply grateful that I didn’t let the fear of a language barrier hold me back. The work I did at Land Salzburg gave me a stronger validation of my Python skills, taught me new technologies, and let me contribute to projects that have a real impact on governance and daily life. Currently, most of my work have been integrated into SAGIS Online and I am super happy to have contributed immensely!
Finally, I received an excellent feedback from my supervisor that made me very happy, in her words "Oke, the speed at which you get things done is not normal, its crazy and we are so impressed that we are able to integrate your work directly without any corrections". This was then followed by the internship certificate and I am really thinking of framing it! Check it out below.

I know you don't speak german. Haha. Here's the translation

Author
Join the discussion
You may sign in to comment.





