Case Study

Increased efficiency and security with Azure Data Factory

The IT infrastructure of a leading service provider in the logistics industry was to be modernized in order to meet the requirements of digital transformation. Data analysis was improved and manual processes were reduced. The level of security and general efficiency were significantly increased thanks to the cloud solution.
Industry
Logistics
Implementation
2023
Company size
220 employees
Company turnover
n.a.

Benefits and improvements achieved

  • Improvements in data analysis and internal transparency
  • Reduction of manual processes
  • Increasing efficiency
  • Improvement of the professional external image
  • Reaching a milestone in the digital transformation
  • Significant increase in security level thanks to cloud solution

Project overview

The company, a leading service provider in the logistics industry, was faced with the challenge of modernizing its IT infrastructure to meet the requirements of digital transformation.

The existing status quo was characterized by manual and sometimes outdated processes for handling large amounts of data provided by suppliers in various formats such as XML and Excel. These inventory lists were manually uploaded to a SharePoint document library several times a day - a process that was both time-consuming and resource-intensive.

The target concept was to automate this process by introducing Azure Data Factory. The aim was to seamlessly and promptly transfer the data received from suppliers to an SQL database hosted in the cloud. This automation should not only allow the data to be processed faster, but also to be prepared and made available to a third-party provider for further use. This restructuring was aimed at improving internal data analysis and creating greater transparency for the company and its customers.

Challenges

The challenges included the introduction of the new Azure Data Factory technology, data crawling and the mapping of data sources. There was a huge learning curve, as there was no previous experience with Azure Data Factory. Change management was also a challenge, as many changes and adjustments were necessary.

Solutions

The optimization of data management processes with Azure Data Factory has led to two key solutions, among others:

XML import pipeline:

This pipeline is executed daily and is used to retrieve new data from the SharePoint document library. The connection is made via a REST interface that accesses the SharePoint server directly. Once the data has been downloaded, it is extracted according to a specific schema and inserted into a dedicated Azure SQL database. This allows a third-party provider to access this data and develop BI reporting solutions based on it. The pipeline processes data in XML and, if necessary, Excel formats, which are stored in the SharePoint document library.

SQL On-Prem to Cloud Pipeline:

The company maintains a locally operated SQL server, which is located in a protected data center. A secure connection is established with the Azure Data Factory via an "Integration Runtime" that runs locally on the server. This configuration makes it possible to transfer data to the cloud several times a day. The data is not only transferred, but also transformed and synchronized for BI reporting. Here too, a third-party provider has access to the Azure SQL server in order to develop BI reporting solutions based on it.

Challenges and change management:

Using the Azure Data Factory was a new challenge for our team. Familiarization with the platform was complex, and since we did not follow a Scrum approach, a rather consultative approach with numerous changes dominated. The project management structure provided for an application manager to lead the specialist departments, which had a significant impact on change management within the project.

Quote:‍

We were convinced that MAKONIS is professionally competent - and still are. There are many service providers, but for us the human component is crucial.

Results

The project led to qualitative improvements in internal processes. Faulty processes were optimized thanks to the newly gained transparency. The workload of employees was reduced, as many manual steps were eliminated, and data preparation and security were improved.

In addition, the implemented solutions enabled easy scaling of data processing operations, which helped to cope with peaks in data demand and keep pace with the company's growth. The improved data management also helped to meet data protection and compliance requirements and strengthened risk management through more accurate data analysis.

In addition, centralized data storage in the cloud facilitated collaboration between different departments and locations, allowing teams to work more efficiently and share information seamlessly. These comprehensive improvements helped the company become more agile, data-driven and competitive.

Techstack

The technology stack included the Azure Data Factory for the data transformation process, Azure SQL Server for data storage and Power BI for data visualization.

  • Azure Data Factory
  • Azure SQL Server
  • Power BI

Resources

Have we aroused your interest?

Find out more about our workshops and offers.