Categories: Study Case

Case Study: End-to-End Middleware Integration and Data Migration Project

Project Title: Middleware Synchronisation and Data Migration Between SMT Equipment Providers

Client: EMS Manufacturer (“ClientTech”)

Technology Providers:

  • PROVIDER_A: Hanwha SMT Equipment and MES database (Microsoft SQL Server)
  • PROVIDER_B: Mycronic SMT Storage Systems and Traceability database (PostgreSQL)

Keywords: Middleware Integration, SMT Synchronisation, EMS Middleware Solution, MSSQL-PostgreSQL Bridge, Real-Time Production Synchronisation, Factory Automation, SMT Equipment Integration, Python Middleware Development, SMT Manufacturing Software, Industry 4.0 Integration, Data Migration, Factory Data Migration Solutions


1. Project Background

ClientTech operates a high-mix, medium-volume electronic manufacturing service (EMS) facility. Two principal vendors provided SMT equipment and storage solutions, resulting in fragmented production data.

Challenge: Real-time synchronisation and efficient data migration between PROVIDER_A (Hanwha MSSQL Server) and PROVIDER_B (Mycronic PostgreSQL) systems were lacking, leading to manual interventions, data mismatches, and operational inefficiencies.

Objective: Develop a fully automated, future-proof middleware that synchronises and migrates production data seamlessly, thereby enhancing production tracking accuracy and digitalisation.


2. Objectives

  • Enable near-real-time data synchronisation (polling every minute).
  • Establish bi-directional data flow for component registration and consumption updates.
  • Migrate reel, carrier, and stock information accurately between systems.
  • Minimise manual operations and error-prone reconciliations.
  • Prepare for future deployment in containerised (Docker) environments.

3. Solution Overview

A bespoke Python middleware service was developed to bridge PROVIDER_A’s Hanwha MSSQL Server and PROVIDER_B’s Mycronic PostgreSQL database via Web Services.

Key Features:

  • Real-time polling of storage system APIs.
  • Database view reading and stored procedure calls for updates.
  • Structured error handling and comprehensive logging.
  • Modular, scalable architecture ready for containerisation.
  • Seamless data migration of legacy records during initialisation.

4. System Architecture

PROVIDER_A (Hanwha MSSQL Database) <--> Python Middleware Service <--> PROVIDER_B (Mycronic PostgreSQL WebService API)

Components:

  • Database Connectors (pyodbc for MSSQL, psycopg2 for PostgreSQL)
  • XML WebService API Client
  • Data Mapper for field translation
  • Core Synchronisation Engine
  • Retry and Logging Mechanism
  • Optional REST API endpoints for manual operations

High-Level Architecture Diagram:


5. Sample Python Project Structure

middleware/
├── config/
│   ├── settings.yaml
│   └── logging.conf
├── connectors/
│   ├── mssql_connector.py
│   ├── postgres_connector.py
│   ├── webservice_client.py
├── core/
│   ├── sync_engine.py
│   ├── data_mapper.py
│   ├── error_handler.py
├── api/
│   ├── server.py
│   └── routes/sync_routes.py
├── utils/
│   ├── xml_utils.py
│   ├── service_utils.py
├── logs/
│   └── middleware.log
├── tests/
│   ├── test_sync_engine.py
│   ├── test_connectors.py
│   └── test_mapper.py
├── main.py
├── README.md
└── requirements.txt

6. Implementation Details

Development Stack:

  • Python 3.11
  • Libraries: requests, lxml, pyodbc, psycopg2, apscheduler
  • Security: SSL/TLS API communication, encrypted credential storage

Middleware Functionalities:

  • Poll PROVIDER_B (Mycronic WebService) every 60 seconds for updated component data.
  • Insert or update reels and carrier data during the migration process.
  • Push new reel/component information into PROVIDER_B when registered in PROVIDER_A.
  • Update reel quantities back into PROVIDER_A upon unload events from PROVIDER_B.
  • Log all operations and errors with automatic retries.

7. Testing Strategy

  • Unit Testing: Each component tested independently.
  • Integration Testing: Mocked databases and APIs used.
  • System Testing: Full validation against production data.
  • Validation Scenarios: Successful migration, synchronisation, and failure recovery.

8. Deployment Approach

Initial Deployment:

  • Native Python service installed on ClientTech’s on-premises servers.

Future Roadmap:

  • Prepare Dockerfile and docker-compose templates for containerisation.

9. Results

  • Synchronisation success rate: >99.9% in the production environment.
  • Manual interventions reduced by 85%.
  • Real-time production and material visibility achieved.
  • Legacy data fully migrated and validated.
  • Foundation laid for scalable Industry 4.0 transformations.

10. Lessons Learned

  • Proper timeout configurations are crucial for heavy database loads.
  • Field mapping precision is critical for successful data migration.
  • Dead Letter Queue design proved vital for resilience.

11. Conclusion

ClientTech successfully implemented a robust, scalable middleware and data migration solution, ensuring real-time synchronisation between SMT systems and significantly improving production efficiency and data integrity.

Date: April 2025

Anton Fieraru

Recent Posts

Recent Developments in Data Analysis: The Shift Toward AI-Native Architectures (2025)

Over the past months, data analysis has entered a phase of accelerated transformation driven by…

3 weeks ago

The Future of IT Jobs in the Age of AI: Who Thrives, Who Fades?

AI isn’t a futuristic buzzword anymore — it’s a commit in your Git repo, a…

4 months ago

From Data to Dialogue: Supercharging Dashboards with Google Gemini’s AI Commentary

Dashboards are the windows into our data, but too often, they offer a silent, static…

4 months ago

Real-Time Dashboard for Capital Market Monitoring

1. Introduction This document presents a technical proposal and case study for developing a Real-Time…

11 months ago

Case Study: Multilingual AI Analysis for Fast-Food Chain Reviews

Objective To analyze customer reviews from Google Maps for McDonald's, Pizza Hut, Burger King, and…

12 months ago

Case Study: Transforming a Media Agency into a Data-Driven Company

Transforming a Media Agency into a Data-Driven Company Client Background Our client, a mid-sized media…

1 year ago