Est. 1985 Software Architecture • Engineering • Leadership 40 Years of Excellence

Fred Lackey

Software Architect, Engineer & Leader

A Company Mid-Acquisition. Dozens of IT Estates. Zero Coherence.

Jenoptik was moving fast. The semiconductor manufacturer was in the middle of an aggressive global expansion — acquiring companies across Ireland, Europe, and the United States in rapid succession. Each acquisition brought a new IT estate: different operating systems, different user directories, different hardware inventories, different phone systems, different email infrastructure. No two were alike.

The result was a sprawling, fragmented environment where nobody had a complete picture of what existed, where it was, or who owned it. New employees arrived to find their machines unready, their accounts not provisioned, their access incomplete. Departing employees left behind hardware that sat idle for months. Procurement was reactive and redundant. Compliance across international jurisdictions was managed, if at all, by hand.

“The company was buying faster than it could integrate. The question was whether the infrastructure could keep pace with the ambition.”

Fred Lackey was brought in to answer that question. The mandate: unify the IT and HR operational fabric across every acquired entity, at global scale, using whatever the underlying platforms happened to be — because standardizing the platforms themselves was not an option.

Global Role Mapping: The Foundation

Before a single machine could be provisioned automatically, a fundamental problem had to be solved: no one had defined what roles meant across the organization. A “Systems Administrator” at a newly acquired company in Germany was not necessarily the same role as one at a legacy site in Ireland — different responsibilities, different access requirements, different hardware needs.

The first system Fred built gave Jenoptik’s central HR team the ability to define and map equivalent roles across all acquired organizations. Each role was described in terms of its responsibilities and then linked to a canonical set of IT requirements: what hardware was needed, what software licenses were required, what system access was appropriate, what phone configuration should be applied.

This role-mapping layer became the master dictionary that everything else ran against. When HR onboarded a new employee, the system knew exactly what that person needed based solely on their role — regardless of which acquired company they were joining or which country they were in.

From Role Assignment to Full Provisioning

With role definitions in place, the provisioning system could execute end-to-end without manual intervention. What had previously required days of coordination between HR, IT, and procurement teams was compressed into an automated workflow triggered by a single event: a new hire being entered into the system.

01

Role lookup. The system matched the new employee’s role against the canonical role map to determine the full set of hardware, software, and access requirements applicable to their position and location.

02

Inventory check. Available hardware at the relevant site was queried. If suitable equipment existed, it was reserved. If inventory was insufficient, a procurement workflow was automatically initiated — generating purchase order requests and routing approval emails to the appropriate managers.

03

Machine imaging. The assigned machine was imaged using Norton Ghost, applying the standardized operating environment for that role. Bootable images on CD/DVD eliminated dependency on local IT expertise — a field technician with no specialized knowledge could deploy a machine correctly.

04

Distributed command execution. Provisioning instructions were dispatched to remote nodes across the global network via the asynchronous messaging system. Each node executed its assigned tasks — creating domain accounts, configuring SSH credentials, setting phone extensions and voicemail, granting email access — and returned status reports to the central orchestrator.

05

Travel-aware access. The system accounted for employees who worked across multiple sites. Access was provisioned for every location the employee needed, not just their home office — ensuring continuity for the globally mobile workforce that Jenoptik’s acquisition strategy required.

The same workflow ran in reverse for departures and role changes: hardware was recovered or re-provisioned, access was revoked, and assets were returned to inventory rather than left to accumulate in desk drawers.

A Distributed Command Processor Before That Had a Name

Coordinating automated tasks across a heterogeneous global network in the late 1990s meant building the coordination infrastructure from scratch. No suitable middleware existed. Cloud orchestration did not exist. What Fred built instead was a modular, asynchronous file-based messaging system — a command bus implemented over FTP.

CENTRAL ORCHESTRATOR  // Role map + workflow engine
  HR event received → role lookup → generate instruction files
  Instruction files written to FTP staging locations

  ↓ instruction files dispatched via FTP

REMOTE NODE PROCESSORS
  Windows NT Domain Controller   → user accounts, group memberships
  Unix / Sun / SGI / BSD         → SSH credentials, home directories
  Novell NetWare                → file system permissions, print access
  Phone system (Nortel / Tadiran) → extension assignment, voicemail
  Email (MSMail / Unix mail)     → mailbox provisioning, routing

  ↓ status and result files returned via FTP

CENTRAL ORCHESTRATOR
  Status consumed → workflow advances → next instruction dispatched
  Exceptions escalated via email approval routing

Each node processor independently executable: new platform support added by deploying a new processor module, no changes to orchestrator required.

The architectural insight was treating the FTP file system as a message queue. Nodes polled their designated directories, consumed instruction files, executed locally, and deposited result files for collection. The central orchestrator never needed a direct connection to a remote machine — it only needed FTP access, which was available everywhere. New platform types were supported by writing a new processor module and deploying it; the orchestration layer required no modification.

This pattern — now described in terms of event-driven architecture and decoupled processing — was built from first principles to solve a practical problem in an era when those concepts had not yet been formalized.

Every Platform. Every Vendor. No Exceptions.

The acquisitions had made no attempt at platform consistency — because acquisitions rarely do. Each company came with the infrastructure it had built independently over years. The provisioning system had to work against all of it, simultaneously, without requiring any of the acquired companies to change their underlying infrastructure first.

Operating Systems

Windows NT, Novell NetWare, Sun Unix, SGI Indigo, BSD, Slackware Linux

Email Infrastructure

Unix mail, MSMail (predecessor to Microsoft Exchange) — each requiring distinct provisioning paths

Phone Systems

Nortel, Tadiran, Meridian — extension and voicemail configuration handled per-system

Programming Languages

C, C++, Visual Basic — selected per-processor based on what each platform required

The cross-platform breadth was not incidental. Each processor module was written in whatever language best suited the target environment. The orchestration layer was platform-agnostic by design — it dispatched instruction files and consumed result files; what happened in between was entirely the processor’s concern. This separation meant the system could grow to cover any new platform acquired in future without architectural changes.

What It Changed

The measurable outcomes were significant. But the less quantifiable change — giving the central IT and HR teams visibility and control over an infrastructure they had never been able to see in full — was arguably more consequential for a company still in the middle of acquiring more entities.

  • Reduced IT workload by over 60% — provisioning that previously required days of manual coordination became an automated workflow triggered by a single HR event
  • Eliminated hardware waste through real-time inventory awareness; assets were recaptured and re-provisioned rather than lost to turnover and role changes
  • Enabled centralized reporting and global visibility into resource allocation across all acquired entities simultaneously
  • Allowed HR and IT to scale operations without expanding headcount — the platform absorbed the volume that acquisition-driven growth generated
  • Standardized compliance across international jurisdictions without requiring any acquired company to change its underlying infrastructure
  • Made deployment hardware-agnostic through bootable machine images, removing the requirement for local IT expertise at remote and newly acquired sites
  • Created an extensible architecture that could absorb new platforms and new acquisitions by adding processor modules rather than rearchitecting the core system
🏆

Jenoptik Highest Internal Recognition

The “Mission from God” Award

In recognition of the platform’s impact on Jenoptik’s global operations, Fred was presented with a gold statue — the company’s highest internal honor. The name reflected the scope of what had been asked, and what had been delivered. It is a fair description of the brief.

Stack & Craft

The system was built without the benefit of modern orchestration frameworks, message queue services, or cloud infrastructure. Everything that needed to exist was built. The FTP-based messaging pattern, the role-mapping schema, the processor module architecture, the inventory tracking layer — none of these had off-the-shelf equivalents suited to the problem at the time.

C C++ Visual Basic
FTP-Based Command Bus Async File Messaging Modular Processor Nodes Email Approval Routing
Windows NT Novell NetWare Sun Unix SGI Indigo BSD / Slackware MSMail Nortel / Tadiran / Meridian Norton Ghost Imaging