fredlackey.com Showcase — Notable Engagements Software & Architecture

Fred Lackey

Senior Software Engineer & Architect

Before It Was Amazon

In 1995, the World Wide Web was not yet a year old as a commercial medium. Mosaic had recently given way to Netscape. E-commerce was theoretical. And in Seattle, a startup called Amazon.com was trying to answer a deceptively simple question: how do you put every book in print online?

The International Standard Book Number — the ISBN system — was the key. Every published book had one. The ISBN database held the record for millions of titles: author, publisher, edition, availability. If you could connect to that database and pull it into a web-accessible catalog, you could build a bookstore that sold anything that had ever been printed, without manually entering a single record.

1995
Netscape Navigator released. The commercial web begins. Amazon.com incorporates and prepares to launch.
2
Number of architects selected to build competing proofs-of-concept for the dynamic catalog system. Both were kept.
1M+
ISBN records available in the database — every title in print. The catalog that needed to be made searchable and purchasable.

Amazon contracted a consulting firm and asked for architects who could solve this. Through a rigorous selection process, two were chosen. Fred Lackey was one of them.

A Rigorous Selection

The process was not a standard job interview. Amazon needed a working proof-of-concept, not a whiteboard diagram. Each candidate was asked to design and build a system capable of interfacing with the ISBN database, extracting publication records, normalizing the data, and loading it into a structure that could power a dynamic, searchable online catalog.

The candidates did not know each other. They worked independently. The goal was to see who could actually solve the problem — not who could describe solving it best.

Two architects. Two independent proofs-of-concept. Both strong enough to keep. The outcome was not competition — it was collaboration.

When the results were reviewed, both submissions demonstrated sufficient technical merit to be considered. The decision was made not to choose one over the other, but to combine them: to take the strongest elements of each design and merge them into a single, more robust system. That combination became the technical foundation for the Amazon.com catalog.

The Proof of Concept

In 1995, there was no REST API to call. There was no managed cloud database service to spin up. The ISBN database was accessed via RS-232 serial communication — a direct hardware-level connection that required building the data pipeline from the physical layer up.

The work proceeded in three phases: establish the connection and extract the records, parse and normalize the raw data into a consistent structure, and load that structure into a relational SQL database capable of serving dynamic web queries.

  ISBN Database (RS-232 Serial)
          │
          ▼
  ┌───────────────────┐
  │  Serial Interface  │  RS-232 connection layer
  │  (hardware layer)  │  raw byte stream extraction
  └────────┬──────────┘
           │
           ▼
  ┌───────────────────┐
  │  Parser / Normalizer│  field extraction, encoding
  │                    │  normalization, deduplication
  └────────┬──────────┘
           │
           ▼
  ┌───────────────────┐
  │   SQL Database     │  relational schema, indexed
  │   (Catalog Layer)  │  queryable, web-accessible
  └────────┬──────────┘
           │
           ▼
  Online Bookstore Catalog
  (browse · search · purchase)
  • 01
    RS-232 Serial Interface

    Established a hardware-level communication pipeline to the ISBN database via RS-232 serial connection — the mechanism by which external data systems communicated before the internet made that assumption obsolete. Handled the raw byte stream, timing, and connection state management at the physical layer.

  • 02
    Data Extraction & Parsing

    Designed the parsing layer to extract structured records from the raw ISBN data stream. Each record contained publication metadata: title, author, publisher, edition, format, availability status. The parser handled malformed records, encoding inconsistencies, and the variability inherent in a database built up over decades by many contributing publishers.

  • 03
    Normalization & Deduplication

    ISBN data in 1995 was not clean. Publisher names were inconsistent, author fields varied in format, and duplicate ISBNs existed for different editions of the same work. The normalization layer applied rules to standardize the data before it entered the relational database, ensuring that the catalog would be coherent and queryable rather than a faithful reflection of the source data’s imperfections.

  • 04
    Relational SQL Database Schema

    Designed and populated the SQL schema that would serve as the live catalog: tables for titles, authors, publishers, categories, and availability, with indexes appropriate for the query patterns a browsable online bookstore would generate. The schema was designed for automated repopulation — as new ISBNs were issued, the pipeline could ingest them without manual intervention.

  • 05
    Automation & Live Catalog Maintenance

    Documented the design and operational parameters for the system’s ongoing maintenance: how to refresh data from the ISBN source, how to handle updates and corrections to existing records, and how to scale the pipeline as the catalog grew beyond its initial seed. The system was designed to maintain itself, not to require human data entry as a permanent dependency.

Two Designs, One System

When both proofs-of-concept were reviewed, the decision was made to merge them. The two architects — who had worked in parallel without coordination — were brought together to combine the strongest elements of each design into a unified implementation.

This is a more unusual outcome than it might appear. A standard POC process produces a winner and a loser. Amazon’s decision to synthesize rather than select reflected either a genuine recognition that both submissions contributed something the other lacked, or a pragmatic understanding that the problem was large enough to absorb both approaches. Either way, the result was a system that incorporated design decisions from two independent architectural perspectives.

The combined system became the technical backbone of the Amazon.com online bookstore. The RS-232 pipeline, the parsing and normalization logic, the relational schema — these were not throwaway prototypes. They were the working architecture that Amazon launched with.

What the Bookstore Became

It is difficult to trace a straight line from a 1995 proof-of-concept to the Amazon of today. Systems are rewritten. Architectures are replaced. The RS-232 connection and the original SQL schema are decades in the past. But the decision to build a dynamic, database-driven catalog — rather than manually curating a static list of titles — was the foundational choice that made everything downstream possible.

1995
The Proof of Concept

RS-232 ISBN pipeline, SQL catalog schema, dynamic bookstore architecture. Fred Lackey and one other architect, selected from a competitive process, build the system that Amazon launches with.

1997
One Million Titles

Amazon passes one million catalog entries. The scale of the original dynamic catalog vision proves out — a manually curated list would have been impossible to maintain at this volume.

1998
Beyond Books

Amazon begins selling music and DVDs. The catalog architecture that began with ISBN records expands to cover new product categories — the dynamic, database-driven model proves transferable.

Today
The Everything Store

The world’s largest e-commerce platform. Hundreds of millions of catalog entries. What began as a decision to dynamically populate a bookstore from a serial connection to an ISBN database became the founding architectural assumption of modern retail.

The question was never whether an online bookstore was possible. The question was whether it could populate itself. The answer was yes — and that answer changed everything.

Stack & Craft

The technology choices of 1995 were constrained by what 1995 had available. There was no cloud. There was no managed database. There was no package ecosystem to pull from. Every component of this system was built from first principles against the hardware and software primitives of the era.

Connectivity
  • RS-232 Serial
  • Hardware Interface
  • Physical Layer Protocol
Data Pipeline
  • ISBN Record Parsing
  • Field Extraction
  • Data Normalization
  • Deduplication Logic
Storage
  • Relational SQL Database
  • Catalog Schema Design
  • Query Optimization
  • Automated Population
Process
  • Proof of Concept
  • Competitive Selection
  • Collaborative Merge
  • Operational Documentation