Digital SKU (Stage 1): The Intake Valve

by | Mar 11, 2026 | Digital Roadmap

Where Product Identity Enters the Refinery

Last issue, we laid the lifecycle map on the table — six refining stages, in a fixed sequence, each one producing a higher grade of product data for the next stage to consume.

Now we walk into the refinery.

The first station is SKU Definitions. It is not the most exciting stage. It is not the one leadership asks about in quarterly reviews. But it is where the fuel enters the system — and if the identity of a product can’t be trusted at intake, nothing downstream will be stable.

This is the intake valve. Everything that follows depends on what it lets through.

What SKU Definitions Actually Are — and Why They Come First

A quick framing note before we go further. At this point in the series, we are not operating a fully built refinery. We are learning how to build one. Blog 7 laid out the blueprint. Now, at Station 1, we’re constructing the intake valve and installing the filter. If your organization doesn’t have this station running yet, you’re not behind — you’re in the right place.

SKU Definitions are the answer to the most fundamental question the business has to resolve about any product: what is this thing, and can we describe it consistently?

That sounds simple. It is not.

In some organizations, product creation is tightly controlled. One or two people create every SKU, following a single process. Consistency comes almost by default — the same people making the same decisions the same way. The intake valve is narrow, and what gets through is relatively clean.

But in many B2B distributors, the reality is radically different. You may have hundreds — even thousands — of locations with the ability to create products independently and immediately. Each location is solving for the customer standing at the counter or on the phone right now. They’re creating a part number, entering a description, assigning a unit of measure, and moving on — because the immediate priority is service, not data governance.

That speed is a genuine business advantage. It empowers the employees to say “yes” more often (and more quickly) to the customers. It should not be dismissed. But it’s more than just a service advantage. Every SKU created at the point of need is also a data point about what the market is asking for. New product lines the organization didn’t know it needed. Suppliers it should be engaging. Pricing gaps it hasn’t identified. Customer buying patterns within industries it wasn’t tracking. Decisions made closest to the customer give the business its clearest view of the customer. That raw material pool isn’t a mess to be tolerated — it’s the asset.

The problem has never been that the pool is too big or too murky. The problem is that there’s no filter between the pool and the refinery. Everything flows in undifferentiated, and every downstream station has to sort through it before it can do its job.

And every SKU created in that moment has the potential to eventually serve the warehouse, the website, the pricing engine, customer service, and every other downstream function. What gets created at the counter rarely arrives ready to do all of that. But the raw material is there — and the opportunity it represents is enormous if the organization builds the infrastructure to refine it.

This is where B2B complexity separates from consumer. When a B2C company creates a SKU, the intended lifecycle is almost always known. It’s going on a shelf. It’s (likely) going on Amazon. The UPC gets scanned, the data populates, the customer knows what they’re getting.

In B2B, the intended use of a SKU at the point of creation is often uncertain. Is this a component repair item — a part that only exists as a replacement piece inside a larger assembly, one that will never be promoted or merchandised beyond its role as a component? Or is this a full-scale, merchandised product that needs to be imported from overseas, classified, priced across channels, warehoused in a distribution center, and presented on a website? Or is it something in between — a product that starts as a simple transaction today but might need to become something more tomorrow?

That uncertainty is baked into the first refining station from day one. And it’s why this stage comes first in the sequence: every downstream stage consumes what this one produces. Regulatory can’t classify a product whose identity isn’t stable. Operations can’t assign dimensions and packaging to a SKU that exists in three different versions. Fulfillment can’t price what it can’t clearly identify. Marketing can’t merchandise what nobody trusts.

If contamination enters the refinery at the intake valve, it flows through every station that follows.

What Breaks When SKU Definitions Are Weak

When the intake valve isn’t governed, the problems don’t announce themselves immediately. They accumulate — quietly, across the catalog, across locations, across years — until the organization tries to do something that requires trust in product identity. That’s when the cost becomes visible.

The first thing that breaks is visibility into your own catalog. When multiple locations can create products independently — from manufacturers, from master distributors, from local sources — you end up with redundant items that nobody recognizes as redundant. The same product lives under multiple identities, set up by different people at different times with different descriptions and different part number structures. The data captured at creation isn’t rich enough to connect them. You can’t leverage your own inventory because you don’t know what you already have. (If that sounds like a problem that should have been solved years ago, you’re right. It should have been.)

Unit of measure inconsistency follows. One category owner sets up a product line of staples with a unit of measure of “box.” The next person comes along and sets up the same type of product with a unit of measure of “each.” There is nothing systematic governing which is correct, and now you have customers attempting to buy zip ties per individual tie — which is unheard of in the industry, but perfectly logical if that’s what the data says. Multiply this across thousands of product lines, and you have a catalog that looks complete on the surface but is insanely inconsistent. “Here you go Mr. Customer, check out our massive catalog and enjoy your shopping experience!”

Then the conversion cascade begins. When the organization decides to standardize — collapsing duplicates under one SKU, aligning to one unit of measure — someone’s customer has to change. That means customer service calls explaining that the part number is changing on the back end. It means the invoice might look different. It means the unit of measure the customer has been ordering by for years is being converted, and the company has to capture that conversion somewhere in the ERP. Every one of those conversions is a potential point of failure, and every one creates a customer experience disruption that teams have to manage manually. The mess being cleaned up at Station 1 temporarily creates its own mess at the customer interface — which is precisely why most organizations avoid doing it until the cost of not doing it becomes impossible to ignore.

The inconsistency becomes externally visible when teams try to mature products for e-commerce. Products within the same line show up with different units of measure, different descriptions, different structures — not because anyone made an error, but because no governance existed at the point of creation to prevent it. The digital experience inherits and amplifies every inconsistency that the first refining station allowed through. You can invest in photography, product descriptions, taxonomy, and search optimization — and all of it sits on top of an identity that can’t be trusted.

And here is the part that doesn’t appear in most data management discussions: the organization often has tens of thousands — sometimes millions — of SKUs that were only ever intended to be component items. Repair parts. Replacement pieces. Items that exist solely as a component of a larger assembly and were never meant to be enriched, promoted, or merchandised independently. When there’s no classification at the point of creation, those items sit in the same catalog as fully merchandisable products with no systematic way to distinguish between them. Every enrichment initiative, every data quality project, every web catalog launch has to sort through the entire population before it can focus on what actually matters. The intake valve let everything through without labeling any of it.

Building the Filter

To be fair, most organizations don’t have zero governance at the point of creation. There is almost always a minimum set of fields required to create a SKU in the ERP. In most distributors, that baseline looks something like this: a vendor part number, a vendor code or designation within the ERP (often selected by searching for the vendor’s name), a unit of measure, a weight, a vendor cost, and a basic product description.

A note on terminology, because it matters here: “vendor” and “supplier” are generally used interchangeably across the industry. Both are umbrella terms. Underneath that umbrella, there are two distinct types — manufacturers (the source of truth for the product) and distributors or master distributors (who are redistributing the manufacturer’s product). This distinction will become important in a moment.

That baseline set of required fields is sufficient for the ERP to do its core job. It enables the organization to buy, quote, invoice, and sell. That’s the transactional foundation — and for many SKUs, it may be all that’s needed. But those fields alone don’t tell you whether this product is a duplicate, whether the unit of measure matches the rest of the category, whether you’re sourcing from the manufacturer or a redistributor, or whether this SKU is intended to go any further than a basic transaction.

The filter adds the governance that answers those questions at the point of entry — before the SKU moves downstream. It doesn’t replace the existing creation process. It builds on top of it.

The first layer of the filter is category confirmation and association. This is the step that makes the rest of the filter possible, because nearly every other governance decision — unit of measure standards, enrichment expectations, catalog placement — is tied to knowing what category a product belongs to. The part creation process doesn’t always require a category assignment, and even when it does, the person creating the SKU may not fully understand the taxonomy. A branch employee creating a product at the counter knows what the customer needs — they don’t necessarily know where that product fits within a structured category hierarchy. So the filter has to account for two scenarios: either the category is assigned during creation and needs to be confirmed by someone with category expertise, or it isn’t assigned at creation and needs to be picked up by the category team afterward based on their knowledge of the product and the vendor it comes from. Either way, correct category placement has to be resolved before the downstream governance layers can function reliably. Without it, category-level standards have nothing to attach to.

The second layer is unit of measure governance at the category level. Once a product is correctly categorized, standard units of measure can be enforced at that level. If you know a product is a zip tie, the expected UOM is predetermined. You don’t get to choose. The taxonomy team or category owners set these standards because if the organization is ever going to market and present a catalog of that product line, every item in the line should follow the same unit of measure. There will be fringe cases — there always are — but the standard is the standard, and exceptions are handled as exceptions rather than treated as normal. The reality is that different manufacturers within the same product category will sometimes ship in different units. When that happens, the distributor needs to set their own standard UOM and manage the conversion within the ERP. Most B2B ERPs have the capability to handle these transformations. It is often underutilized by business teams simply because the governance decision was never made.

The third layer is UPC capture at creation. The Universal Product Code is the single simplest mechanism for identifying duplicate SKUs. Requiring it at the point of creation — at each package level where it exists (unit, case, and sometimes master pack) — gives the organization a matching key that cuts through differences in part numbers, descriptions, and vendor structures. UPCs trace back to the manufacturer, not the master distributor, which means two SKUs sourced from two different vendors but representing the same manufacturer’s product can be connected. Not every product has a UPC — non-branded commodity items, specialty parts, and custom-manufactured products often don’t — but the majority of branded items do. Capturing the UPC doesn’t solve every deduplication problem, but it solves the most common one, and it’s a data point that’s almost always available at the point of creation for the products that carry it.

The fourth layer is vendor governance — and this is where a single mechanism can solve multiple problems at once. Most ERPs already require a vendor code to create a SKU. That vendor code maps to a vendor record. If the organization establishes a simple designation within that vendor record — manufacturer versus distributor — then at the point of creation, the system already knows whether the information being received is coming from the source of truth or from a secondary hand redistributing the manufacturer’s product. If that vendor already exists in the system, someone has already created it, and you immediately have a connection point for identifying overlapping or redundant sourcing.

From that same vendor-level logic, you can derive intent signals. Products coming from master distributors are being supplied to you and your competitors alike — it is likely that these products are ultimately going to be merchandisable and sellable on a website. Products sourced directly from a manufacturer, on the other hand, might include customer-specific items — custom prints, special orders, parts made to a particular customer’s specification that should never be marketed or promoted to other customers. A simple question at the point of creation — is this a standard item or a customer-specific item? — gives the creator an opportunity to provide input that gets passed forward to the teams responsible for enrichment. It’s not a perfect classification system. But it’s a starting point that provides a signal where none previously existed.

Five layers. Category confirmation. Category-level UOM standards. UPC capture. Vendor designation. And a basic intent classification informed by vendor type and creator input. None of these require a new system. None of them require a massive implementation project. They require governance decisions — made once, enforced consistently, and maintained over time. That’s the filter. And it’s what separates an intake valve from an open pipe.

What “Good” Looks Like at This Stage

Let’s be direct about something: “good” at this stage is not a pristine, fully governed end state where every SKU in the catalog is perfect. If you’re waiting for that before you call the first refining station healthy, you’ll be waiting a long time.

Good at this stage means operational readiness. It means the organization has identified what downstream enrichment and execution require of a SKU, and is building the infrastructure to deliver it — even if the full catalog hasn’t been converted yet.

In practical terms, this looks like a few observable things.

The organization knows, at the point of creation, what a SKU is intended to be. Not every detail — but the classification. Is this a component item that only needs to exist for transactional purposes? Or is this a product that will eventually need to be fully enriched, warehoused, priced across channels, and presented to customers digitally? That distinction matters because it tells the rest of the refinery what to expect and what level of investment each SKU warrants.

Naming conventions and unit of measure standards exist and are enforced — not by individual discipline, but by process. When someone creates a new product in a given category, they don’t get to decide on the unit of measure by personal preference. That decision has already been made at the category level and is applied consistently, regardless of which location or which person initiates the creation.

The organization has a mechanism — or is actively building one — for discontinuing, reissuing, and consolidating SKUs with minimal disruption to customers. This is the clearest signal of maturity at Station 1, because it means the business has accepted that SKU cleanup isn’t a one-time project. It’s a continuous operational capability. Products get created, standards evolve, consolidation happens, and the organization can execute those transitions efficiently rather than treating every change as a customer service crisis.

New product introductions — whether activating entire manufacturer catalogs or strategically building product lines a few items at a time — follow a defined workflow that captures enough information at the point of entry to prevent the duplication, inconsistency, and orphan problems that compound over time.

None of this requires perfection. It requires recognition of what the downstream refining stations need, and deliberate progress toward delivering it.

Leadership Mindset for This Stage

SKU Definitions sit in an unusual position within most organizations. The stage is foundational — arguably the most foundational in the entire lifecycle — but it rarely gets the leadership attention that later stages demand. Nobody walks into a board meeting asking about SKU naming conventions. They ask about the website, the digital experience, the customer-facing capabilities. The intake valve does its work quietly, and when it works well, nobody notices.

When it doesn’t work well, everybody notices — they just don’t trace the problem back to its source.

Here is what leadership needs to understand about this stage: the raw material your people are generating every day — every SKU created at a branch, at a counter, on the phone — is the single biggest untapped asset in your data operation. Those locations creating products at the point of need are not just servicing customers. They are telling you where the market is moving. Every product created is a signal — a new product line the organization didn’t know it needed, a supplier it should be engaging, a pricing opportunity it hasn’t captured, a buying pattern within an industry it wasn’t tracking. This is your largest, most direct source of market intelligence. It’s just arriving unrefined.

In decentralized businesses, this pool of raw material is also a genuine competitive advantage. Competitors looking at the catalog from the outside can’t cross-reference against it, quote against it, or service the customer as quickly. What looks like chaos from the inside can look like a moat from the outside. It’s truly a competitive advantage!

But an unfiltered lake of raw material is still just a lake. The asset only becomes valuable when the organization can refine it. And that’s what this stage builds: the filter. Data governance at the point of creation — naming conventions, unit of measure standards, classification of intended use — is the filter that determines what can move cleanly into the next refining station and what needs to be addressed first. The filter doesn’t shrink the lake. It makes the lake usable.

Leadership’s job is not to slow down the creation of raw material. It’s to ensure that a filter exists, that someone owns it, and that the organization treats it as infrastructure rather than an afterthought. Not every individual SKU — that would be impossible in a decentralized environment — but the rules that every SKU follows. The naming conventions. The unit of measure governance. The classification of intended use. The process for transitioning a SKU when standards change. If nobody owns these things, you don’t have an intake valve. You have an open pipe. And you cannot blame the refinery for what comes out the other end when the input was never filtered.

This is governance, not control. The distinction matters. Control says “only these two people can create a product.” Governance says “anyone can create a product, but it follows these rules, and someone is accountable for maintaining those rules over time.” One restricts. The other scales.

Where to Start

If this resonates, here is one thing you can do before the next issue.

Pick one product line. Not your entire catalog — one line. Pull every active SKU in that line and evaluate them against five questions:

First: Is there a consistent naming convention across every SKU in this line?

Second: Is the base unit of measure standardized, or are different SKUs in the same line using different units?

Third: Can you identify every duplicate or location-specific variant — and do you know which one is the standard?

Fourth: Do you know which of these SKUs are component or repair items versus fully merchandisable products?

Fifth: How does this product line look from the customer’s perspective?

That’s it. Five questions, one product line. It won’t take long. What it will tell you is whether your first refining station is producing clean fuel or whether downstream teams are quietly compensating for what the intake valve is letting through.

If the answers are clean, your Station 1 is healthier than most. If they aren’t — and for most B2B organizations, they won’t be — you now have a specific, observable starting point for improvement. Not a catalog-wide overhaul. Not a multi-year data initiative. One product line, five questions, and a clear picture of where you stand.

What's Next

Once you can trust the identity of a product — what it is, how it’s described, what it’s intended to be — the next question becomes whether that product can legally and safely move through your operation.

Next issue, we move to the second refining station: Regulatory Data. It’s the stage that determines whether a product is even eligible to proceed through the rest of the lifecycle — and the one most organizations discover the hard way they can’t afford to skip.

0 Comments

Subscribe TODAY

Built by (and for) practitioners and executives, The Digital Roadmap delivers B2B platform insights with every issue.