Operational Data: Cheapest to Fix, Most Expensive to Ignore

by | Apr 16, 2026 | Digital Roadmap

Here’s a question worth sitting with for a minute. If you added up every freight surcharge your company paid last year because the weight or dimensions in your system didn’t match what the carrier measured, every hour your warehouse team spent working around data they didn’t trust, every customer complaint rooted in a shipment that arrived wrong, and every pick error caused by a barcode that returned the wrong quantity, what would that number be?

Most organizations can’t answer that. Not because the number is small, but because the costs are scattered across so many line items that nobody has ever connected them to a single root cause: the physical product data is wrong, incomplete, or was never captured in the first place.

Last issue, we built the compliance gate, the second refining station where regulatory data determines whether a product is cleared to move. The third station is Operational Data: Dimensions. Weights. Packaging configurations. UPCs. Freight classification. It is the station most organizations assume they already have under control, because products are shipping and orders are filling. That assumption is costing them money on every single shipment.

What operational data actually is, and why it comes third

Operational data is your products’ digital fingerprint associated to your physical operations. Not what it’s called, not what it’s approved to do. What it actually weighs, how big it is, what it’s packed in, and how many of them fit in a box. This is the data that warehouses consume at receiving, carriers consume at pickup, WMS systems consume at slotting, and e-commerce platforms consume every time a customer clicks “Add to Cart” and expects a shipping estimate that bears some resemblance to reality.

It comes third because the first two stations provide its inputs. Identity and clearance are prerequisites to documenting the physical reality of how a product moves through a supply chain.

What makes this station deceptive is how simple it looks. Dimensions, Weights, and Barcodes(Oh My!). Leadership looks at this and thinks: clerical work. Every organization shipping product already has some version of this data somewhere, in the ERP, on the shelf label, in somebody’s head. The assumption is that because products are moving, the data must be fine.

BUT It’s likely not fine. It’s just not failing loudly enough for anyone to notice. The symptoms are usually noticed everywhere, while the cause goes undetected.

The gap between “we have a product in inventory” and “we can fulfill an order accurately and profitably” lives at this station. Shipping and shipping correctly are very different things. Both cost money, but one costs a lot more, and it hides the evidence across freight invoices, warehouse labor reports, and customer complaints that never get traced back to a missing height dimension on a product that’s been in the catalog for six years.

The problem is that the digital fingerprint doesn’t actually match the physical product, and labor generally compensates for the gap.

The foundational question at this station, the one that every other data point depends on, is deceptively simple:

What is a one?

What defines the lowest sellable unit? Is a “one” a single blade, or a pack of five blades? Is it a foot of chain, or a roll? Is it an individual fastener, or a box of fifty? And once you’ve defined what a one is, what are the packaging tiers above it: case, master carton, pallet? How many ones fit in each tier? Every downstream system in the operation, from WMS to carrier integration to procurement to pricing to e-commerce, depends on the answer being clear, consistent, and standardized. When it’s not, the consequences are physical, expensive, and remarkably difficult to undo.

This question gets harder as the organization grows. A small company focused on one product line might not feel the friction. Start growing organically, adding product lines and manufacturer relationships, and the inconsistencies multiply with every new product someone sets up without a category-level standard to follow. Start growing through acquisition, and the problem surfaces overnight. Consolidating catalogs between organizations that each defined a one differently is one of the most painful exercises in B2B integration. What is arguably the biggest challenge I’ve seen organizations face is that there is no documentation, anywhere, that outlines how these decisions were made or what the standard should be. The strategy was never written down because it was never treated as strategy. It was treated as data entry.

This station also carries a manufacturer-distributor dynamic that creates its own friction. Manufacturers originate physical product data. They designed the packaging. They know the weights and dimensions. Distributors consume that data, but they also define their own selling units, which may not match the manufacturer’s packaging logic. A manufacturer sells fasteners by the box. The distributor wants to sell them by the piece. That translation layer, between how the product was manufactured and how the distributor wants to sell and fulfill it, is where a significant share of operational data problems originate.

E-commerce accelerates every one of these problems. Operational data was always important in traditional distribution. Counter sales, phone orders, experienced inside reps could compensate for data gaps because they knew the products. E-commerce doesn’t know anything. Every product page, every shipping estimate, every add-to-cart quantity, every warehouse pick ticket generated from a web order depends on this data being accurate, structured, and connected to the systems that act on it. The website doesn’t have tribal knowledge. It has fields.

What breaks when operational data is weak

The failures at this station tend to be quieter than a compliance violation and steadier than a shipment hold. They don’t announce themselves with a fine or a customs detention. They often can get resolved on the fly. They announce themselves with a thousand small costs, distributed across every warehouse shift, every freight invoice, and every customer interaction. None of them dramatic enough on their own to trigger a root cause investigation. All of them adding up to a number that would horrify leadership if anyone ever totaled it.

The unit of measure tax

At Fastenal, we had a problem with reciprocating saw blades that will sound familiar to anyone who has managed product data at scale. Two parts from the same supplier, within the same product line, set up with different unit of measure logic. One was a “one” that was actually a pack of five blades. The other was set up as five individual blades, meaning a quantity of five in the system was five individual blades, not five packs.

Same supplier. Same line. Two different people set them up at different times, applying different logic to the same question: what is a one? And both had a reason. An operations person looks at the product and sees a package. That’s a one. A sales person looks at the same product and sees individual blades, because they’re trying to compare price per blade in a competitive quoting environment. Both perspectives make sense in isolation. Neither person had a category standard telling them which definition the business wanted.

The operations team built a cheat guide. Down to the part number. Pickers had to reference it to make sure they weren’t picking five packs when the order meant five blades, or one pack when the order meant one blade. The team set a no-break on one of the parts and a quantity override on the other. Manual workarounds, maintained by the warehouse, to compensate for a decision that was never coordinated at product creation.

And then those products sold well. Once they gained traction and got stocked across multiple distribution centers, the cost of unwinding the mistake became a project in itself. Discontinuing the incorrect setup means pulling inventory from every DC that stocks it. Repackaging. Re-slotting. Updating every system that references that part number. Communicating the change to every customer who has it on a recurring order. Rebuilding sales history under the corrected SKU.

That’s just the operations perspective. Now consider what happens across the rest of the organization. Inventory has to inflate to stock both the old product and the replacement simultaneously during the transition. Sales has to tell their customers that the part number they’ve been ordering is changing, because of a mistake that was not the sales team’s fault. The price isn’t changing, but the order quantity is. Try explaining that to a customer who’s been buying this product on a recurring PO for two years. Try explaining it to the sales rep who has to make that call.

It can be done. Ask any operations team about it and watch their expression. Then ask them about re-slotting and replacing or repackaging incorrect unit of measure products. You’ll get an answer that’s more colorful than anything I could write here.

This pattern doesn’t happen once. It happens across categories, across suppliers, across years of product creation by different people applying different logic to the same question. And every instance embeds itself deeper into the operation the longer the product stays active.

The selling unit versus the operational unit

Some products have a structural problem that isn’t about inconsistency at all. It’s about the product itself existing differently depending on whether you’re buying it, storing it, selling it, or fulfilling it.

Take chain. You sell it per foot. You buy it per roll, a roll being, say, 500 feet. You store it as a roll. You never want to break that roll unless a customer orders a cut length. You might even want to offer a discount at the full-roll level and make sure your teams never touch cutting that roll for small orders.

So what’s a one? If you set everything up as feet, your inventory system says you have 500 units. How does your operations team distinguish a full uncut roll from 500 feet of remnant pieces scattered across three locations? How do you offer a full-roll discount if the system doesn’t know what a roll is? How does the picker know which roll is earmarked for a full-roll order and which one has already been cut?

If you set it up as rolls, the customer who wants 50 feet of chain can’t order what they need without the system understanding the conversion and the cut process.

This isn’t a mistake anyone made. It’s a design decision that has to be made, and the right answer depends entirely on how the organization wants to sell and service that product. A category owner who understands both the commercial intent and the operational reality needs to be the one making that call. The product data team can execute the structure once the decision is made, but the decision itself is a business call, not a data call. Because once the UOM is set in the ERP and the product starts transacting, the organization lives with that decision for the life of the product.

Freight cost and customer service exposure

Wrong weights or wrong dimensions mean wrong freight class. Wrong freight class means the carrier charges what they measure, not what the data says. The organization pays the difference on every affected shipment, sometimes for months before anyone connects the freight cost variance to a product data problem. Invoice disputes take weeks to resolve and rarely get resolved in the shipper’s favor.

What gets overlooked in these conversations is the customer side. When inaccurate product data changes the freight cost on an order, that cost change often hits the customer. On a one-time order, it’s an unpleasant surprise. On repeat orders, where the customer expects consistency, it becomes a service problem. The freight cost was one amount last month and a different amount this month for the same product, the same quantity, the same lane. The customer doesn’t see a data problem. They see an unreliable supplier. And they start looking at alternatives.

Freight invoices are, as it turns out, one of the most expensive ways to discover your data is wrong.

Warehouse operations: the cost of working around bad data

Without accurate dimensions and weights, slotting doesn’t work. Products end up in bins they don’t fit, aisles they block, or locations that don’t match their handling requirements. Pick efficiency drops. Damage increases. The warehouse team develops workarounds: mental maps of which products are actually bigger than the system says, sticky notes on shelves with the real weights, tribal knowledge about which bins to avoid for certain SKUs.

The WMS has the data. The data is wrong. So the team works around it, and the labor cost of those workarounds is absorbed into the warehouse operating budget where nobody questions it because it’s always been that way.

But flip this around and there’s a real opportunity hiding in the same data. If occupancy is a significant piece of SG&A, then accurate product volumetrics become a tool for optimization, not just error prevention. Clear visibility to the actual dimensions and weight of every product allows supply chain teams to maximize space utilization, optimize slotting for pick efficiency, and reduce the handling time that ultimately drives warehouse labor costs. The same data that prevents errors when it’s accurate creates measurable efficiency when it’s actually used to manage the facility. Accurate operational data doesn’t just stop bad things from happening. It makes good things possible.

Broken trucks and busted orders

You plan a truck based on what the system says will fit. The system says wrong. Too much product for the space, or the weight exceeds the limit. Partial shipments. Return trips. Expediting fees. The buyer who planned a full pallet pickup discovers that the pallet quantity in the system doesn’t match what’s actually on the pallet, and now they’re either paying for space they didn’t use or scrambling to fit product they didn’t plan for.

UPC and barcode failures

A receiver scans a barcode and the WMS doesn’t recognize it. Or it returns the wrong UOM level. Or the packaging quantity doesn’t match what the barcode says. The receiver stops, investigates, and manually processes the receipt. Multiply that interruption across hundreds of receiving events per day, and you start to understand why warehouse teams build their own reference sheets and stop trusting the system.

The customer experience base layer

A customer adds to cart and the shipping estimate is wrong because the dimensions are wrong. The product arrives in packaging the customer didn’t expect because the UOM description didn’t match the physical reality. The product page says “each” but the customer receives a box. Or worse, the product page says “box” and the customer receives one piece at a box price.

These aren’t fulfillment failures. They’re operational data failures that surface at the point of customer experience. Misshipments, inaccurate freight costs, wrong quantities: every one of them rooted in physical product data that was never captured accurately or never maintained.

The treacherous part is the silence. Unlike a compliance violation that triggers a fine or a SKU duplication that shows up in a catalog audit, operational data gaps hide. The product has a clean description, a correct category, a valid regulatory profile. Everything upstream looks healthy. But the weight is wrong, the dimensions are estimated, the UPC is mapped to the wrong tier, and the freight class hasn’t been validated since the product was set up four years ago. Those gaps ride silently through every transaction until someone, a carrier, a warehouse manager, a frustrated customer, surfaces them. By then, they’re not a data problem. They’re a cost problem, a service problem, and a credibility problem.

Building the measurement standard

If you’ve been following the series, this structure will feel familiar. The intake valve needed a filter with governance layers. The compliance gate needed a category-driven rulebook with toggles. The measurement station needs its own filter: a defined set of physical data requirements that every product must meet before it’s considered ready for the warehouse floor.

The measurement standard works differently than the filters at the first two stations. At Station 1, the filter was sequential, each layer building on the one before it. At Station 2, the filter was conditional, with the category activating which requirements applied. At Station 3, the filter is structural. Every product needs every layer. There are no toggles. A product either has accurate physical data or it doesn’t, and the warehouse discovers which one it is at the worst possible time.

Layer 1: UOM hierarchy definition

Everything else at this station depends on this. What is a one? What are the packaging tiers above it: case, master carton, pallet? How many ones fit in each tier?

The descriptor problem is worth addressing directly, because it’s where a surprising number of organizations get tripped up. “Box,” “bag,” “each,” “unit,” “pack” can all mean the same thing. A box is a one. A bag is a one. An each is a one. The descriptor helps a human understand what they’re looking at. It does nothing for the operational system. A good WMS doesn’t care whether it’s a box or a bag. It cares that a one is 12 inches long by 3 inches wide by 2 inches deep and weighs 1.4 pounds. That’s a one. Everything else is a label.

The irreversibility of this decision cannot be overstated. Once a UOM is set in the ERP and the product starts transacting, changing it isn’t a field update. It’s a project. Sales history is tied to that UOM. Pricing is built on it. Inventory counts reference it. Customer orders expect it. The longer the product sells, the more embedded the decision becomes, and the more expensive the correction gets. The saw blade pattern at Fastenal wasn’t a data entry error. It was a governance gap. No standard existed to ensure that two different people creating products in the same category would answer the question the same way.

And for products where the selling unit differs from the operational unit, whether that’s chain, hose, cable, wire, or anything sold by a measure that differs from how it’s stored, the UOM decision is a design exercise, not a data entry task. It requires a category owner who understands how the product will be bought, stored, picked, sold, and fulfilled. The product data team translates that decision into the right data structure. But the decision itself belongs to the person who understands the business intent behind the product, and it needs to happen before the product enters the ERP.

Layer 2: Weight capture at each packaging level

Gross weight at each UOM tier. Carriers price by weight first. The warehouse needs it for handling decisions and equipment requirements. Procurement needs it for freight cost planning. The e-commerce platform needs it for shipping estimates. Net weight, the weight of the product without the packaging, is useful for technical specifications, but the operational systems that consume this data need gross weight. The box weighs what the box weighs, and that’s what gets loaded on the truck.

This needs to be standardized. Pounds or kilograms. Pick one and enforce it across the catalog. When half the catalog is in pounds and half is in kilograms and nobody is sure which products are which, the conversion errors compound across every freight calculation, every warehouse handling decision, and every shipping estimate.

Layer 3: Dimensional capture at each packaging level

Length, width, and height at each UOM tier. Again, standardized. Inches or centimeters, not a mix. These feed the WMS for slotting decisions, the carrier integration for dimensional weight calculations, and the e-commerce platform for shipping estimates and packaging recommendations.

Dimensions are one of those data points that get estimated at product creation and never revisited. Someone eyeballs the box, types in “12 x 8 x 6,” and moves on. The actual box is 14 x 9 x 7. Close enough, until the slotting algorithm puts it in a bin designed for a 12 x 8 x 6, and the warehouse team discovers the problem when the product doesn’t fit. Measuring a box isn’t hard. Doing it once, doing it accurately, and entering it correctly would save a remarkable amount of downstream frustration. And yet.

Layer 4: Freight classification

Freight class is derived from density, specifically weight per cubic foot, but it’s not a simple calculation for every product category. Some categories have specific NMFC classifications that override the density-based default. Hazmat products may carry freight class requirements dictated by their regulatory profile from Station 2. Getting freight class wrong doesn’t just affect one shipment. It affects every shipment of that product, on every lane, until someone catches it. And the carrier isn’t going to volunteer that you’ve been underclassifying.

This layer connects the weight and dimensional data from Layers 2 and 3 to the shipping cost structure. Freight class should be captured at the product level and connected to the carrier integration, not maintained in a spreadsheet that someone in logistics updates when they remember to.

Layer 5: UPC and barcode assignment with system linkage

A UPC at each packaging level: unit, case, pallet where applicable. But the number alone is just a number. What matters is the linkage. When a receiver scans that barcode, does the WMS return the correct packaging level? The correct quantity? The correct dimensions? The UPC is the bridge between the physical product sitting on the dock and the data sitting in the system. If the bridge is broken, if the UPC is wrong or unregistered or mapped to the wrong tier, the receiver is working blind. Every manual intervention that follows is time and accuracy the warehouse shouldn’t have to spend.

Layer 6: Packaging configuration documentation

How units nest within cases. How cases stack on pallets. The conversion math that procurement, warehouse, logistics, and e-commerce all need to function. How many ones in a case? How many cases on a pallet? What’s the pallet pattern: how many layers, how many cases per layer?

This is the layer that ties Layers 1 through 5 together into a complete physical profile. Without it, the individual data points exist but don’t connect. Procurement can’t calculate a pallet quantity. The warehouse can’t plan a receiving layout. E-commerce can’t offer case or pallet pricing. The conversion math needs to be documented in a way that systems can consume, not locked in a buyer’s notebook or a warehouse manager’s memory.

What “good” looks like at this stage

As with the first two stations, “good” here is not a catalog where every product has been measured to the millimeter and verified against laboratory specifications. That’s the destination. Good is the organization having a defined measurement standard, a process for capturing and validating physical data, and the ability to identify where gaps exist.

Every active product has a defined UOM hierarchy with documented conversion quantities. What is a one, what is a case, what is a pallet. How many ones in each tier. Documented, not assumed. When a new person takes over the category, the standard is there to follow, not a conversation they need to have with the person who set the products up three years ago.

Weights and dimensions exist at each packaging level and are structured for system consumption. The WMS uses them for slotting. The carrier integration uses them for freight calculations. The e-commerce platform uses them for shipping estimates. All three are pulling from the same source, and that source reflects what the product actually weighs and how big it actually is.

Freight class is captured at the product level and connected to shipping systems. Not looked up manually at the time of shipment. Not managed through a spreadsheet that one person in logistics maintains. Captured, connected, and auditable.

UPCs are assigned at each packaging level and verified against what the WMS returns when scanned. When a receiver scans a case barcode, the system returns the correct tier, the correct quantity, and the correct physical attributes. No guesswork. No cheat sheets. No “yeah, ignore that barcode, use this one instead.”

Warehouse teams are not maintaining shadow systems. This is the clearest signal. When the people closest to the physical product stop building their own reference materials to compensate for data the system should carry, you know the measurement station is functioning. When they’re still taping sticky notes to shelves with the real weights, you know it isn’t.

New products entering inventory arrive against a defined operational data standard. The receiving team knows exactly what data needs to be captured and validated before the product goes to a bin. The standard exists. It’s documented. It’s not a suggestion.

The leadership mindset for this stage

There’s a temptation to treat this station as a data management problem. Dimensions go in fields. Weights get captured. Barcodes get assigned. It looks like clerical work. It invites delegation to the most junior person available.

That instinct is wrong, and it’s wrong in a way that costs the organization far more than the salary of the person who should actually be making these decisions.

The operational data decisions at this station are business strategy decisions disguised as data entry tasks. And the leadership failure that creates the most damage here isn’t that nobody owns the data. It’s that nobody has documented the category strategy that should be driving the data decisions in the first place.

Think about what happens when a new product line gets created. Understanding the suppliers who provide the line. Understanding how the sales team wants to sell it. How the organization wants to go to market. How pricing and discount structures will work. How you want to buy. How you want to fulfill and service. How you want to ship. These are all category strategy questions, and they all intersect with operational data decisions, starting with the unit of measure.

Setting UOM standards at the category level means deciding how you communicate what a unit is to your customers and to your teams. Consistently. So that the person who manages the category after you understands the standard and doesn’t unintentionally derail it by going in a different direction. This isn’t documentation for documentation’s sake. It’s protecting the operational integrity of decisions that are embedded in the ERP and lived with for the life of every product in that category.

Distributors carry a unique burden here that manufacturers don’t always appreciate. A manufacturer knows their product. They designed the packaging. Their unit of measure makes sense internally. But a distributor sits across five manufacturers in the same category, each with a different UOM logic, and the distributor has to decide: do we present this consistently to our customers, or do we accept whatever each manufacturer gives us?

That’s a selling strategy decision. It determines how every product in that category gets set up, priced, picked, shipped, and experienced by the customer. If it doesn’t get made deliberately, it gets made accidentally, by whoever happens to set up the next product. And the inconsistency compounds with every new product line, every new manufacturer relationship, every new hire who brings their own interpretation of what a one should be.

The smaller and simpler the organization, the less visible this problem is. You’re partnering with one or two brands per category, the product line is manageable, and experienced people know the products well enough to compensate. But the moment the organization starts pursuing the endless aisle, the infinite product catalog, the ability to sell anything from any manufacturer whether you stock it or not, these decisions matter at scale. Every new product added without a category-level standard is another potential inconsistency embedded in the system.

The underappreciation trap is real. Technically minded teams treat this station as a data management problem to streamline. Capture the fields, normalize the formats, build the integrations. But the decisions haven’t been made. The governance can’t exist without the strategy. Without category-level decisions about selling units, operational units, and packaging standards, the technical team is building structure around nothing. And it messes up very quickly.

These decisions belong on the business side. Close to the sales and commercial teams. The further you get into the technical side, the further you get from understanding what you’re trying to solve, sell, and strategically position the company for. I’ve watched very capable technical teams build clean, efficient data management processes around operational data that was set up wrong from a business perspective. The plumbing was perfect. The water was contaminated at the source.

There’s also a visibility problem that leadership needs to own. Most leaders only see the symptoms of bad operational data. They see freight cost overruns, warehouse labor issues, and customer complaints. They don’t see the root cause because nobody is surfacing it in terms they can act on. A few simple questions can change that. How many parts entering a facility right now are missing weight or dimensions? How many active products lack a pallet quantity that the purchasing team could use to negotiate a price break? How many products don’t have barcodes, creating inefficient labor within the distribution team? If leadership can’t answer those questions, they have no visibility into where the opportunity for improvement lies. Setting those expectations for the distribution teams, giving them a clear picture of what complete operational data looks like and where the gaps are, turns this from a vague “data quality” conversation into a measurable improvement initiative.

Here’s the other side of this. Outside of unit of measure, which is a creation-time design decision that’s genuinely difficult to fix retroactively, the rest of operational data is arguably the easiest area in the entire refinery to create measurable value. Weights. Dimensions. Freight class. UPCs. These are capturable. They’re correctable. They’re directly translatable to cost reduction in freight, warehouse labor, and customer service. This is the station where the return on investment shows up fastest and most visibly in the P&L. The work isn’t glamorous. The results are.

Where to start

Pick one product category with high shipping volume. Not the most complex category. Not the one with the most regulatory requirements. The one that moves the most product through the warehouse and onto trucks. Pull the active SKUs in that category and evaluate them against five questions.

UOM hierarchy: Does every SKU have a defined UOM hierarchy, with a clear definition of what a one is, what a case is, what a pallet is, along with documented conversion quantities? Or are there products where the answer depends on who you ask?

Weight: Do you have gross weight at each packaging level? Not estimated weight. Not weight from the manufacturer’s spec sheet that was published before they redesigned the packaging. Actual, verified, current gross weight.

Dimensions: Do you have length, width, and height at each packaging level, in standardized units? And when was the last time anyone verified that those numbers match what’s actually sitting on the shelf?

Freight class: Is freight class captured at the product level and connected to your carrier integration, or is it being looked up manually at the time of shipment by someone in logistics who knows what to do?

Barcode integrity: If your warehouse team scans a barcode on any product in this category right now, does the WMS return the correct packaging level and quantity? Or does the team already know which barcodes to trust and which ones to work around?

Five questions, one high-volume category. What it reveals is whether your measurement station is producing reliable data or whether your warehouse and logistics teams have been quietly compensating for gaps that the system should be filling, and billing you for the privilege on every freight invoice and labor report.

What comes next

Once you can trust what a product is, whether it’s cleared to move, and how it physically exists in your operation, the next question is commercial: how does the organization buy it, price it, and commit to delivering it?

Next issue, we move to the fourth refining station: Fulfillment & Pricing. Lead times, minimum order quantities, pricing conversions, supplier cost structures. It’s where the physical truth about the product meets the commercial promises the business makes to its customers, and where the distance between those two things determines whether the organization is operating on confidence or on hope.

Need a GPS for your digital roadmap? How can we help? info@b2b-squared.com | b2b-squared.com

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe TODAY

Built by (and for) practitioners and executives, The Digital Roadmap delivers B2B platform insights with every issue.