# Bespoke Engineering in Industrial AI The dirty secret of industrial AI: 60-70% of every deployment is custom engineering that has nothing to do with AI. Where the hours go: - **Data pipeline construction.** Every plant uses different historian systems (OSIsoft PI, Aspen InfoPlus.21, AVEVA). Every integration is bespoke. - **Domain-specific event labelling.** Tagging "this is a fouling event" or "this is a grade change" requires a process engineer who understands the specific chemistry. Can't be automated without domain expertise. - **Threshold calibration and alarm tuning.** Every plant has different normal operating ranges. Tuning alarm thresholds requires operator buy-in and weeks of iteration. - **Model validation against plant data.** The most time-consuming step. Proving that the model's predictions match reality under real operating conditions. No shortcuts. - **Change management.** Getting operators to trust and use the system. High-touch, does not compress with scale, and is often the difference between a deployed system and shelfware. The implication for investors: when a company claims "60-70% platform reuse," ask for the itemized hours breakdown across 3+ deployments. If total hours aren't decreasing, the "reusable platform" is real but the bespoke 30% is where all the time and cost lives. Related: [[Deployment Velocity]], [[Consultancy-to-Platform Transition]], [[Industrial AI Unit Economics]], [[Industrial AI MOC]] --- Tags: #deeptech