top of page

Letting Data Speak, AI Act!

Case Study

AI Automation for Industrial Aftermarket Maintenance

A heavy industrial equipment and power generation company operating large-scale plants requiring rigorous preventive maintenance programs. The client serves as both an OEM and operator of critical industrial assets, managing complex equipment installations where maintenance quality directly impacts plant uptime, safety, and profitability. This solution is applicable to any organization in industrial manufacturing, power generation, or heavy equipment sectors that relies on structured preventive maintenance checksheets to manage asset reliability.

About the Client

A heavy industrial equipment and power generation company operating large-scale plants requiring rigorous preventive maintenance programs. The client serves as both an OEM and operator of critical industrial assets, managing complex equipment installations where maintenance quality directly impacts plant uptime, safety, and profitability. This solution is applicable to any organization in industrial manufacturing, power generation, or heavy equipment sectors that relies on structured preventive maintenance checksheets to manage asset reliability.

Untitled design - 2024-09-27T104509.589.png

Challenge

Creating preventive maintenance (PM) checksheets is a critical but deeply manual and time-consuming process traditionally carried out by analysts, plant engineers, and domain experts. The process presented several compounding challenges.


Fragmented documentation at scale: Source material—including OEM manuals (400+ pages), vendor handbooks, commissioning guides, and site-specific procedures—was scattered across disconnected sources. Processing a standard 12-asset plant required approximately 480 hours (60 working days) of human consulting time.


Hidden knowledge risk: Human consultants focused primarily on clearly labeled chapters such as structured schedules and standard maintenance sections, systematically missing "hidden knowledge" embedded in safety footnotes, appendices, commissioning guides, and environmental condition triggers—precisely the condition-based and low-frequency tasks that prevent catastrophic equipment failure.


Inefficiency and inconsistency of manual extraction: Each asset required 8–10 hours and 15–20 manual touchpoints (collecting, reading, extracting, formatting, copy-pasting, table creation, and review), making the process both error-prone and impossible to scale across large plant portfolios.


Inability to scale: Manual scaling required armies of consultants, with 100 plants estimated to take 16 years to process—rendering enterprise-wide PM program modernization effectively impossible.Without a scalable, intelligent solution, organizations risked incomplete maintenance coverage, undetected safety hazards, and extended equipment downtime caused by missed low-frequency but high-impact maintenance tasks.

Untitled design - 2024-09-27T105551.128.png

Key Results

  • 160x faster processing speed, reducing per-asset processing time from 40 hours to 15 minutes, and enabling 100 plants to be completed in 12.5 days versus the manual baseline of 16 years.

  • 469% knowledge expansion (4.7x more maintenance activities discovered) compared to human consultant baselines, uncovering valid OEM-specified tasks from sub-procedures, installation guides, and appendices that human reviewers missed.

  • 94–96% reduction in extraction costs per asset, from $2,000–$3,000 per asset (manual) to $126 per asset via AI extraction; total pilot cost was $1,530 for 121 documents processed.

  • 92% overall extraction accuracy and automation readiness, validated using the RYGB framework across 2 production power plants (~48,400 pages, 54,630 API calls) with >99% system reliability.

  • 100% reduction in copy-paste and table creation time, collapsing the per-document workflow from 15–20 touchpoints over 8–10 hours to 2–3 touchpoints in approximately 12 minutes.

Solution

High Level Architecture:

Scalable serverless document processing pipeline: Client requests are received via AWS API Gateway, orchestrated through a Node.js API Lambda, and fanned out to parallel Python Lambda workers (up to 100) that fetch PDFs from Uploadcare CDN, run inference and extraction via Gemini 3 Pro, write results to S3, and track job status asynchronously in DynamoDB.

Sophisticated chunking strategy with parallel AI processing: Large-scale OEM documents were decomposed using a multi-stage semantic chunking strategy and processed through a parallel Gemini worker architecture, enabling 15-minute per-asset throughput at peak capacity of 1,500+ documents per day.

Google Gemini semantic analysis (beyond traditional OCR/keyword search): Rather than matching keywords, the system employed semantic classification to identify maintenance tasks regardless of document section (e.g., Safety vs. Maintenance), context-aware validation to distinguish OEM requirements from general guidance, and cross-document linking to connect knowledge across disparate sources.

NLP-based entity resolution and conflict resolution: Redundant or conflicting tasks from OEM manuals, vendor handbooks, and site procedures were deduplicated and reconciled into unified, source-attributed maintenance tasks (e.g., consolidating "inspect stator cooling ducts," "clean air passages," and "check airflow obstruction" into a single unified task with a 98.5% confidence score and a 6-month interval). RYGB confidence framework for validated output: Extracted tasks were classified as Green (exact match to human baseline, zero intervention), Yellow (valid new findings beyond the baseline), or Red (requires tribal knowledge validation, approximately 155 activities flagged), providing auditable automation readiness scoring at 92%.

Human role redefinition: The solution redeployed consultants from manual data entry to strategic validation, enabling expert time to be focused on high-judgment Red-category tasks rather than mechanical extraction.


Untitled design - 2024-09-27T104509.589.png

Technologies Used

  • Google Gemini (Large Language Model for semantic extraction and analysis)

  • AWS (Cloud infrastructure and scalable compute)

  • Natural Language Processing (NLP) / Large Language Model (LLM) pipeline

  • Semantic chunking and parallel document processing architecture

  • PDF/document ingestion and OCR pipeline

Other Case Study Items

Revolutionizing Personal Loans with AI-Driven Underwriting

Revolutionizing Personal Loans with AI-Driven Underwriting

A leading Indian personal loan provider revolutionized their underwriting process by leveraging AI and machine learning to automate 80% of loan decisions. By integrating social and financial data into a sophisticated predictive algorithm, the company drastically reduced decision times to seconds expanded access to underserved segments, and achieved lower default rates compared to human underwriters.

Artificial Intelligence - Powered Tyre Dimension Extraction System

Artificial Intelligence - Powered Tyre Dimension Extraction System

JashDS developed an AI-powered computer vision system for a leading automotive e-commerce platform, enabling accurate extraction of tire dimensions from images. The solution, which increased conversion rates by 25% and reduced customer support inquiries by 80%, utilized advanced technologies such as YoloV8 for instance segmentation and custom-designed augmentation techniques to simplify the online tire purchasing process.

Enhanced Jira Data Analysis for Strategic Insights

Enhanced Jira Data Analysis for Strategic Insights

JashDS developed a flexible framework for analyzing Jira project data that is capable of handling varying export structures and custom fields. The solution leveraged GenAI and LLM technologies to provide actionable insights, identify productivity trends, and uncover potential risks across diverse software projects, resulting in a ___% improvement in team efficiency and a ___% increase in successful project outcomes.

bottom of page