Topics In Demand
Notification
New

No notification found.

Operationalizing AI at scale
Operationalizing AI at scale

February 14, 2025

AI

77

0

Over the past couple of years, enterprises have seen an explosion in AI pilots, experimenting with this innovative technology. Now, we’re entering a phase where AI is being scaled across organizations. While this shift represents a tremendous opportunity, it also brings significant challenges in delivering ROI. Several factors impact ROI from AI at scale, including business use cases, computing infrastructure and data. Among them, let's focus on what remains the biggest bottleneck in the AI puzzle - Data.

Over the last 25 years, data has skyrocketed by an astounding 100,000 to 150,000 times. Remarkably, 90% of the world’s data has been generated in the last two to three years, driven by unstructured data. Despite this, only about 1% of unstructured data is effectively utilized, leaving a massive untapped opportunity. With Gen AI, there’s an opportunity to harness this data. However, unlocking this potential is far from straightforward due to challenges in dealing with data.

Managing data has always been challenging, with issues like lack of front (business) to back (data) integration, data quality, and data, system and organizational silos. With Gen AI, there are three additional challenges that organizations encounter:

  • Contextualization: Off-the-shelf Gen AI models typically provide 40-60% answers. To reach the desired 90% answer, LLMs must be contextualized using deep domain and enterprise-specific data
  • Operationalization: Existing Data and ML Ops frameworks fall short when dealing with LLMs. They struggle to manage the data required for training models and executing the lifecycle of operationalizing AI at scale
  • Security: LLMs operate on a two-way data flow, consuming input data to generate outputs. Input data provided is fed into the model for refinement, but this introduces significant risks of data leakage, including personally identifiable information (PII) or organization’s proprietary data

 

To navigate these challenges and operationalize AI effectively, enterprises need to reimagine how they think about data management. I propose a five-part framework to achieve this:

  1. Narrowing Down the Business Problem: Narrowing the business problem and identifying the specific data required is the most critical step. This reduces the scope of data challenges significantly by ensuring that only relevant data is curated, loaded, and processed through the data stack.
     
     

  2. Building AI and Data Products: Data products are vertical slices of data stack anchored on specific business outcomes, enhancing efficiency and speed in problem-solving through reusability and repeatability.
  3. Gen AI Enablement: Enterprises must enable multiple LLMs and SLMs to balance cost, efficiency, performance and accuracy while building the most appropriate solution for varied use cases. At the same time, it is essential to be selective about the data fed into these models to ensure security, optimize outcomes, and ensure cost effectiveness.
  4. Building a Knowledge Engineering Layer: A knowledge engineering layer enables organizations to search and retrieve the right data required for contextualizing Gen AI outputs. This step is crucial for ensuring models are trained and their output customized on high-quality, relevant enterprise data.
  5. Modernizing the Data Stack: The nature of data itself is evolving. Organizations need to modernize their data architecture to support vector databases and multi-modal data models, which are essential for enabling Gen AI.

Organizations have made significant investments over the years in building their existing data infrastructure. Constructing an entirely new AI-specific stack from scratch is neither practical nor cost-effective, as it would make ROI realization difficult. Instead, the optimal approach lies in narrowing the data scope, enabling Gen AI at scale, contextualizing it with enterprise and domain-specific data, and selectively modernizing infrastructure to support Gen AI capabilities. This strategy ensures a balanced path toward achieving ROI from AI at scale.

Author: Nitin Seth, Co-Founder & CEO, Incedo Inc.

IncedoInc


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


© Copyright nasscom. All Rights Reserved.