90% of Indian AI Firms Use Multiple Cloud Service Provider Simultaneously; Esya Centre Study Finds India’s Entire AI Value Chain
New Delhi : The Esya Centre today released a new report examining competition and market structure across India’s artificial intelligence value chain. Titled An Empirical Assessment of India’s AI Value Chain: Market Structure, Competition, and Innovation Dynamics, the study draws on a primary survey of 227 firms across the infrastructure, foundation model, data, and application segments of the AI ecosystem.
The findings come amid growing global scrutiny of AI markets, particularly around concerns related to concentration, cloud dominance, access to compute infrastructure, and lock-in effects. However, the study finds that India’s AI ecosystem remains dynamic and contestable across layers, with firms actively adopting multi-cloud strategies, combining open-source and proprietary models, and experimenting with decentralised deployment architectures.
Nearly 90 percent of surveyed firms use or plan to use hybrid or multi-cloud infrastructure (multiple cloud service providers at the same time), over 80 percent use open-source foundation models, and three-quarters deploy or test smaller language models on edge devices rather than relying exclusively on centralized cloud infrastructure.
Key Findings
• Infrastructure Layer: Nine in ten surveyed firms use or plan to use hybrid or multi-cloud infrastructure, with 62 percent relying on a long tail of small and medium-sized providers alongside in-house data centres. Competition among cloud service providers is driving prices down, with 99 percent of respondents perceiving some drop in compute costs. Vendor flexibility (78.6%) and cost optimisation (77.7%) are the top drivers of multi-cloud adoption. Notably, over 80 percent of firms operate without exclusivity clauses in their cloud contracts.
• Price is not the primary determinant behind firms’ selection of cloud service provider: The survey finds that cloud provider choice is driven by performance, scalability, and service breadth, with pricing as an important but not exclusive factor. The strongest preference emerges around ‘performance and security’, which nearly three-quarters of surveyed firms rate as extremely important. ‘Flexibility and scalability’ follow closely, with almost 69 percent of respondents considering them extremely important. The ‘breadth and depth of service offerings’ also matter significantly, with close to 60 percent of respondents rating this factor as extremely important.
• Foundation Model Layer: Indian organisations engage with AI foundation models in a plural, overlapping manner. Open-source model adoption stands at approximately 83 percent, complemented by strong use of proprietary (67%) and custom-trained models (63%). Three in five firms simultaneously use multiple models within the same category, while 81 percent deploy a mix of small and large models depending on the application.
• Data Asset Layer: Firms draw on broad, overlapping mixes of data sources, with open-source datasets used by 85 percent of respondents. Around 85 percent of firms believe data quality matters more than quantity for model performance, and two-thirds report that third-party data meets their needs to a considerable extent. The primary barriers to data access are poor quality and low availability of government datasets, copyright uncertainty, and regulatory ambiguity under the Digital Personal Data Protection Act, 2023.
• Application Layer: India’s GenAI startup base grew 3.6 times between the first half of 2023 and first half of 2024. Three-quarters of organisations actively deploy or test smaller language models on edge devices, reducing dependence on centralised cloud infrastructure. The only meaningful entry barrier identified is a skills gap, not financial or structural concentration.
““The survey data paints a picture of an AI ecosystem that is far more dynamic and competitive than many assume. Indian firms are making sophisticated strategic choices mixing cloud providers, combining open-source and proprietary models, and deploying innovation at the edge. The findings suggest that policy conversations should increasingly focus on strengthening enabling conditions such as better data infrastructure, interoperability, governance clarity, and AI skills development,” said Meghna Bal, Director, Esya Centre.
Overall, the survey reveals a highly dynamic and contestable AI value chain, with no indication of concentration or tipping at any layer. The results also indicate that if anything, the government must work towards systematic enablement of this sector by removing regulatory impediments around copyright and the DPDPA that make it hard for start-ups to access data. It must also improve the quality and accessibility of its own data sets and infrastructure, as these are important resources for most companies.