Public Cloud the Best Place for AI Applications? The Case for Localised Solutions

By Stewart Laing, CEO, Asanti.

  • 3 days ago Posted in

The public cloud has long been championed as the most effective solution for modern IT needs, providing scalability, flexibility, and cost-efficiency. However, our recent survey of senior IT decision makers reveals an emerging trend that challenges the cloud-first approach. 52% of organisations now plan to host and deliver their AI applications on-premise or through colocation facilities rather than relying on the public cloud. This shift is largely driven by the unique requirements of AI applications, which often conflict with the expected benefits and cost structure of the cloud.

The financial and operational challenges of AI in the cloud

Whilst public cloud offers an enticing model of pay-as-you-go flexibility, our research reveals that this model may not be the most cost-effective for high-performance AI applications. 77% of organisations reported that the operational costs of the public cloud exceeded their expectations, and 63% found that overall costs were higher than previous non-cloud solutions. The monthly expenses for cloud storage and processing can quickly add up for data-intensive AI tasks, leading some organisations to question whether the cloud is sustainable at scale.

Real-time data processing is another critical challenge for AI in the cloud. 36% of those who moved applications back from the cloud cited faster data transfer needs for real-time applications as a primary reason for their decision. Public cloud often struggles to meet these demands, whereas local data centres provide a proximity to data sources that supports low-latency processing.

Data sovereignty: the overlooked factor in AI deployment

In addition to financial and latency concerns, data sovereignty remains a top priority for organisations handling sensitive information. AI applications frequently process large amounts proprietary data, making compliance with data regulations like GDPR essential. 41% of the survey respondents pointed to security and compliance concerns as key drivers for moving AI applications out of the cloud. Ensuring that data remains within specific geographical boundaries can be difficult in the cloud, where data may be stored across global data centres.

The ability to exercise control over data storage and access policies is particularly attractive to regulated industries such as healthcare and finance. 39% of respondents bringing applications back on-premise noted the lack of control and customisation in the public cloud, which can make compliance with regulatory standards challenging. By using on-premise or colocated infrastructure, organisations can establish strict data governance practices that align with regulatory requirements and offer a higher degree of data oversight.

The appeal of local data centres for AI workloads

Given these challenges, local data centres present a practical solution for organisations seeking to meet AI’s demands. Colocation facilities can provide infrastructure with power and cooling systems optimised for AI processing. Organisations can maintain full control over their hardware and data while benefiting from professional infrastructure management. This setup allows companies to meet the data storage, processing, and compliance needs of AI without incurring the unpredictable costs associated with the cloud.

By adopting a hybrid model that combines cloud, and on-premise/colocation solutions, companies can achieve the best of both worlds: scalability for AI modelling and robust, low-latency performance for deployed AI. This hybrid approach was endorsed by 67% of IT decision makers, who indicated they wished they had taken a mixed approach from the beginning. This finding underscores the importance of evaluating each application’s specific needs and aligning infrastructure accordingly.

Implications for IT infrastructure strategy

The choice of on-premise and colocation for deployed AI workloads suggests a potential reset in IT infrastructure thinking. For years, the "Cloud First" approach has dominated IT decision-making, but our findings suggest that companies are now reconsidering this dogma. A mixed infrastructure model may offer more flexibility and control, particularly for organisations developing AI modelling applications. 45% of respondents stated they should have conducted more due diligence when moving to the cloud, underscoring the importance of careful planning and ongoing evaluation.

This shift could have profound implications for the IT industry, as demand for specialised data centre services is likely to grow. Providers of colocation and private cloud solutions are poised to capture a larger market share, especially among organisations prioritising data sovereignty, low latency, and security for their AI applications.

The road ahead: building an adaptive infrastructure

AI’s impact on IT infrastructure is accelerating, and organisations are recognising the need for a more adaptable approach. The findings from our research suggest that the public cloud may not be the optimal solution for every workload, particularly when it comes to some aspects of AI development. Instead, a hybrid model that leverages the strengths of cloud and local hosting provides organisations with a, flexible solution. that blends performance, cost and control.

As AI continues to drive change, the ability to adapt infrastructure accordingly will be critical for businesses aiming to remain competitive and compliant in a rapidly evolving digital landscape.

By Cary Wright, VP of Product Management, Endace.
By Yoram Novick, CEO, Zadara.
By Dave Errington, Cloud Specialist, CSI Ltd.
BY Jon Howes, VP and GM of EMEA at Wasabi.
By Rupert Colbourne, Chief Technology Officer, Orbus Software.
By Daniel Beers, Senior Vice President, Global Data Center Operations of Ardent Data Centers, a...
By Jake Madders, Co-founder and Director of Hyve Managed Hosting.