Perplexity AI’s CEO sparked another controversy when he said that multi-billion-dollar data centers could become irrelevant as artificial intelligence tools move toward on-device processing.
His remarks, reported by publications such as Mint, The Times of India, and Yahoo Finance, come at a time when the AI industry is rethinking how future systems will be built, powered, and scaled.
Get the latest smartphone leaks, rumors, and tech news. Don’t miss out on updates from the mobile world!
Follow @gsmrumorsHis comments point to a growing shift in the next phase of AI development. Instead of relying only on cloud-based data centers, AI tools may increasingly run directly on smartphones, tablets, and laptops that consumers use every day.
Speaking about AI infrastructure, Aravind Srinivas pointed out that advances in hardware efficiency and model optimization could reduce the need for huge data centers that currently power most large-scale AI systems.
According to him, models that run locally, rather than on remote servers, could handle a wider range of tasks as chips become more powerful and energy-efficient.
Why On-Device AI Matters
In artificial intelligence services, speed really matters. On-device infrastructure will have a big impact because when systems work locally on a device, they need less infrastructure and can deliver lightning-fast responses by reducing latency. It also improves privacy, as sensitive information does not always need to be sent to external servers.
What It Means for India and Consumers
For Indian users and startups, on-device AI could be especially important. Lower dependence on cloud infrastructure may reduce costs and improve access, particularly in areas with limited internet connectivity.
Local AI processing could also boost innovation among developers building apps tailored to Indian languages, everyday use cases, and privacy needs.
As smartphones remain the main computing device for millions in India, advances in on-device AI could directly shape how people use technology in the years ahead.
So, according to Aravind Srinivas, instead of relying only on large data centers, future AI systems may use users’ smartphones as the main processing hub, synced globally. This approach could make results and responses much faster by reducing latency.
However, relying only on smartphones, laptops, and tablets as the primary data source may not be a smart approach. In the future, AI systems may run on a hybrid model. Instead of investing multi-millions of dollars in large data centers, this model could use less infrastructure while delivering better results with low latency.