I just wrapped up my interview with Albert Liu at CES 2026, and Kneron’s vision for edge AI is more ambitious than I expected. After three days of sensory overload on the show floor—where every company seems to be shouting “AI” from the rooftops—it was refreshing to hear a concrete strategy that addresses real-world limitations of current AI deployment.
Liu, sporting his trademark casual blazer over a t-shirt, spoke with the confidence of someone who’s weathered the AI hype cycles since founding Kneron in 2015. “We’re not just putting AI anywhere and everywhere,” he told me, leaning forward in his chair. “We’re solving the fundamental contradiction between computational demands and device constraints.”
What makes Kneron’s approach noteworthy is their recognition that edge AI—artificial intelligence that runs directly on devices rather than in cloud data centers—isn’t just about technical specifications. It’s about reimagining how AI integrates into our daily lives without compromising privacy or increasing costs.
The company unveiled their KL730 chip at the show, which they claim delivers 8x performance improvement over their previous generation while consuming 40% less power. According to data from Counterpoint Research, edge AI processing is expected to grow at 37% CAGR through 2029, significantly outpacing cloud-based AI growth.
“The economics simply don’t work for running everything in the cloud,” Liu explained. “When you’re processing billions of data points daily across millions of devices, the bandwidth and server costs become prohibitive. Edge is inevitable.”
This perspective aligns with what I’ve been hearing from other industry insiders. Kate Morrison, principal analyst at Gartner, whom I spoke with yesterday, noted that “companies are increasingly recognizing the hidden costs of cloud AI dependence, from latency issues to rising inference costs.”
Kneron’s strategy hinges on what Liu calls “hybrid intelligence”—distributing AI processing across the edge, local servers, and cloud depending on the task requirements. This approach is particularly compelling for automotive and IoT applications, where continuous connectivity can’t be guaranteed.
The demonstration I witnessed showed their system maintaining functionality during network outages—something that cloud-dependent solutions struggle with. It reminded me of the massive cloud service outage last year that rendered millions of “smart” devices temporarily useless.
Privacy considerations also feature prominently in Kneron’s pitch. With growing consumer awareness and regulatory pressure around data collection, the ability to process sensitive information locally rather than transmitting it to remote servers offers clear advantages.
The MIT Technology Review recently highlighted this shift, noting that “edge AI represents a fundamental architectural change that addresses many of the privacy concerns plaguing cloud-based systems.” Liu echoed this sentiment: “Users shouldn’t have to sacrifice privacy for intelligence.”
What struck me most during our conversation was Liu’s candor about the challenges ahead. Unlike many tech executives who paint only rosy pictures, he acknowledged the difficulties in convincing device manufacturers to integrate more sophisticated AI chips when they’re focused on keeping costs down.
“We’re not just selling chips; we’re selling a vision of what’s possible when intelligence is distributed,” he said. “That requires education and demonstration of tangible benefits.”
Their partnership strategy reflects this reality. Rather than trying to compete directly with the silicon giants, Kneron is focusing on specific verticals where edge AI delivers obvious advantages—security cameras that don’t need to stream video to the cloud, automotive systems that function reliably regardless of connectivity, and industrial equipment that can make split-second decisions locally.
Industry analysts seem cautiously optimistic about Kneron’s approach. A recent report from IDC predicts that specialized edge AI processors will capture 32% of the AI chip market by 2028, up from just 8% today.
However, challenges remain. The fragmented nature of edge computing standards makes scalability difficult. And despite improvements in efficiency, the most advanced AI models still exceed the capabilities of edge devices—a reality Liu readily admits.
“We’re not claiming edge will replace the cloud,” he clarified. “It’s about putting intelligence where it makes sense—sometimes that’s the device, sometimes it’s not.”
As I left the interview, navigating through crowds admiring flashier CES attractions, I couldn’t help reflecting on how Kneron’s practical approach stands in contrast to some of the more grandiose AI promises being made. In a field dominated by theoretical possibilities, their focus on solving actual deployment challenges feels like a breath of fresh air.
Whether their strategy will propel them to the forefront of the AI chip market remains to be seen. But in a moment when many companies are still figuring out what AI should do, Kneron seems focused on a more fundamental question: where it should happen. And that might just make all the difference.