In context: Arm might not have the same consumer desktop presence as Intel or AMD, but when it comes to smartphones, servers, and even new MacBooks, the company's processor architecture is essentially unmatched. Despite its ubiquitousness, though, the Arm architecture hasn't seen any massive, generational upgrades since v8, released way back in October 2011.

That's about to change, though. The semiconductor firm has officially unveiled Armv9, heralding a new era of processor architecture that seeks to meet growing computational demand in the fields of data security and artificial intelligence.

Focusing on the latter first, Arm CEO Simon Segars says Armv9 is the "answer" to a future that will be "defined by AI." To face that inevitability head-on, Segars claims his company needs to lay a foundation of "leading-edge compute" capabilities.

How, exactly? The Scalable Vector Extension (SVE). For those who don't know, the SVE is the technology at the heart of the world's fastest supercomputer, Fugaku. Arm has partnered with its creator, Fujitsu, to develop SVE2 for Armv9. In theory, SVE2 should enable next-gen machine learning and digital signal processing capabilities.

As for how v9 will tackle security, you need only look to Arm's new Confidential Compute Architecture (CCA). The company describes CCA's functionality as follows:

Confidential computing shields portions of code and data from access or modification while in-use, even from privileged software, by performing computation in a hardware-based secure environment.

Through the CCA, and something called "dynamically created Realms," Armv9 will be able to protect sensitive data from prying eyes while it's actively in use, "at rest," or being transferred to another location.

Last, but certainly not least, Armv9 shoots for massive CPU performance gains of over 30 percent. Arm expects those improvements to be finalized over the "next two generations" of mobile and server CPUs.

Arm hopes all of these generational upgrades will put its architecture on track to process "100 percent of the world's shared data" in at least some capacity – whether it's at the "endpoint" or somewhere in the cloud.