Skip to content

Search the site

applecybersecurityCloudAINews

Apple opens up new Private Cloud for security researchers; top bounties of a fat $1m

"For the first time ever, we’ve created a Virtual Research Environment (VRE) for an Apple platform"

Apple has taken the unusual step of opening up its new “Private Cloud Compute” environment for prodding and probing by security researchers.

It is offering bounties of up to $1 million to those who find weaknesses in the PCC – a “hardened” private cloud for running generative AI workloads. 

(Apple over the summer published details on a ∼3 billion parameter model designed to run locally, and a larger model that will run in this PCC environment to power part of its pending "Apple Intelligence" offering.)

The company has, in the past, persistently fought back against attempts to open up its walled garden systems for deep security research and analysis – for example suing security firm Corellium Inc for simulating its iOS operating system to help researchers find security flaws in Apple devices.

(Corellium reated a virtualization software called CORSEC that emulates various operating systems/simulates on non-Apple hardware an environment that can run iOS – in effect it “enables users to create a virtual iPhone.” In May 2023 the 11th U.S. Circuit Court of Appeals had ruled in favour of Corellium, saying CORSEC’s deployments fell under “fair use.” Apple appealed on copyright grounds. It settled in December 2023.)

Apple’s very first Virtual Research Environment 

“For the first time ever, we’ve created a Virtual Research Environment (VRE) for an Apple platform” wrote the Apple Security Engineering and Architecture (SEAR) on October 24, introducing the new platform. 

The VRE comes four months after Apple introduced Private Cloud Compute (PCC) – which it said is the “most advanced security architecture ever deployed for cloud AI compute at scale”and  equipped with “custom Apple silicon and a hardened operating system designed for privacy.”

As Apple explained in a June 10 blog post, the PCC is built on “custom-built server hardware that brings the power and security of Apple silicon to the data center [using tools like] Secure Enclave and Secure Boot

“We paired this hardware with a new operating system: a hardened subset of the foundations of iOS and macOS tailored to support Large Language Model (LLM) inference workloads while presenting an extremely narrow attack surface. This allows us to take advantage of iOS security technologies such as Code Signing and sandboxing…”

What does the VRE allow?

Apple said that the VRE “runs the PCC node software in a virtual machine with only minor modifications. Userspace software runs identically to the PCC node, with the boot process and kernel adapted for virtualization. 

“The VRE includes a virtual Secure Enclave Processor (SEP), enabling security research in this component for the first time — and also uses the built-in macOS support for paravirtualized graphics to enable inference.”

See also: Apple "rephrases the web" to cut LLM compute and data usage

"You can use the VRE tools to:

  • "List and inspect PCC software releases
  • "Verify the consistency of the transparency log
  • "Download the binaries corresponding to each release
  • "Boot a release in a virtualized environment
  • "Perform inference against demonstration models
  • "Modify and debug the PCC software to enable deeper investigation

Arguably strikingly for Apple, it is also making source code “for certain key components of PCC” available under a “limited-use” research licence.

The bounties available for researchers, on paper, are as follows.

Category Description Maximum Bounty
Remote attack on request data Arbitrary code execution with arbitrary entitlements $1,000,000
Access to a user's request data or sensitive information about the user's requests outside the trust boundary $250,000
Attack on request data from a privileged network position Access to a user's request data or other sensitive information about the user outside the trust boundary $150,000
Ability to execute unattested code $100,000
Accidental or unexpected data disclosure due to deployment or configuration issue $50,000

Many security researchers have, in the past, expressed some dissatisfaction at Apple's approach to disclosures and a trust-building process may be necessary – the classic risk for security researchers is that they disclose a detailed attack path, a vendor says "this has already been reported/a fix is underway" and closes it off with a fix shortly thereafter but no reward. (Vulnerability duplication is, of course, also common.)

Apple has further promised that "every production Private Cloud Compute software image will be published for independent binary inspection – including the OS, applications, and all relevant executables, which researchers can verify against the measurements in the transparency log. Software will be published within 90 days of inclusion in [a] log, or after relevant software updates are available, whichever is sooner."

The "Apple Intelligence" is available now in beta.

SDKs and documentation for developers are available here.

See also: Apple open-sources its Homomorphic Encryption library

Latest