Categories

Apple Offers To Pay You $1 Million If You Can Hack Their AI Cloud

Started 1 week ago by Paschal in Technology

Apple Offers To Pay You $1 Million If You Can Hack Their AI Cloud

Apple Inc., the $3.5 trillion tech giant, is demonstrating its commitment to server security by offering a significant bounty to anyone who can successfully...

Body

Apple Inc., the $3.5 trillion tech giant, is demonstrating its commitment to server security by offering a significant bounty to anyone who can successfully hack into its systems. The company has launched a "security research challenge" alongside the release of its new AI-powered Apple Intelligence feature within iOS 18.1.

Central to this challenge is the Private Cloud Compute (PCC) server, which handles many of the Intelligence commands. Apple aims to ensure the utmost protection for this server against cyberattacks, hacks, and security breaches. The initiative invites both amateur hackers and seasoned security experts to test the resilience of PCC.

In a recent statement, Apple extended an open call to the security community: "Today we’re making these resources publicly available to invite all security and privacy researchers—or anyone with interest and a technical curiosity—to learn more about PCC and perform their own independent verification of our claims." The company also expanded its Apple Security Bounty to include PCC, offering substantial rewards for any security or privacy issues identified.

 

To support this effort, Apple has released a comprehensive security guide for the PCC server, explaining its functions, authentication processes, and security measures. Parts of the PCC source code have also been made available on GitHub.

 

Bug Bounty Rewards

Apple has outlined a tiered reward system based on the severity and complexity of the discovered vulnerabilities:

 

  • $50,000: For accidental or unexpected data disclosure due to deployment or configuration issues.
  • $100,000: For the ability to execute unattested code.
  • $150,000: For accessing user request data or other sensitive information outside the PCC trust boundary.
  • $250,000: For access to sensitive information about user requests outside the trust boundary.
  • $1,000,000: For arbitrary code execution without user consent, with arbitrary entitlements.

 

Apple regards PCC as the "most advanced security architecture ever deployed for cloud AI compute at scale" and looks forward to collaborating with the research community to enhance the system's security and privacy.

Apple also encourages reporting any security issues not explicitly covered in their outline, with the promise of considering bounties for such findings.

more on Technology

  • No one is replied to this thread yet. Be first to reply!