Apple is providing up to $1 million in rewards to security experts who can find flaws that could jeopardize the security of its Private Cloud Compute service, which is set to launch next week. Apple’s continuous efforts to strengthen the security of its AI infrastructure include the program, which is described in full on the company’s security blog.
Those who can reveal flaws that enable the remote execution of harmful code on private cloud compute servers will receive a maximum reward of $1 million. Researchers can also get up to $250,000 for privately revealing flaws that might allow sensitive user information or consumer prompts provided to the cloud service to be extracted. Apple highlights that awards could be awarded for any serious security flaws that aren’t previously covered, with payouts of up to $150,000 possible for gaining access to private data from privileged network positions.
Apple’s bug bounty program, which rewards hackers and security professionals for reporting vulnerabilities before they can be exploited, is a logical next step for this project. Apple has made major measures in recent years to improve the security of its premium iPhones, including creating a dedicated iPhone for researchers to find flaws in the face of mounting malware threats.
Apple added further information about the security features of its Private Cloud Compute service, including details about its source code and documentation, in a recent blog post. The Private Cloud Compute is marketed as an online addition to Apple Intelligence, the company’s on-device AI model that is intended to manage increasingly complex AI activities while protecting customer privacy. Apple’s dedication to safeguarding user data in an increasingly intricate digital environment is demonstrated by this action.