In this version we are working with different publications from the #AI and #Tech world, mainly from 'JustAINews.

Apple Offers $1 Million Reward for Finding Flaws in AI Cloud

To show its commitment to security, Apple has announced a $1 million reward for anyone who can find vulnerabilities in its upcoming AI cloud service, Private Cloud Compute. Launching next week, this initiative aims to improve the security of Apple’s services, particularly those powered by AI.

Inviting Security Experts to Test Apple’s AI Servers

Apple is inviting the wider security community to thoroughly test the Private Cloud Compute system. This network of servers is designed to handle complex AI tasks that are beyond the capabilities of iPhones, iPads, or Macs. The system has strong privacy features, including end-to-end encryption and immediate deletion of user requests after processing. These measures ensure that even Apple cannot access the data.

At first, Apple only invited a small group of security researchers to participate, but now it has opened the invitation to the entire security community. Participants get access to the source code of important components of Private Cloud Compute and a virtual research environment for macOS. This setup gives researchers the tools they need to analyze the system and identify any weaknesses.

Reward Structure and Bounty Program Details

Adding Private Cloud Compute to Apple’s Security Bounty program is a major expansion. The top reward of $1 million is available for those who can remotely hack the servers and run harmful code with system-level privileges. There are also rewards of up to $250,000 for finding ways to extract sensitive user data, and smaller rewards of up to $150,000 for accessing user information from a privileged network position.

Apple also said it would consider giving rewards for vulnerabilities that don't fit into its predefined categories if they have a significant impact on the system's security. “We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI at this scale,” Apple said. “We look forward to working with the research community to build trust in the system and make it even more secure over time.”

Expanding Apple’s Bug Bounty Program

The expanded bug bounty for Private Cloud Compute is part of Apple's ongoing effort to improve security across its products. In recent years, Apple has made its systems more open by creating special iPhones for research, allowing security experts to carry out in-depth analyses and find possible vulnerabilities.

With Private Cloud Compute, Apple aims to improve its AI capabilities on devices so they can handle more complex tasks securely. By inviting the security research community to test its systems, Apple shows its commitment to privacy and transparency in AI. The combination of strong privacy features like end-to-end encryption and attractive rewards reflects Apple's dedication to keeping users' data safe.

Author and Reference

This new was written by Genaro Palma, originally inspired in Apple Invites Hackers to Test Private AI Cloud for $1 Million Reward, published at JustAINews.com