Hackers and safety researchers who uncover vulnerabilities in sure Microsoft merchandise might take residence a part of a $4 million bug bounty.
On Tuesday, the corporate introduced a brand new invitation-only hacking occasion referred to as Zero Day Quest. Touted as the most important of its variety, the occasion will invite top-ranked researchers to find and report high-impact safety flaws. Microsoft additionally introduced a analysis problem that’s open to anybody.
The analysis problem will begin at the moment and run till January 19, 2025. A part of Microsoft’s AI Bounty Program, this problem encourages individuals to hunt for bugs in Microsoft AI, Microsoft Azure, Microsoft Id, M365, and Microsoft Dynamics 365 and Energy Platform.
Earlier than diving in, first-time researchers and different curious events ought to take a look at the MSRC Researcher Useful resource Middle to learn to submit safety vulnerabilities to Microsoft.
Zero Day Quest is scheduled to be held in 2025 at Microsoft’s campus in Redmond, Washington. Microsoft’s high 10 ranked researchers from every of the 2024 Annual Azure, Dynamics, and Workplace Leaderboards will have the ability to attend the hacking occasion. One other 45 researchers shall be accepted based mostly on the standard of their submissions to the analysis problem.
These invited will get round-trip financial system airfare, a five-night lodge keep, transportation between the airport and lodge, and the prospect to take residence a hefty bug bounty. With $4 million able to dole out, Microsoft will award researchers who uncover flaws in areas together with:
- Important and necessary severity Distant Code Execution
- Important and necessary severity Elevation of Privilege
- Excessive-impact situations on the Azure Bounty Program
- Excessive-impact situations on the Microsoft Dynamics 365 and Energy Platform Bounty Program
- Excessive-impact situations on the M365 Bounty Program
Past the hefty bug bounties, Microsoft may also supply qualifying researchers an opportunity to work with its engineers and safety consultants.
“To advance AI safety, beginning at the moment we’ll supply double AI bounty awards,” Tom Gallagher, VP of Engineering at Microsoft Safety Response Middle, stated within the weblog put up. “We may also supply researchers direct entry to the Microsoft AI engineers centered on growing safe AI options, and our AI Pink Group. This distinctive alternative will permit members to boost their expertise with cutting-edge instruments and methods and work with Microsoft to boost the bar for AI safety throughout the ecosystem.”
The right way to qualify
What is going to it take so that you can qualify? The objective of the bounty program is to seek out necessary safety flaws that straight impression the safety of Microsoft customers, so you may have to determine a vulnerability not beforehand reported or recognized to Microsoft. The vulnerability should be thought-about Important or Essential in severity and should be reproducible.
Lastly, you may have to supply clear steps in writing or video exhibiting Microsoft engineers learn how to reproduce and repair the bug.