This question is identical to the 2020 IJCAI Angry Birds question except for the date.
Angry Birds is a game requiring prediction of the physics-based effects of different-property flying-impaired projectiles on various structures. This includes aiming the birds, using their varied properties, and using explosions and other effects.
For several years, an Angry Birds AI competition has been held to evaluate and encourage game-playing ML systems to play Angry Birds. In this competition the entrants are provided "a basic game playing software that includes a computer vision module, a trajectory planning module, and the game interface that works with the Chrome version of Angry Birds."
Part of the competition is an Man vs Machine Challenge, pitting the best ML systems against highly skilled humans.
Thus far no AI program has outscored the best humans. In the 2016 competition, the human and AI players competed on four levels over the course of 10 minutes. Although some AIs completed four levels, none completed all four (some humans did, albeit with difficulty.) The best human players ended with approximately double the best AI scores. This is actually a bit less good than a followup to the 2015 challenge in which an AI came within a factor of 2/3 of the best human scores.
Resolution
The question will resolve as ambiguous if neither the IJCAI Angry Birds AI competition nor any other similar competition runs in 2019. It will be left to the informed discretion of the admins what counts as a "similar competition" -- the gist of the question is whether Angry Birds will be verifiably solved.
Data
Data on competitions from 2012 to 2017 can be found here: https://aibirds.org/past-competitions.html And highly limited data on 2018 here: https://aibirds.org/man-vs-machine-challenge/…