We no longer need to blindly accept the output of even the most sophisticated AI/ML platforms. In fact, we should not consider any artifact, whether produced by humans or machines, as valid knowledge unless it contains not only supporting data and analyses, including provenance, but also an explanation of the underlying plausibility.
There is no questioning that generative AI is here to stay, but its use in mission-critical work has some way to go before it can be trusted and let loose.
If we are not careful and proactive about it, the concept and importance of knowledge itself may soon become blurred or lost.
Then there's the inevitable demand for more automation, from the flight planning and clearance process to the operation of the air vehicles themselves. No human or group of humans could possibly keep track of so many constantly changing variables