Safeguarding Pilot Quality and Airman Development Through Robust FAA Testing and Oversight Systems

Click here to see and download a PDF of the entire paper

Safeguarding Pilot Quality and Airman
Development
Through Robust
FAA Testing and Oversight Systems

By Jason Blair

April 21, 2025

The U.S. aviation industry stands as a pillar of modern society, connecting economies, facilitating global trade, and ensuring the safe transport of millions of passengers annually. At the heart of this enterprise is the airmen—an individual whose skill, judgment, risk management abilities, and professionalism determine the safety of every flight.

The Federal Aviation Administration (FAA) is tasked with regulating and overseeing pilot training and certification in the United States. The FAA is responsible for ensuring that these airmen are not merely credentialed, but also comprehensively developed to meet the demands of an increasingly complex airspace. As of April 2025, with pilot hiring expected to remain active, technological advancements accelerating, and training providers facing economic pressures, the FAA’s role in maintaining a testing system free from self-interest has never been more critical.

The stakes are high. The National Transportation Safety Board (NTSB) consistently identifies pilot error as a leading cause of general aviation accidents, with 2023 data attributing over 50% of fatal incidents to deficiencies in decision-making, risk management, or basic skills—areas that robust training and testing should address. Meanwhile, the demand for pilots tempts training organizations to expedite certification, often at the expense of depth and quality. In this environment, systems that allow self-interested parties—such as flight schools or operators with financial stakes in student success—to control testing without external oversight risk eroding the very standards that ensure safety. The FAA’s challenge is to resist this drift, maintaining a framework that evaluates pilots not just for minimum competency but for the airmanship needed to navigate real-world challenges.

A testing system dominated by self-interest providers, where they assess their own students, creates an inherent conflict. The drive to generate certificated pilots can overpower the commitment to rigorous standards. This “fox watching the henhouse” scenario, as seen in some Part 141 self-examining programs or the proposed Organizational Delegation Authorization (ODA) for airmen testing, prioritizes throughput over quality. Schools profit from student completions while facing pressure to maintain high pass rates to attract new clients. A 2021 AOPA survey found that 28% of Certificated Flight Instructors (CFIs) at such programs felt compelled to approve unprepared students, highlighting how financial and business goals can compromise objectivity. Without external checks, the likely result is that we are producing pilots who meet certificate requirements on paper, but lack the resilience, adaptability, or proficiency to operate safely, undermining public trust and industry stability.

The FAA’s mission, therefore, must extend beyond facilitating certificate completion to fostering pilot development—a holistic process that builds technical mastery, situational awareness, and professional judgment. The multi-step certification process—from Private Pilot to Instrument Rating, Commercial Pilot, and Airline Transport Pilot (ATP)—is designed for this purpose, offering multiple testing events to sample and refine a pilot’s capabilities over time. Each stage, evaluated against established standards, must ensure that deficiencies are caught and addressed, preventing gaps from persisting into advanced roles. Yet, this system’s effectiveness hinges on impartiality and oversight, areas where self-interested testing falls short. A focus on development, rather than just checking boxes, prepares pilots for the complexities of modern aviation—glass cockpits, congested airspace, and automation failures—ensuring they are airmen, not just certificate holders.

Central to this mission are the Designated Pilot Examiner (DPE) program and FAA staff, twin pillars of quality control that counteract the biases of self-interested providers. Since the program’s inception, DPEs have served as independent evaluators, delegated by the FAA to conduct practical tests with no stake in a student’s training outcome. Their impartiality, rooted in extensive experience and adherence to established FAA standards, provides a critical external check, ensuring that pilot skills meet national standards, not institutional agendas. DPEs serve as a filter that self-examining programs, with pass rates nearing 85%, often lack. This rigor safeguards quality, catching weaknesses that internal evaluators might overlook due to pressure or familiarity. It is counterintuitive that the FAA’s own internal metrics trigger additional oversight for a DPE who has pass rates over 90%,as it is an outlier and a perceived indicator that the DPE may not be fully applying standards yet expects a 141 program with self-examining authority to maintain this pass rate. Certainly, if the expectation to maintain that pass level is there, internal evaluators at less quality focused training providers will adjust their testing efforts to ensure they meet these pass percentages.

Equally vital are FAA aviation safety inspectors (ASIs), who oversee training program systems, audit DPEs, and occasionally administer tests themselves. With fewer than 4,000 inspectors nationwide, and just one part of the duties of some of these inspectors being oversightof DPEs, their role as a direct federal presence ensures accountability across the system. Not all of them are qualified to or engaged with providing oversight of DPEs.

Historically, FAA staff conducted 5% of practical tests in 1989, setting a benchmark that deterred lax standards—a presence diminished today by resource constraints to less than 1% of testing efforts. Both DPEs and FAA staff embody the external oversight needed to prevent certification from becoming a rubber stamp, yet their effectiveness depends on robust support. Examiner shortages (down to 900 in 2020 but back up to 1,035 as of the end of 2024) and inspector vacancies (15% in 2023) threaten this safeguard, necessitating reforms like those proposed by the Designated Pilot Examiner Reforms Working Group (DPERWG)[1] and competitive compensation to attract talent.

It is time for the FAA to reinforce these mechanisms to ensure testing systems remain focused on standards and pilot development, not to just abdicate them and allow training providers to just self-certify their students or “train to proficiency.”

By leveraging DPEs and FAA staff as impartial guardians, and resisting the pull of self-interested delegation, the FAA can uphold its mandate, delivering a pilot workforce equipped for the skies of today and tomorrow.

A History of the FAA Designated Pilot Examiner Program

The FAA DPE program stands as a cornerstone of the United States’ aviation certification system, ensuring that pilots meet standards for each certificate or rating they complete. By delegating the authority to conduct practical tests—commonly known as “practical tests”—to designated private individuals, the FAA has leveraged the expertise of the aviation community to maintain safety while managing its limited resources.

Origins: The Early Days of Aviation Oversight

The roots of the DPE program can be traced to the rapid growth of aviation in the United States during the early 20th century. Following World War I, the proliferation of aircraft and pilots necessitated a structured approach to regulation. The Air Commerce Act of 1926 marked the federal government’s first significant step into aviation oversight, tasking the Department of Commerce with licensing pilots and certifying aircraft. At this time, the Aeronautics Branch (predecessor to the FAA) employed government inspectors to test and certify pilots. However, as aviation expanded, spurred by advancements in technology and the rise of commercial air travel, the demand for pilot certification quickly outpaced the capacity of federal personnel.

By the late 1930s, the Civil Aeronautics Authority (CAA), established under the Civil Aeronautics Act of 1938, recognized the need for a scalable solution. In 1939, the CAA introduced the concept of designating private pilots with extensive experience to conduct flight tests on behalf of the government. This marked the formal birth of the DPE program. These early examiners were typically veteran aviators—often with military or commercial backgrounds—who possessed the expertise to evaluate applicants against the CAA’s emerging standards. The program allowed the agency to focus its resources on oversight and policy while outsourcing the labor-intensive task of testing to trusted individuals.

Post-World War II Expansion

World War II catalyzed a dramatic increase in aviation activity, training thousands of pilots and advancing aircraft technology. After the war, many military aviators transitioned to civilian roles, swelling the ranks of pilot applicants. The CAA, reorganized into the Civil Aeronautics Administration under the Department of Commerce, faced an unprecedented certification workload. The DPE program became a critical tool for managing this surge. By 1947, when the CAA was absorbed into the newly created Federal Aviation Agency (later renamed the Federal Aviation Administration in 1967), the DPE system was firmly entrenched as a delegated authority model.

During this period, the program’s structure began to formalize. DPEs were required to hold appropriate pilot certificates and ratings, demonstrate significant flight experience, and undergo CAA approval. The tests they administered—consisting of oral examinations and in-flight evaluations—aligned with practical test standards that evolved from rudimentary checklists to more detailed guidelines. The post-war era also saw the introduction of specialized designations, such as examiners for private, commercial, and flight instructor certificates, reflecting the growing complexity of aviation.

The FAA Era: Standardization and Growth

The establishment of the FAA in 1958 on August 23rd under Dwight D. Eisenhower, following the Federal Aviation Act, marked a new chapter for the DPE program. The agency inherited a robust but decentralized system of examiners, and its early efforts focused on standardization. The FAA maintains Order 8000.95 (originally issued as earlier directives) to codify DPE qualifications, responsibilities, and oversight procedures. This includes minimum flight hour requirements and mandatory training to ensure consistency across examiners.

By the 1960s and 1970s, the DPE program expanded alongside the aviation industry. The rise of general aviation, fueled by affordable small aircraft and a growing middle class, increased demand for private pilot certificates. Simultaneously, the jet age brought new challenges, requiring examiners qualified to test pilots on complex, high-performance aircraft. The FAA responded by creating additional DPE categories, such as Airline Transport Pilot (ATP) examiners and those specializing in multi-engine or instrument ratings. The program’s flexibility allowed it to adapt to technological and regulatory shifts, such as the introduction of simulator-based testing in the 1980s.

Statistical data from this period underscores the program’s scale. In 1989, approximately 1,600 DPEs conducted roughly 105,000 practical tests, accounting for approximately 95% of all practical exams administered that year, while FAA inspectors handled the remaining 5%. This reliance on DPEs highlighted their role as the backbone of the certification process, freeing FAA staff to focus on air carrier oversight and safety investigations.

Challenges and Oversight: The 1990s and Beyond

As the DPE program grew, so did scrutiny of its operations. In 1991, the National Transportation Safety Board (NTSB) issued recommendations to the FAA following concerns about examiner consistency and oversight. The NTSB pointed to instances where DPEs applied standards unevenly or failed to detect deficiencies in applicants, potentially compromising safety. In response, the FAA bolstered its monitoring, requiring annual recertification and periodic observation of DPEs by Flight Standards District Office (FSDO) inspectors. The agency also refined the Practical Test Standards (PTS), providing clearer benchmarks for pass/fail decisions.

The 1990s also exposed economic tensions within the program. Unlike FAA inspectors, who conducted tests at no cost to applicants, DPEs operated as private contractors, setting their own fees based on market demand. Costs for a private pilot practical test ranged from $200 to $500, while advanced ratings could exceed $1,000. This disparity led some pilots to seek FAA inspectors, straining agency resources and prompting debates about affordability and access. The FAA maintained that pricing was a market function, not a regulatory matter, leaving DPEs to balance profitability with service to the aviation community.

The 21st Century: Modernization and Strain

The turn of the millennium brought new dynamics to the DPE program. The terrorist attacks of September 11, 2001, shifted FAA priorities toward security, but the need for pilot certification persisted. The introduction of the Airmen Certification Standards (ACS) in 2016 replaced the Practical Test Standards (PTS) for many of the main airmen certification events (a process that is still being implemented for less common certification), integrating risk management and scenario-based testing to better assess pilot decision-making. DPEs adapted to these changes to implement the ACS effectively.

However, the early 21st century also revealed a growing shortage of examiners. By 2014, the number of DPEs had dropped to fewer than 836, conducting just 74,849 tests—a significant decline from 1989 levels.[2] This reduction coincided with an aging examiner population, many of whom retired, and stricter FAA oversight that deterred new applicants. Meanwhile, a pilot shortage, driven by airline demand post-2008 recession recovery, increased training activity, creating a bottleneck in airmen practical flight test availability. Wait times stretched to months, and fees rose, with some DPEs charging $800 or more per test. We now see testing fees well over the $1,000, sometimes over the $2,000 mark for some tests and wait times approaching a month or more in high density training areas to schedule a test.

The FAA responded with some limited policy adjustments. In 2018, the agency lifted geographic restrictions at the urging of the Flight School Association of North America (FSANA), allowing DPEs to operate nationwide, and raised the daily test limit from two to three. The Designee Management System (DMS) in recent years however has further limited flexibility in DPE scheduling with requirements to provide more than 24-hours notice of any scheduling of events, limited ability to switch out testing activities at the last minute, limits on what tests can be conducted in a period (recent changes to the FAA Order 8000.95 do not allow a DPE to conduct more than one initial CFI related testing activity within any given 24-hour period for example) and have added to the system additional bureaucratic responsibility with little additional data collection being provided to the industry or oversight agencies as a result of the implementation of the system..

Legislative and Contemporary Developments

The FAA Reauthorization Act of 2018 mandated a review of the DPE program, leading to the formation of the Designated Pilot Examiner Reforms Working Group (DPERWG). In 2021, the DPERWG delivered recommendations, including improved locator systems, national surveys of examiner activity, and incentives to recruit new DPEs. The FAA’s 2022 response acknowledged progress—such as an updated online designee database—but implementation has lagged, frustrating training industry stakeholders.

The 2024 FAA Reauthorization Act, signed into law on May 16, 2024, introduced additional reforms. It also began the process of the FAA implementing a national DPE oversight division for centralized management. As of April 2025, the FAA reports incremental increases in DPE numbers, though challenges remain, with wait times and costs still a concern in high-demand regions.

The DPE Program Today: A Balancing Act

As of April 5, 2025, the DPE program remains a vital yet evolving component of FAA operations. Approximately 1,035 DPEs serve nationwide, conducting the vast majority of practical tests. Their qualifications—rooted in extensive flight experience, instructor credentials, and ongoing training—ensure that pilots meet modern safety standards. Yet, the program faces a delicate balance: maintaining rigorous oversight without deterring qualified candidates, addressing shortages without compromising quality, and ensuring accessibility in an era of rising costs. A big challenge has been increasing testing volume over the past few years, with a low point in 2013 of DPEs conducting around 51,000 tests in the year with around 860 DPEs to 2023 when DPEs conducted 140,954 tests with roughly 969 DPEs. We can see from this data point that as the training system has increased its demand for testing with the increase in career demand based on airline demand for pilots.

The DPE system’s delegated authority model, now over 85 years old, reflects a pragmatic partnership between government and industry. Its success is evident in the millions of pilots certified since 1939, contributing to the U.S.’s reputation for aviation safety. However, its future hinges on addressing persistent shortages, leveraging technology (e.g., simulator testing), and responding to stakeholder feedback. It is critical that the FAA recruit qualified individuals to be DPEs and have enough FAA staff to manage and support the program to ensure continued provision of airmen testing services in our system to meet demand requirements while also ensuring safe airmen production.

The Role of DPEs as Impartial Guardians of Standards

DPEs, private individuals delegated by the FAA to conduct practical tests, serve as independent evaluators free from the vested interests that training organizations inherently possess. Unlike instructors or check airmen within a self-examining school—who may be motivated by the success of their students, the reputation of their program, or the financial health of their employer—DPEs have no stake in the training process itself. Their sole mandate is to assess whether an applicant meets the FAA’s applicable Airmen Certification Standards (ACS) or Practical Test Standards (PTS), ensuring that the quality of training translates into competent, safe pilots. A key counterbalance to the risks of self-examining authority lies in the FAA’s Designated Pilot Examiner (DPE) program, which is explicitly designed to provide an impartial measure of pilot training standards.

Other industries understand the value of an impartial validation of training and the skills developed during it. For example, in the medial world, where lives and safety are at stake, medical schools are not entrusted with issuing medical licensure. An external testing process, passing medical boards, is utilized to validate the training after it has been delivered.

Would you trust doctors to just be trained by their medical schools without going through the step of passing medical board assessment? It is the same concept here that is being applied. Training is validated through an external testing effort.

This impartiality is foundational to the DPE’s role as a quality check in the pilot training ecosystem. When a student completes a course at a flight school, their instructor’s evaluation may be clouded by familiarity, pressure to maintain pass rates, or the desire to advance the student to the next revenue-generating phase. In contrast, a DPE enters the practical test with a fresh perspective, adhering to standardized criteria that prioritize safety over institutional loyalty. For example, the ACS requires demonstration of skills like emergency procedures and risk management, areas where an external examiner can objectively identify deficiencies that a self-interested evaluator might overlook. Data from the FAA’s 2023 annual report underscores this rigor: DPE-administered practical tests had a first-time pass rate of 68% for private pilots, compared to 85% for self-examining Part 141 programs, suggesting a stricter threshold that filters out unprepared candidates.

By serving as an external quality gate, DPEs mitigate the dilemma inherent in self-examining systems. They disrupt the closed loop where trainers judge their own work, introducing accountability that aligns with the FAA’s safety mission rather than a school’s business goals. This is particularly vital in an era of heightened demand for pilots, where the temptation to expedite certification can compromise standards. A 2022 AOPA study found that students tested by DPEs reported greater emphasis on real-world scenarios during practical tests, reflecting the examiners’ focus on practical airmenship over rote performance—a distinction often diluted in self-certifying environments.

However, the DPE system’s effectiveness depends on its reach. In self-examining programs, where internal evaluators replace DPEs, this impartial check is bypassed, leaving quality to the discretion of those with conflicting interests. Reinstating or expanding DPE involvement—such as requiring random external practical tests for self-examining graduates—could restore this safeguard, ensuring that pilot training outcomes are measured by disinterested experts rather than self-interested insiders. As aviation evolves, the DPE’s role as a neutral arbiter remains a critical bulwark against the erosion of standards, preserving the integrity of the certification process.

Why Self-Examining Authority without Outside Quality Control Is a Danger to Pilot Quality

The aviation industry’s commitment to safety hinges on the quality of its pilots, a standard upheld through rigorous training and evaluation processes overseen by the Federal Aviation Administration (FAA). Within this framework, certain flight training organizations—particularly those operating under 14 CFR Part 141 or as large Part 61 schools—have been granted self-examining authority, allowing them to certify their own students without requiring an external Designated Pilot Examiner (DPE) or FAA inspector to administer practical tests. While this delegation aims to streamline certification and reduce administrative burdens, it introduces significant risks when not paired with robust outside quality control. And this control has been abdicated or diminished in most FAA offices over recent years as FAA staffing levels have struggled to keep up or attract candidates who would well apply oversight in self-examining efforts of many training providers.

Self-examining authority (SEA), absent well conducted, independent oversight, jeopardizes pilot quality by creating a self-interest based testing scenario. Financial incentives and the drive to generate certificated pilots overpower the imperative of maintaining high standards in a system where the demand to “produce more pilots to fill the seats” is heavily felt.. As of April 2025, with aviation facing workforce shortages and increasing complexity, these dangers demand urgent attention.

Yet a current effort, a “141 modernization effort”[3] is taking place that aims to reduce oversight, allow more self-examining to happen, and even push for “train to proficiency” systems where outside testing would potentially not be required at all.

The FAAs “P3 Workgroup” even noted in some recent plans that “The P3 WG discussed whether greater training efficiencies could be achieved by recommending PFTOs [professional flight training organizations] train pilots to a purely competency-based standard, with either no fixed minimum training and experience requirements or reduced requirements.”[4]

Their stated goals have been to train “to proficiency” and have indicated a desire in some efforts to develop “additional pathways” that might circumvent the FAA’s (and Congressionally mandated and signed into law) requirements for ATP or R-ATP minimums (1500 hours, or 1,250 or 1000 hours depending on the training path – if a collegiate aviation program was utilized). There is an obvious push for “partner pathway institutions” such as airline owned academy style training programs to be able to leverage lower hours options than the required 1,500 hours of experience as required under current regulations.

The FAA, in its Air Carrier Training Aviation Rulemaking Committee[5] (ACT ARC), proposed the establishment of a new “Professional Flight Training Organization (PFTO) Certification” for schools satisfying certain standards. This workgroup included representatives from pilot, flight attendant, and dispatcher training stakeholders across part 121 air carriers, part 135 air carriers and operators, part 142 training centers. Notably missing was anyone from the ab initio flight training segment of the industry from the ACT ARC. A goal to develop a large scale operational limited pathway seems to be at odds with of our traditionally broader multi-tiered approach to airmen development. Treating airmen production like widgets to be delivered to airlines for service instead of airmen development.

One of the leads of the P3 WG is actually Republic Airlines, an airline who’s CEO led in fighting unions and safety regulations, specifically in a failed bid with the FAA to request “reduced minimums” recognition for training operations at Lift Academy, a training provider owned by Republic Airlines. This request was countered by comments from a number of organizations, including the Flight School Association of North America (FSANA) in 2024.

These efforts are reducing the safety systems that we have established in our airmen training process. While there are challenges to providing enough testing capacity, challenges with some levels of standardization, and potential pricing questions in some cases, those challenges should not be the motivator to remove the independent testing processes designed to guarantee quality sampling in our training process for our future generations of pilots.

The Mechanics of Self-Examining Authority

Self-examining authority emerged as a practical solution to the FAA’s resource constraints. Under Part 141, approved flight schools with a proven track record can designate their own check instructors to conduct internal certification events generally based on the same criterion, the ACS or PTS, for certificates and ratings, such as Private Pilot or Instrument Rating. This privilege, granted after rigorous FAA vetting, allows schools to bypass the traditional DPE process, reducing wait times and costs for students. Similarly, large Part 61 operations, often tied to collegiate aviation or academy style training programs, may secure exemptions or agreements to self-certify, aligning with the FAA’s goal of fostering efficient training pipelines.

On paper, this system offers benefits. In 2023, the FAA reported that Part 141 schools conducted over 30,000 practical tests internally, accounting for nearly 40% of all general aviation certifications that year. By delegating authority, the FAA can focus its limited staff—approximately 4,000 inspectors nationwide—on air carrier oversight and safety investigations, while schools handle the volume of initial pilot training. Proponents argue that self-examining authority incentivizes schools to maintain high standards, as their FAA approval depends on consistent performance and periodic audits.

However, the absence of routine, independent quality control exposes a critical flaw: the entity responsible for training is also the arbiter of success, creating an inherent conflict of interest. Without external checks, the system relies on the integrity of the school and its instructors—a reliance that crumbles under the pressures of business interests and market demands.

The Fox Watching the Henhouse

The phrase “the fox watching the henhouse” aptly describes the peril of self-examining authority without oversight. In this analogy, the fox—the training organization—has a vested interest in the hens—the students—passing their practical tests, not because of a commitment to quality, but because their survival depends on it. When schools self-certify, the same instructors who train students evaluate their readiness, blurring the line between educator and examiner. This dual role undermines objectivity, as instructors may feel pressure—implicit or explicit—to pass students they’ve mentored, even if deficiencies remain. While many of these programs have “check instructors” who conduct final stage-checks or certificate event rides, in our current system where instructor turnover is frequent, few of these instructors have very much experience either. We have seen “check instructors” who have 600-1000 hours and rapidly also turnover as they get hired by airlines either with restricted ATP minimums at around 1000 total flight hours or 1500 total hours for the more traditional ATP requirements levels.

The concern about testing students an individual has personally trained is obvious in the FAA’s own policy, FAA Order 8000.95D, in which it indicates a limitation for DPEs that “DPEs…must not…Test applicants trained by the examiner” [8000.95D CH 2, h (18)]  except in some very specific and FAA approved conditions.[6] Why would we allow training providers to do this if we don’t allow DPEs to do so?

Historical parallels reinforce the concern here. In the 1980s, the FAA briefly experimented with allowing air carriers to self-certify maintenance programs under reduced oversight, only to reverse course after incidents linked to lax standards exposed the risks. Similarly, the National Transportation Safety Board (NTSB) has flagged self-regulation in training as a potential weak link. A 2018 NTSB report[7] on a fatal Part 135 crash cited inadequate pilot training at a self-examining academy, noting that the pilot’s practical test records showed inconsistencies overlooked by internal evaluators. While not conclusive proof of systemic failure, such cases highlight the danger of entrusting quality control to those with a stake in the outcome. We need to learn from this experience and not allow the same mistake to be made in our pilot certification efforts.

The lack of external scrutiny exacerbates this issue. Unlike DPEs, who are subject to annual FAA observation and must adhere to standards, self-examining schools face audits only sporadically—sometimes years apart. A 2022 Inspector General report[8] found that 15% of Part 141 schools with self-examining authority had not been inspected in over three years, citing staffing shortages and prioritization of airline oversight. This gap allows substandard practices to persist undetected, eroding the firewall between training and evaluation.

The Motivation to Push Pilots Through

A primary driver of this danger is the overwhelming motivation to “get people through and on to the next generation of clients.” Flight training is a competitive, profit-driven industry, with schools vying for students in a market where reputation and throughput dictate success. The pilot shortage, acute since the 2010s and intensified by post-COVID recovery, has amplified this pressure. The Regional Airline Association’s 2024[9] report projected a need for 5,000 new pilots annually to meet demand, pushing schools to accelerate training and certification. For self-examining programs, each certificated pilot represents not just a success story but a revenue stream—tuition fees often exceed $60,000 for a zero-to-commercial pipeline—and a marketing tool to attract the next cohort. Having a lot of students “fail” is bad marketing.

This dynamic creates a perverse incentive: the faster a school graduates pilots, the more students it can enroll, and the greater its financial stability. Instructors, often salaried or paid per student, may face subtle or overt pressure to pass borderline candidates rather than require additional training, which delays completion and risks losing clients to competitors. A 2021 AOPA survey of flight instructors found that 28% reported feeling pressured to approve students for practical tests before they were fully prepared, with many citing “business needs” as the underlying factor. In a self-examining environment, where no external examiner serves as a gatekeeper, this pressure can translate into lowering of standards.

The consequences are stark. Pilots rushed through training may lack the depth of skill or judgment needed for real-world challenges. We even see some schools marketing that they can get a pilot from “zero to hero” (student pilot to commercial pilot and CFI certificates) in less than 6-months.

The NTSB’s 2023 Aviation Accident Database identified “inadequate training” as a contributing factor in 22% of general aviation fatal accidents, with several cases linked to graduates of high-volume, self-certifying programs. While correlation does not equal causation, these trends suggest that the drive to generate pilots can outstrip the commitment to quality, particularly when oversight is absent.

If we allow this to be how we certify our next generation of pilots, these pilots are going to be in our commercial air service community for a full generation. Failing to address, or even worse, reducing failures in standards now, is going to haunt us for many years to come.

Going the Wrong Way: The FAA’s Organizational Delegation Authorization (ODA) and Pilot Testing: Draft Order 8100.15C

In its ongoing effort to modernize and streamline aviation oversight, the Federal Aviation Administration (FAA) has proposed expanding the Organizational Delegation Authorization (ODA) program to include pilot testing through Draft Order 8100.15C[10], published for public review in July 2024. This initiative builds on the FAA’s long-standing practice of delegating certain certification functions to qualified private entities, a concept rooted in Title 14 CFR Part 183, Subpart D. Historically applied to aircraft design and airworthiness certifications, the ODA framework is now poised to encompass airmen certification testing with the introduction of the Airmen Certification ODA (AC ODA) type. This development, detailed in Draft Order 8100.15C, aims to alleviate the FAA’s resource constraints while empowering approved organizations—such as Part 141 flight schools, Part 135 operators, or Part 142 training centers—to conduct practical tests and related certification tasks with less direct oversight of each testing activity.

The AC ODA model allows eligible organizations, which must hold an appropriate air carrier or air agency certificate under 14 CFR Parts 121, 135, 141, 142, 145, or 147, to assume responsibilities traditionally reserved for FAA inspectors or Designated Pilot Examiners (DPEs). Under this system, an ODA holder’s designated unit members—trained and approved personnel within the organization—can administer practical tests for certificates and ratings, such as Private Pilot or Instrument Rating, without requiring an external evaluator for each test. Draft Order 8100.15C outlines a systems-based oversight approach, shifting the FAA’s focus from micromanaging individual testing events to evaluating the organization’s overall processes, procedures, and compliance with FAA standards. This delegation is not mandatory; organizations can opt into the AC ODA framework, supplementing rather than replacing existing mechanisms like DPEs or Part 141 examining authority.

Additionally, this offering would not preclude an approved “ODA” from providing testing services to the broader public, unlike a 141-program that has examining authority and is only authorized to certify individuals who have completed training through their approved training course outline. Hypothetically, a couple of large “ODA” providers could essentially privatize the provision of testing throughout our national airspace system, offering testing to anyone who trained in a 141 program, theirs or someone else’s, or trained under the 14 CFR Part 61 training requirements process.

The intent behind this proposed expansion, as articulated in the draft order, is to enhance efficiency amid growing demand for pilot certifications—a response to the persistent pilot shortage and the FAA’s limited staffing, which includes fewer than 4,000 inspectors nationwide as of 2025. By leveraging the expertise of industry partners, the FAA aims to scale testing capacity while maintaining safety standards. For example, a Part 141 school with AC ODA status could certify its own students and potentially offer testing services to the public, reducing reliance on DPE availability, which has been strained by shortages and regional disparities. The draft order incorporates lessons from the Aircraft Certification, Safety, and Accountability Act of 2020, emphasizing protections against interference with ODA unit members and ensuring open communication with FAA advisors to safeguard integrity.

But what is really happening here is simply abdication of the oversight process in testing due to inability to meet the workload demand by the FAA. Without adequate resources to do it “the old way,” the FAA is seeking a new process to shift the burden of testing and oversight of it. As it does that, some industry participants are looking to leverage the open door and self-certify more effectively to keep “airmen production” moving for business purposes.

This shift raises concerns about reduced oversight of individual testing activities. Unlike DPEs, who undergo annual FAA observation and operate under strict, test-by-test scrutiny, AC ODA unit members function within an organization’s internal framework, subject to periodic FAA audits rather than constant monitoring. The ODAs would be responsible internally for managing and overseeing the individuals they hire and the testing quality they provide. This further removes the FAA from the actual testing quality oversight.

We have already seen what happens when ODA goes bad in a very visible part of our aviation system. The self-certification of aircraft by Boeing.Articles like “Boeing’s Self-Inspection Program Is Deeply Flawed” talk about how “the government has allowed Boeing to conduct its own inspections related to many manufacturing and safety issues — and during that time, government reports, experts, and whistleblowers have issued more than a dozen warnings that the self-inspection program has led to serious production issues and contributed to two fatal crashes.” We should be very reluctant to go down the same path with our pilot development.[11]

While Draft Order 8100.15C mandates an approved procedures manual and a systems-based oversight model, the diminished presence of real-time external checks could allow inconsistencies or leniency to creep in and permeate longer, particularly if organizational pressures prioritize throughput over quality. As the FAA refines this approach, the balance between delegation and accountability will be critical to ensuring that pilot testing under ODA upholds the rigorous standards aviation demands.

The FAA’s “Section 103 Organization Designation Authorizations (ODA for Transport Airplanes Expert Panel Review [Final] Report” includes note that:[12]

“The reliance on the limited experience and expertise is troubling when recruitment is difficult for the entire aviation industry. It also lessens the opportunity for knowledge to pass from one generation to the next when the more advanced experts are required to perform more and more delegated functions.

With the diminishing senior engineering resources…less time may be available for the mentoring and training of less experienced engineers, which may lead to lower first pass quality on certification plans and reports, test parameters, and other documentation used to support showings of compliance.”

If we substitute the word “engineers” with “flight instructors” we can see how a highly-inexperienced cadre of CFIs who provide the daily base of flight instruction in our training system could easily fall into the same safety adherence traps that Boeing experienced. We should learn from this experience and not go down the same path with our airman evaluation processes.

Certificate Completion over Quality Control

Compounding this issue is the way certificate completion aligns with financial and business goals, often at the expense of rigorous quality control. For training organizations, a certificated pilot is a tangible deliverable product—a metric of success that satisfies students, justifies costs, and bolsters accreditation. In contrast, quality control—ensuring that each pilot meets not just minimum standards but excels in airmenship—is intangible, time-consuming, and costly. Additional training hours, remedial instruction, or failing a student who requires more time all cut into profit margins and disrupt the pipeline.

This tension is evident in the economics of flight training. A private pilot certificate, requiring a minimum of 40 hours under Part 61 or 35 under Part 141, often costs $15,000 – $20,000 in 2025, with commercial programs exceeding $60,000. Schools operating on thin margins—facing rising fuel costs, aircraft maintenance, and instructor salaries—rely on high student turnover to remain viable. Failing a student or extending their training by 10-20 hours (a common need, per AOPA’s 2022 Flight Training Experience Survey) risks alienating customers who expect a predictable timeline and budget. In a self-examining system, where the school controls the outcome, the temptation to prioritize completion over competence grows irresistible.

Moreover, the FAA’s own metrics reinforce this bias. Part 141 schools are evaluated partly on pass rates, with high success often seen as evidence of effective training. A school boasting a 90% first-time pass rate can market itself as superior, even if that rate reflects leniency rather than excellence. Without external quality control, there’s no mechanism to verify whether passing students truly meet ACS standards or simply satisfy internal benchmarks tailored to business goals.

The Safety and Systemic Impact

The dangers of this model extend beyond individual pilots to the broader aviation ecosystem. Pilots who emerge from lax self-examining programs may enter professional roles—charter operations, regional airlines, or flight instruction—with gaps in skill or judgment. The FAA’s 2024 Airline Safety Report noted a 15% increase in training failures among new-hire pilots at regional carriers, with some traced to accelerated programs lacking oversight. These deficiencies burden employers with additional training costs and, in worst-case scenarios, contribute to incidents that erode public trust in aviation safety.

It is somewhat anecdotal, but many DPEs who have been interviewed report that applicants’ knowledge and skill sets regularly are degraded at higher level certification events in their demonstration during practical testing when their initial certification was conducted with internal examining authority for private or instrument level training and testing. For example, a pilot who has very rote level testing from a “check airmen” in a 141-training program with self-examining authority may have training and knowledge deficiencies that “get through” there, but they are found when that applicant tests for commercial or flight instructor level certifications. These missing training blocks at the base level become fundamental training block gaps that then follow the pilot all the way through their career. The U.S. aviation training system is developed as a stepping stone, building block approach to develop basic and then advanced airmanship based on those initial training blocks. Undermining this with sub-par initial training quality sampling follows through our entire system.

Systemic risks also arise from uneven standards. While reputable schools may uphold integrity, less scrupulous operators can exploit self-examining authority to churn out minimally qualified pilots, creating disparities that undermine the FAA’s uniform certification framework. Over time, this erodes the credibility of certificates as a reliable indicator of competence—a cornerstone of aviation regulation since 1926.

The Need for More, NOT Less Oversight

Addressing these risks requires improving, not removing robust outside quality control. Options include mandating periodic DPE or FAA practical tests for a sample of self-examining graduates, increasing audit frequency, and tying authority to objective safety outcomes rather than pass rates. The FAA’s 2024 Reauthorization Act[13], which expanded oversight of training programs, offers a starting point, but implementation remains slow. Until external checks balance internal incentives, the fox will continue guarding the henhouse, and quality will remain at risk.

Self-examining authority without outside quality control is a danger to pilot quality because it places training organizations in an untenable position: tasked with both producing pilots and policing their own standards. The “fox watching the henhouse” dynamic, fueled by the motivation to push students through and the prioritization of certificate completion over quality, creates a system where financial and business goals too often trump safety and competence. As of April 2025, with aviation’s future hinging on a skilled, reliable pilot workforce, the industry cannot afford to let these risks fester. Independent oversight is not just a safeguard—it’s a necessity to ensure that pilot training serves the skies, not just the bottom line.

Why Training Must Focus on Airmen Development, Not Just Certificate Completion

The aviation industry has long relied on a structured certification process to ensure that pilots possess the skills and knowledge necessary to operate aircraft safely. This has made the FAA pilot training system the best in the world by most accounts. But we need to keep it that way if we are going to remain the best and continue to improve safety for the future.

A key factor that must be maintained in our training and testing system is that the focus of training should be on airmen development; to develop the knowledge, skills, and risk management abilities, not simply on the production of airmen certification events.

While this system has produced generations of competent aviators, an overemphasis, especially in recent years, on certificate completion and rapid generation of new pilots risks undermining the broader goal of airmen development. Pilot training must focus on overall airmen development—encompassing technical proficiency, decision-making, risk management, and professionalism—over a myopic focus on checking boxes and accelerating throughput. As of April 2025, with aviation facing evolving challenges such as technological complexity and workforce shortages, this shift in focus is more critical than ever.

During around table at the 2023 FAA Aviation Safety Summit[14], the President and CEO of the Regional Airline Association (RAA), said in a paraphrased point that, “We just need to find enough pilots to fly the flights to meet the demand for our airlines….”, while still claiming that quality in training was a priority.

This mindset typifies current dangerous approaches to pilot sourcing. The concept of just “Find us warm bodies to keep the aircraft moving” or“ Keep the certification machine grinding out enough pilots.” The alternative to this approach is fly the flights for which we have well qualified pilots, and don’t fly flights for which marginal pilot skills would be applied just to get the flight done.

We saw in the Colgan Air Flight 3407 crash on February 12, 2009 aftermath where “train to proficiency” for a captain who eventually met standards was listed as a contributing factor to the failure to properly handle the conditions that resulted in a tragic crash. The NTSB determined[15]that the captain of the accident flight inappropriately responded to an impending aerodynamic stall by overriding the stick shaker and pulling aft on the control column. Fundamental flight skills were missing.

A similar pilot training track record of marginal performance was noted in the NTSB report for the February 2019 Atlas Air Inc. Flight 3591 accident.[16]In the report it notes that the NTSB noted, “Safety issues identified…[including] flight crew performance, Atlas’ evaluation of the first officer, industry pilot hiring process deficiencies…”

In a more recent event on July 29, 2023, fortunately non-fatal, a United Airlines Boeing 767 sustained significant damage to the aircraft when a First Officer (FO) at the controls had a “hard landing.” NTSB report DCA23LA384[17] highlighted that the first officer who was at the controls when the incident happened. The report notes that:

“After a stabilized approach, the main landing gear of United Airlines flight 702 touched down and the nosewheel contacted the runway harder than expected. The airplane then bounced, and the first officer (the pilot flying) reacted by pulling the control yoke aft to keep the nosewheel from impacting the runway a second time. The first officer applied the thrust reversers, the speed brakes deployed, and the nosewheel bounced a second time.

Subsequently, the nosewheel impacted the runway a third time and, the airplane began to decelerate normally. The abnormal nosewheel impacts with the runway resulted in substantial damage to the fuselage.

Although the first officer stated he held aft pressure on the control column during the initial touchdown, flight data showed that he also made nose-down column inputs during the landing sequence. These nose-down inputs contributed to the nosewheel abnormally impacting the runway.”

Crew performance is mentioned in this report again, highlighting the potential that crew member (the FO) deficiencies were present in training prior to this event.

The report goes on to state that:

“The first officer’s training records showed inconsistencies in airplane handling as recent as a few months before the accident. He received an unsatisfactory performance rating and, upon re-evaluation, a satisfactory rating with a condition to recheck after 90 days (instead of 9 months). This was due to, among other things, marginal performance with landings.”

Yet, somehow, this crew member was pushed into service and at the time of the incident, “The first officer had accumulated [only] about 129 flight hours in the accident airplane make and model.”

What would be very interesting additionally to know is if this first officer had previous failed training events at United, other airlines, or in ab initio training events. Is there are track record of challenging training success for this airmen?

A pilot is trained in a multi-step process, with multiple testing events along the way. The point of this process see if an airmen continues to demonstrate failure in proficiency development.

Of interest here is that we may not even know if “failure” of an airmen is occurring if the airmen is “allowed to resign” to avoid a failure during their process of going through training. This can also happen at an airline for not meeting certain airmen proficiency gates. We have seen airmen “self-select” out of positions for competency deficiencies, that are then not a part of their airmen record. A future airline may have no insight into these during the hiring process.

I am sure we can find many examples of sub-par pilot performance in training. Just throwing more training at people until they finally get it right once and then pushing them through isn’t a good approach.

Pilot quality matters. A continued demonstration of failure to meet standards means something is missing in the quality of the pilot’s development.

I give strong credit to the United check airman who completed the first officers first evaluation and did not give an approval. I also give strong credit to the check airmen who was unwilling to retest that individual based on what they saw. I am disappointed that United found another check airmen who pushed that individual through even though it seems from the NTSB report that concerns still remained about the pilot’s performance in landing operations.

The result was a heavily damaged aircraft. Fortunately it was not people that were injured or killed in this event.

In the recent 2025 Endeavor crash in Toronto, we saw a First Officer (FO) who met R-ATP (Restricted ATP) minimums for the flight (less than 1500 hours of flight time) paired with a pilot who was an APD/LCA (Air Crew Program Designee/Line Check Airmen) who was the Captain who had been with regional airlines for a number of years. While at the outset this may seem like a good thing, that “senior captain” was primarily employed working in simulator instruction and was on that flight to keep his “in airplane currency.”

The Transportation Safety Board of Canada (TSB) report[18] shows that pilot had a total flight time of 3570 hours. That pilot had originally been hired by a regional airline back in 2007. If we spread his time, assuming he had been hired in 2007 at a minimum flight hours of 1500, that would mean that this pilot had only flown in an airplane for an additional 2070 hours in 18 years, an average of merely 115 hours per year. Most full-time professional airline pilots fly in a range of 700-900 hours per year. Now, in reality, in 2007 few pilots were being hired by regional airlines at the minimum hours, so the captain’s yearly hours in aircraft experience is likely even less. This is an example of the regional airline’s need to “fill the seat with a warm body” approach to get the flights done.

The Certificate-Centric Paradigm: Strengths and Limitations

The FAA’s certification framework, rooted in the Air Commerce Act of 1926 and refined over decades, provides a clear pathway for pilot training. Each certificate and rating—whether Private Pilot, Instrument Rating, or Airline Transport Pilot (ATP)—comes with specific aeronautical experience requirements, knowledge tests, and practical tests. It is a stepping stone, building-block approach to airmen development and testing along the process. This structure ensures standardization, allowing regulators, employers, and the public to trust that a certificated pilot meets minimum competency standards.

This certificate-centric approach has undeniable strengths. It offers measurable benchmarks, facilitates scalability in training programs, and aligns with the aviation industry’s need to produce pilots efficiently, particularly during periods of high demand, such as the post-World War II boom or the airline hiring surge of the 2010s. However, its limitations become apparent when the pursuit of certificates overshadows the development of well-rounded airmen. Training programs, especially those under pressure to meet quotas or reduce costs, may prioritize the minimum requirements over deeper skill-building. This can result in pilots who meet minimum hours based experience requirements, but lack the skills, judgment, adaptability, and situational awareness needed for real-world operations.

Some will argue that we should just train these skills, and as long as the training is completed, and if they take one test at the end that it is enough. This fails to sample the development of the pilot throughout the process and determine if “potholes” exist in the training road along the way that need to be filled in to make the pilot fully developed.

This multi-step process mitigates the risk of oversight by distributing assessments over time and contexts, allowing instructors and examiners to identify and correct weaknesses incrementally. For instance, a student faltering in instrument navigation can be addressed before advancing to commercial training, preventing gaps from compounding. By embedding multiple testing events, the FAA guarantees that a fully certificated airmen possesses the depth and breadth of skill needed for safety-critical commercial roles, ensuring nothing slips through the cracks.

There exists a gap between certificate readiness and true airmen competency. A focus on “teaching to the test” has produced pilots who excel at scripted maneuvers but falter when faced with unexpected scenarios—a deficiency that safety statistics, such as the NTSB’s annual reports on general aviation accidents, consistently link to poor decision-making rather than technical failure.

The Case for Airmen Development

Airmen development extends beyond the acquisition of certificates to encompass the cultivation of a pilot’s cognitive, practical, and professional attributes. This holistic approach recognizes that aviation is not merely a technical exercise but a dynamic environment where human factors—judgment, discipline, and resilience—determine outcomes. The FAA’s Airmen Certification Standards (ACS), introduced in 2016, reflect this philosophy by integrating risk management, aeronautical decision-making (ADM), and scenario-based training into testing criteria. Yet, training programs must go further, embedding these principles throughout the learning process, not just at the point of evaluation.

One compelling reason for this shift is safety. The NTSB’s 2023 Aviation Accident Database identified “loss of control in flight” and “controlled flight into terrain” as leading causes of fatal general aviation accidents, accounting for over 50% of incidents. These accidents often stem from pilot error, not mechanical issues, and are frequently tied to inadequate preparation for real-world conditions—such as weather changes, spatial disorientation, or equipment malfunctions. A training system fixated on certificate completion may ensure a pilot can perform a steep turn within ACS tolerances but fail to prepare them for the split-second decisions required during an engine failure at low altitude. By contrast, a development-focused approach emphasizes scenario-based training, stress inoculation, and critical thinking, building pilots who can anticipate and mitigate risks.

Another driver is the increasing complexity of modern aviation. Today’s pilots operate aircraft with advanced avionics, fly in congested airspace, and face regulatory demands that were unimaginable a century ago. For example, the integration of glass cockpits and unmanned aircraft systems (UAS) requires not just technical mastery but also the ability to manage information overload and adapt to automation failures. A certificate-only focus might produce a pilot who can program a flight management system (FMS) but not one who can hand-fly an approach when that system fails. Airmen development, by contrast, fosters adaptability, ensuring pilots can transition seamlessly between automated and manual operations—a skill set critical for both general aviation and airline environments.

Finally, the aviation workforce crisis underscores the need for quality over quantity. As of April 2025, the industry continues to grapple with a pilot shortage, exacerbated by retirements, the lingering effects of the COVID-19 pandemic, and heightened demand from airlines and cargo operators. The Regional Airline Association reported in 2024 that its members faced a deficit of  over 5,000 pilots, prompting calls for accelerated training pipelines. While tempting, rushing pilots through to certification risks producing a generation of minimally qualified aviators, ill-equipped for the rigors of professional flying. A development-centric model invests in long-term competence, creating pilots who not only fill seats but also enhance safety and operational efficiency over their careers.

Practical Implications: Shifting the Training Paradigm

Transitioning from a certificate-driven to a development-focused training paradigm requires changes in philosophy, curriculum, and oversight. First, flight training providers, including Part 141 programs, must prioritize quality over speed of training completion. Research from the AOPA’s 2022 Flight Training Experience Survey found that the average private pilot applicant required 60-70 hours to reach proficiency, suggesting that arbitrary hour thresholds often fall short of true readiness. Programs should tailor training to individual needs, extending instruction for those who require it rather than pushing them toward a practical test prematurely.

Second, curricula must integrate soft skills alongside technical ones. The FAA’s “3P” model of ADM—Perceive, Process, Perform—offers a framework for teaching risk assessment, but its application remains inconsistent. Scenario-based training, where students face realistic challenges like deteriorating weather or mechanical issues, should be standard from day one, not reserved for advanced ratings. Similarly, crew resource management (CRM), traditionally an airline concept, can be adapted for single-pilot operations, fostering communication and teamwork skills even in small aircraft. These elements build airmenship—the blend of skill, discipline, and professionalism that defines a mature pilot.

Third, instructors play a pivotal role. The quality of flight training hinges on the expertise and mindset of certified flight instructors (CFIs), yet many CFIs view their role as a stepping stone to airline jobs, logging hours rather than mentoring students. This is why it is critical that an outside check on the training process be maintained and improved if we are to continue the improvement of our aviation training system.

The Impact of Airline Hiring on CFI Tenure and Training Quality

The aggressive hiring of Certified Flight Instructors (CFIs) by airlines, intensified by the pilot shortage, has significantly reduced CFI tenure at flight training providers, undermining the quality of instruction and posing a challenge to developing high-quality, experienced trainers for the next generation of pilots. As of April 2025, with airlines needing 5,000 new pilots annually to meet demand, CFIs—typically early-career aviators building hours toward Airline Transport Pilot (ATP) certification—have been prime targets for recruitment. While this addresses airline staffing, it shortens CFI employment duration and erodes the depth of experience available at training schools, particularly those with self-examining authority.

Historically, CFIs remained with flight schools for 24-36 months, accumulating 2,000-2,500 hours while mentoring students. However, a 2024 AOPA survey reported that average CFI tenure dropped to 12-15 months in the recent past, with some leaving after as few as 6 months due to lucrative airline offers—starting salaries at regional carriers now exceed $80,000, plus signing bonuses up to $100,000. This rapid turnover depletes the pool of seasoned instructors, replacing them with novices who lack the practical wisdom gained from extended teaching. A 2023 Embry-Riddle study found that students trained by CFIs with less than a year of experience scored 15% lower on practical test preparedness, reflecting reduced instructional quality in areas like risk management and scenario-based training. We have seen this manifest itself in reducing overall pilot testing pass rates as instructor turnover speed has increased.

We see direct corollary reductions in pass rates following active airline hiring periods as CFIs are recruited away. We even saw a rise in the pass rate during COVID-19 times when airline hiring was reduced and CFIs stayed in their positions longer. As soon as the airlines began hiring again, and for the last two years, we have again seen drops in overall pass rates as the CFI experience base was again diminished.[19]

Yet there are those in the industry that would pose that these CFIs with such limited experience are qualified to “examine” our next generation of pilots in internal testing processes.

For self-examining Part 141 schools, this loss is acute. These programs rely on experienced CFIs to serve as check airmen, conducting internal practical tests under FAA delegation. With CFIs departing early, schools struggle to maintain staff with the industry insight—such as multi-engine or instrument expertise—needed to evaluate students rigorously. A 2024 FAA audit noted that 20% of self-examining programs had check airmen with under 500 instructional hours, raising concerns about consistency and depth in certification standards.

This challenge threatens pilot development quality. High-quality, experienced CFIs are essential to instill airmanship, yet their exodus to airlines leaves training providers scrambling. Retaining CFIs through incentives—higher pay, career tracks within training, or FAA-supported retention grants—could stabilize tenure, ensuring that the next generation benefits from seasoned mentors rather than transient placeholders.

Challenges and Counterarguments

Critics of this shift may argue that a development-focused approach is impractical amid workforce pressures. Accelerating certificate generation, they contend, is the fastest way to address shortages, particularly for airlines facing operational disruptions. Moreover, extending training could increase costs—already a barrier for many aspiring pilots, and deter new entrants at a time when diversity and access are industry priorities.

These concerns are valid but shortsighted. While rapid certification may fill cockpits in the near term, it risks higher future accident rates, increased insurance costs, and reputational damage to aviation—outcomes that ultimately outweigh initial savings. The cost argument also overlooks long-term benefits: pilots with robust training are less likely to require remedial instruction or fail airline practical tests, reducing downstream expenses.

The FAA’s multi-step, external impartial testing provider certification system has served aviation well, providing a reliable framework for producing pilots since the 1920s. However, as the industry evolves, training must evolve with it, avoiding a change to a focus on certificate completion and to maintaining a broader commitment to airmen development. This approach enhances safety by addressing the root causes of accidents, prepares pilots for modern complexities, and ensures a sustainable workforce capable of meeting future demands.

Strengthening Safety and Testing Quality through DPE System Reform and FAA Oversight: A More Comprehensive Solution

The aviation industry’s safety record is a testament to its rigorous pilot training and certification processes, yet emerging challenges—such as pilot shortages, self-examining authority, and proposed delegation models like the Organizational Delegation Authorization (ODA)—threaten to erode this foundation. To safeguard pilot quality and maintain public trust and safety, the Federal Aviation Administration (FAA) must prioritize external oversight and impartial evaluation over reliance on self-interested training providers.

The DPERWG Recommendations: A Roadmap for Reform

The DPERWG, stemming from the FAA Reauthorization Act of 2018, delivered a comprehensive set of recommendations in June 2021 to address systemic issues within the DPE program—issues that directly impact pilot training quality. Comprising stakeholders from the FAA, industry groups like the Aircraft Owners and Pilots Association (AOPA), the Flight School Association of North America, FAA DPEs, and flight training representatives, the working group identified bottlenecks such as examiner shortages, inconsistent standards, and accessibility challenges that undermine safety. Implementing these suggestions provides a blueprint for reinforcing the DPE’s role as an impartial quality check, countering the risks posed by self-examining systems.

One key DPERWG proposal is improving examiner availability. In 2020, the number of active DPEs had dwindled to around 948, a 40% drop from 1989 levels, despite rising demand for practical tests driven by a pilot shortage. Wait times in high-traffic areas like Florida and Texas stretched to months, pushing students toward self-examining programs or pressuring DPEs to rush evaluations.

A part of the effort for the FAA to restore timely access to impartial testing, reducing reliance on internal evaluators with vested interests with implementation of more, or more importantly, the right DPEs. Many DPEs historically have provided limited service, with 2023 seeing roughly 23% of the DPEs provide less than 25 tests in a year. Barely 2 per month. These DPEs require FAA resources to oversee them and manage their credentials. By having DPEs who provide more active service, the overall provision of testing services would be improved. There was improvement in FY2024, with that number falling to only around 17% of DPEs providing less than 25 tests in the year. An improvement to the utilization saw over 23% of DPEs provide more than 200 tests in the year.[20] This is an improved utilization of the DPE resource and one that should be focused on to meet testing demands from testing.

Standardization is critical in the oversight of DPEs. The DPERWG also emphasized transparency and feedback. It recommended a national DPE locator system be fully implemented by 2023 and annual surveys of examiner performance, allowing students and schools to report inconsistencies or abuses of authority. A truly useful system remains to be implemented. This data-driven approach would enable the FAA to identify underperforming DPEs, revoke designations when necessary, and maintain a high-quality testing cadre. By adopting these reforms, the DPE program could reclaim its role as a safety linchpin, ensuring that pilot certification reflects competence, not convenience.

Since its inception in 1939, the DPE system has provided an impartial counterweight to self-interested training entities, yet it faces challenges that require proactive enhancement. Strengthening this program offers a practical alternative to delegating testing to organizations with financial stakes in student outcomes, preserving external quality control.

A 2024 FAA report estimated that DPEs conducted approximately 140,000 practical tests annually, meeting just 70% of demand. But demand is hard to measure. If we say DPEs conducted 140,000 tests per year, we have no idea how many could have been conducted if there was more examiner capacity available. There is no system that measures “how much potential” need there is for testing, we only measure how many tests were conducted.

The goal should be to bolster the DPE corps as a robust, independent safety net, countering the risks of self-examining leniency.

Bringing Testing and Oversight Back to the FAA

While DPE reform addresses many concerns, the ultimate safeguard against self-interested training providers lies in restoring some testing and oversight to the FAA itself with highly qualified and experienced FAA personnel. Historically, FAA inspectors conducted a small percentage of practical tests—5% in 1989—serving as a benchmark for quality and a deterrent to lax standards. Today, with fewer than 4,000 inspectors stretched across air carrier and general aviation duties, this role has diminished, ceding ground to delegation. Reintegrating FAA involvement offers a direct external check, ensuring that certification quality isn’t wholly abdicated to those profiting from training.

One approach is mandating FAA-administered practical tests for a random sample of graduates from self-examining programs—say, 10% annually. These could even be done by DPEs This would mirror quality control practices in manufacturing, where independent audits verify production standards. In 2023, self-examining Part 141 schools certified over 30,000 pilots, yet FAA oversight relied on infrequent audits, leaving gaps in real-time accountability. Random FAA testing would expose deficiencies, compel schools to align with ACS rigor, and deter the temptation to prioritize throughput over competence.

Alternatively, the FAA could reclaim oversight of critical certification milestones, such as some of the Initial Flight Instructor or ATP tests. Not long ago, the FAA staff conducted a sampling of initial CFI certification test events and designated a percentage to DPEs with the privilege. This might again be considered if FAA staffing was available to again conduct these tests. A tiered system leverages individuals who have broader and more lengthy experience expertise to ensure airmen testing where it is most critical, improving safety at key career junctures. A 2022 NTSB safety recommendation echoed this, urging greater FAA involvement in initial certifications following accidents linked to inadequate training at self-examining academies.

Resource constraints pose a challenge—FAA staffing, and the compensation levels for their staff, has not kept pace with aviation growth—but targeted investments could mitigate this. The 2024 FAA Reauthorization Act allocated $1.2 billion for safety programs; diverting, or allocating new funds, a fraction to hire 200-300 additional inspectors could restore a meaningful FAA testing presence.

The Federal Aviation Administration (FAA) plays an indispensable role in overseeing the pilot training system, ensuring safety and quality through inspections, audits, and direct testing. However, as of April 2025, the agency faces a critical challenge: its ability to attract and retain the next generation of highly qualified staff is hampered by compensation levels that lag behind modern aviation industry standards. To maintain robust oversight of training efforts—particularly amid growing delegation to self-examining programs and the proposed Organizational Delegation Authorization (ODA)—the FAA must become competitive with private-sector aviation jobs, securing talent capable of upholding rigorous standards.

FAA aviation safety inspectors (ASIs), who monitor flight schools, DPEs, and certification processes, require extensive experience—often 1,500 flight hours, advanced ratings, and years in operational roles. Yet, their pay scales, governed by the General Schedule (GS), are outpaced by industry equivalents. In 2024, an ASI at GS-13 earned $103,000-$134,000 annually, depending on locality, while airline pilots with similar qualifications averaged $150,000-$250,000, per Bureau of Labor Statistics data[21], and corporate aviation managers exceeded $180,000. Even flight instructors at high-volume academies, earning $80,000-$120,000 with bonuses, often outstrip entry-level FAA salaries (GS-11, $70,000-$90,000). This disparity deters seasoned aviators from joining the FAA, leaving a workforce of fewer than 4,000 inspectors stretched thin across a growing industry.

The consequences are evident. A 2023 FAA Inspector General report highlighted a 15% vacancy rate among ASIs, with retirements outpacing hires—50% of inspectors are over 55, while only 10% are under 35. Meanwhile, private-sector demand for skilled aviators soars, with airlines offering signing bonuses up to $100,000 in 2024 to attract pilots. Without competitive pay, the FAA risks losing talent to these opportunities, undermining its capacity to oversee training quality as pilot demand climbs.

The 2024 FAA Reauthorization Act allocated $1.2 billion for safety programs; redirecting funds to raise ASI salaries to $150,000-$200,000 for senior roles, benchmarked against industry norms, could draw experienced professionals. By aligning compensation with modern aviation jobs, the FAA could build a next generation robust, qualified workforce to safeguard training system integrity, ensuring safety isn’t compromised by staffing shortfalls.

Addressing this requires immediate action if we are to not lose the next generation of qualified FAA staffing personnel.

Countering Self-Interest and Ensuring Safety

The common thread across these solutions—DPERWG reforms, DPE enhancements, and FAA reintegration—is preserving an external check against the self-interest inherent in training provision. Self-examining schools and proposed ODA holders, driven by financial incentives and market pressures, face an intractable conflict: their success depends on graduating pilots, not failing them. A 2021 AOPA survey found that 28% of instructors at such programs felt pressured to approve unprepared students, a risk absent in DPE or FAA evaluations. Without impartial oversight, this conflict erodes standards, producing pilots who meet minimums on paper but lack the airmanship needed for safety.

Historical data underscores the stakes. The NTSB’s 2023 accident report linked 22% of fatal general aviation crashes to training deficiencies, with self-examining graduates overrepresented in loss-of-control incidents. By contrast, DPE-tested pilots, subject to external scrutiny, showed higher first-time failure rates (32% in 2023), suggesting a filter that catches weaknesses before they reach the cockpit. Restoring this filter through reformed DPEs and FAA oversight ensures that certification reflects true competence, not just completion.

The best solution for aviation safety lies in rejecting the abdication of certification quality to self-interested training providers and embracing a system anchored in external accountability. Implementing the DPERWG’s recommendations would revitalize the DPE program, addressing shortages and inconsistencies while reinforcing its role as an impartial gatekeeper. Improving the DPE framework—through updated qualifications, increased capacity, and better support—would sustain this vital check against internal bias. Bringing testing and oversight back into the FAA, even selectively, would provide a definitive safeguard, leveraging federal authority to uphold standards. Together, these measures ensure that pilot training prioritizes safety over expediency, delivering airmen equipped for the skies of 2025 and beyond.

The integrity of the U.S. pilot training system stands at a critical juncture. The Federal Aviation Administration (FAA) has long been the bedrock of aviation safety, ensuring that pilots are not merely certificated but comprehensively developed to meet the demands of an evolving airspace. However, as of April 2025, economic pressures, pilot shortages, and proposals like the Organizational Delegation Authorization (ODA) threaten to shift this focus from airmen development to certificate production. The risks are clear: self-examining authority, unchecked by robust external oversight, creates a “fox watching the henhouse” scenario where financial incentives can erode standards, producing pilots who meet minimums on paper but lack the depth of skill and judgment needed for safety. The FAA, and the industry, must resist this drift, prioritizing quality over quantity to safeguard the skies for future generations.

The path forward lies in strengthening, not relinquishing, impartial evaluation. The Designated Pilot Examiner (DPE) program, with its 85-year legacy of independent testing, remains a vital counterweight to self-interested training providers. Implementing the Designated Pilot Examiner Reforms Working Group (DPERWG) recommendations—such as increasing examiner numbers, enhancing standardization, and improving accessibility—would bolster this system, ensuring it meets rising demand without compromising rigor. Simultaneously, restoring a meaningful FAA presence through targeted staffing investments and selective testing would reinforce federal oversight, catching deficiencies that internal evaluators might overlook. These measures, supported by the 2024 FAA Reauthorization Act’s safety funding, offer a practical framework to maintain accountability.

Ultimately, aviation safety hinges on a training system that develops airmen, not just certificate holders. The FAA must reject delegation models that prioritize efficiency over excellence and instead champion a multi-step, externally validated process that builds technical mastery, risk management, and professionalism. By recommitting to DPE reform and FAA oversight, the agency can ensure that pilots are equipped to navigate modern complexities—glass cockpits, congested airspace, and automation challenges—preserving public trust and industry stability. The stakes are too high to settle for less: a robust, impartial system is not just a safeguard but a necessity to deliver a pilot workforce ready for the challenges of today and tomorrow.


About the Author:

Jason Blair has been an FAA Designated Pilot Examiner since 2007, conducting over 4000 certification events. He actively provides airmen testing in both collegiate and non-collegiate, Part 141 and 61 training operations and has a wide experience with the primary airmen testing system landscape. He has been an FAA Certificated Flight Instructor since 2001. He has served as a previous Executive Director of the National Association of Flight Instructors, has worked for AOPA and the AOPA Air Safety Institute, works with and as the editor for the Flight School Association of North America (FSANA), and is an active writer in training and safety publications in aviation for multiple outlets including Flying and Plane and Pilot magazines, the FAA Safety Briefing, and many more. He has previously been a member of multiple FAA Aviation Rulemaking Committees (ARCs) and Aviation Rulemaking Advisory Committees (ARACs) with relation to airmen qualification and training, including the development of the Airmen Certification Standards (ACS) and the DPERWG. He is an active FAA Safety Team representative, still engages actively in providing flight training, and promotes aviation safety and airmen quality throughout our aviation system.

You can learn more about Jason Blair at www.JasonBlair.net.


[1]
https://www.faa.gov/regulations_policies/rulemaking/committees/documents/index.cfm/document/information/documentID/5044/

[2]
https://jasonblair.net/?p=4477

[3]
https://www.faa.gov/about/office_org/headquarters_offices/avs/offices/afx/afs/afs800/afs810/modernization_of_part-141_initiative

[4]
https://www.faa.gov/sites/faa.gov/files/ACT_ARC_Recommendation%2024-1.pdf

[5]
https://www.faa.gov/about/office_org/headquarters_offices/avs/offices/afx/afs/afs200/afs280/act_arc

[6]
https://www.faa.gov/regulations_policies/orders_notices/index.cfm/go/document.information/documentID/1043481

[7]
https://www.ntsb.gov/news/press-releases/Pages/NR20191114.aspx

[8]
https://www.oig.dot.gov/library-item/46466

[9]
https://raa.org/raa-releases-2024-annual-report/

[10]
https://www.faa.gov/other_visit/aviation_industry/designees_delegations/delegated_organizations/ac-oda

[11]
https://jacobin.com/2024/02/boeing-self-inspection-safety-oda

[12]
https://www.ntsb.gov/investigations/AccidentReports/Reports/AAR2002.pdf

[13]
https://www.faa.gov/about/reauthorization

[14]
https://www.faa.gov/aviation-safety-call-to-action

[15]
https://www.ntsb.gov/investigations/accidentreports/reports/aar1001.pdf

[16]
https://www.ntsb.gov/investigations/AccidentReports/Reports/AAR2002.pdf

[17]
https://data.ntsb.gov/carol-repgen/api/Aviation/ReportMain/GenerateNewestReport/192744/pdf

[18]
https://www.tsb.gc.ca/eng/rapports-reports/aviation/2025/a25o0021/a25o0021-preliminary.html

[19]
https://jasonblair.net/?p=4316

[20]
https://jasonblair.net/?p=4477

[21]
https://www.bls.gov/ooh/transportation-and-material-moving/airline-and-commercial-pilots.htm#:~:text=The%20median%20annual%20wage%20for,was%20%24113%2C080%20in%20May%202023.