A good, worthwhile (meaning: “worth the huge interruption and distraction”) Red Team doesn’t just “happen” by itself — it takes a lot of advance preparation effort, organization, and setup. Here’s how we planned and executed the best-ever Red Team Review.

A Blue Team Blueprint

First, we had a “Blue Team,” immediately after we had shred out the RFP requirements to our proposal outline. This Blue Team comprised the same individuals who would later review our proposal draft. The objective of the Blue Team was to confirm that we were answering all of the RFP questions and including our win themes and strategies in our proposal where the evaluators expect to find them. This sanity check, before we started actual writing, would avoid major re-organizing at the last minute. With this confirmation (and redirection, if needed) we knew that our proposal was on the right track and that post Red Team changes would consist of rewriting, not major reorganizing!

The best individuals for the Red Team are people who don’t know a lot about your company — the only thing they should know about your company is what they read in your proposal because that more accurately replicates your customer’s evaluation environment. Given that “ideal,” for economy most Red Teams comprise senior management or specialists from your company or other divisions, perhaps supplemented by one or two consultants. When this happens, it is all the more important to follow a structured review that evaluates your proposal against the RFP Section M – Evaluation Factors for Award, and actually scores it against the evaluation criteria. The scoring can be numerical (1, 2, 3, etc), subjective (+, OK, -), or color-coded (Red – Unacceptable/Non-responsive; Yellow – Marginal; Green – Acceptable; or Blue – Exceptional with added benefits to the customer). We prefer color-coding because it does not try to “measure a brick with a micrometer,” because it is very graphic, and because it is the scheme that the DoD evaluators will probably use. The value of actually scoring the proposal is that it forces the reviewers’ attention on the evaluation criteria and results in a more objective review.

The Importance of a Structured Review

On one proposal, where the Technical Panel Leader was the company’s Vice President of Engineering, and his panel was his staff, his debriefing to the proposal team underscored the value of a structured review: “We read the proposal. It is one of the best proposals we have ever done! We looked great! The proposal included all of our themes and win strategies! It was a sure winner! My staff asked that, since we were done, could they go back to their ‘real jobs.’ I told them ‘no,’ that we still had to ‘score’ the proposal against the RFP Section M. Guess what? Even though our proposal read great, it was virtually non-responsive! It was a sure loser!”

How DoD Doesn’t Evaluate Proposals

Before we set up the formal Red Team Review process, a few words are in order to explain exactly how DoD customers do and don’t evaluate proposals. First, let’s clear up some popular misconceptions. The evaluators:

Do not read the entire proposal straight through.
Do not score proposals against each other.
Do not score proposals against the RFP — even against Section M.
Do not score the Specific Evaluation Factors.

How DoD Does Evaluate Proposals

Here is how the Federal Acquisition Regulations require them to evaluate and score competitive proposals, and these regulations are generally followed very closely in order to avoid protests.

The evaluation organization is organized into a Source Selection Evaluation Board (SSEB) (or team) that will evaluate and score the proposals. There is also a Source Selection Advisory Council (SSAC) that will advise the decision maker about other factors germane to the source acquisition (i.e., political, budgetary, programmatic, etc.). Finally, there is the Source Selection Authority (SSA) who is the specific individual who will make the actual decision, based upon all of the information available.

This SSA is under incredible pressure from the eventual users, competing interests, Government oversight agencies, Congress, the White House (on large programs), and the public. It is a safe assumption that the SSA makes the decision that is safest for his or her career, and that can be fully justified to Representatives or Senators of the losing bidders.

A Fully Justified, Winning Approach

For example, in 1991, then Secretary of Defense Dick Cheney justified, to Senator C.S. Bond, his selection of the F/A-18E/F over the F-14:

“In selecting the F/A-18E/F, we considered not only performance and unit price, but also a host of other factors which impact on cost, such as weapon system reliability, maintainability, safety, maintenance costs, squadron manning requirements and cost per flight hour.

“In the final analysis the F/A-18E/F was the clear choice over the F-14. It is three times more reliable, twice as easy to maintain, has a safety record which is fifty percent better, requires about twenty-five percent fewer maintenance personnel, and costs about twenty-five percent less to operate per flight hour. When combined, these factors clearly show that the F/A-18E/F is the more cost effective aircraft.”

It’s hard to argue against that, and your Red Team should ensure that your proposal is equally convincing. To do so, it has to convey a powerful message to the evaluators.

How the SSEB Scores Proposals

The SSEB will usually be divided into Panels that represent the disciplines described in the RFP Section M Specific Evaluation Factors. These specific factors are not scored directly, but indicate the various disciplines of interest to the SSEB. There will likely be a Technical Panel, a Management Panel, and a Cost Panel (costs are not scored, but they are evaluated, usually for realism and reasonableness). The Technical Panel may be further divided into Sub-Panels for specific disciplines such as (for an aircraft) Performance, Airframe, Propulsion, Flight Controls, Weapons, computer systems, etc.

Now it gets a little tricky: The SSEB scores the proposals in accordance with the approved and published Source Selection Plan (SSP), not against the RFP Section M. We’ve seen only one RFP with this SSP information included. The SSP is not the RFP Section M, but Section M is based upon the SSP. The SSP must be approved and locked in the safe before any proposals are accepted in order to preclude any biases put into the SSP to favor a competitor. The SSP cannot differ significantly from the RFP Section M, but it expands Section M to provide specific guidance to the evaluators by identifying the “standards” against which the proposals will be evaluated.

For each specific evaluation factor or sub factor, the SSP will define that factor or sub factor (this usually follows the RFP Section M Specific Evaluation Factor/Sub factor fairly closely), and then states: “The standards are met when the offeror:” then follows a list of very specific quantitative or qualitative values that comprise the minimum acceptable requirements for the proposal to be in the “acceptable range.” These will relate to the minimum specification and SOW requirements that the customer will accept.

Now, here’s the tricky part: The SSEB does not score the specific factors per se, but scores each standard according to the RFP Section M Assessment Criteria — Understanding the problem, Compliance with requirements, and Soundness of approach. The SSEB will give each of these assessment factors a score (either numerical or color-coded) for each standard.

Let’s go over that again: the SSEB scores each SSP standard according to your proposal’s understanding of the problem, compliance with requirements, and soundness of approach.

Additionally, there may be General Criteria announced in the RFP Section M that the SSEB applies across all disciplines, such as Risk and (for Army proposals) MANPRINT. DoD uses three Risk categories — cost, schedule, and performance. FAA uses ten facets of risk — technical, operability, producibility, supportability, cost, schedule, programmatic, management, funding, and political that you need to address for FAA proposals. The Army’s MANPRINT emphasizes integration of seven domains — manpower, personnel, training, human factors, engineering, system safety, and health hazards that you need to address for Army proposals.

For our specific proposal, Section M also identified seven key “tenets” that the evaluators would look for in order to make a “best value” award. We included these in our Red Team instructions. If your RFP includes some “key performance parameters (KPPs), then you can bet that these will be well represented in the Source Selection Plan criteria.

The SSEB presents its findings to the SSA, usually in color codes without identifying the specific offeror by name. The SSAC advises the SSA on other factors that might affect his/her decision, and the SSA makes the final decision. By the way: the SSA almost always reads the Executive Summaries!

Our Red Team Blueprint

That said, now we need to set up a structured Red Team Review that will (hopefully) ensure that your proposal makes it through the SSEB. You can see, from the above discussion, that simply handing your Red Team a copy of the RFP and your proposal may make you feel better, and might even improve your proposal, but has little value in significantly improving your win probability.

This is how we set up the Red Team Review for the proposal mentioned at the start of this article:

1. We identified experts for each technical and management discipline germane to our proposal, and matched them in advance to specific proposal volumes and sections.

2. Every proposal section was reviewed by at least two reviewers, but no one had to review the entire proposal.

3. We set up the room the night before with tables, name plates with the individuals’ functions noted, and reference materials.

4. We provided each member with a three-ring notebook with his/her name on the cover and the following material inside:

a. Copy of the Red Team Review Plan and schedule
b. List of all Red Team Members, their review assignments, and contact information
c. List of proposal team management, volume leaders, authors, publications personnel, and their contact information.
d. Copy of the RFP Section L Instructions to Offerors (proposal organization instructions)
e. Copy of RFP Section M Evaluation Factors for Award.
f. Complete proposal Master Table of Contents (outline)

5. We provided copies of the proposal sections that each would be reviewing at their location

6. We provided copies of the RFP Shred-out to Proposal, showing the actual RFP Requirements that we were addressing in each numbered proposal paragraph (which had been “blessed” by the Blue Team much earlier)

7. We provided convenient copies of the complete RFP and reference documents for their use, if needed

8. We provided paper, Post-Its, paper clips, and red pens (no black pens or pencils allowed — too hard to see when the proposal pages are marked in black)

9. We provided stacks of Proposal Deficiency Reports (PDRs) with places to score — understanding of the problem, compliance with requirements, soundness of approach, and risk assessment and mitigation. All “yellow” and “red” scores required a PDR.

10. Because of the amount to be reviewed and the short time remaining before submittal, the reviewers were allowed to mark simple changes directly on the proposal draft with their red pens, placing a Post-It on the page so that we could find their markups easily.

After self-introductions all around, the Program Manager briefed the program, and I, as Red Team Leader, briefed the review process and Red Team Review Plan. And yes, we told them where to find coffee, donuts, and the bathrooms. We also provided lunch. All volume managers and authors attended this opening session so the proposal team and Red Team would know each other, the authors were allowed to leave — even to go home since it was a weekend — but to stay within reach if the reviewers needed to contact them.

At the end of the first day, we held a Plenary Session to determine how the review was going. There have been times on other proposals when the Plenary brought to light the fact that the proposal draft was so rough and incomplete that it could not be reviewed. For that proposal, I reorganized the Red Team into a “Tiger Team,” and assigned each member proposal sections to “Fix or Repair as Needed.” This was the only way we got the proposal out (which, not surprisingly, lost anyway).

For this proposal, however, the review was going well, but, because of the complexities of the proposal and our offering, it was apparent that we would not be able to review the entire proposal as planned. Therefore, I split the team, reorganized the assignments and we were able to complete the review successfully the second day.

Automated Proposal Deficiency Reports

As the Red Team Reviewers completed proposal sections, they gave their completed PDRs to their panel leader who reviewed them, and then gave them to our Red Team Management group, where three of us entered the PDRs into the POW2000™ Red Team Review function. When we entered the PDRs, POW2000 automatically entered the reviewers’ and author contact information, and the RFP Section M evaluation factor under which the proposal paragraph was evaluated. These PDRs could be organized and printed in order of Red Team Deficiency Control Number, proposal paragraph number, Red Team Panel, reviewer, or author. We printed and distributed hard copies, and e-mailed applicable copies to the volume leaders and authors, and to our teammates across the country.

Red Teams

Time well-spent or a waste of time? This Red Team was very difficult, because of the short time to prepare the proposal and review materials, the complexities of the RFP and our proposal, the short time to review a large amount of material, and the short time to respond before submittal. We spent many 16-hour days preparing for it. The reason it worked, was because we planned and implemented a workable process, we provided everything the reviewers needed in a convenient format, and the volume managers, authors, publications personnel, and reviewers all cooperated. For this Red Team Review, it was “Time Well Spent.”

OCI can provide Red Team support along with other proposal support services. Please get in touch here if you’d like to learn more.