7
Jan

How to lose a $100m bid – in 3 simple steps

Lack of acronym definitions, unreadable language and poor consistency will kill your pursuit.

In early November 2015, the US Government Accountability Office (GAO) denied a protest lodged by Federal Acquisition Services Alliant Joint Venture (FASA). It was connected with task order ID05140054; information technology support for the United States Department of Agriculture’s National Information Technology Center (NITC). This Task Order was estimated to be worth $100m.

In fact, not only did FASA get eliminated but the General Services Agency (GSA) eliminated 16 of the 18 submissions as technically unacceptable. The FASA protest and the GAO determination gives us great insights into the reviewer’s mindset. For instance, the determination cited these specific issues;

“riddled with grammatical errors . . . lack of contractor vs. government identification; spelling errors; lack of acronym identification, consistency and accuracy; inconsistent reference and terminology; and punctuation errors.”

When you bid a contract, your opportunity cost can be very large. So, losing on the grounds of a poor quality proposal is painful since it is so preventable. Let’s consider some of the elements that contributed to this loss and how you can avoid them.

We use some sample copy below by way of explanation. Note these samples are not from the failed FASA submission.

Lack of Acronym Identification

Government agencies expect you to fully define every acronym in a consistent way. While it may be obvious to you, it is extremely dangerous to assume that your reviewer will understand acronyms with no definitions.

Here’s the type of copy we frequently see in proposals:

 

“Because IPTs are necessarily made up of peers from different organizational functions, both within AEES/EI and other components of FEMA shared accountability, and willingness to reach consensus, open-ended discussion and active problem solving involving the entire team is essential.”

 

Quite apart from the Acronym Soup here, this is wrong on many levels. You really wonder where to start.

But, let’s try. Begin by untangling the acronyms. IPTs were not defined in this document prior to first use, similarly AEES/EI and FEMA. Put yourself in the reviewer’s shoes. How would you feel if you saw this from one of your suppliers?

Now the good news is that spotting this kind of acronym overload is easy. For example, we run reports in VisibleThread Docs that instantly flag undefined acronyms across hundreds of pages.

Here’s an example report:

In this case, you get an alphabetized listing of every acronym along with any issues in a couple of mouse clicks. If you’re reviewing tens or hundreds of pages, the time savings are obvious.

The big takeaway from the GSAs determination is that you must check for acronym integrity. Whether you do it by hand or using tooling, you really don’t have a choice. So, just wire it into your color team process.

Proposal Readability and Clarity

Now let’s look at proposals from a clarity point of view.

Here’s what the FASA determination stated in this regard:

the agency was unable to clearly interpret a significant amount of the proposal, which was considered to “present performance risk in terms of quality control execution, which, combined with the inability to interpret the proposal in its entirety, resulted in the proposal being rendered unacceptable.”

So as you consider your own proposals, will a reviewer easily understand the content? If they can’t they will likely question your ability to deliver on the program.

A quick self test is simply to read the sentence aloud. If you need to re-read the sentence to understand it, you have a problem. Here is the previous sample again. Try reading it aloud.

 

“Because IPTs are necessarily made up of peers from different organizational functions, both within AEES/EI and other components of FEMA shared accountability, and willingness to reach consensus, open-ended discussion and active problem solving involving the entire team is essential.”

 

I have to say when I tried it, it took a couple of reads to get to grips with it. While this is just my opinion, wouldn’t it be great if we could move from a subjective, opinion based assessment to a more objective (i.e. repeatable) scoring mechanism. That would allow us apply this check in our color review process in a systematic way.

Turns out we can. And it’s not quite as tricky as you might think. We can score content using standard readability measures.

For instance, here is the same copy scored in VisibleThread Docs using various readability scores. Just to explain the blue text shows long, run-on sentences. The maroon text indicates passive voice while the scores are standard readability measures.

This reveals these immediate issues:

  1. Poor readability score – This scores 21 out of 100 on the Flesch Reading ease index and a US Grade level measure of 16. Both scores suggest that the reader must have an advanced degree level education,
  2. Sentence Length – The sentence is simply too long, multiple ideas compete for your attention. There is no coherent message,
  3. Passive voice If you use active voice instead of passive, you make clear who will be responsible and clarify your message
  4. Bloated word count – It is bloating the document word count and forcing us to omit more valuable information,
  5. High cognitive burden – This content forces the reader to read and re-read to understand it and risks disengagement.

Consistency, lack of alignment & non-compliance

The final nail in the proverbial coffin for FASA was inconsistency and non-compliance. This is a broad area and there were a number of instances of poor alignment, and consequently non-compliance. Let’s focus on one in particular; resume qualifications and the staffing plan.

Here’s one example in the GSAs determination:

…the labor category skill level descriptions in FASA’s proposal specified [DELETED] certification for the project manager labor categories proposed under CLINs 009 and 010. However, inconsistent with these descriptions, the resume of the proposed key personnel [DELETED]–who was proposed to fill a project manager labor category under CLIN 010–did not reflect [DELETED] certification.

For the sake of example, let’s assume that the certification required under CLIN 010 was PMP (Project Management Professional), a fairly typical certification held by program managers. So basically the RFP required this certification and it was not in any of the resumes submitted by the vendor.

Checking for this manually is tedious. Unfortunately, when you’re in a time crunch omissions like this can easily be overlooked. We also see similar examples around security and clearance levels such as ‘Top Secret’ or ‘TSI’.

Using software tooling, it’s really easy to spot these types of misses early.

Here’s an example where we’re searching for key capabilities across a group of resumes. It becomes obvious who has the skills we need. More importantly, we can easily see who does not!

The list on the left show capabilities we are searching for. Each column on the right is a specific individuals’ resume. Numbers and shading show occurrence and density.

This was one of a series of inconsistencies that made the proposal non-compliant in FASA’s case.

Another example of poor alignment and inconsistency was associated with the implementation plan requirements. The mandate was to specify a ‘phase-in plan’ and a ‘transition plan’ separately in the implementation plan. Again this did not happen.

Suffice to say tooling like VisibleThread Docs can slice through lots of docs and identify gaps in this area too.

—–

Takeaways:

  • You will get nailed for poor or non-existent acronym definitions.
  • Wire an acronym check into your color team review process. Check acronym definitions during red team review, but at the very least as part of your gold team review.
  • Readable content is very important. Simplify text by using short sentences & use active voice where possible.
  • Where sentences have multiple clauses, read them aloud. Try to restructure to have 1 concept or message per sentence.
  • Use Readability measures as a way to objectively score your documents. Specialist solutions like VisibleThread Docs work well. If you don’t have access to a dedicated solution, MS Word scores readability at document level.
  • Make sure your proposal complies with the requirements and is consistent. In particular, review the staffing and implementation plans at red and gold team stages.
  • While it may be laborious, manual checks are a must. Make the process super efficient by using tooling like VisibleThread Docs or an equivalent solution.
  • If you don’t wire these checks into your process, you dramatically increase the risk of submitting a non-compliant bid and losing.

I hope this post was helpful. Do you have any examples of compliance issues that torpedoed a proposal?

Let us know in the comments.

To see how you can instantly check acronyms, measure readability or check for consistency, sign up for a 7-day free trial of VisibleThread Docs here: