<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=229461991482875&amp;ev=PageView&amp;noscript=1">

In this article, we'll outline the key concepts and activities involved in the next stage of the process - evaluating and scoring each proposal received from suppliers in response to an RFP.

What is an RFP Analysis?

Request for proposal (RFP) analysis, or RFP evaluation, is the criteria your business uses to guide the scoring of vendor proposals. It standardises the approach, sets expectations and ensures that there is no subjectivity when making a judgement on the proposals made. An RFP Analysis helps your business to make the best decision when it comes to software. 

How to Evaluate RFPs and score each proposal


To evaluate RFPs can be an arduous and lengthy undertaking, depending on:

  1. The number of scorable elements in and associated with the RFP
  2. The number of proposals to be scored
  3. The length and complexity of the contracts involved
  4. The comparability of the pricing elements submitted in each proposal


Some time-savings can be achieved because each member of the RFP evaluation team has been assigned specific areas to evaluate based on their background and expertise, allowing much of the scoring to be conducted in parallel streams.

Knowing the evaluation criteria early on can also save time and sets expectations.  

Conversely, some time may be lost waiting for clarifications or further information from the suppliers about aspects of their proposals detected during the scoring process.

An effective RFP evaluation process and scoring requires close attention to the following:

1. Rules of engagement

  1. No member of the evaluation team should:
    1. Have a personal interest in any participating supplier
    2. Have a financial interest in any supplier recommended for award
    3. Place themselves in any other type of conflict of interest situation
    4. Discuss any aspect of a proposal, evaluation outcomes or the organisation's intentions with respect to the RFP with anybody other than other evaluation team members and appropriate organisational personnel
    5. Have any direct or indirect contact with any representative of a participating supplier prior to, during or after the course of the RFP process, unless authorised by and in the presence of the Designated Contact
    6. Act alone or in collusion with any other evaluation team members for the benefit or to the detriment of any participating suppliers
    7. Accept gifts of any kind from participating suppliers, whether monetary or non-monetary, that are not in compliance with law or organisational policy
    8. Copy, disseminate or use for personal benefit of any kind, any information, whether marked as confidential or not, obtained by any method as an evaluation team member
    9. Reveal the identities of any other evaluation team members
    10. Bypass any security settings in a scoresheet and make unauthorised changes to the assigned requirement weights or group percentages

  2. Each member of the evaluation team must:
    1. Independently, fairly and impartially review and score each proposal
    2. Comply with their obligations regarding confidentiality
    3. Act professionally at all times and in accordance with the organisation's policies and ethical / integrity standards
    4. Advise team management immediately on becoming aware of any actual or potential conflict of interest, or the appearance of it, concerning the evaluator
    5. Promptly advise team management on becoming aware of any issues affecting the evaluator's ability to meet scoring deadlines
    6. Evaluate proposals received in response to the RFP based solely on the merits of the information presented
    7. Exercise prudence, judgment and common sense when providing any commentary in a scoresheet or issues log. Controversial remarks must be avoided at all times, but especially where local legislation may require evidence to be made public about how a preferred proposal was selected and the fairness of the evaluation, and allow challenges to the RFP outcome as a consequence
    8. Register any gifts received from participating suppliers.


2. Evaluation team readiness

Scoring RFP responses and requires each evaluation team member to have a thorough understanding of their role and the tools available to help them.

A scoring process guideline and/or training, plus early review of each proposal prior to the official start of scoring should cover:

  1. Reinforcement of the timelines and the potential consequences of delay
  2. The rules of engagement
  3. Each team member's overall scoring responsibilities, as assigned in stage 4 of the RFP process
  4. Each item a team member is responsible for scoring, whether a requirement, a cost element, the quality of a presentation or something else
  5. What the supplier has proposed with respect to each item, where applicable
  6. What needs to be considered in allocating a score for any particular item, such as how well a proposal satisfies a requirement
  7. The scoring scales to be used
  8. The scoring context that gives meaning to the scores assigned
  9. The scoresheets used to record the assigned scores and issue logs used to capture any comments or questions
  10. Other tools used to facilitate various activities during the RFP process.


3. Supplier team responsiveness

Evaluating and scoring RFPs typically requires a large amount of interaction with the participating suppliers.

Following release of the RFP to the invited suppliers, this interaction mainly involves the suppliers raising questions and comments for the team to deal with about the requirements, the timetable and so on.


This situation is mostly reversed on receipt of supplier proposals, with the evaluation team raising questions and comments about the proposals submitted for the suppliers to deal with.

A supplier team's response time to questions can vary due to workload priorities. When they are dealing with multiple RFPs simultaneously, the more complicated and probably higher-valued RFPs are much more likely to get the focus of attention than otherwise, especially if the bid team is relatively small.

The knock-on effects of low supplier responsiveness for the evaluation team are frustration and potential schedule overruns. For the slow-responding supplier, it can affect the scores received.

It's important that efforts are made not only to prevent one side from affecting the other side's ability to respond to questions, but also to effectively deal with the occurrence of any response delay. This can be achieved by both sides:

  1. Working together to establish an issue prioritisation scheme with defined response times and escalation paths applicable to both sides
  2. Tracking the performance of each side in meeting those response times and triggering any needed escalation
  3. Clearly expressing questions or concerns, with a focus on eliminating ambiguity and minimising opportunities for misinterpretation
  4. Submitting questions and concerns as they are discovered, rather than when a certain amount has been collected, or submitting them at a certain time each day or only on certain days.
  5. Promptly responding to requests for clarification of any questions or comments received
  6. Arranging conference calls managed by the Designated Contact and attended by the relevant questioners, whenever and as soon as the use of email for Q&A becomes unproductive.


4. Consistent score allocation

It can be simple to score some aspects of a proposal but others may require the application of various levels of judgment and foresight. Maintaining a consistent approach to allocating scores is required both within and across proposals.

Inconsistency or drift can be experienced with the first proposals evaluated, while the evaluator settles into the rhythm of scoring. It can also occur with the later proposals, as evaluation fatigue sets in.


There are two ways to deal with drift:

  1. Review each proposal's scores the day after scoring has been completed to determine if the assigned scores still seem valid, and make any adjustments necessary
  2. After all proposals have been individually reviewed, do a side-by-side check of the scores assigned in the same areas of each proposal, where the supplier responses are very similar. If some scores seem unjustifiably different given the similarity of responses, make any adjustments necessary.


5. Price normalisation

Pricing is commonly a difficult area to evaluate. In order to be able to compare supplier prices on a like-for-like basis, discrete pricing for each cost element needs to be provided.

Where insufficient detail is provided to allow some cost elements to be directly compared across suppliers, despite a request for more information, a normalisation process is required to provide estimates of what the comparable costs might be.


The normalisation process must be configured to meet the exact circumstances. As an example, consider the case where one supplier provides a bundled price for a software licence covering (i) the licence, (ii) upgrades, and (iii) maintenance and support, where other suppliers have provided individual pricing for each of these elements.

The normalisation process might proceed as follows when a recalcitrant supplier declines to provide the requested level of detail:

  1. For each supplier who provided the three required element pricing details
    1. Calculate the bundled price as the sum of the three element prices
    2. Calculate the percentage of the bundled price that each element's price represents
  2. For each element, calculate the average percentage of the bundled price across all suppliers who provided the required details
  3. For the supplier who provided only a bundled price, apply to that bundled price each of the three elements' average percentage of the bundled price calculated above to produce an estimated price for those elements that can be used for element pricing comparison across suppliers.


6. Scoring focus

We've already discussed how each element of a proposal requires its own scoring scale and scoring context, and how this information will be provided for guidance in each scoresheet. The focus of scoring can vary for different areas of the RFP, as follows:

  1. Best fit. Used where the scoring scale describes certain fixed outcomes (say, high cost and low risk, high cost and high risk, etc) that requires the use of judgement from the evaluator to estimate supplier risk levels
  2. Level of acceptability. This is another judgement call on the part of the evaluator. It is commonly used for acceptability assessment of:
    1. a supplier, in terms of financial viability, quality of proposal / presentation, reference checks and so on
    2. a contract clause, where the evaluator typically needs to state any objection and propose alternative wording for the clause
  3. Level of fit. Typically required where a very specific requirement has been stated (say, widgets must come in bronze, silver and gold colours), and the score reflects how well a proposal satisfies that requirement
  4. Relative ranking. Required for items like pricing or delivery guarantees, where comparison against each supplier's offering is required. The highest score is assigned to the supplier whose offering is most attractive, depending on whether a higher or lower offering is better. Scores for other suppliers should be assigned in decreasing amounts according to the decreasing comparative attractiveness of their offerings.


7. Scoring analysis

Depending on the legislative environment that the RFP is subject to, greater or lesser care may be needed to minimise opportunities for challenges to the RFP outcomes.

Irrespective of this, there is an overwhelming need to ensure that the best possible outcome for the organisation is obtained from the RFP.


The following checks need to be conducted on the scores received to discover if there is any cause for concern:

  1. On receipt of each evaluation team member's individual scoresheets:
    1. Delete any scores assigned outside the specific elements the evaluator is required to score
    2. Check if any requirement weights or group percentages differ from the master scoresheet templates and reset as necessary
  2. Following aggregation and averaging of all individual scoresheets to show consolidated total scores, discussion with the relevant evaluators and instigation of certain actions may be necessary, if analysis shows that:
    1. Identical scores have been assigned to the same proposal elements by different evaluators, a potential indicator of collusion
    2. The scores of any evaluators for any specific proposal diverge consistently and significantly from the mean scores across all evaluators for that proposal, a possible indicator of bias for or against that proposal.


Wrap-Up


This stage of the RFP process deals with evaluation and scoring of supplier proposals, covering:

  • How the evaluation team members should prepare and conduct themselves in relation to the scoring process
  • The need for both sides to work together to minimise friction and delay
  • Approaches for achieving consistency in scoring and normalising pricing models to allow like-for-like comparison
  • Clarification of scoring contexts and methods for detecting potentially suspicious scoring.

You should now be able to plan and prepare for, and conduct when ready, the RFP stage activities outlined in this article.

Our next article in this RFP Guide series will outline the key concepts and activities involved in identifying any proposals warranting further consideration, obtaining updated proposals and negotiating final offers, negotiating and executing a contract, then finalising the RFP.


If you are considering running an RFP for a Contract Management Solution and would like to hear more about Gatekeeper, then please contact us today.

Rod Linsley
Rod Linsley

Rod is a seasoned Contracts Management and Procurement professional with a senior IT Management background, specialising in ICT contracts

Tags

Contract Management , Control , Compliance , Vendor Management , Contract Lifecycle Management , Contract Management Software , Visibility , Contract Lifecycle , Case Study , Supplier Management , Vendor Management Software , Contract Risk Management , Vendor and Contract Lifecycle Management , Contract Management Strategy , Contract Repository , Risk Mitigation , Regulation , Contract Automation , Workflows , CLM , Contract Ownership , Contract Visibility , Contracts , Regulatory compliance , Supplier Performance , Supplier Risk , TPRM , Third Party Risk Management , VCLM , Contract and vendor management , Legal , Legal Ops , Podcast , Procurement , Risk , Vendor Onboarding , contract renewals , Future of Procurement , Gatekeeper Guides , Procurement Reimagined , Procurement Strategy , RFP , Supplier Relationships , Business continuity , CLM solutions , COVID-19 , Contract Managers , Contract Performance , Contract Redlining , Contract Review , Contract Risk , Contract compliance , ESG , Metadata , Negotiation , SaaS , Supplier Management Software , Vendor Portal , Vendor risk , webinar , Artificial Intelligence , Clause Library , Contract Administration , Contract Approvals , Contract Management Plans , ESG Compliance , Kanban , RBAC , Recession Planning , SOC Reports , Security , Sustainable Procurement , collaboration , AI , Audit preparedness , Audit readiness , Audits , Business Case , Clause Template , Contract Breach , Contract Governance , Contract Management Audit , Contract Management Automation , Contract Monitoring , Contract Obligations , Contract Outcomes , Contract Tracking , Contract Value , DORA , Dashboards , Data Fragmentation , Due Diligence , ECCTA , Employee Portal , Excel , FCA , ISO Certification , KPIs , Legal automation , LegalTech , Market IQ , NetSuite , Obligations Management , Procurement Planning , Redline , Scaling Business , Spend Analysis , Standard Contractual Clauses , Suppler Management Software , Touchless Contracts , Vendor Relationship Management , Vendor risk management , central repository , success hours , time-to-contract , APRA CPS 230 , APRA CPS 234 , Australia , BCP , Bill S-211 , Breach of Contract , Brexit , Business Growth , CCPA , CMS , CPRA 2020 , CSR , Categorisation , Centralisation , Certifications , Cloud , Conferences , Confidentiality , Contract Ambiguity , Contract Analysis , Contract Approval , Contract Attributes , Contract Challenges , Contract Change Management , Contract Community , Contract Disengagement , Contract Disputes , Contract Drafting , Contract Economics , Contract Execution , Contract Management Features , Contract Management Optimisation , Contract Management pain points , Contract Negotiation , Contract Obscurity , Contract Reminder Software , Contract Reporting , Contract Routing , Contract Stratification , Contract Templates , Contract Termination , Contract Volatility , Contract relevance , Contract relevance review , Contracting Standards , Contracting Standards Review , Cyber health , DPW , Data Privacy , Data Sovereignty , Definitions , Digital Transformation , Disputes , EU , Electronic Signatures , Enterprise , Enterprise Contract Management , Financial Services , Financial Stability , Force Majeure , GDPR , Gatekeeper , Healthcare , ISO , IT , Implementation , Integrations , Intergrations , Key Contracts , Measurement , Mergers and Acquisitions , Microsoft Word , Modern Slavery , NDA , Operations , Parallel Approvals , Partnerships , Pharma , Planning , Port Agency , Pricing , RAG Status , Redlining , Redlining solutions , Requirements , SaaStock , Shipping , Spend optimzation , Startups , SuiteApp , SuiteWorld , Supplier Cataloguing , Technology , Usability , Vendor Governance , Vendor compliance , Voice of the CEO , automation , concentration risk , contract management processes , contract reminders , document automation , eSign , esignature , post-signature , remote working , vendor centric , vendor lifecycle management

Related Content

 

subscribe to our newsletter

 

Sign up today to receive the latest GateKeeper content in your inbox.

Subscribe to Email Updates