Company Name:
Competency Framework Used:
Typical Interview Duration:
Number of Interview Stages:
Competency Framework & Role Profiling
Implement a competency framework that includes organizational core competencies (applicable to all roles), functional competencies (specific to job families), and role-specific competencies (unique to individual positions). Reference established frameworks such as the SHL Universal Competency Framework, Lominger/Korn Ferry competencies, or the CIPD Profession Map. Ensure each competency has a clear definition, behavioral indicators at multiple proficiency levels, and observable examples that can be assessed during interviews.
Conduct competency-based job analysis involving current role holders, managers, and stakeholders to identify the five to eight competencies most critical for role success. Distinguish between competencies required at entry (must-haves for selection) and those that can be developed on the job (developable through training). Prioritise competencies based on their importance for role performance and their difficulty to develop, focusing interview assessment on the highest-priority, hardest-to-develop competencies.
Create a proficiency scale (typically four to five levels) for each competency with specific behavioral descriptors at each level. For example, a competency like 'Stakeholder Management' might progress from Level 1 (builds rapport with immediate colleagues) through Level 4 (influences senior executives and external stakeholders to achieve strategic outcomes). Link each role's required proficiency level to the appropriate point on the scale. Use these levels to anchor interview scoring rubrics.
Allocate competency assessment across interview stages and interviewers so that each competency is assessed at least once and critical competencies are assessed by multiple interviewers for reliability. Avoid assessing more than three competencies per interviewer per session to ensure sufficient depth. Create a competency-interview mapping matrix for each role that shows which competencies are assessed at which stage and by whom. Share this matrix with all interviewers as part of the briefing process.
Question Design & Interview Guides
Craft questions that ask candidates to describe specific past experiences demonstrating each competency, using the STAR format: Situation (context), Task (responsibility), Action (what the candidate specifically did), and Result (outcome and learning). Develop two to three primary questions and two to three follow-up probes for each competency. Ensure questions are open-ended, role-relevant, and avoid hypothetical scenarios, as behavioral questions (past-focused) have higher predictive validity (r = 0.51) than situational questions (r = 0.47) according to meta-analytic research.
Develop printed or digital interview guides for each role that include an opening script (welcome, introduction, process overview), competency questions in logical sequence, probing prompts for each question, time allocation per competency, scoring rubric with behavioral anchors, closing script (candidate questions, next steps), and note-taking template. Standardised guides ensure all candidates receive an equivalent experience and all interviewers follow the same process, which is essential for both validity and legal defensibility.
Train interviewers in probing skills to ensure candidates provide complete responses rather than vague or generic answers. Common probes include: 'What specifically was your role?' (clarifying Task), 'Walk me through exactly what you did' (deepening Action), 'What was the measurable outcome?' (strengthening Result), and 'What would you do differently?' (assessing reflection). Teach interviewers to distinguish between 'we' responses (team contributions) and 'I' responses (individual contributions) and to probe accordingly.
Supplement behavioral questions with situational questions ('What would you do if...') for candidates transitioning from other industries, early-career applicants, or roles requiring competencies that are difficult to assess through past experience alone. Design scenarios that are realistic and based on actual challenges the role faces. Create scoring rubrics that evaluate the quality of the candidate's reasoning, approach, and awareness of complexity rather than looking for a single 'correct' answer.
Track which questions yield the most differentiated and predictive responses by correlating interview scores with post-hire performance data. Retire questions that show poor discrimination between candidates or lack correlation with job success. Add new questions based on evolving role requirements, candidate feedback about question clarity and relevance, and emerging competency-based interviewing research. Maintain the library centrally and version-control all changes.
Interviewer Skills & Calibration
Require all interviewers to complete a training program covering competency-based interviewing principles, question delivery technique, active listening and note-taking, STAR response recognition, effective probing, scoring rubric application, and bias awareness. Use role-play with trained actors or recorded candidate responses to provide realistic practice. Certify interviewers upon successful completion and require recertification annually. Research shows that trained interviewers achieve significantly higher inter-rater reliability and predictive validity.
Organise regular calibration sessions where interviewers independently score the same candidate response (recorded video or written transcript) and then compare and discuss their ratings. Identify sources of scoring inconsistency and refine shared understanding of rubric anchors. Calculate inter-rater agreement statistics (e.g. intraclass correlation coefficient) and target a minimum of 0.70 for acceptable reliability. Address persistent low-agreement interviewers through additional coaching or temporary removal from panels.
Provide evidence-based training on biases most prevalent in interviews: first impression bias (forming judgements in the first few minutes), confirmation bias (seeking information that confirms initial impressions), similar-to-me bias (favouring candidates who share the interviewer's background), halo/horn effect (allowing one strong or weak impression to color all assessments), and contrast effect (comparing candidates against each other rather than against the rubric). Teach specific mitigation strategies for each bias type.
Require new interviewers to observe experienced colleagues conducting two to three interviews before interviewing independently. Follow observation with a co-interviewing phase where the new interviewer asks questions and scores alongside an experienced colleague who provides feedback. Progress to independent interviewing only after the new interviewer demonstrates consistent scoring aligned with calibration standards. This graduated approach builds competence while protecting candidate experience and assessment quality.
Evaluation & Decision-Making
Require each interviewer to complete their scorecard immediately after the interview, recording scores for each competency with specific behavioral evidence from the candidate's responses. Prohibit discussion of candidates between interviewers before individual scores are submitted. Use technology (ATS or dedicated evaluation platforms) to lock scores before the debrief is scheduled. This practice is essential for preventing anchoring, groupthink, and social pressure from compromising assessment independence.
Structure the debrief by discussing one competency at a time rather than giving global impressions of candidates. For each competency, have the assigned interviewer share their score and evidence first, followed by additional evidence from other panellists. Discuss and resolve scoring discrepancies using evidence rather than seniority or persuasion. Document the agreed competency scores and the evidence supporting them. Make the overall recommendation based on the competency profile rather than a holistic gut feeling.
Create a weighted scoring matrix where the most critical competencies receive higher weightings in the overall score calculation. Define minimum threshold scores for essential competencies that candidates must achieve regardless of their overall score. Use the matrix to generate a rank-ordered candidate list based on evidence. Document the decision rationale and ensure it can withstand scrutiny from unsuccessful candidates, legal review, or audit. The matrix transforms subjective impressions into a defensible, transparent decision process.
Offer all interviewed candidates specific, competency-based feedback on their performance, highlighting both strengths and development areas. Frame feedback in terms of observable behaviors and competency gaps rather than personal characteristics. Ensure feedback is consistent with the scoring rationale and does not expose the organization to legal risk. Timely, specific feedback enhances candidate experience, strengthens employer brand, and enables candidates to improve for future opportunities.
Quality Assurance & Continuous Improvement
Conduct criterion-related validity studies by correlating competency interview scores with subsequent performance ratings, objective productivity metrics, and retention outcomes. Calculate validity coefficients for each competency and the overall interview process. Identify which competencies are most predictive of success and which are weak predictors. Use findings to refine competency selection, question design, and scoring rubrics. Aim for validity coefficients consistent with published meta-analytic benchmarks (r = 0.51 for structured interviews).
Analyse pass rates for each demographic group at the screening, interview, and offer stages. Apply the four-fifths rule and statistical testing to identify any stage where adverse impact may exist. Investigate root causes, which may include biased questions, inconsistent application of scoring rubrics, non-diverse panels, or cultural bias in competency definitions. Take corrective action to eliminate sources of adverse impact while maintaining the validity and rigour of the process.
Survey interviewers after each hiring cycle on the quality and relevance of interview questions, the usability of scoring rubrics, the effectiveness of the debrief process, and overall confidence in hiring decisions. Identify common frustrations and suggestions for improvement. Use feedback to update interview guides, refine training content, and simplify administrative processes. Demonstrating responsiveness to interviewer feedback increases engagement with the structured process.
Review the organizational competency framework annually to ensure it reflects evolving business strategy, market requirements, and workforce expectations. Assess whether the interviewing methodology remains aligned with current best practice and research evidence. Engage with external experts, professional bodies (CIPD, BPS, SIOP), and peer organizations to identify innovations and improvements. Update the framework, questions, and processes as needed, while maintaining sufficient stability for trend comparison.
The Competency-Based Interviewing (CBI) Framework is a structured assessment methodology where every interview question is specifically designed to evaluate a predefined competency — a skill, behavior, or attribute identified through job analysis as critical for success in the role. It eliminates the guesswork from candidate evaluation by replacing subjective impressions with evidence-based, scoreable responses tied to measurable performance indicators.
CBI has its roots in the pioneering research of David McClelland at Harvard University, who argued in the 1970s that competencies are substantially better predictors of job performance than traditional measures like IQ tests, academic credentials, or years of experience. The behavioral interview methodology was further developed and standardised by the UK Civil Service, and has since been adopted by organizations worldwide — from the NHS and the BBC to Amazon, Deloitte, and the United Nations.
This competency assessment framework covers competency identification through job analysis, behavioral question design using the STAR method, standardised scoring rubrics with behavioral anchors, interviewer training and calibration, panel interview coordination, and evidence-based debrief protocols. It creates a consistent, research-validated approach to talent assessment that produces better hiring decisions, reduces unconscious bias, and delivers a fairer, more transparent candidate experience.
Unstructured, conversational interviews are among the weakest predictors of job performance in the entire selection toolkit — research shows they are only marginally more predictive than a coin flip. Competency-based behavioral interviews, by contrast, are among the strongest predictors when combined with standardised scoring rubrics and calibrated assessors. The predictive validity difference between these approaches is both statistically significant and practically substantial.
For your hiring managers, this skills-based interview framework provides a clear, usable structure that makes interviewing less stressful and more effective. Instead of struggling with what to ask or relying on "culture fit" gut feelings, interviewers have focused behavioral questions that target specific, job-relevant competencies with clear scoring criteria. This consistency also significantly strengthens your organization's legal defensibility against discrimination claims in hiring.
The framework also measurably improves candidate experience. Research by the Talent Board shows that candidates rate competency-based interviews as fairer and more professional than unstructured conversations because the evaluation criteria are transparent and consistent. Candidates can prepare effectively for behavioral questions, which means you see their genuine capability rather than their ability to handle conversational ambiguity or interview surprises.
The framework begins with competency identification — how to use systematic job analysis, role profiling, and stakeholder input to determine the 5 to 8 key competencies that most strongly predict success in a specific role. It covers competency libraries and taxonomies, levelling frameworks that define what each competency looks like at different seniority levels, and prioritisation techniques for selecting which competencies to assess through behavioral interviews versus complementary methods like work samples or cognitive assessments.
The core section addresses behavioral interview design in depth. It provides step-by-step guidance on crafting STAR-format questions (Situation, Task, Action, Result), writing effective follow-up probes that distinguish genuine competency from rehearsed answers, creating scoring rubrics with specific behavioral anchors at each performance level, and distributing competency assessment responsibilities across multiple interviewers to maximise coverage and minimise interviewer fatigue.
The framework also covers interviewer training and calibration protocols, structured debrief and decision-making procedures, continuous assessment quality improvement, and adaptation guidance for different formats — panel interviews, virtual video interviews, assessment centre exercises, and sequential one-on-one sessions — and for different career levels, from graduate and entry-level roles through senior executive and board-level hiring.
Choose the Brief version for a ready-to-deploy competency interview guide with a scoring template and behavioral question bank organised by common workplace competencies, or the Detailed version for a comprehensive CBI program design guide including competency libraries, question design workshops, scoring rubric templates, and interviewer training materials.
Customize the framework by entering the specific competencies relevant to your target roles, your preferred interview format and panel structure, the number of assessment stages in your hiring process, and the seniority levels you are hiring for. The template fields help you build a tailored behavioral interview process specific to your organization's competency model and hiring needs.
Download as a PDF or DOCX to share with hiring managers, interview panel members, and your recruiting team. Hyring's free framework generator helps you implement competency-based interviewing best practices — the same evidence-based assessment methodology used by the world's most rigorous hiring organizations — quickly and professionally.