Creating Assessments from Competencies

In the second half of your job analysis workshop, work with SMEs to develop material for your job related assessments.

The SMEs should decide which assessment types make sense to evaluate each competency and then create pass/fail assessments to assess the identified proficiency levels.

As a best practice, applicants should pass through multiple hurdles with multiple SMEs before an agency considers them qualified. In order to eliminate unqualified applicants in each round, the two rounds should measure different things. For example, one assessment can measure breadth of knowledge and the second can measure depth of knowledge. Or, a written demonstration project can measure technical competencies while a phone interview measures competencies more easily assessed through oral communications. Any applicant who does not pass the first assessment will not progress to the second.

Use your defined competencies, proficiencies, and required experience to create possible assessment materials in breakout groups. Have the SMEs from other groups test the materials and record their test responses. Use those responses to iterate on the questions (especially on the time required to answer them) and create guidance for SMEs reviewing applicant responses, including possible follow-up questions for interviews. You can also use test responses later when training SMEs on the final materials.

This page covers some available assessment options.

Assessment options summary

During job analysis you will select additional assessment hurdles to measure an applicant’s competencies. For example:

  • Work samples
    • these can be requested at the time of application
  • Written assessments
  • Structured interviews
    • We recommend phone calls or conference lines over video meetings
  • USAHire assessments

You should discuss options thoughtfully with hiring managers before the job analysis workshop, then plan the content of the assessments with SMEs during that workshop. If you have more than one assessment then they must measure two different things. This allows you to disqualify applicants from the second assessment if they don’t pass the first.

If you are creating a new passing score work simulation assessment, think through the following:

  • Is the assessment linked to the job analysis of what is required to do the job from day one?
  • You are required to include awareness of this passing score assessment hurdle in the job announcement language under “how you will be evaluated.” Make it clear applicants must pass the assessments to be considered qualified.
  • The scoring for the assessment must be clear so it’s obvious to all SMEs which applicants pass and which do not.

See a longer list at this reference link.

Scoring criteria

Scoring criteria should be defined when the assessment is created. Based on the scoring criteria, the applicant will either not meet or meet the associated required competency and required proficiency level being tested. If the hiring action is using the assessment both as a pass/fail assessment against the minimum bar and to later create categories, applicants can also note if the applicant “exceeds” the requirement on the scoring sheet. In this case, the reviewer should be provided with a definition of what it means to “meet” vs “exceed” the required proficiency level. SMEs can also define what a “good” answer looks like ahead of time while they are creating the assessment.

Example competency scoring: Analytical Ability

  • Meets Requirements: The submission encompasses the issues and evidence that were identified in the answer key table. A recommendation is provided.
  • Exceeds Requirements: In addition to meeting the “Meets Requirements” definition, the applicant’s summary includes additional detail and nuance demonstrating mastery of the subject matter.

Written assessments

Work sample essay

In this type of assessment, applicants are provided with data, information, policies, or memos. Their task is to review the materials and provide a structured response based on the materials. This may involve interpreting the data, discussing the pros/cons of a policy, or designing a communication plan or roadmap to address the materials. The work product should be tied to one or more competencies and reflect the type of task and work applicants would be assigned upon first starting the job.

Work sample essay example:

Imagine you are asked to modernize and update [example]. Create a research plan to outline how you would identify pain points for [example].

General essay: short answer

This type of assessment requires the applicant to provide short answer responses to a set of questions based on a given scenario. Each question should be tied to a specific competency and have explicit scoring criteria and instructions. You may have one or multiple questions per competency. This method is more open-ended and can be subject to a SME’s interpretation.

Short answer example

There are currently 10 different systems that support 30 different programs within an agency. The agency plans to consolidate these 10 systems into just one. You have been brought on to help with this initiative.

  1. What questions would you want to answer? What assumptions would you make?
  2. What would you propose to the agency for how to accomplish this?
  3. What would a potential project plan / roadmap look like?

General essay: long answer

This type of assessment requires the applicant to provide a longer essay response based on a given scenario. The essay response should allow applicants to demonstrate their proficiency level across the various competencies. This method is more open-ended and can be subject to a SME’s interpretation.

Long answer prompt example

You’ve been tasked with solving “x” using “y” methods. Detail a plan of action, with the steps you’d take. In your plan, account for various risks, such as what you would do if a stakeholder blocked your efforts and alternative methods you would consider.

Written assessments tips

  • The assessment type should be appropriate to evaluate the applicant’s proficiency level within the pre-defined competencies.
  • Ideally, you will have time during the workshop for live testing of assessment questions. If you end up revising the assessment after the workshop, the revised assessment will need to be tested.
  • As part of testing, record the time it will take to conduct the written assessment. Add that to the job announcement and assessment instructions.
  • You should provide an estimated timeframe within the JOA in which applicants can expect a written assessment to be sent (e.g., “If you are selected to move forward, you should expect to receive a written assessment within approximately 3-5 days of this announcement closing.”
  • To keep the applicant burden manageable, give applicants a maximum word count to use to complete the assessment.

Structured Interviews

All pilots to date have included a structured interview as the final assessment step. This pass/fail interview allows a SME to speak with an applicant to evaluate their proficiency and verify they have qualified without external assistance. This avoids a process where an unqualified candidate is on the cert and discovered only during fit interviews, risking a cancelled certificate.

If using two structured interviews

Phone Assessment Interview 1 tests the applicant’s breadth of experience by evaluating their basic knowledge across all required competencies. Write one question per competency and include follow-up questions to determine whether the applicant meets the required proficiency level. Breadth interviews tend to take 30 minutes and could be conducted asyncronously. Asyncronous interviews allow the agency to utilize tools where applicants receive an interview link and have a certain number of days to return a recording of their responses for SME review. The benefit of the asynchrous approach is enabling flexibility for applicants and SMEs in recording and reviewing answers. The primary risk of the asynchrous approach is a poor applicant experience, as well as applicant concerns that AI is being used to assess their recordings.

Phone Assessment Interview 2 tests the applicant’s depth of knowledge across all required competencies. Depth questions test how an applicant reacts and responds to changes in the presented situation. Depth interview questions can cover one to two competencies per question, and should include multiple follow-up questions that add complexity to the original question, such as “Now imagine…”. Depth interviews tend to take a full hour to conduct and should be live so that SMEs can ask follow up probe questions as needed.

Types of Interview Questions

Ask applicants questions about past experience, a hypothetical situation, and applicant viewpoint. In all cases, you can establish additional probe questions help SMEs to draw out more information, such as “What was your role?” or “Can you tell me more?”

  • Past Experience: Ask the applicant for a story from their work experience that describes the required level of proficiency (for example, “Tell me about a time…”). A qualified applicant will give specific and detailed answers, including the events leading up to the story, why they made the decisions they did, the lessons they learned, and what they would do differently.
  • Hypothetical Situation: Give the applicant a situation they are likely to encounter in the role you’re interviewing for (for example, “Imagine we have a problem with…”). Ask them to analyze what might be the cause, develop a plan or solution, or describe the pros and cons of a proposed approach. The setup to this question may be longer than other types and might be based on real challenges their organization has faced in the past. Ground a hypothetical situation in an applicant’s own experience by always including a follow-up question, such as “Can you tell me about a time you experienced a similar situation in your recent work experience?”
  • Applicant Viewpoints: Ask the applicant’s opinion about an issue with different schools of thought (for example, “What do you think about…”). Assess the applicant on their ability to justify their perspective, or if they lack a strong preference one way or other, to identify and contrast different opinions.

To get the best information about an applicant, related to the competencies and proficiencies you’ve documented, avoid creating these types of questions.

  • Brain teasers or puzzles create stress for the applicant and don’t test their skills.
  • Self-assessing strengths and weaknesses creates disingenuous answers that don’t relate to competencies.
  • Five-year plans and future goals don’t test competencies and can reveal inappropriate information that introduces bias.