HR Open Technology

Innovations in Assessments

June 16, 2020 HR Open Standards Season 1 Episode 3
HR Open Technology
Innovations in Assessments
Show Notes Transcript

Learn about the value of standards in staged assessments. Throughout this podcast we'll teach you:

  • The increasing trend and advantages of short assessments
  • New assessment technologies and media
  • How the standards impact an assessments workflow with ATS' and recruiting companies
  • Seamlessly integrating interviews
  • Impacts of short assessments, lessons and learned, and next steps
Kim Bartkus:

This is Kim Bartkus with the HR Open Standards Consortium. Today, we have several speakers sharing their expertise on innovations in assessments, interviewing,and the standards. I'd like to introduce David Steckbeck the project lead of the assessments work group, David welcome.

David Steckbeck:

Hi Kim. Thank you for having me here today. We wanted to highlight and discuss a few of the items that we're working on as the assessments work group here at HR Open. Before we get into that, let me introduce our presenters today. So you've heard my name. I'm David Steckbeck. I'm a consultant and the project lead with the assessments work group. We also have Rick Barfoot with HRNX and he's their CTO. Jim Elder is a pri ncipal co nsultant with DDI World and we have Greg Mye rs, who's the VP of Technology Strategy at Modern Hire. So to get us started with some of the items that we're working on in the work group let me go ahead and jump on into staged assessments. I'm going to pass the mic over to Rick and Jim to talk about staged assessments a little bit.

Jim Elder:

Well thank you, David. Staged assessments are a way for us in the HR tech industry to develop a standard that can be used to connect multiple shorter assessments in a single assessment order. It allows us to provide opportunities for clients and assessment vendors and applicant tracking systems and even learning platforms to connect shorter assessments, video interviewing tools, phone interview tools, content simulations, all sorts of opportunities to connect new and innovative tools, and to sequence those tools using conditions and progression for clients to really develop out their hiring process or their development career development processes in their, in their platforms. And Rick, feel free to jump in if you want to add to that.

Rick Barfoot:

Yeah. I just wanted to add that, you know, one of the trends that we're seeing in, in dealing with many different clients and assessment vendors is the move towards the shorter assessment and building that directly into the application process or as part of the application process. And there's a lot of sensitivity to application or applicant drop-off if the process is too long. So this is having the ability to stage an assessment or to chunk it down, whether it's within a single vendor, assessment vendor, or whether it's across multiple assessment or even other types of screening type vendors. I think it's a huge new capability that this new standard will support.

David Steckbeck:

Well, thank you, gentlemen. And then we wanted to talk about short assessment advantages. Rick, would you mind discussing the short assessment advantages?

Rick Barfoot:

Sure. Yeah, and I'll ask Jim to jump in also. I think he has a lot of experience with this at DDI as well, but as I was mentioning, we're seeing a trend towards having a shorter assessment instead of, what I've seen in the past and I've gone through myself in the past, with the 90-minute assessments sometimes or 60-minute assessments. That's just not practical during an application process, it will absolutely lead to drop off. If the goal is to make sure that you're getting people through the process and, you especially don't want to scare off your qualified and interested candidates with a lengthy application process, having a shorter assessment is definitely an advantage. The results of the previous or the earlier assessments can then be used to, to determine whether, you know, the candidate progresses to a certain stage in the recruiting cycle or, and/or if they're offered a different assessment and/or potentially a lengthier offline assessment.

Greg Meyer:

Yeah, this is Greg. I think another area that maybe we should talk about, is this allows for some creativity, with respect to workflows, because now you could imagine if there's like a core assessment you'd like to do with anyone that you might be interviewing and use that to redirect, to say maybe another opportunity within the company. This may have a different assessment or different skills assessment related to that. So now you can chunk it down instead of having them go through four big, long assessments for four different positions. It's like, okay, let's do this quick check. Now, maybe send them an assessment specific to another opportunity inside the company they may be a better fit for.

Jim Elder:

Absolutely thank you, Greg and Rick, and it leads to better research and analytics, shorter assessments could be focused on single competencies. We can quickly adapt and improve scoring algorithms for shorter, more focused assessments. Having the progression built in, the audit progression built into the staged assessments standard allows us the opportunity to create progression for clients and selection as far as how candidates proceed from perhaps from that core behavioral assessment onto acumen or potential or other types of assessments that can really help pinpoint skills and provide focused analytics for onboarding and development as candidates become employees and then progress through their career development. The advantages are numerous, and we've learned through our meetings and our discussions, how this staged assessment really opens doors to several innovations in the industry.

David Steckbeck:

Thank you guys. And with respect to new technologies, Jim, would you mind leading us off and discussing that.

Jim Elder:

Not at all. Thank you, David. It's building off the innovations in the industry. We know that there is a lot of interest in mobile assessments. These days, mobile really is a new platform for assessment vendors to focus on providing quick and just in time development and assessment opportunities using mobile devices. Research shows that shorter assessments are much better considering the UI and the experience in mobile assessments. But other technologies such as games and simulations and video interviewing tools, as well as content, and sequenced learning paths and other things that can help with providing access for clients to use some of the new emerging technologies in our field.

David Steckbeck:

And regarding the staged assessments workflow that we've been working on, what you can see here is a diagram of how this works. So we have a candidate who applies for position, and when they come in the assessment or the ATS or the vendor or the customer, whoever is expecting this candidate to come in prepares an assessment order package. That order package is effectively a bunch of JSON or XML, depending on which standard you are affiliated with in terms of your technology. As you can see from here, we've got the c yan c olored a nd the green colored boxes indicating the assessment order request flow. The request is then sent on over to the ATS or to the assessment vendor and then a response i s returned back. In HR Open in the assessments standard, this is where we define what is being put in there. As a work group right now, we're adding the staging into this, and this is what you can see from here. So you create the order, the invitation is sent on over to the candidate or the person taking the test. They take the test that is processed. Then when it comes back through the system, a report request is sent in, either by the client company or the ATS to the assessment vendor. Rick and Jim, could you guys describe a little bit about how this works with the workflows that you see out there?

Rick Barfoot:

Sure, it's Rick here, I can take that one. First keeping in mind, like a lot of the recruiting systems that have built APIs or exposed APIs to facilitate assessment integration have followed well, first of all, they've sort of assumed that it's just a single test that's being ordered. So it's generally just a request or it's the assessment vendor that's calling in to pick up a request for a single test. And this effectively allows us to expand what we can do in terms of the logic and externalizing, the logic of progressing an individual candidate through a series of assessments. So from the recruiting system point of view, or I guess an LMS as well, you know, it's generally looking at it as just a single request in a single test and than expecting a single a result for that order that initially went in. So with what I would argue is relatively small change, the request can be modified so that it includes more of that logic. So the logic doesn't have to be coded in the recruiting system itself. It's effectively encapsulated in the JSON that forms part of that request. And so, you know, from the ATS point of view, it's still a single order request that goes out. It just has more details in it, more logic, more parts to it. All of that work happens external and at the end of the day, it's still a single, you know, result or report or set of results that get posted back into the recruiting system. That's how I see it being effectively used to augment what is typically found out there today. Jim, I don't know if you had some other thoughts on that.

Jim Elder:

I do. That's a really good point too Rick. I think it, the standard allows us to level the playing field a little bit. So the capability to have all of that progression and those rules built into the assessment order request itself allows different assessment vendors, different content vendors, different platforms, such as applicant tracking systems and learning management s ystems to take advantage of the progression that's built into the order itself, rather than trying to code that into the platform and allows that to be interoperable across a wider range of platforms, and really allow the client to use a wide range of systems and have opportunities to really build that progression to their s pecification into the request itself.

Rick Barfoot:

That's a good point also, just to add to that Jim, the kind of the competitive aspect there, that there are very, very few recruiting systems, that I'm aware of in the market that have that level of, or that type of logic built in for managing a battery of assessments, for example, and, you know, having that, that logic that's, that's actually built in. So this could be an effective way for other recruiting systems to kind of close that competitive gap

David Steckbeck:

Thank you gentlemen, for those insights into the staged assessment feature that we've been putting into the standard. Now we wanted to talk a little bit about integrating interviews into the assessments work. Greg, would you mind talking a little bit about that.

Greg Meyer:

Sure thing, thank you. This kind of carries on with the previous discussion, what, you know, what we've discovered as part of the interviewing process and interview is just another form of assessment and generally a very structured interview, there's specific questions that might target certain skills or competencies, and then there's rating and scoring associated with that. And what we're discovering is that with clients kind of maturing their level of expertise with interviewing is that they're starting to combine these interviews with assessments and the opportunity we see is those situations where, you know, as an interviewing company where we have our own internal assessments, but we also partner with others assessment providers to package those into a single assessment that a recruiting system might order. So the slight difference in this workflow is where our recruiting system, we may be a vendor for a recruiting system, and we're going to deliver some kind of staged assessment with a structured interview. And then it's a skills assessment or a personality assessment, maybe a gamified assessment that gets packaged and a single status returned back to the applicant tracking system and the recruiting system and we're seeing that for a couple different reasons. One is just our clients maturing. They're looking for a more robust assessment performed, but also it provides an opportunity for other assessment providers, that may not necessarily have a direct integration with that recruiting system or applicant tracking system. And especially in the enterprise market to become a vendor, you know, to support a vendor in that market and to integrate with other systems can be incredibly expensive and time consuming, not just from a technology perspective, but from a security and a security review perspective. This is a way for us to work with other vendors, combine that assessment into a series of staged assessments and return that final result to the client. And again, those stage assessments have, could have areas of guidance where if they score at this level, then they may not necessarily auto progress to the next stage. So that really gives us that opportunity to package that and work with other vendors to provide that complete assessment for a candidate. We're seeing that because of that combination of desire by our clients to do both interviews, traditional interviews, or some sort of traditional interview, along with that assessment. We're getting requests for integration with skills assessment providers, personality assessments, gamification, and that all rolls back into what we just described earlier which is kind of a smooth candidate experience where it allows us to keep those assessments short and have a consistent delivery to the candidate, where they don't feel like they're being handed off from platform to platform. It gets us the ability to deliver to the market quicker and gives our clients higher confidence they're making the right decision with respect to that candidate. It also gives us a way to package up the results in a way that's a little more consumable. So instead of having four different results they have to review, this gives us a way to kind of package that, maybe deliver it as one recommendation based on the four different assessments or two different assessments a candidate might have gotten through. And then finally, the other area that we're really excited about is this gives us more data to pull in and then using that with machine learning to maybe combine those assessments into a single guidance score for our clients to help prioritize which candidates should they focus on first.

Rick Barfoot:

So if I could just add to that, actually, David, if I could add a couple of points that because I had focused my earlier comments, more on the recruiting system vendor side, that is a possibility that it's implemented that this new staged assessment standard is implemented on the recruiting system vendor side. But as Greg just mentioned, it could equally be implemented on the vendor side, the assessment vendor, the video interview vendor whoever's receiving that order or whoever's acting as sort of, let's say the primary vendor and managing that downstream process. Whether it's within, you know, whether it's multiple things within single vendors capabilities or if it's reaching out and tying into to other vendors, as well as some of the examples that Greg gave the third one is that it could also be part of the integration platform. So whatever's brokering, if there is an iPaaS or something in the middle, that's brokering the request from the recruiting system to the vendors that logic could be controlled there as well. The iPaaS, the integration platform as a service, could also be acting as the broker of all of those requests. The thing about those last two examples is that it really gives the ATS the capability that it didn't have before with little to no changes on the ATS side, because instead of ordering assessment A you could be setting up a package, one, two, three that is combined with all of this logic and everything. All the ATS knows is it's ordering package one, two, three, and all of that downstream, all of that logic brokering between the different vendors and or within a single vendor for multiple tests is done by that external entity, whether it's an iPaaS or whether it's the vendor themselves.

Greg Meyer:

Yeah. The brokering opportunity you're bringing up is actually a great idea because, as you can imagine, you've got two or three vendors that would like to work together. And if you select one vendor, there might be some discomfort with allowing one vendor to have control of the situation, but using a brokering service, like you described two or three vendors could work together for that combined experience and use the broker to make sure there's a neutral entity managing the staging between the different assessments.

Rick Barfoot:

Yeah, I think a pretty common use case also out there is that a client has worked with a vendor for a long time, like maybe an assessment vendor to do more of a kind of traditional, let's say personality assessment or job fit type assessment. And they're happy with that vendor, but there's a startup that focuses on culture fit and that's something their current assessment vendor just doesn't simply doesn't do. So now they trust their existing assessment vendor, they've worked with them for a long time and so they're going to tap them to kind of manage that external vendor that they want to tie into the process and bake it into wherever it fits logically in the stages for a particular job or for a particular client.

Greg Meyer:

Yeah. That's exactly the use case we're running into. Take a large enterprise company like a Siemens or a Procter and Gamble where the effort to move it to their mainstream works flow and integrate into the enterprise environment is less of a technical hurdle, but more of a corporate hurdle in terms of federal review process. And so this really makes it much easier to do something like that.

David Steckbeck:

Thank you gentlemen, for that insight. And now we want to talk about learning assessments and the content related to those, Jim would you mind helping us out?

Jim Elder:

Not at all, David, thank you. Just to tie back into Rick and Greg's conversation, it really empowers the client at the end of the day whether the supplier, the requester, or broker, whoever helps to build that integration on behalf of the clients, it really empowers the client to build their selection process and development as well. As we talk about learning assessments and content, it really empowers the client to blend their processes the way they think best, whether that's a single assessment vendor, a single content vendor, or multiple content or assessment vendors, video interviewing, or other new, innovative opportunities. This new standard really allows clients to be able to design their learner candidate experience the way they'd like to see that process flow. The new stageed assessments really work very well with content. We work very, very often with leadership development and we have found research has proven that today more planned and sequenced learning really has a more impactful outcome. As leaders learn to embrace new opportunities, as they're identified as potential leaders, we can use new technologies such as virtual reality assessment tools to assess intelligence or emotional intelligence content and assessments can be built into a single workflow. Some of the other new technologies we talked about earlier, games, video interviews, all of those can be built for the purpose of selection and for development. Having the chance to do that in a planned and sequenced way, in a way in which the empowered client can control and can work with consultants and industry experts to sequence that learning experience or that selection process is really a wonderful opportunity we believe in, and we've worked hard to put together and to build that into our staged assessments. We also believe that shorter assessments, shorter bits of content help with retention. Research does prove that it also helps with reducing the number of distractions, thus leading to more focused learners. And the opportunity to really take a short piece of content, a short assessment that's maybe 15 or 20 minutes long has a better outcome than the longer 60 or 90 minute assessments. It also provides opportunities for learners to have a break, to use the devices we talked about earlier, such as assessments and content on mobile devices. It really even allows for the opportunity to do things that are even manually scored, such as simulations or more longer lasting, assessments that are delivered in sequence and in small, shorter segments. So the ability to use staged assessments really provides the opportunity to use these emerging technologies and to request multiple pieces of content or assessments in one request for our work group and for DDI, it really opens the door to what we believe is a better experience for learners and candidates.

David Steckbeck:

All right, well, thank you, Jim. Team, what's next? We've got multiple suppliers that we've been working with a single requester with request control. They've got multiple suppliers with a single requester that have distributed supplier control. We've got connecting assessments results to learning systems and then learning platform API standards in order to bring in the content of those learning programs into the HR Open standard and kind of make up a hybrid of a little payload that you create. So those are just some of the items that we've got coming up that we've been discussing, we would love to have your help with that. Are there any notes of some of these feature items that you gentlemen would want to discuss about real quick?

Rick Barfoot:

Well, it's Rick here again, I'll jump in on the first two points there because we focused this discussion mostly on the first one or the first kind of use case where, there's a single controller, so our centralized control, and we kind of talked about how that could be the recruiting system is acting as a controller, or it could be an iPaaS, an integration platform, middleware that's acting as the controller, or it could be a vendor themselves that are acting as a controller between themselves and other vendors, or even just within their own system. But the second one, the idea of the distributed supplier control is almost like a kind of a daisy chain type idea where, the order gets passed from one vendor to the next. Each vendor gets enough information from the previous one and for the next one, so they know what they know what to do and you know how to respond. So for example, the order gets initiated from the recruiting system and it goes to vendor A. Vendor A completes their piece, or I should say that the candidate completes the assessment from vendor A, and based on the result that comes back from that vendor A makes the decision, do I report that directly back to the recruiting system to say we're done, or do can it pass a certain threshold of the test let's say, then do I pass it on to the next vendor in the list? So as you can imagine that that's a lot more complicated. We had initially looked at that one as a working group and decided to focus first on the simpler case with a single requester. It is a very interesting approach or use case as well to have the distributed supplier control, because it then takes the onus off any single vendor or part of the integration value chain, so to speak, in acting as the controller or building the logic. It just really let's each vendor focus on what they need to do and then to make the decision, do I pass it back to the original requester or do I move it onto the next vendor in the list?

David Steckbeck:

Yeah, that's, that's a good point. We did encounter challenges with that one and we did elect to tackle the first simpler use case of staged assessment that we created here. Jim, did you want to talk about connecting assessment results or Greg connecting assessment results of learning systems or a platform API standard briefly?

Jim Elder:

Sure. David, I'll be happy to jump in. An API standard for connecting learning systems while there are industry standards, commonly adopted industry standards, for integrating content in the learning platforms for transferring and collecting information on learner experience there really isn't a fully built out end-to-end standard, API standard, for connecting learning platforms. We do believe that the assessment standard is well fit for that opportunity for connecting those platforms. We've built the ability already for assessment results to transfer to learning platforms and staged assessments as we've talked throughout this presentation is really a great opportunity to connect content to learning platforms.

Greg Meyer:

Yeah, this is Greg, we're starting to see that demand from our clients that, you know, they want the opportunity to present materials to candidates, possibly as part of their learning platform to educate the candidates about the opportunity the company they might be working for. And they already have that material as part of their learning platform and want to be able to integrate that into the candidate experience as a part of the whole interviewing and assessment process. So we see this as a potential huge opportunity for integration opportunity between our products and learning systems.

David Steckbeck:

Well, thank you, Rick. Thank you, Jim. Thank you, Greg, for all of your inputs. Uh, we want to thank you the viewer for taking the time to listen to this and follow along with us. Uh, we certainly hope that this was educational enough for you to want to make a change out there. We would want you to adopt the standards, use these standards to help empower your communication abilities and make your processes more efficient. Thank you for listening.