While some people shouldn’t ever be driving, your average car is pretty idiot-proof these days. If you leave your headlights on, it beeps at you. If you don’t have your seatbelt on, more beeping. You can’t even lock your keys inside a car these days because the door will refuse to lock. Can’t get lost due to built-in GPS. Heck, if you don’t want to park or drive, they’ll do that for you now. Mostly, it’s for your own safety and convenience. The car makers know that humans make mistakes and they do their best to safeguard against inevitable errors.
Your Applicant Tracking System (ATS) certainly can cost as much as a car each year, and there are probably even more ways that human error can creep in. With multiple employees entering in data by hand, your overall data quality can quickly go south and render any sort of analysis faulty. What can you do to “idiot-proof” your ATS usage?
Know Where You Make Errors
Especially if you have a lot of historical data to correct, you need to examine which fields in your ATS are frequently prone to errors. Depending on your ATS, you might be able to pull a report with null or “0″ fields, to start. Aggregating those fields might show a report like this:
In this case, the “fee” field had the most amount of errors and correcting these would result in a nearly 25% increase in data quality. To continue to find data issues, you can pull a report with the fee set to, say, anything above 100% to find Job Orders were the fee was probably misentered.
Use an Average as a Placeholder
Perhaps from this exercise, you find that there are over a hundred fees with 0% values. Do you have the time to fix them all? Eventually, you should make the time as better data means better predictive analysis, but it’s often unrealistic to plow through all data errors in one sitting. Meanwhile, you want to view overall trends on this data. What to do?
A good placeholder is to pull your average and apply it to all missing data as a placeholder. This will maintain the status quo of your trend over time, and while it’s not perfect data, it’s better than zeros that might completely throw off the trend lines.
To the left you can see that this set of data has put an adjusted value of 20% for any fee fields that have been flagged has having erroneous data. This can be easily applied in one step via Excel and uploaded to your ATS as a “quick fix” until you have the bandwidth to really get to scrubbing your data errors.
Make Some Noise About Errors
Honestly, if cars didn’t beep at us incessantly each time our seat belts were off, we’d probably wear them less. Take a cue from the seat belt robot and make some noise whenever you see your employees making data errors. After all, it’s for their own good. They want full credit for their placements, right? Unless the enter in data correctly into the ATS, they run the risk of not getting credit for a win. Your motto should be “if it’s not in the ATS, it never happened.”
In the example to the right, Nicholas Copernicus has a slew of high priority errors on his Job Orders. A “beep” each time Nicholas doesn’t log his data carefully will go a long way to getting him on track to using your ATS better, improving his track record and the data quality of your entire organization. Another way to do this would be to utilize nightly emails to point out data errors and expose them to the entire company. No one wants to be constantly on the naughty list; you can expect to see better compliance in just a few days.
How do you drive better usage of your ATS? Sound off in the comments below.