•This article made it seem that there is an alternative predictor that is just as valid as cognitive testing. This is enlightening, but how likely are companies to use biodata over cognitive testing? Cognitive testing has received much more credibility in the workforce and praise. I would find this hard to be replaced by biodata for particular jobs that require high levels of cognitive ability.
The validation criteria for the biodata measure in this study were the ratings by the immediate supervisor(s) of the degree to which the individual met the job requirements and the individuals’ ability to meet the job requirements. Interestingly, validities for ability ratings “averaged consistently higher than those for the duty ratings, even after correction for unreliability.” Why might this occur? Could this be due to fundamental attribution error on the part of the raters?
I think I agree with the authors that it is plausible to have validity generalization for biodata predictors of job performance within a job family. However, the authors examined first line supervisors only in this study. Could biodata measures demonstrate validity generalization across other job families or jobs?
How long ago do you think is reasonable to college biodata? That is, how long ago do you think participants should need to look into their pasts to answer the questions? How long before someone's past behavior becomes irrelevant if ever?
It seems that multilevel modeling would work well with these measure, because you could take into account the organizational level. I don't understand why they said biodata keys can decay over time? How would this happen?
I am eager to hear Roni's commentary on this article. I find it helpful and practical. If a well-developed autobiographical questionnaire instrument can be shown to generalize across a job family as large as first-line supervisors, this can be a very helpful took for business to add incremental validity to a selection procedure.
•This article made it seem that there is an alternative predictor that is just as valid as cognitive testing. This is enlightening, but how likely are companies to use biodata over cognitive testing? Cognitive testing has received much more credibility in the workforce and praise. I would find this hard to be replaced by biodata for particular jobs that require high levels of cognitive ability.
ReplyDeleteThe validation criteria for the biodata measure in this study were the ratings by the immediate supervisor(s) of the degree to which the individual met the job requirements and the individuals’ ability to meet the job requirements. Interestingly, validities for ability ratings “averaged consistently higher than those for the duty ratings, even after correction for unreliability.” Why might this occur? Could this be due to fundamental attribution error on the part of the raters?
ReplyDeleteI think I agree with the authors that it is plausible to have validity generalization for biodata predictors of job performance within a job family. However, the authors examined first line supervisors only in this study. Could biodata measures demonstrate validity generalization across other job families or jobs?
ReplyDeleteHow long ago do you think is reasonable to college biodata? That is, how long ago do you think participants should need to look into their pasts to answer the questions? How long before someone's past behavior becomes irrelevant if ever?
ReplyDeleteIt seems that multilevel modeling would work well with these measure, because you could take into account the organizational level. I don't understand why they said biodata keys can decay over time? How would this happen?
ReplyDeleteI am eager to hear Roni's commentary on this article. I find it helpful and practical. If a well-developed autobiographical questionnaire instrument can be shown to generalize across a job family as large as first-line supervisors, this can be a very helpful took for business to add incremental validity to a selection procedure.
ReplyDelete