As a career service professional, would you like to connect more with your faculty and other academic colleagues?

Sharing this with your Provost might get you brownie points, enhanced engagement, and recognition by your peers 👏.

Consider sharing this article about Generative Artificial Intelligence (GAI) in higher education with them to enhance your credibility while fostering relationship-building and bridging the gap with your fellow professionals  🌉. You might even find ways to inspire your own career research pursuits to improve the services you offer to your students while better helping your staff 📈. If you want to chat about how to best implement the use of these tools with your team, school, or organization, please feel free to reach out and ask any questions that you may have via the Calendly link at the end of this newsletter 📞💼.

Have you heard about the recent White House 🏛️ summit on 7/21 with a gathering of leading GAI companies, including OpenAI, Microsoft, Amazon, Alphabet (Google), and Meta (Facebook), to gain commitments to establishing guidelines and tools to make AI more transparent? They also signed a pledge 📜. Discussions included having watermarks on AI-generated content (including videos, images, and text). Third-party reviews add further transparency to AI processes also committed to 🔍. Two other lesser-known companies present at this exclusive gathering that you likely want on your radar include Anthropic (Claude 2 – used below in this newsletter and also actively used by Quora, Slack, and Zoom) and Inflection (Pi-Personal AI). Keep your eyes open for these and many other new players coming onto the scene. As a matter of fact, I highlighted over 80 such global AI tools and companies in a recent LinkedIn post that can be found at https://www.linkedin.com/posts/drscottcpa_chatgpt-gai-artificialintelligence-activity-7091411632257437696-_cJF 👀.

You might appreciate our next webinar discussing the results of our pilot with career-minded students as they work through our recently developed career curriculum 📚. We will be following up with details on the fall rollout to career service offices that can engage your students, prepare peer mentors, facilitate relations with the community, alumni and employers, and make your office the go-to place for students wanting to seek and find opportunities ultimately making a difference in their own lives and the lives of others 💼. Let’s help each other to help our students get a return on their college investment as they get ready for a life of success… and not messing up. Join us on Thursday, August 17, 12-12:30 ET to participate. Register today via https://www.linkedin.com/events/c3-collegetocareerconvergenceop7086690849115279360/about/ 🗓️.

Let’s jump into the weeds 🌿. We are preparing our students for a new digital work age. I am seeking to conduct a study that will help us understand what the needs of employers are so that our students can meet those needs 📊. We will use Anthropic’s Claude 2 (mentioned above) and OpenAI’s GPT-4 to help us prepare. I have been a GPT-4 fan for quite a while, preferring it to Bard and other AI tools I have used (see previous newsletters for sample conversations with a variety of tools). I was very impressed by Claude, but I will get into more details in my Commentary at the end of this newsletter 🤖.

Two of my colleagues and I have been discussing this research project over a series of Zoom sessions. It has taken hours of research and review to come up with journal articles, theories to consider, statistical methods to be used, and survey questions to be asked ⏳. The following AI conversations amazed me in that what was taking HOURS of “interesting” discussions with my academic colleagues was accomplished in MINUTES by asking effective targeted questions (prompt engineering). Take a look below and see the POWER of what these tools can do 🕒.

Don’t get bogged down in the specifics of the exchange. Observe how the “conversation” evolves while yielding amazingly “thoughtful” results 💭. Both Claude 2 and GPT-4 might even be able to be used directly with your staff to help in administrative and marketing tasks as well as with your students in preparing their resumes, cover letters, LinkedIn Profiles, interview question/answer preparation, and fostering networking opportunities 📝.

The following is a research project geared towards understanding what an HR professional would look for and value in a prospective employee 🏢. We are trying to determine how a student can be most prepared and put their best foot forward when seeking internship and job opportunities. We seek to answer the question, “What are the highest priorities of potential employers when they are seeking out job candidates?”. A series of AI prompts yielded ten theories, specific statistical analysis techniques, multiple research questions, and ultimately survey questions that could be directly asked IN A MATTER OF SECONDS. Identical prompts were used with both of these powerful GAI tools 🕵️ ♂️🔍.

Here are the results.

Results Using Anthropic’s Claude 2:

Q: You are an academic researcher and scholar with over 20 years of experience writing journal articles related to the career space. You are seeking a theory that would support a research project analyzing what is valued by human resource professionals with the intent of having these professionals rank and evaluate which attributes would be most favorably evaluated during the application process for prospective interns and employees, leading to inviting the job candidate for an interview. Specifically, you are looking at the value of certifications such as the CPA and CMA, knowledge of programming languages such as Python and SQL, specific soft skills such as oral and written communication, academic accomplishments such as GPA and other recognition, and any other considerations you might have when evaluating a potential internship or job candidate. List at least ten theories and possible survey techniques or methodologies to evaluate the criteria above.

A: Here are 10 theories and methodologies that could support a research project analyzing what HR professionals value in job candidates:

  1. Human capital theory – Posits that knowledge, skills, and abilities of employees are valuable organizational assets that can lead to competitive advantage. Survey methods could include rankings of candidate attributes and conjoint analysis.
  2. Signaling theory – Notes that educational credentials and certifications act as signals of desired employee traits. Methods may include rankings and discrete choice experiments.
  3. Person-organization fit theory – Suggests perceived compatibility between candidates and organizational values impacts hiring. Could use Likert-scale surveys on candidate-company fit.
  4. Impression management theory – Discusses applicant tactics to be viewed favorably. Interviews could reveal what impression management behaviors impress HR professionals.
  5. Attribution theory – Focuses on how interviewers attribute positive/negative qualities to candidates. Scenarios with candidate profiles can test attribution biases.
  6. Social identity theory – Explores how demographics shape interactions and evaluations. HR surveys can obtain views on diversity.
  7. Stereotype fit model – Examines stereotyping in hiring. Vignettes of candidates from different groups would reveal biases.
  8. Signaling theory – Notes certifications signal skills. Rankings and discrete choice experiments can compare credentials’ impact.
  9. Human capital theory – Views education as an investment in productivity. Surveys could rate value of academic credentials.
  10. Implicit personality theory – Suggests assumptions about how personality traits correlate. Likert surveys can test these assumptions in hiring context.

Methodologies could include surveys, interviews, conjoint analysis, and discrete choice experiments. Both qualitative and quantitative data can provide insights into how HR professionals evaluate and prioritize candidate attributes when making hiring decisions.

Q: What statistical methods could be used to analyze survey data for the above study

A: Here are some statistical methods that could be used to analyze survey data in a study examining how HR professionals evaluate and prioritize candidate attributes:

  • Factor analysis – could identify underlying factors or dimensions that explain how HR professionals group and evaluate candidate attributes. Allows determining the most influential factors.
  • Regression analysis – used to model the relationships between candidate attributes (independent variables) and hiring recommendations or decisions (dependent variable). Can determine the predictive power of each attribute.
  • ANOVA – to compare means of hiring assessments between groups based on credentials, skills, demographics, etc. Can reveal if certain attributes lead to significantly better evaluations.
  • Conjoint analysis – has respondents rate or rank profiles with different combinations of attributes. Used to quantify importance weights and utility scores for each attribute.
  • Chi-square tests – can determine if rankings/ratings of candidate attributes differ based on respondent demographics like industry, experience level, etc.
  • T-tests – compare means between two groups, such as male and female respondents, to see if attribute priorities differ significantly.
  • Correlation analysis – useful for metrics measured on continuous scales, examines relationships between attribute rankings/ratings and tendencies in hiring decisions.
  • Relative importance analysis – determines percentage of importance for each attribute in hiring decisions using regression modeling.

The choice of statistical method would depend on the type of survey data collected and the specific research questions looking to be addressed through the analysis. Using a mix of quantitative techniques would provide a robust examination of the data.

Q: What would be possible research questions for the above study

A: Here are some potential research questions that could be investigated for a study examining how HR professionals evaluate and prioritize candidate attributes:

  • Which credentials, such as degrees, certifications, or training programs, do HR professionals view as most valuable when assessing candidates?
  • How do HR professionals rate the importance of academic performance indicators like GPA or special academic honors when evaluating recent graduates?
  • What specific soft skills, such as communication abilities, leadership qualities, or critical thinking, are perceived as most important for candidates by HR professionals?
  • How do HR professionals view balancing hard skills/experience and soft skills when evaluating experienced candidates for openings?
  • Do HR professionals exhibit preferences or biases for candidates similar to themselves in terms of demographics, academic background, etc?
  • How highly do HR professionals value emotional intelligence, cultural fit, motivation, and personality when reviewing candidates?
  • What differences exist between industries or companies in terms of how candidate attributes are prioritized and evaluated in hiring?
  • How do economic conditions or labor market factors influence how HR professionals evaluate qualifications and experience?
  • What gaps are observed between what HR professionals say they value and what hiring data shows being predictive of getting hired?
  • To what extent do cognitive biases, first impressions, or improper attribution influence how HRs ultimately evaluate candidates?
  • How could training or guidelines help improve consistency and reduce bias in how HRs evaluate candidate resumes and applications?

The research questions would focus on uncovering insights into hiring practices, priorities, biases, and decision-making from the perspective of HR professionals through surveys, interviews, and choice experiments.

Q: Design at least 20 survey questions based on conjoint analysis evaluating the criteria as originally proposed

A: Here are 20 potential survey questions for HR professionals based on conjoint analysis to evaluate candidate criteria:

  1. When reviewing candidates for an entry-level accounting role, how important is having a CPA certification? Very Important, Somewhat Important, Not Important
  2. When deciding between two candidates with similar experience, how much more would you value a candidate with a CPA certification? Much More, Somewhat More, Slightly More, No Difference
  3. How valuable is knowledge of programming languages like Python or SQL for an entry-level accounting candidate? Extremely Valuable, Very Valuable, Moderately Valuable, Slightly Valuable, Not Valuable
  4. Given two equal candidates, how much more would you value someone with Python and SQL skills?
  5. Much More, Somewhat More, Slightly More, No Difference
  6. For a recent graduate, how important are excellent written and verbal communication abilities? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  7. Given similar candidates, how much more would you value someone with top 10% communication skills? Much More, Somewhat More, Slightly More, No Difference
  8. When evaluating a recent graduate, how important is a high GPA (3.8+) from a reputable institution? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  9. For a candidate with average work experience, how much more would an honors award increase your evaluation? Much More, Somewhat More, Slightly More, No Difference
  10. How important is cultural fit when considering two candidates with similar qualifications? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  11. Given identical resumes, how much more would you value a candidate that seems like a culture add? Much More, Somewhat More, Slightly More, No Difference
  12. When considering two equal candidates, how much more valuable is 3-5 years experience compared to 1-2 years?
  13. Much More Valuable, Somewhat More, Slightly More, No Difference
  14. How important is leadership experience from school or work when evaluating candidates? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  15. Given similar candidates, how much more would you value leadership experience? Much More, Somewhat More, Slightly More, No Difference
  16. How important is emotional intelligence and self-awareness in candidates? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  17. Given two equal candidates, how much more would you value higher emotional intelligence? Much More, Somewhat More, Slightly More, No Difference
  18. How important is interest and enthusiasm for the role and company during the hiring process? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  19. Given similar candidates, how much more would you value authentic excitement? Much More, Somewhat More, Slightly More, No Difference
  20. How important is the interview presentation and personality fit when deciding between two qualified candidates? Extremely Important, Very Important, Moderately Important, Slightly Important, Not Important
  21. Given two equal resumes, how much would you favor the candidate with the better interview presence? Much More, Somewhat More, Slightly More, No Difference
  22. When evaluating candidates, how much weight do you place on your personal intuition or gut feeling about fit? A Great Deal, A Fair Amount, A Moderate Amount, A Little, None at All

Q: List some peer-reviewed journal article references, with appropriate citations including Doi, for at least 25 articles that would be related to the topic above

A: Here are 25 peer-reviewed journal article references with citations and DOIs related to evaluating candidate attributes in hiring:

  1. Allen, D. G., Mahto, R. V., & Otondo, R. F. (2007). Web-based recruitment: Effects of information, organizational brand, and attitudes toward a Web site on applicant attraction. Journal of Applied Psychology, 92(6), 1696–1708. https://doi.org/10.1037/0021-9010.92.6.1696
  2. Bangerter, A., Roulin, N., & König, C. J. (2012). Personnel selection as a signaling game. Journal of Applied Psychology, 97(4), 719–738. https://doi.org/10.1037/a0026078
  3. Barrick, M. R., Swider, B. W., & Stewart, G. L. (2010). Initial evaluations in the interview: Relationships with subsequent interviewer evaluations and employment offers. Journal of Applied Psychology, 95(6), 1163–1172. https://doi.org/10.1037/a0019918
  4. Cole, M. S., Rubin, R. S., Feild, H. S., & Giles, W. F. (2007). Recruiters’ perceptions and use of applicant résumé information: Screening the recent graduate. Applied Psychology, 56(2), 319–343. https://doi.org/10.1111/j.1464-0597.2007.00288.x
  5. Connelly, B. L., Certo, S. T., Ireland, R. D., & Reutzel, C. R. (2011). Signaling theory: A review and assessment. Journal of Management, 37(1), 39-67. https://doi.org/10.1177/0149206310388419
  6. Derous, E., Born, M. P., & De Witte, K. (2004). How applicants want and expect to be treated: Applicants’ selection treatment beliefs and the development of the social process questionnaire on selection. International Journal of Selection and Assessment, 12(1‐2), 99-119. https://doi.org/10.1111/j.0965-075X.2004.00267.x
  7. Dineen, B. R., Noe, R. A., & Wang, C. (2004). Perceived fairness of web-based applicant screening procedures: Weighing the rules of justice and the role of individual differences. Human Resource Management, 43(2‐3), 127-145. https://doi.org/10.1002/hrm.20016
  8. Ehrhart, K. H., & Ziegert, J. C. (2005). Why are individuals attracted to organizations? Journal of Management, 31(6), 901-919. https://doi.org/10.1177/0149206305279759
  9. Humburg, M. (2017). Skills and the graduate recruitment process: Evidence from two discrete choice experiments. Economics of Education Review, 60, 1-14. https://doi.org/10.1016/j.econedurev.2017.07.001 DDD
  10. Kilduff, M., & Day, D. V. (1994). Do chameleons get ahead? The effects of self-monitoring on managerial careers. Academy of Management Journal, 37(4), 1047-1060. https://doi.org/10.5465/256612 DDD
  11. Krishnan, T. N., & Scullion, H. (2017). Talent management and dynamic view of talent in small and medium enterprises. Human Resource Management Review, 27(3), 431-441. https://doi.org/10.1016/j.hrmr.2016.10.003
  12. Kristof-Brown, A. L. (2000). Perceived applicant fit: Distinguishing between recruiters’ perceptions of person–job and person–organization fit. Personnel Psychology, 53(3), 643-671. https://doi.org/10.1111/j.1744-6570.2000.tb00217.x
  13. Kumari, S., & Saini, G. K. (2018). How does the recruitment process affect employees’ psychological contracts? Journal of Strategic Human Resource Management, 7(1), 7. XXX
  14. Oliphant, G. C., Hansen, K., & Oliphant, B. J. (2008). A review of a telephone-administered survey of a large college sample. College Student Journal, 42(2). XXX
  15. Phillips, J. M., & Gully, S. M. (2015). Multilevel and strategic recruiting: Where have we been? Where can we go from here? Journal of Management, 41(5), 1416-1445. https://doi.org/10.1177/0149206315582248
  16. Roulin, N., & Bangerter, A. (2013). Social networking websites in personnel selection: A signaling perspective on recruiters’ and applicants’ perceptions. Journal of Personnel Psychology, 12(3), 143–151. https://doi.org/10.1027/1866-5888/a000094
  17. Sackett, P. R., Lievens, F., Berry, C. M., & Landers, R. N. (2007). A cautionary note on the effects of range restriction on predictor intercorrelations. Journal of Applied Psychology, 92(2), 538–544. https://doi.org/10.1037/0021-9010.92.2.538
  18. Sanchez, R. J., Truxillo, D. M., & Bauer, T. N. (2000). Development and examination of an expectancy-based measure of test-taking motivation. Journal of Applied Psychology, 85(5), 739. https://doi.org/10.1037/0021-9010.85.5.739
  19. Spence, M. (1978). Job market signaling. In Uncertainty in Economics (pp. 281-306). Academic Press. https://doi.org/10.1016/B978-0-12-214850-7.50025-5
  20. Sujansky, J. G., & Ferri-Reed, J. (2009). Keeping the Millennials: Why Companies Are Losing Billions in Turnover to This Generation-and What to Do About It. John Wiley & Sons.
  21. Swider, B. W., Barrick, M. R., Harris, T. B., & Stoverink, A. C. (2011). Managing and creating an image in the interview: The role of interviewee initial impressions. Journal of Applied Psychology, 96(6), 1275. https://doi.org/10.1037/a0024005
  22. Turban, D. B., & Cable, D. M. (2003). Firm reputation and applicant pool characteristics. Journal of Organizational Behavior, 24(6), 733-751. https://doi.org/10.1002/job.215
  23. Van Hoye, G., & Turban, D. B. (2015). Applicant–employee fit in personality: Testing predictions from similarity-attraction theory and trait activation theory. International Journal of Selection and Assessment, 23(3), 210-223. https://doi.org/10.1111/ijsa.12109
  24. Williamson, I. O., Lepak, D. P., & King, J. (2003). The effect of company recruitment website orientation on individuals’ perceptions of organizational attractiveness. Journal of Vocational Behavior, 63(2), 242-263. https://doi.org/10.1016/S0001-8791(03)00043-5
  25. Young, J., & Foot, K. (2006). Corporate e-cruiting: The construction of work in Fortune 500 recruiting web sites. Journal of Computer-Mediated Communication, 11(1), 44-71. https://doi.org/10.1111/j.1083-6101.2006.tb00304.x

Links provided by Claude may not always be valid or up to date. We appreciate your patience as we work to improve link accuracy.

Q: Which of the above papers would you recommend as the top 3 that relates to the topic at hand

A: Based on the topic of evaluating candidate attributes in hiring, I would recommend the following 3 papers as the most relevant from the list:

  1. Barrick, M. R., Swider, B. W., & Stewart, G. L. (2010). Initial evaluations in the interview: Relationships with subsequent interviewer evaluations and employment offers. Journal of Applied Psychology, 95(6), 1163–1172. https://doi.org/10.1037/a0019918

This paper looks specifically at how initial interviewer evaluations of candidates relate to subsequent assessments and job offers. It provides insights into how certain candidate attributes shape interviewer first impressions and hiring decisions.

  1. Kristof-Brown, A. L. (2000). Perceived applicant fit: Distinguishing between recruiters’ perceptions of person–job and person–organization fit. Personnel Psychology, 53(3), 643-671. https://doi.org/10.1111/j.1744-6570.2000.tb00217.x

This paper examines how recruiters evaluate person-job and person-organization fit when assessing candidates. It sheds light on how hiring managers prioritize different elements of candidate fit.

  1. Van Hoye, G., & Turban, D. B. (2015). Applicant–employee fit in personality: Testing predictions from similarity-attraction theory and trait activation theory. International Journal of Selection and Assessment, 23(3), 210-223. https://doi.org/10.1111/ijsa.12109

This paper examines how personality similarity between candidates and recruiters impacts hiring assessments and provides evidence on how candidate attributes interact with recruiter biases.

These studies provide both theoretical grounding and empirical findings highly relevant to researching recruiter decision making on candidate attributes. Their methodologies also offer useful examples for a study in this area.

Links provided by Claude may not always be valid or up to date. We appreciate your patience as we work to improve link accuracy.

Q: Propose 20 titles for the proposed paper, considering what might be most attractive in garnering a reviewer’s attention

A: Here are 20 proposed title ideas for a paper examining how HR professionals evaluate and prioritize candidate attributes in hiring:

  1. Recruiter Revealed: How HR Professionals Really Assess Job Candidates
  2. Behind the Curtain: An Inside Look at What Drives Hiring Decisions
  3. Candidate Fit or Flawed Criteria? HR Evaluations Under the Microscope
  4. The Assessment Matrix: Quantifying Candidate Attributes that Make or Break Hiring
  5. Standard Requirements or Biased Benchmarks? Profiling Recruiter Evaluations
  6. Candidate Credentials, Culture Fit, or Chemistry? Unpacking the Hiring Black Box
  7. peripatetic Preferences: Explicit and Implicit Drivers of Recruiter Decision-Making
  8. Screening for Superstars: How Recruiters Identify Top Talent in the Applicant Pool
  9. Optimal or Biased? Psychometric Evaluation of Recruiter Assessments
  10. Character Counts? Examining Recruiter Prioritization of Soft Skills
  11. Do Recruiters Practice What They Preach? Comparing Stated and Revealed Preferences
  12. What Moves the Needle? Statistical Modeling of Recruiter Hiring Drivers
  13. Method or Madness? Contextualizing Recruiter Decision-Making
  14. The Gatekeepers: Recruiter Judgments and Engineering the Applicant Pipeline
  15. May the Best Impression Win: Effects of Signaling and Branding on Hiring
  16. Clones or Wild Cards? Uncovering Recruiter Biases in Applicant Screening
  17. Communication Skills, Kultur, and Kontacts: What Resonates with Recruiters
  18. Leadership, Likeability, or Luck? Keys to Recruiter Assessments
  19. Take the Job and Shove It? Modeling Recruiter Desirability Bias
  20. You’re Hired! Predicting Recruiter Decisions through Conjoint Analysis

Q: Revise the above titles using alliteration

A: Here are some revisions of the previous title ideas using alliteration:

  1. Deciphering the Decision Drivers: How HRs Determine Destiny in Hiring
  2. Peering Behind the Curtain: An Inside Inspection of Influencer Impacts on Hiring
  3. Fit or Flawed Formulas? Fathoming Favoritism in HR Evaluations
  4. The Meaningful Metrics Matrix: Quantifying the Qualities that Make or Break Hiring
  5. Screening for Superstars: How Recruiters Identify Top Talent in the Applicant Pool
  6. Credential Considerations & Culture Calculations: Cracking the Candidate Code
  7. Perceptive Preferences: Profiling the Explicit and Implicit Criteria Guiding Recruiter Decisions
  8. Optimal or Biased? Psychometric Probing of Recruiter Assessments
  9. Character Counts? Examining Recruiter Prioritization of Soft Skills
  10. Do Recruiters Practice What They Preach? Comparing Stated and Revealed Preferences
  11. What Wins with Recruiters? Statistical Spotlighting of Hiring Drivers
  12. Method or Madness? Contextualizing Recruiter Decision Making
  13. The Gatekeeping Game: Recruiter Judgments and Engineering the Applicant Pipeline
  14. May the Best Impression Prevail: Effects of Signaling and Branding on Hiring
  15. Clones or Wild Cards? Uncovering Recruiter Biases in Applicant Screening
  16. Communication Skills, Kultur, and Kontacts: What Resonates with Recruiters
  17. Leadership Lessons, Likeability Licenses, and Lucky Breaks: Keys to Recruiter Assessments
  18. Take the Job and Shove It? Modeling Recruiter Desirability Bias
  19. You’re Hired! Predicting Recruiter Decisions through Conjoint Analysis
  20. Cracking the Code: How Recruiters Calculate Candidate Compatibility

Results Using OpenAI”s GPT-4:

Q: You are an academic researcher and scholar with over 20 years of experience writing journal articles related to the career space. You are seeking a theory that would support a research project analyzing what is valued by human resource professionals with the intent of having these professionals rank and evaluate which attributes would be most favorably evaluated during the application process for prospective interns and employees, leading to inviting the job candidate for an interview. Specifically, you are looking at the value of certifications such as the CPA and CMA, knowledge of programming languages such as Python and SQL, specific soft skills such as oral and written communication, academic accomplishments such as GPA and other recognition, and any other considerations you might have when evaluating a potential internship or job candidate. List at least ten theories and possible survey techniques or methodologies to evaluate the criteria above.

A: Sure, below are several theoretical frameworks and methodologies that could support such a research project:

Theories

  1. Human Capital Theory: This theory posits that individuals and their productive capacities are an investment. Attributes such as education, skills, and certifications (e.g., CPA, CMA) are considered forms of human capital and are valuable to employers.
  2. Job Characteristics Theory: This theory emphasizes the importance of specific job characteristics in influencing job outcomes. In your context, an understanding of programming languages and other job-specific skills can be analyzed.
  3. Matching Theory: In labor economics, this theory argues that job matches are determined by the congruence of employer’s requirements and candidate’s skills. This can be applied to all the attributes you’ve mentioned.
  4. Person-Organization Fit Theory: This theory suggests that individuals are attracted to and selected by organizations that match their values, and vice versa. This could be extended to include soft skills.
  5. Human Resource Development (HRD) Theory: HRD theory suggests that people’s skills, knowledge, and abilities are honed and utilized for organizational development. This would include all forms of certification, knowledge, and soft skills.
  6. Social Exchange Theory: It proposes that the relationships involve an exchange of resources. Here, HR professionals exchange job offers for the skills and qualifications of candidates.
  7. Resource-Based View (RBV) Theory: It suggests that organizations can gain a competitive advantage by using their resources (including human resources) effectively.
  8. Expectancy Theory: It suggests that individuals are motivated based on the expected outcome of their actions. HR professionals might expect better job performance from candidates with higher GPAs and certifications.
  9. Self-Efficacy Theory: This theory emphasizes the importance of an individual’s belief in their capabilities. Soft skills like communication could fall under this.
  10. Emotional Intelligence Theory: It underscores the ability to understand and manage one’s own emotions and those of others. This can be related to interpersonal skills and communication.

Survey Techniques and Methodologies

  1. Quantitative Surveys: This can be used to rate the importance of various attributes on a scale, for instance.
  2. Qualitative Interviews: To gain more in-depth insights into what HR professionals value.
  3. Comparative Methods: Evaluating different job postings and the requirements listed in them.
  4. Focus Groups: Discussing with a group of HR professionals what attributes they look for.
  5. Conjoint Analysis: This statistical technique is used in surveys to understand how people value different features of a product, which could be applied to job candidate attributes.
  6. Content Analysis: Analyzing job descriptions and HR policies to identify key attributes.
  7. Delphi Method: Using a panel of HR experts, this technique can be used to gain consensus on the most important attributes.
  8. Observation: Participating in HR decision-making processes to see which attributes are most valued.
  9. Case Studies: Analyzing specific examples of hiring processes in depth.
  10. Experimental Design: Applying a scientific method, providing different CVs with different attributes to HR professionals and observing their choices.

Q: What statistical methods could be used to analyze survey data for the above study

A: There are numerous statistical methods that can be used to analyze the survey data from the above study. Here are some examples:

  1. Descriptive Statistics: This would provide basic data analysis and give you an overview of the data collected, including measures of central tendency (mean, median, mode), measures of variability (range, variance, standard deviation), and measures of distribution (skewness, kurtosis).
  2. Inferential Statistics: This includes hypothesis testing to draw conclusions about the population from which the sample was drawn. Methods such as Chi-Square tests, t-tests, Analysis of Variance (ANOVA), etc., can be used.
  3. Correlation Analysis: This statistical method could be used to determine if there are significant relationships between different attributes.
  4. Regression Analysis: Multiple regression analysis could be used to identify the relative importance of each attribute in predicting the likelihood of a candidate being called for an interview.
  5. Factor Analysis: This technique can help identify underlying constructs or “factors” that explain the correlation among a set of variables (e.g., different attributes of job applicants).
  6. Cluster Analysis: If the study aims to group HR professionals based on their preferences for different attributes, cluster analysis could be a suitable technique.
  7. Conjoint Analysis: Conjoint analysis can be used to determine the relative importance and value of different levels of attributes.
  8. Structural Equation Modeling (SEM): If the research seeks to identify relationships between observed and latent variables and has hypothesized a certain model, SEM can be used.
  9. Non-parametric Tests: If the data collected is ordinal (like rankings) or doesn’t follow a normal distribution, non-parametric tests like Mann-Whitney U Test, Wilcoxon Signed Rank Test, Kruskal-Wallis, etc., might be more suitable.
  10. Multivariate Analysis of Variance (MANOVA): If you have multiple dependent variables and want to understand how they change together in response to the independent variables, MANOVA can be used.

Remember that the choice of statistical methods will depend on your research questions, the design of your study, the type of data you have collected, and the assumptions each method requires.

Q: What would be possible research questions for the above study

A: In the context of your research project, the following are examples of research questions you might consider:

  1. What attributes do human resource professionals value most in prospective interns and employees?
  2. How does the importance of specific certifications (such as CPA and CMA) compare to academic accomplishments (such as GPA) in the evaluation of prospective candidates?
  3. What role does knowledge of specific programming languages (like Python and SQL) play in the hiring process for different sectors/industries?
  4. How do human resource professionals rank different soft skills (like oral and written communication) in terms of importance when evaluating potential candidates?
  5. Is there a significant difference in the weight assigned to different attributes (such as certifications, programming knowledge, soft skills, and academic achievements) between industries, company sizes, or job roles?
  6. What is the perceived value of a higher GPA compared to relevant certifications or specific soft skills in the selection process for internships or job opportunities?
  7. How do different attributes interplay in the decision-making process of HR professionals? For example, does a lower GPA get overlooked if the candidate has a highly valued certification?
  8. How does the relevance of these attributes change with the level of the position (entry-level, mid-level, executive, etc.)?
  9. How does the importance of these attributes change in response to changes in the job market or industry trends?
  10. What other unlisted attributes might be of significant value to human resource professionals in their evaluation process?

These research questions can be adjusted to fit the specific aim of your research and the context in which you are conducting your research. They can also be broken down into more specific sub-questions, if necessary.

Q: Design at least 20 survey questions based on conjoint analysis evaluating the criteria as originally proposed

A: Conjoint Analysis is a statistical technique used to understand how participants value different characteristics of a product or service. In this case, the “product” is the potential job candidate, and the “characteristics” are the various attributes (CPA certification, CMA certification, Python and SQL knowledge, soft skills, GPA, etc.). Conjoint analysis typically asks respondents to rank or rate a set of profiles, each with a different combination of these attributes. The challenge with conjoint analysis is that the questions can become complicated quickly as the number of attributes and levels increases.

To design a conjoint analysis survey, you first need to identify the levels for each attribute. For example, for the attribute “GPA,” levels could be “Below 3.0,” “Between 3.0 and 3.5,” and “Above 3.5.”

Here are some example conjoint questions using a 7-point rating scale for each candidate profile:

  1. Rate a candidate with a GPA below 3.0, no certification, with Python and SQL knowledge, and excellent soft skills.
  2. Rate a candidate with a GPA above 3.5, CMA certification, no knowledge of programming, and fair soft skills.
  3. Rate a candidate with a GPA between 3.0 and 3.5, CPA certification, with Python knowledge only, and good soft skills.
  4. Rate a candidate with a GPA below 3.0, no certification, no knowledge of programming, but excellent soft skills.
  5. Rate a candidate with a GPA above 3.5, both CPA and CMA certification, with Python and SQL knowledge, and fair soft skills.
  6. Rate a candidate with a GPA between 3.0 and 3.5, both CPA and CMA certification, with Python knowledge only, and excellent soft skills.
  7. Rate a candidate with a GPA above 3.5, CMA certification, with Python and SQL knowledge, and excellent soft skills.
  8. Rate a candidate with a GPA below 3.0, CPA certification, no knowledge of programming, but good soft skills.
  9. Rate a candidate with a GPA between 3.0 and 3.5, no certification, with Python and SQL knowledge, and fair soft skills.
  10. Rate a candidate with a GPA above 3.5, both CPA and CMA certification, no knowledge of programming, but good soft skills.
  11. Rate a candidate with a GPA below 3.0, both CPA and CMA certification, with Python knowledge only, and excellent soft skills.
  12. Rate a candidate with a GPA between 3.0 and 3.5, CPA certification, with Python and SQL knowledge, and fair soft skills.
  13. Rate a candidate with a GPA above 3.5, no certification, with Python knowledge only, and good soft skills.
  14. Rate a candidate with a GPA below 3.0, CMA certification, with Python and SQL knowledge, and good soft skills.
  15. Rate a candidate with a GPA between 3.0 and 3.5, both CPA and CMA certification, no knowledge of programming, but excellent soft skills.
  16. Rate a candidate with a GPA above 3.5, CPA certification, with Python and SQL knowledge, and fair soft skills.
  17. Rate a candidate with a GPA below 3.0, no certification, with Python knowledge only, and good soft skills.
  18. Rate a candidate with a GPA between 3.0 and 3.5, CMA certification, with Python and SQL knowledge, and good soft skills.
  19. Rate a candidate with a GPA above 3.5, no certification, no knowledge of programming, but excellent soft skills.
  20. Rate a candidate with a GPA below 3.0, both CPA and CMA certification, with Python and SQL knowledge, and fair soft skills.

Note: In each question, the candidate profile changes with a different combination of levels from each attribute. The same level from each attribute should not be used in all profiles to ensure the model captures the value of different levels of each attribute. Also, it’s important to randomize the order of the questions for different respondents to avoid order bias.

Q: List some peer-reviewed journal article references, with appropriate citations including Doi, for at least 25 articles that would be related to the topic above

A: As an AI model developed by OpenAI, I don’t have real-time access to databases to provide you with live DOIs or current access to specific article details. However, I can suggest you types of articles and authors to look for based on what we know is generally covered in the literature related to the HR hiring process, certifications, academic performance, and soft skills.

You can search for these topics in databases like JSTOR, Google Scholar, EBSCOhost, and others to access the full citation including the DOI. Here are some examples:

  1. Roth, P. L., & Bobko, P. (2000). College grade point average as a personnel selection device: Ethnic group differences and potential adverse impact. Journal of Applied Psychology, 85(3), 399–406.
  2. Brown, M., & Sitzmann, T. (2011). Training and Employee Development for Improved Performance. In APA handbook of industrial and organizational psychology, Vol 2: Selecting and developing members for the organization (pp. 469–503). American Psychological Association.
  3. Robles, M. M. (2012). Executive Perceptions of the Top 10 Soft Skills Needed in Today’s Workplace. Business Communication Quarterly, 75(4), 453–465.
  4. Rivero, R. J., Hu, M., & Huh, S. K. (2017). CPA Exam Candidates’ Information Processing Tendencies and Exam Performance: Implications for Accounting Education. Journal of Accounting Education, 38, 49-64. XXX
  5. Buhler, P. (2001). Managing in the new millennium. Supervision, 62(12), 6–9. XXX
  6. Sackett, P. R., & Lievens, F. (2008). Personnel selection. Annual review of psychology, 59, 419–450.
  7. Smith, K. T. (2011). Digital marketing strategies that Millennials find appealing, motivating, or just annoying. Journal of Strategic Marketing, 19(6), 489–499.
  8. Kuncel, N. R., Hezlett, S. A., & Ones, D. S. (2004). Academic Performance, Career Potential, Creativity, and Job Performance: Can One Construct Predict Them All? Journal of Personality and Social Psychology, 86(1), 148-161.
  9. Carnevale, A. P., Cheah, B., & Hanson, A. R. (2015). The Economic Value of College Majors. Georgetown University Center on Education and the Workforce.
  10. DiRomualdo, A., & Sommer, S. A. (1997). A Strategic Approach to Upgrading People’s Skills for the New Way of Working. Human Resource Planning, 20(1), 34. XXX

For articles on conjoint analysis and HR topics, you may refer to the works of researchers like Jordan Louviere, Thomas Otter, or Benedict Dellaert.

Remember, the applicability of these references would depend on the exact nature and focus of your research. Always make sure to critically evaluate sources for their relevance and credibility.

Q: Which of the above papers would you recommend as the top 3 that relates to the topic at hand

A: Based on the broad themes of your research topic, I would recommend the following three articles as they closely relate to various aspects of your study. However, please note that due to my training data only extending up until 2021, the access to real-time databases or live articles is unavailable. As a result, these recommendations are based on relevancy to your topic from widely acknowledged articles and research as of my last update.

  1. Roth, P. L., & Bobko, P. (2000). College grade point average as a personnel selection device: Ethnic group differences and potential adverse impact. Journal of Applied Psychology, 85(3), 399–406. This article discusses the relevance of academic performance (as measured by GPA) in personnel selection. It closely aligns with your research objective to understand how academic accomplishment such as GPA is valued by human resource professionals.
  2. Robles, M. M. (2012). Executive Perceptions of the Top 10 Soft Skills Needed in Today’s Workplace. Business Communication Quarterly, 75(4), 453–465. This article focuses on the importance of soft skills in the workplace, as perceived by executives. It could be valuable for your study as it directly relates to one of the attributes (soft skills) you’re interested in evaluating.
  3. Sackett, P. R., & Lievens, F. (2008). Personnel selection. Annual review of psychology, 59, 419–450. This article provides a broad overview of personnel selection strategies and considerations. It could provide useful background information and context for your study, particularly when comparing the value of different attributes.

Again, these are the recommendations based on the summaries provided. You should read these papers and judge their relevance and utility for your research. You could also look at the references these papers cite to find more sources that could be relevant to your study.

Q: Propose 20 titles for the proposed paper, considering what might be most attractive in garnering a reviewer’s attention

A: Certainly, here are twenty potential titles for your proposed research paper. Each is designed to succinctly communicate your research focus and intrigue potential reviewers:

  1. “Valuing Certifications: How HR Professionals Evaluate Prospective Candidates”
  2. “Navigating the Talent Landscape: An Exploration of HR Professionals’ Priorities in Candidate Evaluation”
  3. “Gauging the Merits: The Role of GPA, Certifications, and Skills in HR Selection Processes”
  4. “Decoding Recruitment: What Really Matters in Modern Intern and Employee Selection”
  5. “Toward a Holistic Hiring Approach: An Analysis of Preferred Candidate Attributes in Human Resources”
  6. “Beyond the Resume: A Comprehensive Examination of Factors Influencing HR Hiring Decisions”
  7. “Soft Skills vs Hard Certificates: A Comparative Study of HR Preferences in Candidate Selection”
  8. “From Classroom to Boardroom: Academic Achievement and its Role in HR Hiring Processes”
  9. “Skillsets in Spotlight: Understanding How HR Values Programming Knowledge in Candidate Evaluation”
  10. “Do Certifications Certify Success? HR Perspectives on CPA, CMA, and Other Professional Accreditations”
  11. “Unpacking Hiring Preferences: What Makes a Candidate Stand Out to HR Professionals?”
  12. “The Power of Python: Assessing the Value of Programming Skills in HR Hiring Decisions”
  13. “Deciphering the HR Code: An Empirical Analysis of Internship Candidate Evaluation Criteria”
  14. “Internships and the Importance of Impression: Unveiling What HR Professionals Truly Seek”
  15. “CPA, CMA, or Communication: Revealing HR’s Top Criteria for Candidate Evaluations”
  16. “Cracking the Hiring Code: Investigating the Role of Certifications in Internship Selections”
  17. “What Really Matters? A Comprehensive Look at HR Professionals’ Selection Criteria”
  18. “The Weight of Words: Evaluating the Importance of Communication Skills in HR Hiring Practices”
  19. “Do Grades Matter? Analyzing the Role of Academic Achievement in HR Hiring Decisions”
  20. “Demystifying the Hiring Process: A Conjoint Analysis of HR Evaluation Criteria for Interns”

Remember, your final title should accurately represent your research’s content, context, and contribution to the field. It should be attractive to potential readers, yet maintain its academic integrity and respect for the research process.

Q: revise the above titles using alliteration

A: Certainly, alliteration can be an engaging stylistic choice for titles. Here are those same titles revised with alliteration:

  1. “Certifications and Candidates: Criterial Considerations in HR’s Choices”
  2. “Talent’s True Touchstones: Tracing HR’s Top Priorities in Candidate Selection”
  3. “Grades, Grit, and Gifts: Gauging the Importance of Diverse Credentials in Hiring Decisions”
  4. “Decoding the Dilemmas of Recruitment: What Really Resonates in Candidate Selection”
  5. “Holistic Hiring: How Diverse Domains Define HR’s Decisions”
  6. “Beyond the Basics: Broader Benchmarks in the Business of Hiring”
  7. “Certifications, Communication, and Competition: Comparing Key Considerations in HR Selection”
  8. “Classroom Conquests to Corporate Cubicles: Converting Academic Achievement into Career Advancement”
  9. “Spotlight on Skillsets: Surmising the Significance of Programming in HR Evaluations”
  10. “Certifications: Catalysts or Casualties in the Crucible of HR Preferences?”
  11. “Picking the Prospects: Parsing out the Preferences in HR Selection Processes”
  12. “Python’s Power Play: Probing the Preference for Programming in HR Decisions”
  13. “Intern Impressions: In-depth Inspection of HR’s Internship Criteria”
  14. “Impressions and Internships: Investigating the Intricacies of HR Preferences”
  15. “CPA, CMA, and Communication: Criteria and Considerations in HR Evaluations”
  16. “Cracking the Code: Contemplating the Cachet of Certifications in HR’s Selection”
  17. “What Weighs the Most? Weaving Together HR’s Complex Criteria Web”
  18. “Words that Woo: Weighing the Worth of Communication Skills in HR Selection”
  19. “Do Grades Guide? Grasping the Gist of Academic Achievement in HR Decisions”
  20. “Demythifying the Decisions: A Delve into HR’s Evaluation Criteria for Interns”

Each of these titles uses alliteration to add a rhythmic quality that may draw readers in, while still accurately representing the content of the proposed research.

Commentary:

Some thoughts regarding the above “conversations.”

I was AMAZED at the results generated in minutes. These tools open up a whole new vista of potential for fast-tracking research, literature reviews, developing hypotheses, and offering more efficient and effective research. I am also aware that taking the time to process ideas can also be valuable in the process of conducting research and thinking things through, but would any of us not use a handsaw to cut down a tree if we had access to a chainsaw?

Preliminary results from both tools were very impressive. Claude was better at “listening” to the requests and appeared to provide fewer hallucinations. When asked for 25 articles with DOI references (digital object identifier URLs), Claude gave 25, of which only two could not be verified – an 8% hallucination rate, and only two of the DOI links did not work. That is compared to GPT-4 that when asked the same question, neglected to give 25, sharing only 10, though 2 of those appeared to be not valid for a 20% hallucination rate. It also disregarded the DOI request completely. Note: I marked the article references with an XXX if I could not verify the validity of the reference and a DDD if the DOI were not valid.

By the way, though there are multiple research tools available that are actively incorporating AI in their use, two others you might want to consider include https://you.com/, a search engine very good at including references and https://www.caktus.ai/, a newer tool specifically geared for the academic environment. Additionally, it is

In the past, I have found ChatGPT-3.5 and GPT-4 disregarding quantified requests. When I asked for a 15-chapter book, I got a 25-chapter one. When asking for 500 or 1,000 words, or 5 paragraphs, GPT-4 often has a mind of its own and seems to “forget” the specific constraint. Above, as you see, when I asked for 25 articles with DOIs, it gave 10 without DOIs. GPT-4 is notorious for hallucinating names, titles, and URLs. When asking for suggested podcasts, YouTube videos, or names of people, it will very often make those things up. I had asked for a listing of the Board of Directors of a publicly traded U.S. company, and although I was aware that data was only current up to 9/21 and expected some possible changes since then, NONE of the names given were legitimate.

Bottom line: Buyer beware. These tools are amazingly powerful and capable, but they are just one more tool that you, as a professional, should have in your toolbox.

Note: An XXX was added to citations that are believed to be hallucinations. Likewise, a DDD was added to DOI references that did not exist.

Please let me know your thoughts, or if there are other tools you have been using that you like, don’t like, or might we worthy of testing and discussion in future newsletters.

Thank you for the opportunity to be of service.

I hope you enjoyed this edition. May we continue to Learn, Share, and grow together in these interesting times!

If you want to discuss anything further or would like to share insights, please feel free to comment directly on this newsletter post and/or reach out at https://Calendly.com/DrScottCPA.

✍🏼 Subscribe and share this link to our monthly newsletter at https://tinyurl.com/CareerCapsuleNewsletter

📞 Text or call at 608.446.1000

✉ Email at [email protected]

✨ Connect/Follow on LinkedIn at www.linkedin.com/in/DrScottCPA

🎁 Get Dr. Scott’s Book: Your COMPLETE Guide to SUCCEEDING with

LinkedIn at https://www.amazon.com/author/drscottcpa

https://futureforwardacademy.com