INFORMS International Conference
Introduction to Monte Carlo and Discrete-Event Simulation
San Jose, CA
Essential Practice Skills for High-Impact Analytics Projects
CAP NewsVideo Learning Center features analytics conference highlights
If you couldn’t attend the 2016 INFORMS Conference on Analytics & Operations Research or if you were at the Orlando event but missed some presentations, the INFORMS Video Learning Center has you covered. Feel free to use the videos as teaching material or to share with others to let them know how OR/MS and analytics are helping to make a difference in the World.Read More
Special ArticlesSeven best practices for an effective project management office
Gartner, Inc. has identified seven best practices that project management offices (PMO) leaders should employ to improve the effectiveness of project, portfolio and program management and demonstrate they can support the wider organization and its strategic goals.Read More
Special ArticlesIBM’s Watson learning nuances of security research
Watson for Cyber Security, a cloud-based version of IBM’s cognitive technology, trained on the language of security as part of a year-long research project with eight universities to greatly expand the collection of security data IBM has trained the cognitive system with. Watson is learning the nuances of security research findings and discovering patterns and evidence of hidden cyber attacks and threats that could otherwise be missed.Read More
Certified Analytics Professional
INFORMS prepares to launch first-of-its-kind program.
By Scott Nestler, Jack Levis and Bill Klimack (left to right)
The Institute for Operations Research and the Management Sciences (INFORMS), publishers of Analytics magazine, is launching an analytics certification program (Certified Analytics Professional or CAP) designed to “enable analytics professionals (and their employers) to have confidence that a person will bring a core set of analytics skills to a project team.” Open to all qualified analysts, the first-of-its-kind certification will consist of a standardized test, as well as a review of the candidate’s resume or work portfolio. The program was outlined earlier this year in the March/April issue of Analytics. Following is an update on the program, with the first exams scheduled for the 2013 INFORMS Conference on Business Analytics & Operations Research in San Antonio, Texas, April 7-9, 2013, followed by a second opportunity at the 2013 INFORMS Annual Meeting, Oct. 6-9, 2013 in Minneapolis.
INFORMS has now published the eligibility criteria for the CAP certification at INFORMS Online (www.informs.org/Build-Your-Career/Certification), which includes the following:
- BA/BS (or higher) degree, and
- at least five years of analytics work-related experience for BA/BS holder in related area, or
- at least three years of analytics work-related experience for MA/MS (or higher) holder in related area, or
- at least seven years of experience for those with BA/BS in unrelated area, and
- verification of soft skills/provision of business value by employer.
Job Task Analysis
In general, a Job Task Analysis (JTA) is a comprehensive description of the duties and responsibilities of a profession, occupation or specialty area; our approach consists of four elements: 1) domains of practice, 2) tasks performed, 3) knowledge required for effective performance on the job, and 4) domain weights that account for the importance of and frequency with which the tasks are performed. More specifically, the JTA for the CAP program can be viewed as an outline of a partial body of knowledge, as it represents a delineation of common or typical tasks performed and knowledge applied by analytics professionals, grouped together in a hierarchical domain structure. In the course of analytics work, these tasks may be performed multiple times with modifications based on data, findings and results, as part of ongoing feedback loops that are routinely a part of practice. The JTA serves as the test blue print for exam development and links what is done on the job with what is measured by the certification examination. This linkage is necessary to establish a valid, practice-related examination. It is important to realize that the JTA is a dynamic document that will change in the future to reflect best practices and changes in the analytics profession.
The JTA outlined in this article was developed by the INFORMS Analytics Credentialing Job Task Analysis Working Group, comprised of 12 subject matter experts (SMEs) (see box) who are: highly regarded in their field; diverse in geography, sector (public-private), organization type (e.g., large companies-smaller consulting firms, practice-academia, etc.) and application area (e.g., finance, logistics, software, consumer goods, etc.); and representative of the descriptive, predictive and prescriptive segments of analytics. Additionally, the working group contains four members in common with the task force and two INFORMS directors, helping to ensure continuity with existing governance structures.
Additionally, since CAP is designed to attract analytics professionals who are not currently members of INFORMS, the working group contains some non-members. In developing the JTA, members of the working group relied upon: their knowledge of practice gained from years of experience, academic program content, corporate job descriptions in analytics and articles from professional and scholarly publications. As outlined in the earlier update, the JTA Working Group proposed, and the Credentialing Task Force and Board of Directors approved, that the CAP assess to some level of depth across the breadth of knowledge needed in analytics. The evaluation of more detailed knowledge in specific areas or applications will be done later in “add-on” certifications, pending the successful development and deployment of CAP.
Domains Provide Top-Level Structure
The 36 typical tasks and 16 knowledge statements (not provided here) in the analytics JTA are organized in seven domains, as listed in Figure 1. Tasks are specific goal-directed work activities or groups of closely related work activities that describe identifiable behaviors, while knowledge is an organized body of information that, when applied, makes possible the competent and effective performance of the work activities described by a task.
Figure 1 also shows domain weights, which are based on the SMEs’ assessments of the importance of tasks and the frequency of their performance. Mean weights of the working group members were used as starting points for discussion and debate that continued until consensus was reached. The weights will be used in the exam construction process to ensure content mixture validity. Within each of these seven domains, the JTA Working Group identified a number of tasks that must be performed by practitioners of analytics.
Successful performance of the tasks requires specific knowledge, which is what will be tested. At this time, the supporting knowledge statements are being used to develop items (questions) for the first exam and, as such, are not publicly available. We anticipate releasing them as part of the CAP Candidate Handbook and at INFORMS Online in October 2012.
Figure 1: Domains, descriptions and weights in the JTA.
In order to ensure that the JTA Working Group had not missed anything important in the practice of analytics, the JTA and an associated questionnaire were sent out to a random sample of INFORMS members and non-members. More than 200 analytics professionals from various regions of the United States, Europe and Asia/Pacific responded to the survey; approximately three-quarters of these were not INFORMS members. Non-member respondents included previous respondents of the 2011 certification feasibility study and registered subscribers of Analytics magazine. Survey participants were asked to perform three tasks in their review of the draft JTA document:
- Identify those domains, tasks, or knowledge statements they would like to remove, reword, or revise.
- Suggest new domains, tasks, knowledge or skill statements that they would like to add.
- Confirm or suggest changes to the weights based on their ranking of importance and frequency.
After reviewing the results of the survey, including a thorough report prepared by the certification consultant, the JTA Working Group met again by telephone in February 2012 to clarify and improve the JTA. The agreed upon changes primarily included the addition of examples of concepts and definitions to most of the knowledge statement in order to ensure understandability.
In May 2012, a call for question writers went out via the INFORMS subdivision (section and society) officers and also to those who responded to the feasibility study in 2011 saying they wanted to help with development of an analytics certification program. Nearly 50 volunteers, a mix of academics and practitioners, initially responded; most, but not all, are INFORMS members. In early June 2012, a consulting psychometrician provided training to about 30 volunteers on how to write multiple-choice questions for the exam. The goal of this training was several-fold: to maximize the quality of the test; to ensure the accuracy, fairness and validity of the test; to minimize the measurement error of the test; and most importantly, to minimize errors in the classification of candidates as certified or not. Item writers were asked to keep the following general guidelines in mind:
- Does the question test something relevant and non-trivial?
- Does the question reflect current best-practice?
- Is the question stated clearly enough so that the knowledgeable candidate will be able to select the correct choice without undue hesitation?
- Is the context, setting and content of the question equally appropriate and familiar to all segments of the candidate population, including minority groups?
- Is the question free of language and/or descriptions that might be offensive to any segment of the candidate population?
- Is the content of the question free of language, descriptions or terminology that could reinforce common stereotypes concerning any segment of the candidate population?
Over a six-week period, this group worked individually to develop multiple-choice questions for the certification exam. In addition to the question, each writer provided a correct answer, three incorrect but plausible answers, a reference or citation, and a suggestion of which domain, task and knowledge statements the question was useful to assess. In the first round, this group provided more than 200 questions, which were reviewed by 12 volunteers in late June 2012. The question writer and reviewer group included four members of the JTA Working Group, who helped to guide the review effort.
In conjunction with a certification consultant, the INFORMS Certification Task Force prepared a draft Policies and Procedures (P&P) manual for the CAP program. Further work on an independent governance board for the program continues as well. Additionally, the INFORMS staff is working on a branding and marketing plan for CAP. The marketing department developed a logo that was recently approved by the Certification Task Force.
Test development efforts, to include additional question writing, creation and review of two exam forms, and a cut score study, are ongoing. Development of a detailed marketing and communications plan, a candidate handbook, marketing materials and a Website, as well as the back-end accounting and IT systems, are also underway. The INFORMS Certification Task Force recognizes the importance of making additional details available as soon as possible given the proposed administration date of the first certification exam. To find out more, check the CAP page at the INFORMS Website from time to time. Similarly, keep reading OR/MS Today and Analytics magazine, as further updates will be shared there as well.
Scott Nestler is a colonel in the U.S. Army, currently attending the Army War College. Jack Levis is the director of Process Management at UPS and the INFORMS VP for Practice Activities. Bill Klimack is a decision analysis consultant at Chevron and the INFORMS VP for Meetings. All three are members of both the INFORMS Certification Task Force and also the INFORMS Analytics Certification Job Task Analysis Working Group.
INFORMS Analytics Credentialing Job Task Analysis Working Group
|Jeff Camm (Univ. of Cincinnati)|
Arnie Greenland (IBM Global Bus. Serv.)
Bill Klimack (Chevron) * #
Jack Levis (UPS) * #
Daymond Ling (Canadian Imperial Bank of Commerce) N
Freeman Marvin (Innovative Decisions Inc.)
Scott Nestler (Naval Postgraduate School) #
Jerry Oglesby (SAS)
Michael Rappa (NC State / Inst. Adv. Analytics) #
Tim Rey (Dow Chemical)
Rita Salam (Gartner) N
Sam Savage (Stanford / Vector Economics)
* - INFORMS Board Member
# - Credentialing Task Force Member
N – Non-Member of INFORMS